CN112121434A - Interaction method and device of special effect prop, electronic equipment and storage medium - Google Patents

Interaction method and device of special effect prop, electronic equipment and storage medium Download PDF

Info

Publication number
CN112121434A
CN112121434A CN202011069424.5A CN202011069424A CN112121434A CN 112121434 A CN112121434 A CN 112121434A CN 202011069424 A CN202011069424 A CN 202011069424A CN 112121434 A CN112121434 A CN 112121434A
Authority
CN
China
Prior art keywords
special effect
virtual object
influence
prop
radiation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011069424.5A
Other languages
Chinese (zh)
Other versions
CN112121434B (en
Inventor
练建锋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202011069424.5A priority Critical patent/CN112121434B/en
Publication of CN112121434A publication Critical patent/CN112121434A/en
Application granted granted Critical
Publication of CN112121434B publication Critical patent/CN112121434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application provides an interaction method, an interaction device, electronic equipment and a computer-readable storage medium of a special effect prop; the method comprises the following steps: displaying a virtual scene in a human-computer interaction interface, and displaying at least one special effect prop projected to a target position in the virtual scene; the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed; and before the special effect prop releases the special effect, displaying an influence area of the special effect by taking the target position as a reference. By the aid of the method and the device, the simulation performance of immersive perception of the virtual scene can be realized, and the resource utilization rate of the graphic processing hardware is improved.

Description

Interaction method and device of special effect prop, electronic equipment and storage medium
Technical Field
The present disclosure relates to computer human-computer interaction technologies, and in particular, to an interaction method and apparatus for special effects, an electronic device, and a computer-readable storage medium.
Background
The human-computer interaction technology of the virtual scene based on the graphic processing hardware can realize diversified interaction between virtual objects controlled by users or artificial intelligence according to actual application requirements, and has wide practical value. For example, in a virtual scene such as a military exercise simulation and a game, a real battle process between virtual objects can be simulated.
The projection type special effect prop is widely applied to grenades or grenades and the like, so that the effects of killing enemy troops and destroying virtual vehicles within a certain range are achieved. After the projected special effect prop is projected, other virtual objects need to be far away before the projected special effect prop releases the special effect, so as to avoid the influence of the projected special effect prop.
In the related art, a user is required to continuously adjust the viewing angle and position of a controlled virtual object to find out the accurate position of a projection-type special effect prop from a surrounding scene, so that the influence of a special effect is avoided, the simulation performance of immersion perception of the virtual scene is influenced by frequent adjustment operations, and the resources of graphic processing hardware are excessively consumed.
Disclosure of Invention
The embodiment of the application provides an interaction method and device of a special effect prop, electronic equipment and a computer readable storage medium, which can realize the simulation performance of immersive perception of a virtual scene and improve the resource utilization rate of graphic processing hardware.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an interaction method of a special effect prop, which comprises the following steps:
displaying a virtual scene in a human-computer interaction interface, and displaying at least one special effect prop projected to a target position in the virtual scene;
the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed;
and before the special effect prop releases the special effect, displaying an influence area of the special effect by taking the target position as a reference.
The embodiment of the application provides an interactive installation of special effect stage property, includes:
the prop display module is used for displaying a virtual scene in a human-computer interaction interface, and displaying at least one special-effect prop projected to a target position in the virtual scene;
the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed;
and the area display module is used for displaying the influence area of the special effect by taking the target position as a reference before the special effect prop releases the special effect.
In the above scheme, the area display module is further configured to obtain a release distance that can be achieved by the special effect prop from the target position; and displaying a regular geometric shape with the release distance as the radiation distance as an influence area by taking the target position as a geometric center, or displaying an irregular geometric shape with the release distance as the radiation distance as an influence area by taking the target position as a geometric center.
In the above scheme, the area display module is further configured to obtain a maximum release distance that can be achieved when the special effect prop is released from the target position; attenuating the maximum release distance based on a protection capability parameter of a target virtual object, and taking the attenuated release distance as a release distance which can be realized by the special-effect prop from the target position; wherein the target virtual object is an arbitrary virtual object in the virtual scene.
In the above scheme, the area display module is further configured to display a plurality of radially arranged influence areas with the target position as a reference; wherein different said areas of influence characterize different degrees of influence of said special effect.
In the above scheme, the area display module is further configured to display a plurality of radially arranged influence areas between the radiation boundary corresponding to the radiation distance from the radiation starting point to the special effect prop, with the target position as the radiation starting point.
In the above scheme, the area display module is further configured to sequentially display a plurality of the affected areas; wherein the sequence is arranged in a direction radiating outward from the radiation starting point to indicate a moving direction away from the special effect prop.
In the above scheme, the area display module is further configured to simultaneously display the plurality of influence areas according to the display parameter corresponding to each of the influence areas; the display parameters of the plurality of influence areas are in an attenuation trend and are attenuated according to the direction radiating outwards from the radiation starting point so as to indicate the moving direction far away from the special effect prop.
In the above scheme, the area display module is further configured to obtain a release distance that can be reached when the special effect of the special effect prop is released from the target position, and determine the release distance as a radiation distance; dividing a radiation range corresponding to the radiation distance into a plurality of influence areas with different influence degrees along a radiation direction by taking the target position as a radiation starting point; wherein the radiation direction is a direction outward from the radiation origin.
In the above scheme, the area display module is further configured to divide a radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation sections of the intensity of the special effect according to an attenuation characteristic of the intensity of the special effect prop along the radiation direction.
In the above scheme, the area display module is further configured to determine an attenuation characteristic of a state value of the target virtual object along the radiation direction according to a protection capability parameter of the target virtual object and an attenuation characteristic of the strength of the special effect prop along the radiation direction; dividing a radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the state value of the target virtual object according to the attenuation characteristic of the state value of the target virtual object along the radiation direction; wherein the target virtual object is an arbitrary virtual object in the virtual scene.
In the above scheme, the area display module is further configured to display at least one of the following prompt messages: first timing information, wherein the first timing information is used for prompting the special effect time of the special effect prop to release a special effect; second timing information, wherein the second timing information is used for prompting the time required for leaving the influence area; direction prompt information, wherein the direction prompt information is used for prompting a direction which is the fastest to leave the influence area; and influence prompt information, wherein the influence prompt information is used for prompting the influence degree caused when the special effect prop releases the special effect.
In the above scheme, the prop display module is further configured to display, in response to a projection operation for the special effect prop, a process of projecting the special effect prop to the target position by a first virtual object in the virtual scene.
In the foregoing solution, the area display module is further configured to display an influence area in the view angle of the first virtual object with the target position as a reference when the distance between the first virtual object and the target position is smaller than the release distance and the special effect prop has not released the special effect.
In the foregoing solution, the area display module is further configured to display, when the special effect prop is within the visible range of the first virtual object and the special effect prop has not released the special effect, an influence area of the special effect in the view angle of the first virtual object based on the target position.
In the above scheme, the area display module is further configured to, when a distance between a second virtual object and the target position is smaller than a release distance and the special effect prop has not released the special effect, display an influence area of the special effect in an angle of view of the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting the special effect prop.
In the above scheme, the area display module is further configured to, when the special effect prop is within a visible range of a second virtual object and the special effect prop has not released a special effect, display an influence area of the special effect in an angle of view of the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting the special effect prop.
In the above scheme, the area display module is further configured to, when it is determined that the special effect prop will affect the state of the first virtual object when releasing the special effect according to a motion trend of the first virtual object, display an affected area of the special effect in an angle of view of the first virtual object and/or the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, the distance between the second virtual object and the first virtual object is smaller than a visual distance threshold value, and the first virtual object is a virtual object for projecting the special effect prop.
In the above scheme, the area display module is further configured to, when a distance between a third virtual object and the target position is smaller than a release distance and the special effect prop has not released the special effect, display an influence area of the special effect in an angle of view of the third virtual object with the target position as a reference; the third virtual object and the first virtual object respectively belong to a group which is confronted with each other, and the first virtual object is a virtual object which projects the special effect prop.
In the above scheme, the area display module is further configured to stop displaying the affected area in the virtual scene after the special effect item releases the special effect.
The embodiment of the application provides an interactive electronic equipment for special effect stage property, electronic equipment includes:
a memory for storing executable instructions;
and the processor is used for realizing the interaction method of the special effect prop provided by the embodiment of the application when the executable instruction stored in the memory is executed.
The embodiment of the application provides a computer-readable storage medium, which stores executable instructions and is used for realizing the interaction method of the special effect prop provided by the embodiment of the application when being executed by a processor.
The embodiment of the application has the following beneficial effects:
before the special effect prop releases the special effect, the influence area of the special effect is displayed by taking the target position projected by the prop as a reference, and the accurate position and the influence range of the special effect prop can be visually displayed, so that the special effect prop can be far away from the special effect prop through high-efficiency human-computer interaction operation, the good immersive perception of a virtual scene is realized, and meanwhile, the resource consumption of the graphics processing hardware for performing human-computer interaction related calculation is remarkably saved.
Drawings
Fig. 1 is a schematic diagram of an application scenario provided by the related art;
FIG. 2 is a diagram of an application scenario provided by the related art;
fig. 3A and 3B are schematic application modes of an interaction method of special effects props provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device 500 provided in an embodiment of the present application;
fig. 5A, 5B, and 5C are schematic flow charts of an interaction method of the special effect prop provided in the embodiment of the present application;
fig. 6 is a schematic flowchart of an interaction method of the special effect prop according to an embodiment of the present application;
fig. 7A is a schematic flowchart of an interaction method of the special effect prop according to the embodiment of the present application;
fig. 7B and fig. 7C are schematic diagrams illustrating a principle of dividing a plurality of influence areas according to an embodiment of the present application;
fig. 8A, 8B, and 8C are application scenario diagrams of an interaction method of a special effect prop according to an embodiment of the present application;
fig. 9A and 9B are schematic flow charts of an interaction method of the special effect prop provided in the embodiment of the present application;
fig. 10A, 10B, and 10C are schematic diagrams illustrating an interaction method of the special effect prop according to an embodiment of the present application.
Detailed Description
In order to make the objectives, technical solutions and advantages of the present application clearer, the present application will be described in further detail with reference to the attached drawings, the described embodiments should not be considered as limiting the present application, and all other embodiments obtained by a person of ordinary skill in the art without creative efforts shall fall within the protection scope of the present application.
In the following description, references to the terms "first", "second", and the like are only used for distinguishing similar objects and do not denote a particular order or importance, but rather the terms "first", "second", and the like may be used interchangeably with the order of priority or the order in which they are expressed, where permissible, to enable embodiments of the present application described herein to be practiced otherwise than as specifically illustrated and described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the application.
Before further detailed description of the embodiments of the present application, terms and expressions referred to in the embodiments of the present application will be described, and the terms and expressions referred to in the embodiments of the present application will be used for the following explanation.
1) In response to: for indicating the condition or state on which the performed operation depends, when the condition or state on which the performed operation depends is satisfied, the performed operation or operations may be in real time or may have a set delay; there is no restriction on the order of execution of the operations performed unless otherwise specified.
2) A client: and the application programs run in the terminal and used for providing various services, such as a game client and the like, a military exercise simulation client.
3) Virtual scene: the application program displays (or provides) a virtual scene when running on the terminal. The virtual scene may be a simulation environment of a real world, a semi-simulation semi-fictional virtual environment, or a pure fictional virtual environment. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene, or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, ocean, etc., the land may include environmental elements such as deserts, cities, etc., and a user may control a virtual object to move in the virtual scene.
4) Virtual object: the image of various people and objects that can interact in the virtual scene, or the movable objects in the virtual scene. The movable object may be a virtual character, a virtual animal, an animation character, etc., such as a character, an animal, a plant, an oil drum, a wall, a stone, etc., displayed in a virtual scene. The virtual object may be an avatar in the virtual scene that is virtual to represent the user. The virtual scene may include a plurality of virtual objects, each virtual object having its own shape and volume in the virtual scene and occupying a portion of the space in the virtual scene.
For example, the virtual object may be a user Character controlled by an operation on the client, an Artificial Intelligence (AI) set in a virtual scene match by training, or a Non-user Character (NPC) set in a virtual scene interaction. For example, the virtual object may be a virtual character that is confrontationally interacted with in a virtual scene. For example, the number of virtual objects participating in the interaction in the virtual scene may be preset, or may be dynamically determined according to the number of clients participating in the interaction.
Taking a shooting game as an example, the user may control a virtual object to freely fall, glide, open a parachute to fall, run, jump, climb, bend over, and move on the land, or control a virtual object to swim, float, or dive in the sea, or the like, but the user may also control a virtual object to move in the virtual scene by riding a virtual vehicle, for example, the virtual vehicle may be a virtual car, a virtual aircraft, a virtual yacht, and the like, and the above-mentioned scenes are merely exemplified, and the present invention is not limited to this. The user can also control the virtual object to perform antagonistic interaction with other virtual objects through the special effect prop, for example, the special effect prop can be a projection type special effect prop such as a grenade, a beaming mine, a landmine, a viscous grenade, and the like, and the type of the special effect prop is not specifically limited in the application.
5) Scene data, representing various features that objects in the virtual scene are exposed to during the interaction, may include, for example, the location of the objects in the virtual scene. Of course, different types of features may be included depending on the type of virtual scene; for example, in a virtual scene of a game, scene data may include a time required to wait for various functions provided in the virtual scene (depending on the number of times the same function can be used within a certain time), and attribute values indicating various states of a game character, for example, a life value (also referred to as a red amount) and a magic value (also referred to as a blue amount), and the like.
6) The special effect prop is one of props in a virtual scene and is used for releasing the special effect when the duration of the projected special effect reaches the target duration or releasing the special effect when any virtual object is sensed. The special effect prop can be a throwing prop of grenades, cluster mines, land mines, viscous grenades and the like, and can also be a shooting prop of grenades, shells and the like.
Referring to fig. 1 and 2, fig. 1 and 2 are schematic diagrams of application scenarios provided by the related art. As shown in fig. 1, in the virtual scene, an enemy has thrown a grenade (i.e., the above-mentioned special effect item) near the virtual object controlled by the user, but the prompt information 101 of the grenade danger area near the virtual object is unclear, and the user cannot determine the distance between himself and the grenade through the existing prompt information 101, cannot determine from which direction to escape, and cannot determine how far to walk to avoid the injury of the grenade. As shown in fig. 2, when the virtual object escapes from the dangerous area of the grenade (i.e. does not cause damage when the special effect prop releases the special effect), the prompt message 101 in fig. 1 is automatically hidden.
In the related technology, the prompting method of the dangerous area of the grenade is not friendly to the user, and the prompting process of the dangerous area of the grenade is divided into two stages. The first stage is as follows: when the enemy has thrown the grenade to the side of the virtual object controlled by the user and the virtual object is in the danger area of the grenade, the user is prompted to be in the danger area of the grenade as shown in fig. 1. And a second stage: when the user finds that the user is in the dangerous area of the grenade, the user can manipulate the virtual object to move in the direction away from the grenade, and when the virtual object escapes from the dangerous area of the grenade, the prompt message 101 in fig. 1 is automatically hidden, as shown in fig. 2.
The related art has the following technical problems: for a virtual object attacked by a grenade, the prompt is not clear enough, so that a user cannot clearly know the distance between the grenade and the user, cannot determine the direction from which the user escapes, and cannot determine how far the user needs to escape to avoid the harm of the grenade. For the virtual object throwing the grenade, the user cannot clearly judge whether the grenade thrown by the user can damage the enemy.
In view of the above technical problems, embodiments of the present application provide an interaction method for special effects, which can implement the simulation performance of immersive perception of a virtual scene and improve the resource utilization rate of graphics processing hardware. In order to facilitate understanding of the interaction method of the special effects provided in the embodiment of the present application, an exemplary implementation scenario of the interaction method of the special effects provided in the embodiment of the present application is first described, and a virtual scenario may be completely output based on a terminal, or output based on cooperation of the terminal and a server.
In some embodiments, the virtual scene may be a picture presented in a military exercise simulation, and a user may simulate a tactic, a strategy or a tactics through virtual objects belonging to different teams in the virtual scene, so that the virtual scene has a great guiding effect on the command of military operations.
In some embodiments, the virtual scene may be an environment for game characters to interact with, for example, game characters to play against in the virtual scene, and the two-way interaction may be performed in the virtual scene by controlling the actions of the virtual objects, so that the user can relieve the life pressure during the game.
In an implementation scenario, referring to fig. 3A, fig. 3A is an application mode schematic diagram of the interaction method for special-effect props provided in the embodiment of the present application, and is applicable to some application modes that can complete calculation of related data of a virtual scenario 100 by completely depending on the computing power of graphics processing hardware of a terminal 400, such as a game in a single-machine/offline mode, and output of the virtual scenario is completed by the terminal 400 such as a smart phone, a tablet computer, and a virtual reality/augmented reality device.
As an example, types of image Processing hardware include a Central Processing Unit (CPU) and a Graphics Processing Unit (GPU).
When the visual perception of the virtual scene 100 is formed, the terminal 400 calculates and displays required data through the graphic computing hardware, completes the loading, analysis and rendering of the display data, and outputs a video frame capable of forming the visual perception on the virtual scene at the graphic output hardware, for example, a two-dimensional video frame is displayed on a display screen of a smart phone, or a video frame realizing a three-dimensional display effect is projected on a lens of an augmented reality/virtual reality glasses; furthermore, to enrich the perception effect, the device may also form one or more of auditory perception, tactile perception, motion perception, and taste perception by means of different hardware.
As an example, the terminal 400 runs a client 410 (e.g. a standalone version of a game application), and outputs a virtual scene including role play during the running process of the client 410, wherein the virtual scene is an environment for interaction of game characters, such as a plain, a street, a valley, and the like for fighting the game characters; the virtual object 110 and the special effect prop 120 are included in the virtual scene, the virtual object 110 may be a game character controlled by a user (or called a player), that is, the virtual object 110 is controlled by a real user, and will move in the virtual scene in response to an operation of the real user on a controller (including a touch screen, a voice-controlled switch, a keyboard, a mouse, a joystick, and the like), for example, when the real user moves the joystick to the left, the virtual object will move to the left in the virtual scene, and can also remain stationary in place, jump, and use various functions (such as skills and props); special effect item 120 may be a special effect item projected by virtual object 110 in the virtual scene, or may be a special effect item projected by the remaining virtual objects (virtual objects other than virtual object 110, such as an enemy or teammate of virtual object 110).
For example, a target location in the virtual scene 100 presents special effects prop 120; before the special effect prop 120 releases the special effect, an influence area of the special effect is displayed with the target position as a reference; the user, based on the presented area of influence, controls, via client 410, virtual object 110 to leave the area of influence to avoid affecting the state (e.g., life value or visual range, etc.) of virtual object 110 when special effect prop 120 releases the special effect.
In another implementation scenario, referring to fig. 3B, fig. 3B is an application mode schematic diagram of the interaction method for special effect props provided in the embodiment of the present application, and is applied to the terminal 400 and the server 200, and is adapted to complete virtual scenario calculation depending on the calculation capability of the server 200 and output an application mode of a virtual scenario at the terminal 400.
Taking the visual perception of forming the virtual scene 100 as an example, the server 200 performs calculation of display data related to the virtual scene and sends the calculated display data to the terminal 400, the terminal 400 relies on graphic calculation hardware to complete loading, analysis and rendering of the calculated display data, and relies on graphic output hardware to output the virtual scene to form the visual perception, for example, a two-dimensional video frame can be presented on a display screen of a smart phone, or a video frame realizing a three-dimensional display effect is projected on a lens of augmented reality/virtual reality glasses; for perception in the form of a virtual scene, it is understood that an auditory perception may be formed by means of a corresponding hardware output of the terminal, e.g. using a microphone output, a tactile perception using a vibrator output, etc.
As an example, the terminal 400 runs a client 410 (e.g., a network version of a game application), and performs game interaction with other users by connecting a game server (i.e., the server 200), the terminal 400 outputs a virtual scene 100 of the client 410, which includes a virtual object 110 and special effects items 120, the virtual object 110 can be a game character controlled by a user, that is, the virtual object 110 is controlled by a real user, and will move in the virtual scene in response to an operation of the real user on a controller (including a touch screen, a voice control switch, a keyboard, a mouse, a joystick, and the like), for example, when the real user moves the joystick to the left, the virtual object will move to the left in the virtual scene, and can also remain stationary in place, jump, and use various functions (such as skills and items); special effect item 120 may be a special effect item projected by virtual object 110 in the virtual scene, or may be a special effect item projected by the remaining virtual objects (virtual objects other than virtual object 110, such as an enemy or teammate of virtual object 110).
For example, a target location in the virtual scene 100 presents special effects prop 120; before the special effect prop 120 releases the special effect, the server 200 determines an influence range (or called radiation range) when the special effect prop 120 releases the special effect according to the parameter of the special effect prop 120, and divides the influence range into a plurality of influence areas; the server 200 sends rendering data corresponding to the plurality of influence areas to the client 410; after receiving the rendering data, the client 410 displays the influence area of the special effect with the target position as a reference; the user, based on the presented area of influence, controls, via client 410, virtual object 110 to leave the area of influence to avoid affecting the state (e.g., life value or visual range, etc.) of virtual object 110 when special effect prop 120 releases the special effect.
In some embodiments, the terminal 400 may implement the interaction method of the special effect prop provided in the embodiments of the present application by running a computer program, for example, the computer program may be a native program or a software module in an operating system; may be a local (Native) Application (APP), i.e. a program that needs to be installed in an operating system to run, such as a game APP (i.e. the above-mentioned client 410); or may be an applet, i.e. a program that can be run only by downloading it to the browser environment; but also a game applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
The embodiments of the present application may be implemented by means of Cloud Technology (Cloud Technology), which refers to a hosting Technology for unifying series resources such as hardware, software, and network in a wide area network or a local area network to implement data calculation, storage, processing, and sharing.
The cloud technology is a general term of network technology, information technology, integration technology, management platform technology, application technology and the like applied based on a cloud computing business model, can form a resource pool, is used as required, and is flexible and convenient. Cloud computing technology will become an important support. Background services of the technical network system require a large amount of computing and storage resources.
As an example, the server 200 may be an independent physical server, may be a server cluster or a distributed system formed by a plurality of physical servers, and may also be a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a web service, cloud communication, a middleware service, a domain name service, a security service, a CDN, and a big data and artificial intelligence platform. The terminal 400 may be, but is not limited to, a smart phone, a tablet computer, a laptop computer, a desktop computer, a smart speaker, a smart watch, and the like. The terminal 400 and the server 200 may be directly or indirectly connected through wired or wireless communication, and the embodiment of the present application is not limited thereto.
Next, a structure of an electronic device provided in an embodiment of the present application is described, where the electronic device may be the terminal 400 shown in fig. 3A and 3B, referring to fig. 4, fig. 4 is a schematic structural diagram of an electronic device 500 provided in an embodiment of the present application, and the electronic device 500 shown in fig. 4 includes: at least one processor 510, memory 550, at least one network interface 520, and a user interface 530. The various components in the electronic device 500 are coupled together by a bus system 540. It is understood that the bus system 540 is used to enable communications among the components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 540 in fig. 4.
The Processor 510 may be an integrated circuit chip having Signal processing capabilities, such as a general purpose Processor, a Digital Signal Processor (DSP), or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like, wherein the general purpose Processor may be a microprocessor or any conventional Processor, or the like.
The user interface 530 includes one or more output devices 531 enabling presentation of media content, including one or more speakers and/or one or more visual display screens. The user interface 530 also includes one or more input devices 532, including user interface components to facilitate user input, such as a keyboard, mouse, microphone, touch screen display, camera, other input buttons and controls.
The memory 550 may be removable, non-removable, or a combination thereof. Exemplary hardware devices include solid state memory, hard disk drives, optical disk drives, and the like. Memory 550 optionally includes one or more storage devices physically located remote from processor 510.
The memory 550 may comprise volatile memory or nonvolatile memory, and may also comprise both volatile and nonvolatile memory. The nonvolatile memory may be a Read Only Memory (ROM), and the volatile memory may be a Random Access Memory (RAM). The memory 550 described in embodiments herein is intended to comprise any suitable type of memory.
In some embodiments, memory 550 can store data to support various operations, examples of which include programs, modules, and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for processing various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and processing hardware-based tasks;
a network communication module 552 for communicating to other computing devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 including: bluetooth, wireless compatibility authentication (WiFi), and Universal Serial Bus (USB), etc.;
a presentation module 553 for enabling presentation of information (e.g., a user interface for operating peripherals and displaying content and information) via one or more output devices 531 (e.g., a display screen, speakers, etc.) associated with the user interface 530;
an input processing module 554 to detect one or more user inputs or interactions from one of the one or more input devices 532 and to translate the detected inputs or interactions.
In some embodiments, the interaction device for special effect properties provided in this application embodiment may be implemented in a software manner, and fig. 4 illustrates an interaction device 555 for special effect properties stored in a memory 550, which may be software in the form of a computer program, a plug-in, and the like, for example, a game program, a military exercise simulation system, and the like. The interaction device 555 for special effects properties comprises the following software modules: prop display module 5551 and area display module 5552, which are logical and thus can be arbitrarily combined or further split depending on the functions implemented. The functions of the respective modules will be explained below.
The interaction method for the special effects prop provided by the embodiment of the application may be executed by the terminal 400 in fig. 3A alone, or may be executed by the terminal 400 and the server 200 in fig. 3B in a cooperation manner.
Next, an example of an interaction method for executing the special effects provided by the embodiment of the present application by the terminal 400 in fig. 3A is described. Referring to fig. 5A, fig. 5A is a schematic flowchart of an interaction method of a special effect prop provided in the embodiment of the present application, and the steps shown in fig. 5A will be described.
It should be noted that the method shown in fig. 5A can be executed by various forms of computer programs executed by the terminal 400, and is not limited to the client 410, such as the operating system 551, the software modules and the scripts described above, and therefore the client should not be considered as limiting the embodiments of the present application.
In step S101, a virtual scene is displayed in the human-computer interaction interface.
In some embodiments, a virtual scene may be displayed at a first-person perspective in a human-machine interface (e.g., to play a virtual object in a game at a player's own perspective); or displaying the virtual scene at a third person perspective (e.g., a player follows a virtual object in the game to play the game); the virtual scene can also be displayed at a bird's-eye view angle; the above-mentioned viewing angles can be switched arbitrarily.
As an example, the virtual object may be an object controlled by a user in a game or military simulation, although other virtual objects may be included in the virtual scene, may be controlled by other users, or may be controlled by a robot program. The virtual objects may be divided into teams, which may be in a hostile or collaborative relationship, and the teams in the virtual scene may include one or all of the above relationships.
Taking the example of displaying the virtual scene at the first-person viewing angle, the virtual scene displayed in the human-computer interaction interface may include: according to the viewing position and the viewing angle of the virtual object in the complete virtual scene, the field of view area of the virtual object is determined, and the partial virtual scene in the field of view area in the complete virtual scene is presented, namely, the displayed virtual scene can be a partial virtual scene relative to the panoramic virtual scene. Because the first person viewing angle is the viewing angle which can give impact force to the user, the immersive perception that the user is personally on the scene in the operation process can be realized.
Taking the example of displaying the virtual scene at the bird's-eye view angle, the virtual scene displayed in the human-computer interaction interface may include: in response to a zoom operation for the panoramic virtual scene, a partial virtual scene corresponding to the zoom operation is presented in the human-machine interaction interface, i.e., the displayed virtual scene may be a partial virtual scene relative to the panoramic virtual scene. Therefore, the operability of the user in the operation process can be improved, and the efficiency of man-machine interaction can be improved.
In step S102, at least one special effect item projected to the target location is displayed in the virtual scene.
In some embodiments, the special effect prop may be a projectile or a missile thrown by the virtual object, and when the special effect prop is a projectile, the special effect prop may be a grenade, a landmine, a grenade, or the like; when the special effect prop is a launcher, namely the special effect prop is thrown by automatically throwing the prop, which is equivalent to machine throwing, the launching can be understood as launching, and the special effect prop can be a tracing grenade, a cannonball, a bullet and the like.
As an example, the special effect prop may be configured to release the special effect when the duration of the projected special effect reaches a target duration, for example, after the virtual object throws the grenade, the timed duration of the grenade after throwing reaches a duration threshold (which may be a default value or a set value), and then the special effect (e.g., explosion or smoke release, etc.) is released; it can also be used to release the special effect when sensing any virtual object, for example, the landmine senses the pressure applied by the virtual object through a mechanical sensor, senses the sound of the virtual object through an ultrasonic sensor, or senses the heat of the virtual object through an infrared sensor and then releases the special effect (for example, explosion or smoke release, etc.). Here, the diversified special effect props accord with actual military battle scenes, so that the military data obtained through simulation can be utilized to a higher degree.
In step S103, before the special effect item releases the special effect, an influence area of the special effect is displayed with the target position as a reference.
Here, the shape of the area of influence may be a regular geometric shape, such as a circle, a sector, a square, or the like, but may of course be an irregular geometric shape. When the shape of the region of influence is a regular geometric shape, the reference is the geometric center; when the shape of the region of influence is an irregular geometric shape, the reference is the geometric center of gravity.
Here, the number of the influence regions displayed with reference to the target position may be one or more (i.e., at least two), which will be described separately below.
In some embodiments, when the number of the displayed influence areas is one, referring to fig. 5B, fig. 5B is a schematic flowchart of an interaction method of the special effect prop provided in the embodiment of the present application, based on fig. 5A, step S103 may include step S1031 to step S1033, where step S1032 and step S1033 may be alternatively executed.
In step S1031, before the special effect item releases the special effect, a release distance that can be achieved by the special effect item from the target position is acquired.
In some embodiments, the release distance that the special effect prop can achieve from the target position may be a maximum release distance, or may be a value that is a preset proportion to the maximum release distance, for example, 80% of the maximum release distance; the preset ratio may be a default value or a value set by a user.
In other embodiments, the release distance that the special effect prop can achieve from the target position may also be determined by: obtaining the maximum release distance which can be realized when the special-effect prop is released from the target position; and attenuating the maximum release distance based on the protection capability parameter of the target virtual object, and taking the attenuated release distance as the release distance which can be realized by the special-effect prop from the target position.
Here, the protection capability parameter represents the protection capability of the target virtual object, and may be determined by a protection prop held by the target virtual object, such as body armor or shield.
Taking the shape of the affected area as a circle as an example, referring to fig. 10A, fig. 10A is a schematic diagram of a principle of an interaction method of the special effect prop provided in the embodiment of the present application, in fig. 10A, a circle center is a target position 501, a radius a is a maximum release distance that can be achieved when the special effect prop is released from the target position 501, taking a protection capability parameter characterization of a target virtual object as an example, the maximum release distance can be attenuated by 20%, and the attenuated release distance is (1-20%) a ═ 80% a, that is, an affected area formed subsequently for the target virtual object is a circular area whose circle center is the target position 501 and radius is 80% a.
Here, the target virtual object is an arbitrary virtual object in a virtual scene.
As an example one, the target virtual object may be any virtual object in the virtual scene whose distance from the target position is less than the maximum release distance. That is, the target virtual object is within the influence range of the special effect item to release the special effect.
For example, when there are a plurality of target virtual objects, the target virtual object with the smallest protection capability parameter value is determined among the plurality of target virtual objects; and attenuating the maximum release distance based on the protection capability parameter of the target virtual object with the minimum protection capability parameter value, and taking the attenuated release distance as the release distance which can be realized by the special-effect prop from the target position. In this way, the target virtual objects with the minimum protection capability can be attenuated to ensure that each target virtual object is not affected by the special effect when outside the affected area.
As example two, the target virtual object may be a virtual object that views the area of influence. That is, the area of influence displayed in the human-computer interaction interface is displayed in a perspective of the target virtual object.
For example, when the target virtual object is the virtual object a, the influence region corresponding to the virtual object a is presented by default in the view angle of the virtual object a, and of course, the influence region corresponding to other virtual objects may be presented by switching to present, for example, the influence region corresponding to a virtual object whose distance from the target position is less than the maximum release distance (for example, a virtual object belonging to the same group or a virtual object belonging to a different group).
In step S1032, a regular geometric shape with the release distance as the radiation distance is displayed as the influence region with the target position as the geometric center.
Here, the shape of the area of influence may be a regular geometric shape, such as a circle, a sector, a square, or the like. The radiation distance is a geometrical parameter for constraining the dimensions of the outer contour of the region of influence, the radiation distance having different geometrical meanings depending on the different geometries of the region of influence.
In some embodiments, when the shape of the area of influence is a circle, the geometric center is the center of the circle and the radiation distance is the radius of the circle; when the shape of the affected area is square or rectangular, the geometric center is the intersection point of the diagonals of the square or rectangular, and the radiation distance is the connecting line from the geometric center to the vertex of the square or rectangular; when the shape of the region of influence is an ellipse, the geometric center is the midpoint of the line connecting the two foci, and the radiation distance is the semi-major axis of the ellipse.
In fig. 10A, the area defined by the boundary 503 is the influence area, and the boundary 503 is a circle having a radius a and having the target position 501 as the center. Where the target location 501 is the geometric center and the radius a is the release distance, i.e., the radiation distance.
In step S1033, with the target position as the geometric center of gravity, an irregular geometric shape with the release distance as the radiation distance is displayed as the region of influence.
Here, the shape of the region of influence may be an irregular geometric shape.
For example, referring to fig. 10B, fig. 10B is a schematic diagram illustrating an interaction method of special effect props provided in an embodiment of the present application, in fig. 10B, an area defined by a boundary 504 is an influence area, wherein a target position 502 is a geometric center of gravity of the influence area, and a longest distance between the geometric center of gravity and the boundary 504 is a radiation distance.
An influence area displayed by the embodiment of the application can visually display the accurate position and the influence range of the special effect prop, and the special effect prop can be kept away by high-efficiency human-computer interaction operation, so that excessive operation is not needed, the virtual scene can be perceived intensively, and good immersive perception of the virtual scene is realized; and because of the reduction of human-computer interaction operation, the workload of graphic calculation for updating the virtual scene is reduced, and the consumption of related calculation resources for human-computer interaction of graphic processing hardware is saved.
In some embodiments, when the number of the displayed influence areas is multiple, referring to fig. 5C, 5C is a schematic flow chart of the interaction method for the special effect prop provided in the embodiment of the present application, and based on fig. 5A, step S103 may specifically include step S1034.
In step S1034, before the special effect prop releases the special effect, a plurality of radially arranged influence regions are displayed with the target position as a reference.
Here, the different influence areas represent different influence degrees of the special effect, for example, when the virtual object is in the influence area a, after the special effect item releases the special effect, a certain influence can be exerted on the state (for example, a life value or a visual range) of the virtual object, where the caused influence degree corresponds to the influence area a.
As an example, the plurality of influence regions may be regular geometric shapes having the same geometric center (i.e., reference), such as circles, sectors, circles, squares, or the like, and of course, the plurality of influence regions may also be irregular shapes having the same geometric center (i.e., reference).
Taking an example that the plurality of influence regions may be circular or ring-shaped with the same geometric center (i.e., a reference), see fig. 10C, where fig. 10C is a schematic diagram of an interaction method of the special effect prop provided in the embodiment of the present application, and fig. 10C includes three influence regions, that is, an influence region 511, an influence region 512, and an influence region 513. The area of influence 511 is a circular area with a center at the target position 510 and a radius b; the affected area 512 is a circular ring-shaped area with a center at the target position 510, an inner diameter at b, and an outer diameter at c (where the ring width is c-b); the area of influence 513 is a circular ring shaped area with a center at the target location 510, an inner diameter at c, and an outer diameter at d (where the ring width is d-c). Wherein the area of influence 511, the area of influence 512 and the area of influence 513 have the same geometric center, i.e. the target position 510.
In some embodiments, before the special effect prop releases the special effect, the target position is used as a radiation starting point, and a plurality of radially arranged influence areas are displayed between radiation boundaries corresponding to radiation distances from the radiation starting point to the special effect prop.
Here, each influence area is based on the target position, that is, based on the position of the special effect item.
For example, in fig. 8C, before the special effect prop releases the special effect, a plurality of danger areas (fig. 8C exemplarily shows 807-1, 807-2, 807-3 and 807-4, i.e. the above-mentioned influence areas) arranged in a radial manner are displayed between the radiation boundary 805-4 corresponding to the radiation distance 806 from the radiation start point by taking the explosion point 804 as the radiation start point.
As an example, when a plurality of radially arranged influence regions are displayed, the following may be implemented: sequentially displaying a plurality of regions of influence; wherein, the sequence is formed by arranging the radiation directions from the radiation starting points outwards to indicate the moving direction far away from the special effect prop.
For example, the outer radiation boundaries of the multiple regions of influence may be displayed sequentially, with multiple radiation boundaries being displayed sequentially from inside to outside along a radiation distance 806 from the detonation point 804 as the radiation origin in FIG. 8C (805-1, 805-2, 805-3, and 805-4 are shown in the example of FIG. 8C). Wherein, the outermost radiation boundary 805-4 can be continuously displayed, for example, the radiation boundary 805-4 is continuously displayed, and the radiation boundary 805-1, the radiation boundary 805-2 and the radiation boundary 805-3 are sequentially displayed; the outermost radiation boundary 805-4 may not be displayed continuously, for example, radiation boundary 805-1, radiation boundary 805-2, radiation boundary 805-3, and radiation boundary 805-4 are displayed in this order.
It is noted that radiation boundary 805-1 is the radiation outer boundary of hazardous area 807-1, radiation boundary 805-2 is the radiation outer boundary of hazardous area 807-2, radiation boundary 805-3 is the radiation outer boundary of hazardous area 807-3, and radiation boundary 805-4 is the radiation outer boundary of hazardous area 807-4.
As an example two, when a plurality of radially arranged influence regions are displayed, this can be achieved by: simultaneously displaying a plurality of influence areas according to display parameters (for example, color values, thickness degrees, and the like) corresponding to each influence area; the display parameters of the plurality of influence areas are in an attenuation trend and are attenuated according to the direction radiating outwards from the radiation starting point so as to indicate the moving direction away from the special effect prop. Because the influence degree of the special effect prop on the area close to the target position is larger when the special effect prop releases the special effect, the area close to the target position is provided with larger display parameters, so that a user can easily determine the direction of the special effect prop to release the special effect, and the user can conveniently keep away from the special effect prop.
For example, the trend of the display parameters of the multiple influence areas being attenuated may be a trend of the display parameters (e.g., thickness degree, color value) of the outer radiation boundaries of the multiple influence areas being attenuated, e.g., a trend of the color values (e.g., saturation, brightness of the same color) of the outer radiation boundaries of different influence areas being attenuated; the display parameters (e.g., color values) of the whole of the plurality of influence regions may be in a decay trend, for example, the color values (e.g., saturation and brightness of the same color) of the filling colors of different influence regions are in a decay trend.
For example, in FIG. 8C, the thickness degrees of radiation boundary 805-1, radiation boundary 805-2, radiation boundary 805-3, and radiation boundary 805-4 shown may be attenuated in sequence; the saturation of the same color of the displayed radiation boundaries 805-1, 805-2, 805-3, and 805-4 may also be attenuated in sequence.
As an example three, when a plurality of radially arranged impact regions are displayed, this can be achieved by: sequentially displaying a plurality of regions of influence; wherein, the order includes: an arrangement order along the irradiation distance from the irradiation start point; the display parameters of the plurality of influence areas (for example, the display parameters of the boundaries of the influence areas) show a decay trend and decay from the radiation starting point in the direction of the radiation distance.
For example, the trend of the display parameters of the multiple influence areas being attenuated may be a trend of the display parameters (e.g., thickness degree, color value) of the outer radiation boundaries of the multiple influence areas being attenuated, e.g., a trend of the color values (e.g., saturation, brightness of the same color) of the outer radiation boundaries of different influence areas being attenuated; the display parameters (e.g., color values) of the whole of the plurality of influence regions may be in a decay trend, for example, the color values (e.g., saturation and brightness of the same color) of the filling colors of different influence regions are in a decay trend.
For example, in FIG. 8C, a plurality of irradiation boundaries (shown schematically in FIG. 8C as 805-1, 805-2, 805-3, and 805-4) are displayed sequentially from inside to outside along irradiation distance 806 from the detonation point 804 to the irradiation start point. Wherein the thickness degrees of the radiation boundaries 805-1, 805-2, 805-3 and 805-4 shown may be attenuated in sequence.
Through the three examples, the user can easily determine the direction of the special effect released by the special effect prop, and is convenient for the user to keep away from the special effect prop so as to avoid injury; and a special effect prop can be created to release a special effect tense atmosphere at any time, so that a user can feel personally on the scene.
In some embodiments, when the special effect item is invisible in the virtual scene (for example, the special effect item is blocked), the influence area of the special effect may also be displayed with the target position as a reference before the special effect item releases the special effect. Therefore, when the special effect prop is in the sight blind area of the user, the user can be still prompted to be far away from the special effect prop.
In the following, specific implementations of displaying the impact area in the view of virtual objects with different role identities are described separately.
In some embodiments, when the distance between the first virtual object and the target position is less than the release distance and the special effect prop has not released the special effect, the influence area is displayed with the target position as a reference in the viewing angle (e.g., the first person viewing angle or the third person viewing angle) of the first virtual object.
As an example, the first virtual object is a virtual object that projects a special effects item.
As an example, the release distance may be the maximum release distance that the special effect prop can achieve from the target location; the release distance may also be a value that is a preset ratio to the maximum release distance, such as 80% of the maximum release distance; the preset proportion can be a default value or a value set by a user; the release distance may also be a release distance after attenuating the maximum release distance based on a protection capability parameter of the first virtual object.
For example, when the distance between the first virtual object and the target position is not less than the release distance, the state of the first virtual object is not affected even if the special effect prop releases the special effect, and therefore it can be determined that the first virtual object does not know the requirement of the special effect prop, and therefore the affected area is not displayed, not only can interference be avoided, but also unnecessary calculation related to display of the affected area by graphics processing hardware can be avoided, and resources are saved.
For example, in fig. 8B, when the distance between the virtual object and the special effect prop is less than the release distance, and the special effect prop has not released the special effect, the danger area 803 is displayed at the perspective of the virtual object. When the distance between the virtual object and the special effect prop is not less than the release distance, the displayed danger area 803 is hidden.
In the embodiment of the application, when the virtual object is within the influence range of the special effect prop, the user is prompted in real time, the user can timely know the position of the special effect prop to keep away from the special effect prop, so that the visual angle adjustment operation executed when the user searches for the special effect prop can be reduced, the man-machine interaction efficiency is improved, the processing resources consumed by the terminal are reduced, the electric quantity consumed by the terminal is saved, and the cruising ability of the terminal is improved.
As an example, after displaying the area of influence of the special effect with the target position as a reference in the view angle of the first virtual object, the method may further include: stopping displaying the area of influence in the perspective of the first virtual object when at least one of the following conditions is satisfied: the distance between the first virtual object and the special effect prop is not less than the release distance; according to the motion trend (such as the moving direction and speed) of the first virtual object, the state of the first virtual object is not influenced when the special effect prop releases the special effect. By the aid of the method and the device, the influence area can be stopped being displayed when the first virtual object does not know the requirement of the special effect prop, unnecessary calculation related to display of the influence area can be avoided by graphic processing hardware, and accordingly resources are saved.
In some embodiments, when the special effect item is within the visible range of the first virtual object and the special effect item has not released the special effect, the influence area of the special effect is displayed with the target position as a reference in the viewing angle of the first virtual object. Therefore, the special effect prop located in the visual range of the virtual object can be prompted to the user, and the virtual object controlled by the user is prevented from entering the influence range of the special effect prop by mistake to cause damage.
As an example, the display of the area of influence may also be stopped in the perspective of the first virtual object when the first virtual object exhibits a tendency to move away from the special effects prop. Because the first virtual object presents the movement trend far away from the special effect prop, the first virtual object can be determined not to know the requirement of the special effect prop, so that the display influence area can be stopped, the display resource of the terminal can be prevented from being consumed, the electric quantity consumed by the terminal is saved, and the cruising ability of the terminal is improved.
In some embodiments, when the distance between the second virtual object and the target position is less than the release distance and the special effect prop has not released the special effect, the influence area of the special effect is displayed with the target position as a reference in the viewing angle of the second virtual object.
As an example, the second virtual object and the first virtual object both belong to the same group (or formation), that is, the second virtual object and the first virtual object have a cooperative relationship, for example, belong to the same group in a game.
As an example, the release distance may be the maximum release distance that the special effect prop can achieve from the target location; the release distance may also be a value that is a preset ratio to the maximum release distance, such as 80% of the maximum release distance; the preset proportion can be a default value or a value set by a user; the release distance may also be a release distance after attenuating the maximum release distance based on a protection capability parameter of the second virtual object.
In the embodiment of the application, when the second virtual object is in the influence range of the special effect prop projected by the first virtual object in the same group when the special effect prop releases the special effect, the second virtual object is prompted in real time, so that the second virtual object can timely know the position of the special effect prop to be far away from the special effect prop, the influence caused by the special effect prop projected by the same partner is avoided, and the human-computer interaction efficiency is improved.
As an example, after displaying the area of influence of the special effect with the target position as a reference in the view angle of the second virtual object, the method may further include: stopping displaying the area of influence in the perspective of the second virtual object when at least one of the following conditions is satisfied: the distance between the second virtual object and the special effect prop is not less than the release distance; according to the motion trend (such as the moving direction and speed) of the second virtual object, the state of the second virtual object is not influenced when the special effect prop releases the special effect. By the embodiment of the application, the display of the influence area can be stopped when the second virtual object does not know the requirement of the special effect prop, and unnecessary calculation related to the display of the influence area by graphic processing hardware can be avoided, so that resources are saved.
In some embodiments, when the special effect item is within the visible range of the second virtual object and the special effect item has not released the special effect, the influence area of the special effect is displayed with the target position as a reference in the viewing angle of the second virtual object.
In the embodiment of the application, when the special effect props projected by the first virtual objects in the same group are within the visual range of the second virtual object, the second virtual object is prompted in real time, so that the second virtual object can timely know the positions of the special effect props to be far away from the special effect props, the influence caused by the special effect props projected by the same partner is avoided, and the human-computer interaction efficiency is improved.
In some embodiments, when it is determined that the state (e.g., a life value, a visual range, etc.) of the first virtual object will be affected when the special effect prop releases the special effect according to a motion trend (e.g., a direction and a speed of movement) of the first virtual object, an affected area of the special effect is displayed with reference to the target position in the visual angle of the first virtual object and/or the second virtual object.
As an example, the second virtual object and the first virtual object both belong to the same group, and the distance between the second virtual object and the first virtual object is less than the visible distance threshold, that is, the second virtual object and the first virtual object are both within the visible range of each other.
For example, when it is determined that the state of any virtual object will be affected when the special effect prop releases the special effect according to the movement trend of the virtual object, the affected area may be displayed in the view angle of the virtual object and its surrounding teammates, so that the surrounding teammates can make further action decisions, such as escape, forward support, summons teammates, and the like.
In some embodiments, when the distance between the third virtual object and the target position is less than the release distance and the special effect prop has not released the special effect, the influence area of the special effect is displayed with the target position as a reference in the viewing angle of the third virtual object.
As an example, the third virtual object and the first virtual object respectively belong to competing groups, i.e. the third virtual object has a competing or competing relationship with the first virtual object, e.g. belongs to different groups in a game.
As an example, the release distance may be the maximum release distance that the special effect prop can achieve from the target location; the release distance may also be a value that is a preset ratio to the maximum release distance, such as 80% of the maximum release distance; the preset proportion can be a default value or a value set by a user; the release distance may also be a release distance after attenuating the maximum release distance based on a protection capability parameter of the third virtual object.
In the embodiment of the application, when the third virtual object is in an influence range of the first virtual object of the mutually-antagonistic group when the special effect prop releases the special effect, the third virtual object is prompted in real time, so that the third virtual object can timely know the position of the special effect prop to keep away from the special effect prop, the influence caused by the special effect prop projected by an enemy is avoided, and the human-computer interaction efficiency is improved.
As an example, after displaying the area of influence of the special effect with the target position as a reference in the perspective of the third virtual object, the method may further include: stopping displaying the area of influence in the perspective of the third virtual object when at least one of the following conditions is satisfied: the distance between the third virtual object and the special effect prop is not less than the release distance; and according to the motion trend (such as the moving direction and speed) of the third virtual object, determining that the state of the third virtual object is not influenced when the special effect prop releases the special effect. By the embodiment of the application, the display of the influence area can be stopped when the third virtual object does not know the requirement of the special effect prop, and unnecessary calculation related to the display of the influence area by graphic processing hardware can be avoided, so that resources are saved.
In some embodiments, when the special effect item is within the visible range of the third virtual object and the special effect item has not released the special effect, the influence area of the special effect is displayed with the target position as a reference in the viewing angle of the third virtual object.
In the embodiment of the application, when the special effect props projected by the first virtual objects in different groups are within the visual range of the third virtual object, the third virtual object is prompted in real time, so that the third virtual object can timely know the positions of the special effect props to keep away from the special effect props, influence caused by the special effect props projected by enemies is avoided, and human-computer interaction efficiency is improved.
In some embodiments, after step S103, the method may further include: and after the special effect prop releases the special effect, stopping displaying the influence area in the virtual scene. Because the special effect prop can not influence the virtual object in the virtual environment after releasing the special effect, the virtual object does not know the requirement of the special effect prop, the display resource of the terminal can be prevented from being consumed by stopping displaying the influence area, the electric quantity consumed by the terminal is saved, and the cruising ability of the terminal is improved.
In some embodiments, in addition to displaying the influence area, the human-computer interaction interface may also display prompt information, for example, when the influence area of the special effect is displayed with the target position as a reference, at least one of the following prompt information may be displayed:
(1) the first timing information is used for prompting the special effect time of the special effect prop to release the special effect.
As an example, the type of the first timing information includes at least one of: a timing progress bar; a time value text. Taking the example that the type of the first timing information is a timing progress bar, the length of the timing progress bar is gradually shortened with the passage of time. Taking the example that the type of the first timing information is a timing numerical text, the numerical text may appear as a numerical text displaying numbers gradually decreasing as time passes. Therefore, the time for releasing the special effect of the special effect prop can be intuitively and clearly prompted to the user, so that the user can control the virtual object to leave the influence area before the special effect is released.
(2) And second timing information, wherein the second timing information is used for prompting the time required for leaving the influence area.
As an example, the type of the second timing information includes at least one of: a timing progress bar; a time value text. Taking the type of the second timing information as an example, when the virtual object moves towards a direction away from the special effect prop, the length of the timing progress bar is gradually shortened along with the lapse of time; when the virtual object moves towards the direction close to the special effect prop, the length of the timing progress bar gradually increases along with the time. Taking the type of the first timing information as an example, when the virtual object moves towards a direction away from the special effect prop, the number displayed by the numerical text gradually decreases; when the virtual object moves towards the direction close to the special effect prop, the number displayed by the numerical text gradually increases. Therefore, the time required for leaving the influence area can be intuitively and clearly prompted, so that a user can be quickly away from the special-effect prop.
As an example, when the human-computer interaction interface displays a plurality of radially arranged influence areas with the target position as a reference, a plurality of second timing information may be displayed, where each second timing information corresponds to one influence area, that is, each second timing information is used to prompt a time required for leaving the corresponding influence area; it is also possible to display only one second timing information, which is used to indicate the time required to leave the outermost area of influence.
(3) And direction prompt information, wherein the direction prompt information is used for prompting at least one direction which is the fastest to leave the influence area.
As an example, a plurality of prompting directions and the time required for leaving the influence area corresponding to each prompting direction are presented in the human-computer interaction interface, so that the user can select the prompting direction with the least time required for leaving the influence area from the plurality of prompting directions, and can leave the influence range of the special effect prop most quickly.
(4) And influence prompt information, wherein the influence prompt information is used for prompting the influence degree caused when the special effect prop releases the special effect.
As an example, the type of influencing the hint information includes at least one of: an influence progress bar; affecting the numeric text. Taking the example that the type of the influence prompting information is the influence progress bar, when the virtual object moves towards the direction far away from the special effect prop, the length of the influence progress bar is gradually shortened along with the moving distance; when the virtual object moves towards the direction close to the special effect prop, the length of the influence progress bar gradually increases along with the moving distance. Taking the example that the type of the influence prompt information is the influence numerical value text, when the virtual object moves towards the direction far away from the special effect prop, the number displayed by the numerical value text is gradually reduced along with the moving distance; when the virtual object moves towards the direction close to the special effect prop, the number displayed by the numerical text gradually increases along with the moving distance. Therefore, the influence degree of the user when the user is located at the current position and the special effect prop releases the special effect can be intuitively and clearly prompted, and the user can perceive the influence degree of the special effect prop in real time.
As an example, when the human-computer interaction interface displays a plurality of radially arranged influence regions with reference to the target position, and the different influence regions represent different influence degrees of the special effect, a plurality of influence prompt messages may be displayed, where each influence prompt message corresponds to one influence region, that is, each influence prompt message is used for prompting the influence degree caused when the special effect prop releases the special effect and is located in the influence region.
Besides the presentation prompt region, the presentation of the sound special effect and the body feeling special effect can be assisted, so that the user is prompted to leave the influence region through the sound special effect and the body feeling special effect.
In some embodiments, referring to fig. 6, fig. 6 is a schematic flowchart of an interaction method of a special effect prop provided in the embodiment of the present application, and based on fig. 5A, step S104 may be further included before step S102.
In step S104, in response to the projection operation for the special effect item, a process of the first virtual object projecting the special effect item to the target position is displayed in the virtual scene.
In some embodiments, a plurality of candidate special effects items are displayed; responding to the selection operation of the special effect prop, and displaying a throwing preparation state of the selected special effect prop; and responding to the projection operation aiming at the special effect prop in the virtual scene, and displaying the process that the first virtual object projects the special effect prop to the target position in the virtual scene.
As an example, a plurality of candidate special effect items are displayed on a human-computer interaction interface, for example, a plurality of special effect items may be displayed in a special effect item column of the human-computer interaction interface for selection, and in response to a selection operation of a virtual object for the special effect items, the human-computer interaction interface presents a preliminary throwing state in which the virtual object prepares to throw the selected special effect items, the presenting manner includes presenting a throwing form of the virtual object, that is, a throwing initial position and a throwing direction of the special effect items included in the throwing preparation state are represented by a throwing form in an image form, and the presenting manner may also be directly presenting characters or presenting characters on the basis of the presenting manner, that is, the throwing initial position and the throwing direction of the special effect items included in the throwing preparation state are represented by a character form.
As an example, the throwing preparation state includes a throwing initial position and a throwing direction of the special effect prop, the throwing initial position may be a position where a throwing hand of a virtual object throwing the special effect prop is located, that is, a certain position in a space, and the throwing direction is a throwing direction of the throwing hand of the virtual object, the above-mentioned situations describe a manual throwing situation, there also exist an automatic throwing situation, that is, a certain automatic throwing prop throws the special effect prop in response to control of the virtual object, at this time, the throwing initial position may be a position of the automatic throwing prop throwing the special effect prop, the throwing direction is an exit direction of the special effect prop, the throwing preparation state may further include a prompt message of throwing strength or throwing speed, thereby helping a user to sense the special effect throwing message, and thus effectively adjust parameters such as the throwing initial position, throwing direction, throwing strength and throwing speed, therefore, the target object can be effectively attacked, and the actual military battle scene is met.
For example, the above parameters may be adjusted by a selection operation and a movement operation, for example, parameters such as a throwing direction, a throwing force, and a throwing speed may be restricted in the selection operation, and the initial position of throwing may control movement of the virtual object through the movement operation of the virtual object, and may also control movement of the automatic throwing prop through the movement operation of the automatic throwing prop. Therefore, the target object can be attacked effectively by the user, the actual military battle scene is met, and the utilization degree of the military data obtained through simulation is higher.
In some embodiments, referring to fig. 7A, fig. 7A is a schematic flowchart of an interaction method of a special effect prop provided in the embodiment of the present application, and based on fig. 5A, step S105 and step S106 may be further included before step S103.
In step S105, a release distance that can be reached when the special effect of the special effect prop is released from the target position is acquired, and the release distance is determined as the radiation distance.
As an example, the release distance may be a maximum release distance that the special effect prop can achieve from the target position, or may be a value that is a preset proportion to the maximum release distance, for example, 80% of the maximum release distance; the preset ratio may be a default value or a value set by a user.
For example, in fig. 8C, the explosion point 804 is the target position, and the release distance that can be reached when the special effect of the special effect prop is released is the radiation distance 806.
In step S106, the irradiation range corresponding to the irradiation distance is divided into a plurality of influence regions having different degrees of influence in the irradiation direction with the target position as the irradiation start point.
Here, the radiation direction is a direction outward from the radiation starting point.
When the shape of the radiation range is circular, the radiation starting point is the circle center, and the radiation distance is the radius of the circle; when the shape of the radiation range is square or rectangle, the radiation starting point is the intersection point of the diagonals of the square or rectangle, and the radiation distance is the connecting line from the geometric center to the vertex of the square or rectangle; when the shape of the radiation range is an ellipse, the radiation starting point is the midpoint of a connecting line of two focal points, and the radiation distance is the semimajor axis of the ellipse; when the shape of the irradiation range is an irregular geometric shape, the irradiation starting point is the geometric center of gravity of the irradiation range, and the irradiation distance is the longest distance between the geometric center of gravity and the boundary of the irradiation range.
Taking the shape of the radiation range as an example of a circle, in fig. 10C, the radiation range is divided into a circular region (i.e., the influence region 511) centered on the target position 510 and a plurality of circular ring-shaped regions (i.e., the influence region 512 and the influence region 513) arranged in the radiation direction by dividing the radiation distance 530. For example, the radiation distance 530 is divided into increasing b, c, d, such that b is the radius of the area of influence 511, while b is the inner radius of the area of influence 512, i.e. a circle with radius b is not only the outer radiation boundary of the area of influence 511, but also the inner radiation boundary of the area of influence 512; c is the outer diameter of the influence zone 512, while c is the inner diameter of the influence zone 513, i.e. a circle with radius c is not only the radial outer boundary of the influence zone 512, but also the radial inner boundary of the influence zone 513; d is the outer diameter of the area of influence 513, i.e. the circle with radius d is the radiating outer boundary of the area of influence 513, i.e. the total area defined by the circle with radius d is the above-mentioned radiating range corresponding to the radiating distance 530.
In some embodiments, according to the attenuation characteristic of the intensity of the special effect prop along the radiation direction, the radiation range corresponding to the radiation distance is divided into a plurality of influence areas representing different attenuation intervals of the intensity of the special effect.
As an example, the attenuation characteristic may be a linear attenuation or a non-linear attenuation, for example, the intensity of the special effect corresponding to the position a located in the influence range of the special effect prop is linearly inversely proportional or non-linearly inversely proportional to the distance between the position a and the target position.
As an example, the radiation range is equally divided into a plurality of influence regions according to the radiation distance, wherein the attenuation values of the attenuation intervals corresponding to different influence regions may be the same or different, for example, when the attenuation characteristic is linear attenuation, the attenuation values of the attenuation intervals corresponding to different influence regions are the same; when the attenuation characteristic is a nonlinear attenuation, the attenuation values of the attenuation intervals corresponding to different influence regions are different.
As an example, the radiation range is divided into a plurality of influence regions by attenuation values such that the attenuation values of the attenuation intervals corresponding to different influence regions are the same, e.g., the attenuation value of the attenuation interval of each influence region is 25%.
Taking the shape of the radiation range as a circle and the intensity of the special effect of the center of the circle (i.e. the target position) as 100% as an example, referring to fig. 7B, fig. 7B is a schematic diagram of the principle provided by the embodiment of the present application for dividing a plurality of influence regions, and the attenuation interval of the influence region 701 is from 100% to 75%, i.e. the attenuation value is 25%; the attenuation interval of the region of influence 702 is from 75% to 50%, i.e. the attenuation value is 25%; the attenuation interval of the region of influence 703 is from 50% to 25%, i.e. the attenuation value is 25%; the attenuation interval of the region of influence 704 is from 25% to 0%, i.e. the attenuation value is 25%. Thus, the attenuation values of the attenuation sections corresponding to different influence regions are the same, and the strength of the special effect corresponding to the region outside the radiation outer boundary of the influence region 704 is 0, that is, when any virtual object is located in the region outside the radiation outer boundary of the influence region 704, the special effect prop does not affect the state of the virtual object when releasing the special effect.
According to the embodiment of the application, the radiation range is uniformly divided according to the special effect strength of the special effect prop, the same influence area prompt is presented to all virtual objects in a virtual scene, the calculation resources used for dividing a plurality of influence areas can be reduced, and the display speed of the influence areas is improved.
In some embodiments, determining the attenuation characteristic of the state value of the target virtual object along the radiation direction according to the protection capability parameter of the target virtual object and the attenuation characteristic of the special effect strength of the special effect prop along the radiation direction; and dividing the radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the state value of the target virtual object according to the attenuation characteristic of the state value of the target virtual object along the radiation direction.
Here, the target virtual object is an arbitrary virtual object in a virtual scene.
As an example one, the target virtual object may be any virtual object in the virtual scene whose distance from the target position is less than the maximum release distance. That is, the target virtual object is within the influence range of the special effect item to release the special effect.
For example, when there are a plurality of target virtual objects, the target virtual object with the smallest protection capability parameter value is determined among the plurality of target virtual objects; determining the attenuation characteristic of the state value of the target virtual object along the radiation direction according to the protection capability parameter of the target virtual object with the minimum protection capability parameter value and the attenuation characteristic of the special effect strength of the special effect prop along the radiation direction; and dividing the radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the state value of the target virtual object according to the attenuation characteristic of the state value of the target virtual object along the radiation direction. In this way, a plurality of influence ranges can be virtually divided with the target of minimum protection capability to ensure that each target virtual object is not affected by special effects when outside the influence area.
As example two, the target virtual object may be a virtual object that views the area of influence. That is, the area of influence displayed in the human-computer interaction interface is displayed in a perspective of the target virtual object.
For example, when the target virtual object is the virtual object a, the influence region corresponding to the virtual object a is presented by default in the view angle of the virtual object a, and of course, the influence region corresponding to other virtual objects may be presented by switching to present, for example, the influence region corresponding to a virtual object whose distance from the target position is less than the maximum release distance (for example, a virtual object belonging to the same group or a virtual object belonging to a different group). Therefore, a plurality of influence areas aiming at different virtual objects can be displayed in a follow-up flexible mode according to the requirements of the user, and the user can conveniently designate a fighting strategy.
Here, the state value may characterize the vitality of the target virtual object, such as the amount of blood in the game; the state value may also characterize the degree of visibility of the target virtual object in the virtual scene, such as the visible distance in the game.
As an example, the state value of the target virtual object may be a difference between the strength of the corresponding special effect and a protection capability parameter value of the target virtual object. Taking the protection capability parameter value of the target virtual object as a and the intensity value of the special effect at the first position (located in the influence range of the special effect) as B as an example, the state value of the target virtual object corresponding to the first position is | a-B |, so that the state value of the target virtual object corresponding to each position in the radiation range can be determined, and thus the attenuation curve of the attenuation characteristic of the state value of the target virtual object along the radiation direction can be determined.
As an example, according to the attenuation characteristic of the state value of the target virtual object along the radiation direction, the radiation range corresponding to the radiation distance is divided into a plurality of influence regions, so that the attenuation values of the attenuation sections of the state values of the target virtual object corresponding to different influence regions are the same.
Taking the shape of the radiation range as a circle, the strength of the special effect at the center of the circle (i.e., the target position) is 100%, and the protection capability parameter value of the target virtual object is 20% (i.e., 20% of the strength of the special effect can be resisted) as an example, see fig. 7C, where fig. 7C is a schematic diagram of the principle for dividing the multiple influence regions provided in the embodiment of the present application, and in fig. 7C, the influence region 701, the influence region 702, the influence region 703, and the influence region 704 are influence regions divided for a virtual object without protection capability; the influence region 711, the influence region 712, the influence region 713, and the influence region 714 are influence intervals divided for a target virtual object whose protection capability parameter is 20%. When the target virtual object is at the target position, the influence degree of the special effect is 100% -20% ═ 80%. Thus, the attenuation interval of the impact region 711 is from 80% to 60%, i.e., the attenuation value is 20%; the attenuation interval of the region of influence 712 is from 60% to 40%, i.e. the attenuation value is 20%; the attenuation interval of the region of influence 713 is from 40% to 20%, i.e. the attenuation value is 20%; the attenuation interval of the region of influence 714 is from 20% to 0%, i.e. the attenuation value is 20%. In this way, the attenuation values of the attenuation sections corresponding to different influence regions are the same, and the strength of the special effect corresponding to the region outside the radiation outer boundary of the influence region 714 is 0, that is, when the target virtual object is located in the region outside the radiation outer boundary of the influence region 714, the special effect prop does not affect the state of the target virtual object when releasing the special effect.
That is to say, the larger the protection capability parameter value of the target virtual object is, the smaller the area defined by the boundary of the influence region at the outermost layer is, and the representation of the target virtual object is easier to be away from the radiation range of the special effect prop compared with a virtual object without protection capability or with a protection capability parameter value lower than that of the target virtual object.
As an example, when the target virtual object is a first virtual object, the influence region corresponding to the first virtual object is presented by default in the view angle of the first virtual object, and of course, may also be switched to present influence regions corresponding to other virtual objects, for example, virtual objects having a distance from the special effect prop smaller than the maximum release distance (for example, virtual objects belonging to the same group or virtual objects belonging to different groups).
According to the embodiment of the application, the radiation range is divided in an individualized mode according to the protection capability parameter of the target virtual object and the strength of the special effect prop, an individualized influence area prompt is presented to the virtual object in the virtual scene, and the effectiveness of the prompt can be improved.
In some embodiments, when a target virtual object whose distance from a target position is less than a maximum release distance exists in a virtual scene, dividing a radiation range by a radiation range according to a protection capability parameter of the target virtual object and the strength of a special effect of the special effect prop; and when a target virtual object with the distance to the target position smaller than the maximum release distance does not exist in the virtual scene, dividing the radiation range according to the special effect strength of the special effect prop. Therefore, the radiation range can be flexibly divided according to the actual fighting condition in the virtual scene, and the effectiveness of prompting can be improved.
Next, an exemplary application of the interaction method of the special effect item provided in the embodiment of the present application in an application scenario of a game will be described.
The embodiment of the application provides an interaction method of a special-effect prop, which can simply and clearly enable a player to know the distance between a grenade and the player, determine the direction from which the player escapes, and determine how far the player needs to escape from the grenade.
Referring to fig. 8A, 8B, and 8C, fig. 8A, 8B, and 8C are application scenario diagrams of an interaction method of a special effect prop according to an embodiment of the present application.
(1) Presenting the danger zone of the grenade (i.e. the aforementioned zone of influence) from the perspective of the player throwing the grenade
In fig. 8A, after the player throws the grenade 801, the dangerous area 802 of the grenade is automatically displayed, and specifically, in the area near the grenade 801, an energy ring (or danger indicating ring, i.e., the above-mentioned radiation boundary) effect is presented, and the outside of the energy ring at the outermost layer does not cause damage to enemies, so that the player throwing the grenade can easily judge whether the grenade thrown by himself or herself can attack the enemies.
As an example, the color of the displayed energy ring may be red or blue for the purpose of alerting the user away from the grenade.
(2) Presenting a danger zone of a grenade from the perspective of a player struck by the grenade
In fig. 8B, when the player attacked by the grenade is in the dangerous area of the grenade, the player does not necessarily see the position of the grenade, but the energy circle in the dangerous area 803 can clearly remind the player that the player is in the dangerous area, and the player will not be attacked by the grenade explosion as long as the player escapes from the outermost energy circle.
In some embodiments, the energy circle in hazard zone 803 is displayed as shown in FIG. 8C, with the outermost energy circle being displayed continuously prior to the grenade explosion, so that the player cannot know where to go and is safe until the grenade has exploded and the outermost energy circle is hidden. The energy circle from the explosion point 804 to the outermost layer is displayed in a refreshing mode from inside to outside in the sequence of the marquee, so that the player can know the explosion direction of the grenade, and a nervous atmosphere that the grenade explodes at any time can be created, so that the player can feel personally on the scene.
Here, the line thickness d of the energy ring is inversely related to the radius R of the energy ring from the explosion point, for example, the formula d ═ K/R + b, where the constant K and the constant b are fixed.
Next, a specific implementation manner of the interaction method for special effects props provided in the embodiment of the present application is described, referring to fig. 9A and 9B, where fig. 9A and 9B are schematic flow diagrams of the interaction method for special effects props provided in the embodiment of the present application.
(1) Referring to FIG. 9A, for a player who throws a grenade.
Firstly, after the grenade is thrown out, the energy ring at the outermost layer of the grenade is displayed.
And secondly, displaying other energy rings from inside to outside according to the sequence of the marquee.
And thirdly, hiding all energy rings after the grenade explodes.
(2) Referring to FIG. 9B, for a player attacked by a grenade.
The method comprises the steps of firstly, judging whether a player is in a dangerous area of the torpedo attack, and if so, displaying an outermost energy circle which can be covered by the torpedo attack.
And secondly, displaying other energy rings from inside to outside according to the sequence of the marquee.
And thirdly, if the player escapes from the dangerous area of the tornado attack, all energy circles are automatically hidden in a picture displayed by the player.
It should be noted that, the embodiment of the present application shows the energy ring, including but not limited to the shape and the color described above, and the embodiment of the present application does not limit this.
An exemplary structure of the interaction device 555 for special effect items provided in the embodiment of the present application is described below with reference to fig. 4 as a software module, and in some embodiments, as shown in fig. 4, the software module stored in the interaction device 555 for special effect items in the memory 550 may include:
the item display module 5551 is configured to display a virtual scene in the human-computer interaction interface, and display at least one special-effect item projected to a target location in the virtual scene;
the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed;
the area display module 5552 is configured to display an influence area of the special effect with the target position as a reference before the special effect prop releases the special effect.
In the above scheme, the area display module 5552 is further configured to obtain a release distance that can be achieved by the special effect prop from the target position; and displaying a regular geometric shape with the release distance as the radiation distance as an influence area with the target position as a geometric center, or displaying an irregular geometric shape with the release distance as the radiation distance as an influence area with the target position as a geometric center.
In the above scheme, the area display module 5552 is further configured to obtain a maximum release distance that can be achieved when the special effect prop is released from the target position; attenuating the maximum release distance based on the protection capability parameter of the target virtual object, and taking the attenuated release distance as the release distance which can be realized by the special-effect prop from the target position; wherein the target virtual object is any virtual object in the virtual scene.
In the above solution, the area display module 5552 is further configured to display a plurality of radially arranged influence areas with the target position as a reference; wherein the different influence regions characterize different degrees of influence of the special effect.
In the above solution, the area display module 5552 is further configured to display a plurality of radially arranged influence areas between the radiation boundaries corresponding to the radiation distances from the radiation starting point to the special effect prop, with the target position as the radiation starting point.
In the above solution, the area display module 5552 is further configured to sequentially display a plurality of affected areas; wherein, the sequence is formed by arranging the radiation directions from the radiation starting points outwards to indicate the moving direction far away from the special effect prop.
In the above solution, the area display module 5552 is further configured to simultaneously display a plurality of influence areas according to the display parameter corresponding to each influence area; the display parameters of the plurality of influence areas are in an attenuation trend and are attenuated according to the direction radiating outwards from the radiation starting point so as to indicate the moving direction away from the special effect prop.
In the above scheme, the area display module 5552 is further configured to obtain a release distance that can be reached when the special effect of the special effect prop is released from the target position, and determine the release distance as a radiation distance; dividing a radiation range corresponding to the radiation distance into a plurality of influence areas with different influence degrees along the radiation direction by taking the target position as a radiation starting point; wherein the radiation direction is a direction outward from the radiation starting point.
In the above scheme, the area display module 5552 is further configured to divide a radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the intensity of the special effect according to the attenuation characteristic of the intensity of the special effect prop along the radiation direction.
In the above solution, the area display module 5552 is further configured to determine an attenuation characteristic of the state value of the target virtual object along the radiation direction according to the protection capability parameter of the target virtual object and the attenuation characteristic of the special effect strength of the special effect prop along the radiation direction; dividing a radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the state value of the target virtual object according to the attenuation characteristic of the state value of the target virtual object along the radiation direction; wherein the target virtual object is any virtual object in the virtual scene.
In the above solution, the area display module 5552 is further configured to display at least one of the following prompt messages: the first timing information is used for prompting the special effect time of the special effect prop to release a special effect; second timing information, wherein the second timing information is used for prompting the time required for leaving the influence area; direction prompt information, wherein the direction prompt information is used for prompting the direction which is the fastest to leave the influence area; and influence prompt information, wherein the influence prompt information is used for prompting the influence degree caused when the special effect prop releases the special effect.
In the above solution, the item display module 5551 is further configured to display a process of projecting the special effect item to the target position by the first virtual object in the virtual scene in response to the projection operation for the special effect item.
In the foregoing solution, the area display module 5552 is further configured to display the affected area based on the target position in the view angle of the first virtual object when the distance between the first virtual object and the target position is less than the release distance and the special effect prop has not released the special effect.
In the foregoing solution, the area display module 5552 is further configured to display an influence area of the special effect with the target position as a reference in the view angle of the first virtual object when the special effect item is within the visible range of the first virtual object and the special effect item has not released the special effect.
In the foregoing solution, the area display module 5552 is further configured to, when the distance between the second virtual object and the target position is smaller than the release distance and the special effect prop has not released the special effect, display an influence area of the special effect in the viewing angle of the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting special effects.
In the foregoing solution, the area display module 5552 is further configured to, when the special effect item is within the visible range of the second virtual object and the special effect item has not released the special effect, display an influence area of the special effect in the viewing angle of the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting special effects.
In the above solution, the area display module 5552 is further configured to, when it is determined that the special effect prop will affect the state of the first virtual object when releasing the special effect according to the motion trend of the first virtual object, display an affected area of the special effect in the view angle of the first virtual object and/or the second virtual object with the target position as a reference; the second virtual object and the first virtual object both belong to the same group, the distance between the second virtual object and the first virtual object is smaller than a visible distance threshold value, and the first virtual object is a virtual object for projecting special effects.
In the foregoing solution, the area display module 5552 is further configured to, when the distance between the third virtual object and the target position is smaller than the release distance and the special effect prop has not released the special effect, display an influence area of the special effect with the target position as a reference in an angle of view of the third virtual object; the third virtual object and the first virtual object belong to a mutually-antagonistic group respectively, and the first virtual object is a virtual object for projecting special effect props.
In the above solution, the area display module 5552 is further configured to stop displaying the affected area in the virtual scene after the special effect item releases the special effect.
Embodiments of the present application provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and executes the computer instructions, so that the computer device executes the method for interacting the special effects according to the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions, where the executable instructions are stored, and when executed by a processor, will cause the processor to execute an interaction method of a special effect prop provided in embodiments of the present application, for example, the interaction method of a special effect prop illustrated in fig. 5A, fig. 5B, fig. 5C, fig. 6, fig. 7A, fig. 9A, and fig. 9B.
In some embodiments, the computer-readable storage medium may be memory such as FRAM, ROM, PROM, EP ROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; or may be various devices including one or any combination of the above memories.
In some embodiments, executable instructions may be written in any form of programming language (including compiled or interpreted languages), in the form of programs, software modules, scripts or code, and may be deployed in any form, including as a stand-alone program or as a module, component, subroutine, or other unit suitable for use in a computing environment.
By way of example, executable instructions may correspond, but do not necessarily have to correspond, to files in a file system, and may be stored in a portion of a file that holds other programs or data, such as in one or more scripts in a hypertext Markup Language (H TML) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
By way of example, executable instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network.
In summary, the embodiment of the present application has the following beneficial effects:
(1) before the special effect prop releases the special effect, the influence area of the special effect is displayed by taking the target position as a reference, so that a user can be helped to intuitively obtain the accurate position and the influence range of the special effect prop, the visual angle adjustment operation executed when the user searches for the special effect prop is reduced, the man-machine interaction efficiency is improved, the processing resources consumed by the terminal are reduced, the electric quantity consumed by the terminal is saved, and the cruising ability of the terminal is improved.
(2) The user can easily determine the direction of the special effect prop release, and is convenient for the user to keep away from the special effect prop so as to avoid injury; and a special effect prop can be created to release a special effect tense atmosphere at any time, so that a user can feel personally on the scene.
(3) The influence area is presented and stopped being displayed in an individualized way according to the identity of the virtual object, so that the display resource of the terminal can be prevented from being consumed, the electric quantity consumed by the terminal is saved, and the cruising ability of the terminal is improved.
(4) The radiation range is uniformly divided according to the strength of the special effect prop, the same influence area prompt is presented to all virtual objects in the virtual scene, the calculation resources used for dividing a plurality of influence areas can be reduced, and therefore the display speed of the influence areas is improved.
(5) The radiation range is divided in a personalized manner according to the protection capability parameter of the target virtual object and the strength of the special effect prop, and personalized influence area prompts are presented to the virtual objects in the virtual scene, so that the effectiveness of the prompts can be improved.
The above description is only an example of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, and improvement made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (22)

1. An interaction method of special effect props is characterized by comprising the following steps:
displaying a virtual scene in a human-computer interaction interface, and displaying at least one special effect prop projected to a target position in the virtual scene;
the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed;
and before the special effect prop releases the special effect, displaying an influence area of the special effect by taking the target position as a reference.
2. The method of claim 1, wherein displaying the area of influence of the special effect based on the target position comprises:
obtaining a release distance which can be realized by the special-effect prop from the target position;
displaying a regular geometric shape with the release distance as the radiation distance as an affected area with the target position as a geometric center, or,
and displaying an irregular geometric shape with the release distance as the radiation distance as an influence area by taking the target position as a geometric center of gravity.
3. The method of claim 2, wherein the obtaining a release distance that the special effect prop can achieve from the target location comprises:
obtaining the maximum release distance which can be realized when the special-effect prop is released from the target position;
attenuating the maximum release distance based on a protection capability parameter of a target virtual object, and taking the attenuated release distance as a release distance which can be realized by the special-effect prop from the target position;
wherein the target virtual object is an arbitrary virtual object in the virtual scene.
4. The method of claim 1, wherein displaying the area of influence of the special effect based on the target position comprises:
displaying a plurality of radially arranged influence areas by taking the target position as a reference;
wherein different said areas of influence characterize different degrees of influence of said special effect.
5. The method of claim 4, wherein displaying the plurality of radially arranged regions of influence with reference to the target location comprises:
and displaying a plurality of influence areas which are radially arranged between the radiation boundaries corresponding to the radiation distances from the radiation starting points to the special effect props by taking the target positions as radiation starting points.
6. The method of claim 5, wherein displaying the plurality of radially arranged impact regions comprises:
sequentially displaying a plurality of the regions of influence;
wherein the sequence is arranged in a direction radiating outward from the radiation starting point to indicate a moving direction away from the special effect prop.
7. The method of claim 5, wherein displaying the plurality of radially arranged impact regions comprises:
simultaneously displaying the plurality of influence areas according to the display parameters corresponding to each of the influence areas;
the display parameters of the plurality of influence areas are in an attenuation trend and are attenuated according to the direction radiating outwards from the radiation starting point so as to indicate the moving direction far away from the special effect prop.
8. The method of claim 4, wherein prior to said displaying the plurality of radially arranged regions of influence with reference to the target location, the method further comprises:
obtaining a release distance which can be reached when the special effect of the special effect prop is released from the target position, and determining the release distance as a radiation distance;
dividing a radiation range corresponding to the radiation distance into a plurality of influence areas with different influence degrees along a radiation direction by taking the target position as a radiation starting point;
wherein the radiation direction is a direction outward from the radiation origin.
9. The method of claim 8, wherein the dividing the radiation range corresponding to the radiation distance into a plurality of influence areas with different influence degrees along the radiation direction with the target position as a radiation starting point comprises:
and dividing the radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the special effect strength according to the attenuation characteristic of the special effect strength of the special effect prop along the radiation direction.
10. The method of claim 8, wherein the dividing the radiation range corresponding to the radiation distance into a plurality of influence areas with different influence degrees along the radiation direction with the target position as a radiation starting point comprises:
determining the attenuation characteristic of the state value of the target virtual object along the radiation direction according to the protection capability parameter of the target virtual object and the attenuation characteristic of the special effect strength of the special effect prop along the radiation direction;
dividing a radiation range corresponding to the radiation distance into a plurality of influence areas representing different attenuation intervals of the state value of the target virtual object according to the attenuation characteristic of the state value of the target virtual object along the radiation direction;
wherein the target virtual object is an arbitrary virtual object in the virtual scene.
11. The method of claim 1, wherein when displaying the area of influence with reference to the target position, the method further comprises:
displaying at least one of the following prompt messages:
first timing information, wherein the first timing information is used for prompting the special effect time of the special effect prop to release a special effect;
second timing information, wherein the second timing information is used for prompting the time required for leaving the influence area;
direction prompt information, wherein the direction prompt information is used for prompting a direction which is the fastest to leave the influence area;
and influence prompt information, wherein the influence prompt information is used for prompting the influence degree caused when the special effect prop releases the special effect.
12. The method of any of claims 1 to 11, wherein prior to displaying in the virtual scene at least one special effects prop projected to a target location, the method further comprises:
and responding to the projection operation aiming at the special effect prop, and displaying a process that a first virtual object projects the special effect prop to the target position in the virtual scene.
13. The method of claim 12, wherein displaying the area of influence of the special effect based on the target position comprises:
and when the distance between the first virtual object and the target position is smaller than the release distance and the special effect prop does not release the special effect, displaying an influence area by taking the target position as a reference in the visual angle of the first virtual object.
14. The method of claim 12, wherein displaying the area of influence of the special effect based on the target position comprises:
and when the special effect prop is in the visual range of the first virtual object and the special effect prop does not release the special effect, displaying an influence area of the special effect by taking the target position as a reference in the visual angle of the first virtual object.
15. The method according to any one of claims 1 to 11, wherein displaying the area of influence of the special effect with reference to the target position comprises:
when the distance between a second virtual object and the target position is smaller than the release distance and the special effect prop does not release the special effect, displaying an influence area of the special effect by taking the target position as a reference in the visual angle of the second virtual object;
the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting the special effect prop.
16. The method according to any one of claims 1 to 11, wherein displaying the area of influence of the special effect with reference to the target position comprises:
when the special effect prop is in the visual range of a second virtual object and the special effect prop does not release the special effect, displaying an influence area of the special effect by taking the target position as a reference in the visual angle of the second virtual object;
the second virtual object and the first virtual object both belong to the same group, and the first virtual object is a virtual object for projecting the special effect prop.
17. The method according to any one of claims 1 to 11, wherein displaying the area of influence of the special effect with reference to the target position comprises:
when determining that the state of a first virtual object will be affected when the special effect prop releases a special effect according to the motion trend of the first virtual object, displaying an affected area of the special effect by taking the target position as a reference in the visual angle of the first virtual object and/or a second virtual object;
the second virtual object and the first virtual object both belong to the same group, the distance between the second virtual object and the first virtual object is smaller than a visual distance threshold value, and the first virtual object is a virtual object for projecting the special effect prop.
18. The method according to any one of claims 1 to 11, wherein displaying the area of influence of the special effect with reference to the target position comprises:
when the distance between a third virtual object and the target position is smaller than the release distance and the special effect prop does not release the special effect, displaying an influence area of the special effect by taking the target position as a reference in the visual angle of the third virtual object;
the third virtual object and the first virtual object respectively belong to a group which is confronted with each other, and the first virtual object is a virtual object which projects the special effect prop.
19. The method according to any one of claims 1 to 11, further comprising:
after the special effect prop releases a special effect, stopping displaying the influence area in the virtual scene.
20. An interactive device for a special effect prop, the device comprising:
the prop display module is used for displaying a virtual scene in a human-computer interaction interface, and displaying at least one special-effect prop projected to a target position in the virtual scene;
the special effect prop is used for releasing the special effect when the duration of the projected special effect reaches the target duration, or is used for releasing the special effect when any virtual object is sensed;
and the area display module is used for displaying the influence area of the special effect by taking the target position as a reference before the special effect prop releases the special effect.
21. An electronic device, characterized in that the electronic device comprises:
a memory for storing executable instructions;
a processor, configured to execute the executable instructions stored in the memory to implement the method of interacting the special effects prop of any of claims 1-19.
22. A computer-readable storage medium storing executable instructions for, when executed by a processor, implementing the method of interacting the special effects prop of any of claims 1-19.
CN202011069424.5A 2020-09-30 2020-09-30 Interaction method and device of special effect prop, electronic equipment and storage medium Active CN112121434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011069424.5A CN112121434B (en) 2020-09-30 2020-09-30 Interaction method and device of special effect prop, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011069424.5A CN112121434B (en) 2020-09-30 2020-09-30 Interaction method and device of special effect prop, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112121434A true CN112121434A (en) 2020-12-25
CN112121434B CN112121434B (en) 2022-05-10

Family

ID=73843772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011069424.5A Active CN112121434B (en) 2020-09-30 2020-09-30 Interaction method and device of special effect prop, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112121434B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113703654A (en) * 2021-09-24 2021-11-26 腾讯科技(深圳)有限公司 Camouflage processing method and device in virtual scene and electronic equipment
CN113750530A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene
CN114100128A (en) * 2021-12-09 2022-03-01 腾讯科技(深圳)有限公司 Prop special effect display method and device, computer equipment and storage medium
CN114327059A (en) * 2021-12-24 2022-04-12 北京百度网讯科技有限公司 Gesture processing method, device, equipment and storage medium
CN114939275A (en) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 Object interaction method, device, equipment and storage medium
WO2024001450A1 (en) * 2022-06-27 2024-01-04 腾讯科技(深圳)有限公司 Method and apparatus for displaying special effect of prop, and electronic device and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006288528A (en) * 2005-04-07 2006-10-26 Namco Bandai Games Inc Game system, server system, game device, program and information storage medium
CN110215698A (en) * 2019-04-29 2019-09-10 努比亚技术有限公司 Dangerous tip method, wearable device and computer readable storage medium in game
CN111124133A (en) * 2019-12-30 2020-05-08 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for danger prompt information in virtual scene
CN111111217A (en) * 2019-12-06 2020-05-08 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111202982A (en) * 2020-01-02 2020-05-29 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111318018A (en) * 2020-02-07 2020-06-23 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111352507A (en) * 2020-02-27 2020-06-30 维沃移动通信有限公司 Information prompting method and electronic equipment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006288528A (en) * 2005-04-07 2006-10-26 Namco Bandai Games Inc Game system, server system, game device, program and information storage medium
CN110215698A (en) * 2019-04-29 2019-09-10 努比亚技术有限公司 Dangerous tip method, wearable device and computer readable storage medium in game
CN111111217A (en) * 2019-12-06 2020-05-08 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111124133A (en) * 2019-12-30 2020-05-08 腾讯科技(深圳)有限公司 Method, device, equipment and storage medium for danger prompt information in virtual scene
CN111202982A (en) * 2020-01-02 2020-05-29 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111318018A (en) * 2020-02-07 2020-06-23 腾讯科技(深圳)有限公司 Control method and device of virtual prop, storage medium and electronic device
CN111352507A (en) * 2020-02-27 2020-06-30 维沃移动通信有限公司 Information prompting method and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
OKITA SOUJI: "《知乎》", 13 June 2019 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113750530A (en) * 2021-09-18 2021-12-07 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene
CN113750530B (en) * 2021-09-18 2023-07-21 腾讯科技(深圳)有限公司 Prop control method, device, equipment and storage medium in virtual scene
CN113703654A (en) * 2021-09-24 2021-11-26 腾讯科技(深圳)有限公司 Camouflage processing method and device in virtual scene and electronic equipment
CN113703654B (en) * 2021-09-24 2023-07-14 腾讯科技(深圳)有限公司 Camouflage processing method and device in virtual scene and electronic equipment
CN114100128A (en) * 2021-12-09 2022-03-01 腾讯科技(深圳)有限公司 Prop special effect display method and device, computer equipment and storage medium
CN114100128B (en) * 2021-12-09 2023-07-21 腾讯科技(深圳)有限公司 Prop special effect display method, device, computer equipment and storage medium
CN114327059A (en) * 2021-12-24 2022-04-12 北京百度网讯科技有限公司 Gesture processing method, device, equipment and storage medium
CN114939275A (en) * 2022-05-24 2022-08-26 北京字跳网络技术有限公司 Object interaction method, device, equipment and storage medium
WO2024001450A1 (en) * 2022-06-27 2024-01-04 腾讯科技(深圳)有限公司 Method and apparatus for displaying special effect of prop, and electronic device and storage medium

Also Published As

Publication number Publication date
CN112121434B (en) 2022-05-10

Similar Documents

Publication Publication Date Title
CN112121434B (en) Interaction method and device of special effect prop, electronic equipment and storage medium
CN112090069B (en) Information prompting method and device in virtual scene, electronic equipment and storage medium
CN112090070B (en) Interaction method and device of virtual props and electronic equipment
CN112121430B (en) Information display method, device, equipment and storage medium in virtual scene
CN112121414B (en) Tracking method and device in virtual scene, electronic equipment and storage medium
CN111921198B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112416196B (en) Virtual object control method, device, equipment and computer readable storage medium
CN112138385B (en) Virtual shooting prop aiming method and device, electronic equipment and storage medium
CN113633964B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN112057863A (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN112402959A (en) Virtual object control method, device, equipment and computer readable storage medium
CN112057864B (en) Virtual prop control method, device, equipment and computer readable storage medium
US20230033530A1 (en) Method and apparatus for acquiring position in virtual scene, device, medium and program product
CN112057860B (en) Method, device, equipment and storage medium for activating operation control in virtual scene
CN112121432B (en) Control method, device and equipment of virtual prop and computer readable storage medium
CN113769379B (en) Method, device, equipment, storage medium and program product for locking virtual object
CN113703654B (en) Camouflage processing method and device in virtual scene and electronic equipment
CN113144617B (en) Control method, device and equipment of virtual object and computer readable storage medium
CN112121433B (en) Virtual prop processing method, device, equipment and computer readable storage medium
CN114210051A (en) Carrier control method, device, equipment and storage medium in virtual scene
CN114146414A (en) Virtual skill control method, device, equipment, storage medium and program product
CN112870694A (en) Virtual scene picture display method and device, electronic equipment and storage medium
CN112891930B (en) Information display method, device, equipment and storage medium in virtual scene
CN113633991B (en) Virtual skill control method, device, equipment and computer readable storage medium
CN113769392B (en) Method and device for processing state of virtual scene, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant