CN114177616A - Virtual item release method, device, equipment and medium - Google Patents

Virtual item release method, device, equipment and medium Download PDF

Info

Publication number
CN114177616A
CN114177616A CN202111658380.4A CN202111658380A CN114177616A CN 114177616 A CN114177616 A CN 114177616A CN 202111658380 A CN202111658380 A CN 202111658380A CN 114177616 A CN114177616 A CN 114177616A
Authority
CN
China
Prior art keywords
virtual
moving object
moving
character
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111658380.4A
Other languages
Chinese (zh)
Inventor
刘智洪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of CN114177616A publication Critical patent/CN114177616A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5378Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for displaying an additional top view, e.g. radar screens or maps
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The application discloses a method, a device, equipment and a medium for releasing a virtual prop, and relates to the field of virtual worlds. The method comprises the following steps: in the process of releasing the virtual prop, displaying a virtual map corresponding to the virtual environment in a display interface; in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment; and responding to the release operation of the virtual moving object, displaying the virtual moving object positioned in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.

Description

Virtual item release method, device, equipment and medium
The present application claims priority of chinese patent application No. 202111306198.2 entitled "method, apparatus, device, and medium for releasing virtual prop" filed on 11/05/2021, which is incorporated herein by reference in its entirety.
Technical Field
The present application relates to the field of virtual worlds, and in particular, to a method, an apparatus, a device, and a medium for releasing a virtual item.
Background
In a network game having a virtual environment, such as a multiplayer online role-playing game, a player can play one or more virtual characters and control the activities and behaviors of the virtual characters in the virtual world of the game.
There are typically at least two avatars in a player versus environment, each of which includes at least one virtual character. Taking the first virtual character and the second virtual character belonging to different avatars as an example, the first player controls the first virtual character, the second player controls the second virtual character, and the two virtual characters can release skills by using the virtual prop, so that the two virtual characters affect the other party and the attribute value of the other party is reduced.
Taking the example that the first virtual character uses the virtual prop, after the first virtual character releases the skill corresponding to the virtual prop, because the influence range of the skill is fixed, the second player can predict the influence range through the skill special effect displayed in the display interface, so that the second virtual character is controlled to avoid.
Disclosure of Invention
The embodiment of the application provides a method, a device, equipment and a medium for releasing a virtual prop, which can display a virtual moving object in a display interface, wherein the actual moving path of the virtual moving object is determined according to an expected moving path on a virtual map, so that the influence range of the virtual moving object is movable and unpredictable.
The technical scheme is as follows:
according to an aspect of the application, a method for releasing a virtual prop is provided, the method comprising:
in the process of releasing the virtual prop, displaying a virtual map corresponding to the virtual environment in a display interface;
in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment;
and in response to the releasing operation aiming at the virtual moving object, displaying the virtual moving object positioned in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.
According to an aspect of the application, there is provided a device for releasing a virtual prop, the device comprising:
the display module is used for displaying a virtual map corresponding to the virtual environment in a display interface in the process of releasing the virtual prop;
the response module is used for responding to the movement operation aiming at the virtual map, displaying the expected movement path of the virtual moving object on the virtual map, wherein the virtual moving object is used for influencing the attribute value corresponding to the virtual character in the virtual environment;
and the response module is also used for responding to the release operation aiming at the virtual moving object, displaying the virtual moving object positioned in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.
According to one aspect of the present application, there is provided a computer device comprising a processor;
the processor is used for displaying a virtual map corresponding to the virtual environment in the display interface in the process of releasing the virtual prop;
in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment;
and in response to the releasing operation aiming at the virtual moving object, displaying the virtual moving object positioned in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.
According to an aspect of the present application, there is provided a computer-readable storage medium having stored therein a computer program for execution by a processor to implement the method of releasing a virtual prop as described above.
According to an aspect of the present application, there is provided a chip comprising programmable logic circuits and/or program instructions for implementing the method of releasing a virtual prop as described above when the chip is run.
According to an aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium, the processor reading and executing the computer instructions from the computer readable storage medium to implement the method of releasing a virtual prop as described above.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
by displaying the virtual moving object in the display interface, the actual moving path of the virtual moving object is determined according to the desired moving path on the virtual map, so that the influence range of the virtual moving object is movable and unpredictable. Wherein the desired movement route is displayed on the virtual map in response to a movement operation on the display interface.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal provided in an exemplary embodiment of the present application;
FIG. 2 is a block diagram of a computer system provided in an exemplary embodiment of the present application;
FIG. 3 is an interface schematic diagram of a virtual item display provided in an exemplary embodiment of the present application;
fig. 4 is a flowchart of a method for releasing a virtual prop according to an exemplary embodiment of the present application;
fig. 5 is a flowchart of a method for releasing a virtual prop according to an exemplary embodiment of the present application;
FIG. 6 is an interface schematic diagram of a virtual prop display provided in an exemplary embodiment of the present application;
fig. 7 is a flowchart of a method for releasing a virtual prop according to an exemplary embodiment of the present application;
FIG. 8 is an interface schematic diagram of a display interface provided by an exemplary embodiment of the present application;
FIG. 9 is an interface schematic diagram of a display interface provided by an exemplary embodiment of the present application;
FIG. 10 is an interface schematic diagram of a display interface provided by an exemplary embodiment of the present application;
fig. 11 is a flowchart of a method for releasing a virtual prop according to an exemplary embodiment of the present application;
FIG. 12 is a schematic view of a release device for a virtual prop provided in an exemplary embodiment of the present application;
fig. 13 is a block diagram of an electronic device provided in an exemplary embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
First, terms referred to in the embodiments of the present application will be described:
virtual environment: is a virtual environment that is displayed (or provided) when an application is run on the terminal. The virtual environment may be a simulated world of a real world, a semi-simulated semi-fictional world, or a purely fictional world. The virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
Virtual map: the map corresponding to the virtual environment can be obtained by scaling down the virtual environment. Compared with the virtual environment, the objects included in the virtual map can be deleted according to actual needs, for example, soil piles or swarms in the virtual environment may not be displayed in the virtual map.
Virtual roles: refers to a virtual object existing in a virtual environment, and the object may be a movable object or an associated virtual object corresponding to the movable object. Alternatively, the movable object may be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil drums, walls, stones, etc. displayed in a three-dimensional virtual environment. Optionally, the virtual character is a three-dimensional volumetric model created based on animated skeletal techniques. Each virtual character has its own shape and volume in the three-dimensional virtual environment, occupying a portion of the space in the three-dimensional virtual environment. Optionally, the associated virtual object may be a virtual item used by the movable object, a temporary reward acquired by the movable object, such as: shooting tools equipped by the virtual characters, mechanical continuous killing rewards acquired by the virtual characters and the like.
The method provided by the application can be applied to the application program with the virtual environment and the virtual role. Illustratively, an application that supports a virtual environment is one in which a user can control the movement of a virtual character within the virtual environment. By way of example, the methods provided herein may be applied to: any one of a Virtual Reality (VR) application program, an Augmented Reality (AR) program, a three-dimensional map program, a Virtual Reality Game, an Augmented Reality Game, a First-Person shooter Game (FPS), a Third-Person shooter Game (TPS), a Multiplayer Online tactical Game (MOBA), and a strategy Game (SLG).
Illustratively, a game in the virtual environment is composed of one or more maps of game worlds, the virtual environment in the game simulates a scene of a real world, a user can control a virtual character in the game to perform actions such as walking, running, jumping, shooting, fighting, driving, attacking other virtual characters by using virtual weapons, and the like in the virtual environment, the interactivity is strong, and a plurality of users can form a team on line to perform a competitive game.
In some embodiments, the application program may be a shooting game, a racing game, a role playing game, an adventure game, a sandbox game, a tactical competition game, and the like. The client can support at least one operating system of a Windows operating system, an apple operating system, an android operating system, an IOS operating system and a LINUX operating system, and the clients of different operating systems can be interconnected and intercommunicated. In some embodiments, the client is a program adapted to a mobile terminal having a touch screen.
In some embodiments, the client is an application developed based on a three-dimensional engine, such as the three-dimensional engine being a Unity engine.
The terminal in the present application may be a desktop computer, a laptop computer, a mobile phone, a tablet computer, an e-book reader, an MP3(Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3) player, an MP4(Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4) player, and so on. The terminal is installed and operated with a client supporting a virtual environment, such as a client of an application supporting a three-dimensional virtual environment. The application program may be any one of a Battle Royal (BR) game, a virtual reality application program, an augmented reality program, a three-dimensional map program, a third person shooter game, a first person shooter game, and a multiplayer online tactic competition game. Alternatively, the application may be a stand-alone application, such as a stand-alone 3D game program, or may be a network online application.
Fig. 1 is a schematic structural diagram of a terminal provided in an exemplary embodiment of the present application, where the terminal includes a processor 101, a touch screen 102, and a memory 103.
The processor 101 may be at least one of a single-core processor, a multi-core processor, an embedded chip, and a processor having instruction execution capabilities.
The touch screen 102 includes a general touch screen or a pressure sensitive touch screen. The normal touch screen can measure a pressing operation or a sliding operation applied to the touch screen 102; a pressure sensitive touch screen can measure the degree of pressure exerted on the touch screen 102.
The memory 103 stores an executable program of the processor 101. Illustratively, the memory 103 stores a virtual environment program a, an application program B, an application program C, a touch pressure sensing module 18, and a kernel layer 19 of an operating system. The virtual environment program a is an application program developed based on the three-dimensional virtual environment module 17. Optionally, the virtual environment program a includes, but is not limited to, at least one of a game program, a virtual reality program, a three-dimensional map program, and a three-dimensional presentation program developed by a three-dimensional virtual environment module (also referred to as a virtual environment module) 17. For example, when the operating system of the terminal adopts an android operating system, the virtual environment program a is developed by adopting Java programming language and C # language; for another example, when the operating system of the terminal is the IOS operating system, the virtual environment program a is developed using the Object-C programming language and the C # language.
The three-dimensional Virtual environment module 17 is a module supporting multiple operating system platforms, and schematically, the three-dimensional Virtual environment module may be used for program development in multiple fields, such as a game development field, a Virtual Reality (VR) field, and a three-dimensional map field, and the specific type of the three-dimensional Virtual environment module 17 is not limited in the embodiment of the present application, and in the following embodiment, the three-dimensional Virtual environment module 17 is a module developed by using a Unity engine as an example.
The touch (and pressure) sensing module 18 is a module for receiving a touch event (and a pressure touch event) reported by the touch screen driver 191, and optionally, the touch sensing module may not have a pressure sensing function and does not receive a pressure touch event. The touch event includes: the type of touch event and the coordinate values, the type of touch event including but not limited to: a touch start event, a touch move event, and a touch down event. The pressure touch event comprises the following steps: a pressure value and a coordinate value of the pressure touch event. The coordinate value is used for indicating a touch position of the pressure touch operation on the display screen. Optionally, an abscissa axis is established in the horizontal direction of the display screen, and an ordinate axis is established in the vertical direction of the display screen to obtain a two-dimensional coordinate system.
Illustratively, the kernel layer 19 includes a touch screen driver 191 and other drivers 192. The touch screen driver 191 is a module for detecting a pressure touch event, and when the touch screen driver 191 detects the pressure touch event, the pressure touch event is transmitted to the pressure sensing module 18.
Other drivers 192 may be drivers associated with the processor 101, drivers associated with the memory 103, drivers associated with network components, drivers associated with sound components, and the like.
Those skilled in the art will appreciate that the foregoing is merely a general illustration of the structure of the terminal. A terminal may have more or fewer components in different embodiments. For example, the terminal may further include a gravitational acceleration sensor, a gyro sensor, a power supply, and the like.
Fig. 2 shows a block diagram of a computer system provided in an exemplary embodiment of the present application, where the computer system 200 includes: terminal 210, server cluster 220.
The terminal 210 is installed and operated with a client 211 supporting a virtual environment, and the client 211 may be an application supporting a virtual environment. When the terminal runs the client 211, a user interface of the client 211 is displayed on the screen of the terminal 210. The client may be any one of an FPS game, a TPS game, an MOBA game, a tactical sports game, and an SLG game. In the present embodiment, the client is an FPS game for example. The terminal 210 is a terminal used by the first user 212, and the first user 212 uses the terminal 210 to control a first virtual character located in the virtual environment to perform an activity, and the first virtual character may be referred to as a first virtual character of the first user 212. The activities of the first avatar include, but are not limited to: adjusting at least one of body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing. Illustratively, the first avatar is a first avatar, such as a simulated persona or an animated persona.
The device types of the terminal 210 include: at least one of a smartphone, a tablet, an e-book reader, an MP3 player, an MP4 player, a laptop portable computer, and a desktop computer.
Only one terminal is shown in fig. 2, but there are a plurality of other terminals 240 in different embodiments. In some embodiments, there is at least one other terminal 240 corresponding to the developer, a development and editing platform of the client of the virtual environment is installed on the other terminal 240, the developer can edit and update the client on the other terminal 240, and transmit the updated client installation package to the server cluster 220 through a wired or wireless network, and the terminal 210 can download the client installation package from the server cluster 220 to update the client.
The terminal 210 and the other terminals 240 are connected to the server cluster 220 through a wireless network or a wired network.
The server cluster 220 includes at least one of a server, a plurality of servers, a cloud computing platform, and a virtualization center. Server cluster 220 is used to provide background services for clients that support a three-dimensional virtual environment. Optionally, the server cluster 220 undertakes primary computing work and the terminals undertake secondary computing work; or, the server cluster 220 undertakes the secondary computing work, and the terminal undertakes the primary computing work; or, the server cluster 220 and the terminal perform cooperative computing by using a distributed computing architecture.
Optionally, the terminal and the server are both computer devices.
In one illustrative example, server cluster 220 includes servers 221 and 226, where servers 221 include a processor 222, a user account database 223, a combat service module 224, and a user-oriented Input/Output Interface (I/O Interface) 225. The processor 222 is configured to load an instruction stored in the server 221, and process data in the user account database 223 and the combat service module 224; the user account database 223 is used for storing data of user accounts used by the terminal 210 and other terminals 240, such as head images of the user accounts, nicknames of the user accounts, fighting capacity indexes of the user accounts, and service areas where the user accounts are located; the fight service module 224 is used for providing a plurality of fight rooms for the users to fight against; the user-facing I/O interface 225 is used to establish communication with the terminal 210 through a wireless network or a wired network to exchange data.
In conjunction with the above description of the virtual environment and the description of the implementation environment, the data synchronization method provided in the embodiments of the present application will be described below.
Fig. 3 is a schematic interface diagram illustrating a display of a virtual prop according to an exemplary embodiment of the present application.
In the process of triggering the virtual item, a virtual map 111 corresponding to the virtual environment is displayed in the display interface 110. The display of the virtual map 111 may be performed in response to a trigger operation in the display interface 110. For example, a trigger control with a virtual item is displayed in the display interface 110, the player clicks the trigger control, and the virtual map 111 is displayed in the display interface 110 in response to a clicking operation on the trigger control.
Wherein the virtual environment is displayed in the display interface 110. Optionally, the virtual map 111 is a map of the virtual environment after being scaled down.
In response to the moving operation with respect to the virtual map, a desired moving path 112 of the virtual moving object is displayed on the virtual map 111. The virtual moving object is one of the virtual props and is used for influencing attribute values corresponding to virtual characters in the virtual environment. Optionally, the virtual moving object is one of a virtual tornado, a virtual monster, and a virtual weapon.
The length and shape of the desired movement path 112 are determined according to the movement operation on the display interface 110. For example, a virtual finger is displayed on the display interface 110, the player controls the virtual finger to slide on the virtual map 111, and in response to a sliding operation for the virtual finger, a sliding route corresponding to the virtual finger can be displayed on the virtual map 111, and the sliding route can be determined as the desired movement path 112. The starting point of the sliding route is the desired starting point of the desired moving path 112, and the ending point of the sliding route is the desired ending point of the desired moving path 112.
The form of the desired movement path 112 is determined according to the sliding route of the virtual finger, and may be a straight line or a curved line.
Optionally, the sliding path is completed in one go. If the player interrupts the sliding operation, it is determined that one sliding operation is completed, and when the player controls the virtual finger to slide on the virtual map 111 again, the corresponding sliding route is no longer displayed.
In response to the release operation for the virtual moving object, the virtual moving object located in the virtual environment is displayed in the display interface 110, and the virtual moving object moves along the actual moving path 113. Wherein the actual movement path 113 is determined from the desired movement path 112. Alternatively, the length and shape of the actual moving path 113 may be referred to the length and shape of the desired moving path 112.
The display of the virtual moving object may be performed in response to a release operation in the display interface 110. For example, a release control 114 for the virtual moving object is also displayed in the display interface 110, and the player clicks the release control 114 to display the virtual moving object in the display interface 110 in response to a release operation on the display interface 110. Optionally, in response to a release operation for the virtual moving object, the virtual moving object is displayed at an actual starting point of the virtual environment; subsequently, the virtual moving object is controlled to move along the actual moving path 113.
And canceling the display of the virtual moving object under the condition that the virtual moving object meets the display canceling condition. The display canceling condition can be set according to actual needs. Optionally, the cancel display condition includes at least one of the following conditions: the continuous moving time of the virtual moving object reaches the preset time length, the virtual moving object reaches the actual end point, the moving energy consumption of the virtual moving object is finished, and the number of virtual characters influenced by the virtual moving object reaches the preset number.
Alternatively, the virtual moving object moves along the actual moving path 113 at a target moving speed, which is a ratio of the length of the desired moving path to the duration of the moving, and the duration of the moving is a fixed time period.
Illustratively, the virtual moving object affects the attribute value corresponding to the virtual character in the virtual environment. Taking the example that the virtual environment includes a virtual character and an associated virtual object corresponding to the virtual character, in the case that the virtual character is located within the coverage area of the virtual moving object, the influence information of tornado on the virtual character is displayed in the display interface 110, and the influence information is used for indicating the reduction of the attribute value of the virtual character and/or the associated virtual object.
For example, the virtual environment includes a virtual character and a virtual prop corresponding to the virtual character. In the process of moving the virtual moving object along the actual moving path 113, the virtual character enters the coverage area of the virtual moving object, and the virtual moving object affects the state attribute of the virtual character. During the time period that the virtual character is within the coverage range, the state attribute value of the virtual object is continuously deducted at a constant speed.
Then, the virtual moving object continues to move along the actual moving path 113, and the virtual moving object gradually approaches the virtual prop, so that the speed attribute value of the virtual prop continuously decreases. The speed of the speed attribute value of the virtual prop decreases differently with the change of the distance between the virtual prop and the virtual moving object, for example, the speed of the speed attribute value decreases faster the virtual prop is closer to the virtual moving object.
As shown schematically in fig. 4, an embodiment of the present application provides a method for releasing a virtual prop, which is used to display a virtual moving object in a display interface. The method for releasing the virtual prop is applied to a terminal and comprises the following steps:
step 401: and in the process of releasing the virtual prop, displaying a virtual map corresponding to the virtual environment in the display interface.
The virtual map may be obtained by scaling down the virtual environment. Compared with the virtual environment, the virtual objects included in the virtual map can be deleted according to actual needs, for example, a virtual soil heap or a virtual grass in the virtual environment may not be displayed in the virtual map.
Optionally, a trigger control corresponding to the virtual item is displayed in the display interface, and the virtual map is displayed in the display interface in response to a trigger operation on the trigger control.
For example, a player controls the skills associated with a virtual character releasing a virtual item that requires the use of a virtual map. In the process of releasing the virtual prop, a player can trigger the trigger control of the virtual prop, a virtual flat plate is displayed in a display interface in response to the trigger control, a virtual map is displayed on the virtual flat plate, and the player performs related setting on the virtual prop through operation on the virtual map so as to realize use of the virtual prop.
Step 402: in response to a moving operation with respect to the virtual map, a desired moving path of the virtual moving object is displayed on the virtual map.
Illustratively, the virtual moving object is used to influence the attribute value corresponding to the virtual character in the virtual environment.
The virtual moving object is a movable virtual prop, and the actual moving path of the virtual moving object is an unfixed route, and the actual moving path is determined according to the expected moving path. For example, the player controls the virtual character to release the virtual moving object multiple times, and the actual moving path of the virtual moving object is not a preset fixed path but a random path which is not completely the same at each release.
Optionally, the virtual moving object is one of a virtual tornado, a virtual monster, and a virtual weapon. For example, the virtual moving object is a virtual tornado, and the player can control the virtual character to release the virtual tornado, so that the virtual tornado lifts up strong wind along the actual moving path, thereby affecting the virtual character in the virtual environment; for another example, the virtual moving object is a virtual monster, the player controls the virtual character to release the virtual monster, and the virtual monster walks along the actual moving path to affect the contacted or affected virtual character in the walking process; as another example, the virtual moving object is a virtual weapon, such as a fly cutter, a fan, a stick, etc., and the virtual weapon moves along the actual moving path, thereby affecting the virtual orange color.
The expected moving path of the virtual moving object is generated according to the moving operation on the display interface, and the moving operation on the display interface is unpredictable, so that the expected moving path is also unpredictable. Wherein the confirmation of the desired movement path can be achieved by: starting at the expected starting point of the expected moving path, obtaining a coordinate point at a certain time interval, and sequentially connecting each coordinate point to obtain the curve of the expected moving path.
Illustratively, the moving operation for the virtual map includes, but is not limited to, at least one of the following operations: sliding operation, connecting operation, selecting operation and dragging operation.
For example, the player draws a curve at random on the display interface, and generates a desired movement path from the curve in response to a sliding operation of the player drawing the curve on the display interface. For another example, a plurality of coordinate mark points are displayed on the display interface, the player performs random connection on the display interface, and the virtual moving path is displayed on the display interface in response to the connection operation on the display interface.
For another example, a plurality of candidate desired movement paths or a plurality of randomly generated curves are also displayed in the display interface, and the player performs a selection operation or a drag operation on any one of the candidate desired movement paths or the plurality of randomly generated curves. Taking a dragging operation as an example, the player drags the selected curve to an arbitrary position on the virtual map, and in response to the dragging operation, the determined virtual movement path is displayed on the corresponding position of the curve on the virtual map.
A virtual character refers to a virtual object existing in a virtual environment. Optionally, the virtual character includes, but is not limited to, at least one of the following: virtual characters and associated virtual objects corresponding to the virtual characters.
The virtual character is controlled by a player, the player can control the virtual character to walk, jump, lie down and other different motions in the virtual environment, and the virtual character can be controlled to release the owned virtual prop in the virtual environment. Optionally, the associated virtual object includes, but is not limited to, at least one of: the virtual property corresponding to the virtual character and the temporary reward obtained by the virtual character.
For example, a player controls a virtual character equipped with various shooting tools, and since the player realizes multi-link killing while controlling the virtual character, a mechanical link killing award is obtained. In this process the virtual character includes one or more of a virtual character, a different kind of shooting tool, a mechanical kill reward.
Optionally, the virtual sports object is a temporary reward acquired by the virtual character. For example, the player controls the virtual character to realize multi-link killing, and the system gives the virtual sports object as a link killing reward to the virtual character, and the virtual character obtains the right to use the virtual sports object once.
The virtual moving object may affect the attribute values corresponding to the virtual characters in the virtual environment. Optionally, the attribute value includes, but is not limited to, at least one of: a speed attribute value, a state attribute value. The speed attribute value is used for indicating the current speed of the virtual character, and the state attribute value is used for indicating the current state of the virtual character; the state attribute value includes, but is not limited to, one of the following: blood volume value, physical strength value, endurance value, attack value, defense value and hit rate.
Optionally, the virtual moving object reduces the attribute value corresponding to the virtual character, and the reduction of the attribute value may be at least one of one-time reduction, continuous reduction, uniform speed reduction, acceleration reduction, deceleration reduction, random reduction by a certain value or proportion, and fixed reduction by a certain value or proportion. For example, the virtual moving object affects the state attribute value of the virtual character, and the decrease of the state attribute value shows a continuous uniform-speed decreasing trend; for another example, the virtual moving object affects the speed attribute value corresponding to the virtual prop used by the virtual character, and the reduction of the speed attribute value is a one-time random reduction by a certain proportion.
Step 403: and responding to the release operation aiming at the virtual moving object, and displaying the virtual moving object positioned in the virtual environment in the display interface.
Illustratively, the actual movement path of the virtual moving object is determined according to the desired movement path.
Wherein the virtual environment may be continuously displayed in the display interface, and the virtual moving object is displayed in the continuously displayed virtual environment in response to a release operation for the virtual moving object. Optionally, the releasing operation for the virtual moving object includes, but is not limited to, at least one of the following operations: single click operation, double click operation, moving operation, touch operation, sliding operation and dragging operation.
For example, in response to a single-click operation for a virtual moving object, a virtual moving object located in a virtual environment is displayed in a display interface. Optionally, a release control is further displayed on the display interface, and in response to a trigger operation on the release control, the virtual moving object located in the virtual environment is displayed in the display interface.
For another example, after the desired movement path of the virtual moving object is displayed, the player drags the desired movement path to the virtual environment continuously displayed in the display interface, and the virtual moving object located in the virtual environment is displayed in the display interface in response to a drag operation on the display interface.
According to the foregoing, the actual moving path of the virtual moving object is determined according to the desired moving path, and the length and shape of the actual moving path may refer to the desired moving path. For example, the length and shape of the actual movement path are the same as the desired movement path, and the shape of the actual movement path in the virtual environment can be regarded as an isometric enlargement of the desired movement path on the virtual map.
The length of the actual movement path in the virtual environment may be equal to or unequal to the length of the desired movement path on the virtual map. Taking an equal scale as an example, the scale of the virtual map is 1:50, the scale of the virtual environment is 1:500, the length of the expected moving path displayed on the virtual map is 8, and the specific length of the expected moving path is 400 according to the scale; if the length of the actual movement path is the same as the length of the desired movement path, the length of the actual movement path displayed in the virtual environment should be 80.
As a movable virtual prop, the actual starting point, the actual end point, the moving distance, the moving speed and the continuous moving time of the virtual moving object need to be considered. In the actual use process, the above numerical indicators affecting the movement of the virtual moving object can be set according to actual needs, and the following contents are only exemplary examples, and do not specifically limit the present application.
Alternatively, the actual starting point of the actual moving path of the virtual moving object may be determined according to the desired starting point of the desired moving path. For example, if the desired starting point of the desired moving path is located at the top floor of a certain building, the actual starting point of the actual moving path may also be located at the top floor of the building.
Alternatively, the actual end point of the actual moving path of the virtual moving object may be determined according to the desired end point of the desired moving path, or the actual end point may be determined according to the duration moving time of the virtual moving object. For example, the continuous moving time of the virtual moving object is a fixed value, and when the moving time length of the virtual moving object reaches the fixed value, the point at which the virtual moving object moves at that time is the actual end point.
Alternatively, the target moving speed of the virtual moving object may be determined according to the length of the desired moving path. According to the foregoing, there are two determination methods for the actual end point of the actual moving path of the virtual moving object, and different target moving speeds can be determined according to the different determination methods.
For example, the actual end point is determined according to the desired end point of the desired moving path, and the target moving speed may be determined according to the length of the desired moving path and the continuous moving time of the virtual moving object, where the continuous moving time is a fixed value set according to actual needs. For another example, the actual end point is determined according to the continuous moving time of the virtual moving object, and the predicted moving speed is determined according to the length of the actual moving path and the continuous moving time. Since the actual endpoint is different from the desired endpoint, the resulting target movement velocity is different.
Optionally, the target moving speed of the virtual moving object may also be set according to actual needs. For example, after determining the desired moving path of the virtual moving object, the player sets the target moving speed of the virtual moving object to a fixed value, and the actual moving time period of the virtual moving object can be determined according to the desired moving path and the fixed target moving speed.
In order to influence the attribute value corresponding to the virtual character, the virtual moving object has a certain coverage range. Alternatively, the coverage range of the virtual moving object can be determined according to actual needs. For example, the coverage area of the virtual moving object is hemispherical, or cylindrical, or other areas with certain space, and the relevant size of the coverage area can be set according to the actual requirement. Taking the coverage area of the virtual moving object as a hemisphere, the hemispherical opening may be upward or downward.
Optionally, the coverage of the virtual moving object changes with the movement of the virtual moving object.
The influence of the virtual moving object on the attribute value corresponding to the virtual character needs the virtual character to be within a preset influence range, and the influence range may be a coverage range of the virtual moving object or larger than the coverage range. Optionally, the virtual moving object may affect the attribute value corresponding to the virtual character within the coverage range; or the virtual moving object influences the attribute value corresponding to the virtual character which is out of the coverage range and the distance between the virtual moving object and the virtual character does not exceed the preset value.
If the virtual character is out of the preset influence range, the virtual character is not influenced by the virtual moving object. Optionally, if the virtual moving object is located at a certain position higher than the ground, the virtual moving object affects the attribute value corresponding to the virtual character higher than the certain position, and does not affect the attribute value corresponding to the virtual character lower than the certain position.
For example, if the actual starting point of the virtual moving object is located at the peak of a virtual mountain and the virtual moving object moves downward along the peak of the virtual mountain until the virtual moving object moves onto the virtual ground, the virtual character located above the peak of the virtual mountain is affected by the virtual moving object at the actual starting point of the virtual moving object, and the attribute value corresponding to the virtual character located under the foot of the virtual mountain is not affected. Meanwhile, as the virtual moving object moves, the relative altitude of the virtual moving object will continuously decrease, and the virtual moving object still only affects the attribute values corresponding to a part of virtual characters, the relative altitude of the part of virtual characters is higher than that of the virtual moving object, and the relative distance between the part of virtual characters and the virtual moving object is within a preset value, and the virtual moving object does not affect the attribute values corresponding to the virtual characters with the relative altitude lower than that of the virtual moving object.
In the moving process of the virtual moving object, when the virtual fixed object in the virtual environment is located on the actual moving path of the virtual moving object, the virtual moving object does not affect the virtual fixed object in the virtual environment. The virtual fixed object refers to an obstacle displayed in a virtual environment, such as a virtual house, a virtual wall, a virtual grass, and the like. For example, there is a virtual house on the actual moving path, and when passing through the virtual house, the virtual moving object passes through the virtual house without causing damage to the virtual house.
In order to achieve that the virtual moving object does not affect the virtual fixed object, the virtual moving object may be generated in a three-dimensional model corresponding to the virtual environment. In the moving process of the virtual moving object, the virtual character in the virtual environment has a collision volume, the virtual fixed object does not have the collision volume, and the action area of the virtual moving object is only intersected with the collision volume of the virtual character, so that the virtual character is influenced. The virtual moving object does not have a collision volume, the virtual moving object does not have a solid form in a virtual environment, and a virtual character in the virtual environment cannot influence the virtual moving object.
In summary, the embodiment of the present application provides a method for releasing a virtual prop, in which a virtual moving object is displayed in a display interface, and an actual moving path of the virtual moving object is determined according to an expected moving path on a virtual map, so that an influence range of the virtual moving object is movable and unpredictable. Wherein the desired movement route is displayed on the virtual map in response to a movement operation on the display interface.
As shown schematically in fig. 5, an embodiment of the present application provides a method for releasing a virtual prop, which is used to display a virtual moving object in a display interface. The method for releasing the virtual prop is applied to a terminal and comprises the following steps:
step 501: and in the process of releasing the virtual prop, displaying a virtual map corresponding to the virtual environment in the display interface.
Optionally, a trigger control corresponding to the virtual item is displayed in the display interface, and the virtual map is displayed in the display interface in response to a trigger operation on the trigger control.
Illustratively, step 501 is the same as step 401, and may be referred to for further description.
Step 502: a virtual finger is displayed.
Illustratively, a virtual finger is displayed in the display interface.
The virtual finger may be displayed in response to a triggering operation for the virtual prop. Taking the virtual prop as an example of virtual tornado, when a player needs to use the skill of the virtual tornado, clicking a trigger control corresponding to the virtual tornado in a display interface, and responding to the clicking operation on the trigger control to display a virtual finger in the display interface.
Step 503: and controlling the virtual finger to slide on the virtual map in response to the sliding operation for the virtual finger.
Wherein the virtual finger is controlled by the player. Optionally, the player may control the virtual finger to slide in the display interface through a touch screen operation, or the player may control the virtual finger to move up, down, left, right, and the like in the display interface through a keyboard, a joystick, a mouse, and the like.
For example, the player performs a slide operation on a virtual map displayed in the display interface by using the touch screen, and in response to this slide operation, the player controls the virtual finger to slide on the display interface. Since the sliding operation of the player on the display interface is random, the acquired sliding route of the virtual finger on the virtual map is also random.
Optionally, the sliding route of the virtual finger is completed at one time. For example, the player controls the virtual finger to slide on the virtual map by using the touch screen, and when the player stops contacting the touch screen, the sliding route of the virtual finger is completed; subsequently, if the player continues to touch the touch screen, the virtual finger cannot be controlled to slide on the virtual map, and the sliding route cannot be displayed.
Step 504: and determining and displaying a desired moving path of the virtual moving object according to the sliding route of the virtual finger.
Illustratively, the virtual moving object is used to influence the attribute value corresponding to the virtual character in the virtual environment.
The foregoing contents may be referred to for the description of the virtual moving object, the expected moving path, the virtual character, and the attribute value, and are not repeated herein.
According to the foregoing, the sliding of the virtual finger on the virtual map can determine the sliding route of the virtual finger. After the sliding route of the virtual finger is determined, a desired moving path of the virtual moving object may be determined according to the sliding route.
Illustratively, the length and shape of the expected moving path of the virtual moving object can be determined according to the sliding route of the virtual finger. For example, the desired movement path coincides with the sliding path.
Fig. 6 shows an interface schematic diagram of a display of a virtual item provided in an exemplary embodiment of the present application, in a process of releasing the virtual item, a virtual map 611 corresponding to a virtual environment and a virtual finger 01 are displayed in a display interface 610. The player performs a sliding operation on the display interface 610 with respect to the virtual finger 01, thereby controlling the virtual finger 01 to slide on the virtual map 611; according to the sliding route of the virtual finger 01 on the virtual map 611, a desired moving path 612 of the virtual moving object can be determined and displayed in the display interface 610. The desired movement path 612 has the same length and shape as the sliding path of the virtual finger 01.
Optionally, a release control 613 is displayed in the display interface 610, and in response to a release operation of the release control 613, a virtual moving object (not shown in the figure) may be displayed in the display interface 610, where an actual moving path of the virtual moving object is determined according to the desired moving path 612.
Optionally, the maximum length of the desired movement path 612 of the virtual moving object may be limited, and referring to fig. 6, a progress control 614 is further displayed in the display interface 610, and the progress control 614 is used for indicating the remaining drawable length of the desired movement path 612. The player may plan the desired movement path 612 with reference to the remaining tradable lengths on the progress control 614.
After the sliding route of the virtual finger is determined, there is a possibility that the player is not satisfied with the sliding route. Optionally, in order to meet the redrawing requirement of the player on the sliding route, referring to fig. 6, a redrawing control 615 is also displayed in the display field 610. The method for releasing the virtual prop provided by the embodiment of the application further comprises the following steps: in response to a triggering operation on the redraw control 615, the display of the desired movement path 612 on the virtual map 611 is cancelled.
Wherein the triggering operation on the redraw control 615 includes, but is not limited to, at least one of the following operations: single click operation, double click operation, touch operation, and slide operation. For example, if the player clicks the redraw control 615, and the desired movement path 612 is not displayed in response to this single click, the desired movement path 612 will not be displayed on the virtual map 611. Subsequently, the player may control the virtual finger to slide again on the virtual map 611 to acquire a new slide path, thereby displaying one desired movement path again.
After the desired movement path 612 is displayed, there is a possibility that the player does not wish to continue using the virtual item, a virtual sport. Optionally, referring to fig. 6, a cancel control 616 is also displayed in the display interface 610. The method for releasing the virtual prop provided by the embodiment of the application further comprises the following steps: in response to a triggering operation on the cancel control 616, the display of the virtual map 611 is cancelled. The trigger operation on the cancel control 616 may refer to the trigger operation on the redraw control 615, and is not described in detail.
Step 505: in response to a release operation for the virtual moving object, the virtual moving object is displayed at an actual starting point in the virtual environment.
Illustratively, the position of the actual starting point is determined in accordance with the position of the desired starting point in the desired movement path.
Wherein the releasing operation for the virtual moving object includes but is not limited to at least one of the following operations: single click operation, double click operation, touch operation, moving operation, sliding operation and dragging operation.
Optionally, the position of the actual starting point in the virtual environment corresponds to the position of the desired starting point on the virtual map.
Step 506: and controlling the virtual moving object to move along the actual moving path.
According to the foregoing, the virtual moving object is used as a movable virtual prop, and the actual starting point, the actual end point, the moving distance, the moving speed and the continuous moving time of the virtual moving object need to be considered. According to step 505, an actual starting point of the virtual moving object can be determined, and then the virtual moving object moves along the actual moving path from the actual starting point. In this case, the index of the remaining number of virtual moving objects needs to be considered.
Optionally, to determine the actual moving manner of the virtual moving object, step 506 may be implemented as follows:
and controlling the virtual moving object to move along the actual moving path at a target moving speed, wherein the target moving speed is determined according to the length of the expected moving path.
Optionally, the target moving speed is a ratio of the length of the desired moving path to the duration of the moving; alternatively, the target moving speed is a fixed value set in advance according to actual needs.
The continuous moving time may be a fixed value preset according to actual needs.
For example, the duration of movement is 10 seconds, and a unique ratio can be determined according to the length of the desired movement path and the duration of movement, and the ratio is determined as the target movement speed of the virtual moving object. After the virtual moving object is displayed at the actual starting point, the virtual moving object can be controlled to move along the actual moving path at the target moving speed.
Alternatively, the target moving speed is varied according to the duration of the moving time. Taking the example that the continuous moving time at least includes a first time and a second time, the first time is earlier than the second time, the virtual moving object moves at a first target moving speed in the first time, and moves at a second target moving speed in the second time, and the first target moving speed is faster than the second target moving speed.
For example, the target moving speed of the virtual moving object at the actual starting point is the fastest, and the target moving speed of the virtual moving object is in a uniform speed descending state along with the change of the continuous moving time until the target moving speed is reduced to 0, and the display of the virtual moving object is cancelled.
Step 507: and canceling the display of the virtual moving object under the condition that the virtual moving object meets the display canceling condition.
The display canceling condition can be limited according to actual needs.
Optionally, the cancel display condition includes, but is not limited to, at least one of the following conditions: the continuous moving time of the virtual moving object reaches the preset time length, the virtual moving object reaches the actual end point, the moving energy consumption of the virtual moving object is finished, and the number of virtual characters influenced by the virtual moving object reaches the preset number.
The preset time length can be set according to actual needs, for example, the preset time length is a preset fixed value; or, the preset time length is determined according to the length of the expected moving path of the virtual moving object and the target moving speed.
Alternatively, the actual end point may be determined according to the desired end point of the desired movement path, or the actual end point may be determined according to the duration of movement of the virtual moving object and the desired movement path.
For example, the position of the actual end point in the virtual environment is determined according to the position of the desired end point on the virtual map, and the position where the virtual moving object is cancelled is equivalent to the position of the desired end point on the virtual map. For another example, if the actual end point is determined according to the continuous moving time of the virtual moving object and the desired moving path, the position where the virtual moving object is cancelled may correspond to the position of the desired end point on the virtual map, or may correspond to a certain position between the desired end points.
The virtual moving object has a certain amount of energy value at an actual starting point, the energy value is attenuated according to a preset strategy along with the movement of the virtual moving object, and the completion of the motion energy consumption of the virtual moving object is determined when the energy value is attenuated to 0 or is attenuated to a preset value.
The preset strategy can be set according to actual needs. Taking the virtual moving object as a virtual tornado as an example, the initial moving energy value of the virtual tornado is 100, and the energy value is in a descending trend at a constant speed along with the movement of the virtual tornado; when the virtual tornado passes through a virtual fixed object or affects a virtual character, the energy value is fixedly reduced by a certain value.
The preset number of the virtual roles influenced by the virtual moving object can be set according to actual needs.
For example, the preset number is 5, and in the moving process of the virtual moving object, if the virtual moving object affects 5 virtual characters, the display of the virtual moving object is cancelled after the virtual moving object affects the 5 th virtual character; and if the number of the virtual characters influenced by the virtual moving object does not reach 5, continuing to move along the actual moving path until the position corresponding to the expected end point in the expected moving path is reached, and canceling the display of the virtual moving object at the position.
Optionally, in the actual moving process of the virtual moving object, since the virtual objects passed by the actual moving path are different, the actual moving modes of the virtual moving object are also different, and several optional moving modes of the virtual moving object are given as follows:
the first moving mode is as follows: the virtual moving object may pass through the virtual fixed object.
During the movement of the virtual moving object, a virtual fixed object may exist on the actual moving path of the virtual moving object. Optionally, the method for releasing the virtual prop provided in the embodiment of the present application further includes: and controlling the virtual moving object to pass through the virtual fixed object under the condition that the virtual fixed object is displayed on the actual moving path.
When the virtual fixed object in the virtual environment is located on the actual moving path of the virtual moving object, the virtual moving object does not affect the virtual fixed object in the virtual environment. For example, there is a virtual house on the actual moving path, and when passing through the virtual house, the virtual moving object passes through the virtual house without causing damage to the virtual house.
In order to achieve that the virtual moving object does not affect the virtual fixed object, the virtual moving object may be generated in a three-dimensional model corresponding to the virtual environment. In the moving process of the virtual moving object, the virtual character in the virtual environment has a collision volume, the virtual fixed object does not have the collision volume, and the action area of the virtual moving object is only intersected with the collision volume of the virtual character, so that the virtual character is influenced. The virtual moving object does not have a collision volume, the virtual moving object does not have a solid form in a virtual environment, and a virtual character in the virtual environment cannot influence the virtual moving object.
And a second moving mode: the virtual moving object moves along the virtual ground.
Optionally, the method for releasing the virtual prop provided in the embodiment of the present application further includes: after the virtual moving object is contacted with the virtual ground on the actual moving path, the bottom of the virtual moving object is controlled to be continuously contacted with the virtual ground.
Taking the virtual moving object as a virtual tornado as an example, a real-life tornado usually moves in a close-to-ground manner, so that the movement of the virtual moving object is more real, and the virtual tornado is controlled to move in the close-to-ground manner after the virtual tornado is contacted with a virtual ground in a virtual environment.
A third moving mode: the virtual moving object falls from a high place to a low place during the moving process.
In the moving process of the virtual moving object, the heights of the positions of the virtual object on the actual moving path are not completely consistent, for example, two adjacent houses on the actual moving path have different heights, and the height of the house located in front of the moving direction of the virtual moving object is lower.
Optionally, in a case that a first virtual fixed object and a second virtual fixed object which are adjacent to each other are displayed on an actual moving path, the virtual moving object is located on the first virtual fixed object, the second virtual fixed object is located in front of a moving direction of the virtual moving object, and a position height of a top of the second virtual fixed object is lower than a position height of the virtual moving object on the first virtual fixed object, where the method for releasing the virtual prop provided in the embodiment of the present application further includes: and controlling the virtual moving object to move from the first virtual fixed object to the top of the second virtual fixed object.
For example, there are a first virtual house and a second virtual house adjacent to each other on the actual moving path, the virtual moving object is located on the roof of the first virtual house, the roof of the second virtual house is located at a lower height than the roof of the first virtual house, and the virtual moving object, when moving forward, will fall from the roof of the first virtual house onto the roof of the second virtual house.
Optionally, the three moving modes may be implemented separately, may also be implemented in combination, and may also be implemented in combination with other embodiments in the present application.
For example, a virtual building, a virtual ground, and a virtual wall are sequentially displayed in the advancing direction of the actual moving path, and the actual starting point of the virtual moving object is located at the top layer of the virtual building. Taking the virtual moving object as a virtual tornado as an example, as the virtual tornado moves forwards along the actual moving path, the virtual tornado falls on the virtual ground from the top layer of the virtual building and continues to move forwards along the virtual ground; when the virtual wall body is moved along the virtual ground, the virtual tornado penetrates through the virtual wall body and continues to move along the virtual ground, and the virtual wall body cannot be damaged by the virtual tornado.
In summary, according to the method for releasing the virtual prop provided by the embodiment of the present application, by controlling the random sliding of the virtual finger on the virtual map, the desired moving path of the virtual moving object can be determined, so that the length and the shape of the desired moving path have randomness and unpredictability.
In addition, the embodiment of the application also provides an actual moving mode of the virtual moving object in the virtual environment. Optionally, since different virtual objects may exist on the actual moving path, the embodiment of the present application further provides three different moving modes of the virtual moving object.
According to the foregoing, the virtual moving object affects the attribute value corresponding to the virtual character in the virtual environment. On the basis of fig. 5, schematically shown in fig. 7, an embodiment of the present application provides a method for releasing a virtual prop, where the method further includes the following steps:
step 404: and displaying the influence information of the virtual moving object on the virtual character.
In light of the foregoing, the virtual character includes, but is not limited to, at least one of: virtual characters and associated virtual objects corresponding to the virtual characters. Illustratively, the influence information is used to indicate a decrease in the value of the attribute of the avatar and/or its associated virtual object.
Optionally, the attribute value includes, but is not limited to, at least one of: a speed attribute value, a state attribute value. The speed attribute value is used for indicating the current speed of the virtual character, and the state attribute value is used for indicating the current state of the virtual character; the state attribute value includes at least one of: blood volume value, physical strength value, endurance value, attack value, defense value and hit rate.
The virtual moving object reduces the attribute value corresponding to the virtual character, and the reduction of the attribute value can be at least one of one-time reduction, continuous reduction, uniform speed reduction, acceleration reduction, deceleration reduction, random reduction of a certain value or proportion, and fixed reduction of a certain value or proportion.
The influence information is different according to the attribute value. For example, if the attribute value includes a speed attribute value, the influence information includes deceleration information indicating a decrease in the speed attribute value of the virtual character; as another example, the attribute value includes a state attribute value, and the impact information includes deduction information for indicating a decrease in the state attribute value of the virtual character.
Optionally, in the case that the influence information includes deceleration information, step 404 may be implemented as follows:
in the case where the virtual character is located in the periphery of the virtual moving object, deceleration information indicating a decrease in the velocity attribute value of the virtual character is displayed.
Wherein, the periphery of the virtual moving object comprises the following two conditions: the distance between the virtual moving object and the coverage area is within the coverage area of the virtual moving object, and the distance between the virtual moving object and the coverage area is not more than a preset value.
According to the difference of the relative position of the virtual character and the virtual moving object, the step of displaying the deceleration information under the condition that the virtual character is positioned at the periphery of the virtual moving object can be realized as follows:
displaying first deceleration information under the condition that the virtual character is located in the coverage range of the virtual moving object, wherein the first deceleration information is used for indicating that the speed attribute value is reduced by a first reduction value;
or displaying second deceleration information under the condition that the virtual character is located outside the coverage range and the distance between the virtual character and the virtual moving object does not exceed a preset value, wherein the second deceleration information is used for indicating that the speed attribute value is reduced by a second descending value;
the first drop value is larger than the second drop value, and the preset value can be set according to actual needs.
For example, at a certain point in time, the periphery of the virtual moving object displays a first virtual character and a virtual prop used by the first virtual character on the actual moving path of the virtual moving object, the first virtual character is located within the coverage area of the virtual moving object, and the virtual prop is located outside the coverage area of the virtual moving object and has a distance from the virtual moving object not exceeding a preset value.
The virtual moving object has different influences on the speed attribute values of the first virtual character and the virtual prop at the time point. For example, the speed attribute value of the first virtual character decreases by 20, and the speed attribute value of the virtual prop decreases by 10; or the speed attribute value of the first virtual character is reduced by 10%, and the speed attribute value of the virtual prop is reduced by 5%.
Optionally, in the case that the influence information includes deduction information, step 404 may be implemented as follows:
and displaying deduction information under the condition that the virtual character is positioned at the periphery of the virtual moving object, wherein the deduction information is used for indicating deduction of the state attribute value of the virtual character.
For the description of the relationship around the virtual moving object, reference is made to the foregoing description, and the description is omitted.
According to the difference of the relative positions of the virtual character and the virtual moving object, the deduction information is displayed under the condition that the virtual character is positioned at the periphery of the virtual moving object, and the deduction information can be realized as follows:
displaying deduction information under the condition that the virtual character is located in the coverage range of the virtual moving object;
or, when the distance between the virtual character and the virtual moving object does not exceed a preset value, deduction information is displayed, and the deduction amount of the state attribute value is determined according to the distance between the virtual character and the virtual moving object.
For example, at a certain point in time, a first virtual character and a second virtual character are displayed around the virtual moving object, the first virtual character is located within the coverage area of the virtual moving object, and the second virtual character is located outside the coverage area of the virtual moving object and is not more than a preset distance from the virtual moving object.
The virtual moving object affects the state attribute values of the first virtual character and the second virtual character differently at this point in time. Taking the case that the state attribute value comprises a blood volume value as an example, the blood volume value of the first virtual character is in a uniform descending state, and the blood volume value of the second virtual character is unchanged; or the blood volume values of the first virtual character and the second virtual character are both decreased, and the blood volume value of the first virtual character is decreased faster than that of the second virtual character because the first virtual character is closer to the virtual moving object.
Optionally, to determine specific values of the attribute values in the influence information, step 404 may be implemented as follows:
determining a reduction value according to the relative position relationship between the virtual character and the virtual moving object, wherein the reduction value is a reduction numerical value of the attribute value of the virtual character and/or the associated virtual object;
and determining and displaying the influence information according to the reduction value.
The relative position relationship between the virtual character and the virtual moving object can refer to the above contents, and is not described again; the reduction value may be a numerical value or a ratio, and is not limited herein.
In addition, in order to enable the management account corresponding to the virtual character to quickly acquire the influence information, the method for releasing the virtual prop provided by the embodiment of the application further includes: and displaying the influence special effects corresponding to the influence information, wherein the influence special effects comprise at least one of sound special effects and animation special effects.
Wherein, the sound effect and the animation effect can be set according to actual needs. Taking a shooting game as an example, if a player controls the skill of a first virtual character to release a virtual moving object so that the blood volume value of a second virtual character is continuously reduced, an influence special effect indicating the influence of the virtual moving object on the blood volume value of the second virtual character may be displayed on the sight corresponding to the first virtual character. For example, the influence effect includes an animated effect that is a sliding value that appears around the sight.
To sum up, in the method for releasing the virtual prop provided in the embodiment of the present application, influence information of the virtual moving object on the virtual character may also be displayed, so as to indicate a decrease in the attribute value of the virtual character and/or the associated virtual object of the virtual character. Optionally, according to different influence information, two implementation manners of the influence information are further provided in the embodiment of the present application.
In addition, the virtual moving object also has a visual effect on a virtual character controlled by a different player during the moving process, and the visual effect on the virtual moving object will be described below by taking the virtual moving object released by the first virtual character as an example.
Optionally, the method for releasing the virtual prop provided in the embodiment of the present application further includes: and in the case that an obstacle exists between the first virtual character and the virtual moving object, displaying the perspective effect of the virtual moving object on the obstacle.
Wherein the virtual moving object is released by the first virtual character.
Optionally, the virtual moving object may be released by other virtual characters belonging to the same battle as the first virtual character. For example, two camps exist in the virtual environment, the first virtual character and the second virtual character belong to different camps, the third virtual character and the first virtual character belong to one camp, and if the third virtual character releases the virtual moving object, the perspective effect of the virtual moving object can be displayed on the barrier in both the display interface of the first virtual character and the display interface of the second virtual character.
For the server, whether the terminal is granted the authority to display the perspective effect is determined according to the corresponding situation of the first virtual character and the virtual moving object. In the case where there is an obstacle between the first virtual character and the virtual moving object, the server may perform the following operations:
under the condition that the first virtual character corresponds to the virtual moving object, granting permission to a management account corresponding to the first virtual character, wherein the permission is used for displaying the position of the virtual moving object in the virtual environment; or, in the case that the first virtual character does not correspond to the virtual moving object, canceling the authority to the management account.
The first virtual character corresponds to the virtual moving object, that is, the virtual moving object is released by the first virtual character, or the virtual moving object is released by other virtual characters in a same formation with the first virtual character.
Optionally, the number of currently released virtual moving objects does not exceed the number of camps existing in the virtual environment. For example, two camps exist in the virtual environment, and the number of the virtual moving objects in the virtual environment does not exceed two at the same time. And under the condition that the number of the virtual moving objects in the virtual environment exceeds the number of the campuses, displaying a reminding control for reminding that the virtual moving objects cannot be used at the current moment.
Taking the case that the virtual moving object is released by a first virtual character, and a second virtual character and the first virtual character belong to different camps, in the moving process of the virtual moving object, if a barrier exists between the position of the first virtual character and the current position of the virtual moving object, such as a virtual wall, the server grants an authority to a management account corresponding to the first virtual character, and the management account corresponding to the first virtual character displays the position of the virtual moving object in the virtual environment in a display interface according to the granted authority; and if the position of the second virtual character and the current position of the virtual moving object have the barrier, canceling the authority to the management account corresponding to the second virtual character, and canceling the display of the virtual moving object in the virtual environment by the management account corresponding to the second virtual character.
That is, if the virtual moving object is released by the first virtual character, the virtual moving object will be continuously displayed in the display interface of the management account corresponding to the first virtual character. When a barrier for blocking the sight line exists between the virtual moving object and the first virtual character, the position of the virtual moving object can still be displayed in the display interface of the management account corresponding to the first virtual character.
The perspective effect of the virtual moving object displayed on the obstacle can be referred to fig. 8 and 9. In fig. 8, a virtual character 811 is displayed on the display interface 810 of the management account corresponding to the first virtual character, the virtual character 811 is partially hidden by the virtual wall 812, and the gray area in the figure is a perspective effect of partially hiding the virtual wall 812. The perspective principle is similar to that of fig. 9, a plurality of virtual enemies 912 are displayed in a display interface 910 of a management account corresponding to a first virtual character 911, the virtual enemies 912 are blocked by different virtual walls, the first virtual character 911 and the virtual enemies 912 are connected, for example, the position of the first virtual character 911 is taken as a starting point, the direction of a muzzle is taken as a ray detection direction, the distance is the distance between the first virtual character 911 and the virtual enemies 912, ray detection is transmitted, and if an obstacle is detected between the first virtual character 911 and the virtual enemies 912, the virtual enemies 912 are considered to be blocked by the obstacle. If the first avatar 911 has perspective capability, the positions of the virtual enemies 912 can still be displayed in the display interface 910 of the management account corresponding to the first avatar 911. A plurality of virtual enemies 912 are displayed on the obstacle in a manner similar to the manner in which virtual character 811 is displayed on virtual wall 812 in fig. 8.
Similarly, the perspective effect of the virtual moving object displayed on the obstacle may refer to the display of the virtual character 811, and if the virtual moving object is also completely or partially occluded by the virtual wall 812, the occluded part of the virtual moving object still displays the perspective effect on the obstacle in the display interface 810, and the perspective effect may be identified by using different colors or other means.
Optionally, according to the relative position relationship between the first virtual character and the virtual moving object, the method for releasing the virtual prop provided in the embodiment of the present application further includes:
canceling the display of the virtual environment within the coverage range of the virtual moving object in the case where the first virtual character is located outside the coverage range of the virtual moving object; or, in the case where the first virtual character is within the coverage of the virtual moving object, canceling the display of the virtual environment outside the coverage of the virtual moving object.
The foregoing contents may be referred to for the description of the coverage of the virtual moving object, and are not repeated herein; the first virtual character may or may not correspond to the virtual moving object.
For the server, whether the terminal is granted the authority to display part of the virtual environment is determined according to the relative position relationship between the first virtual character and the virtual moving object. According to different relative position relations, the server can execute the following operations:
under the condition that the first virtual character is located outside the coverage range of the virtual moving object, a first authority is granted to a management account corresponding to the first virtual character, and the first authority is used for canceling and displaying a virtual environment within the coverage range of the virtual moving object;
and under the condition that the first virtual character is positioned in the coverage range of the virtual moving object, a second authority is granted to the virtual character corresponding to the first virtual character, wherein the second authority is used for canceling the display of the virtual environment outside the coverage range of the virtual moving object.
And according to whether the first virtual character is in the coverage range of the virtual moving object or not, the virtual environment displayed in the display interface of the management account corresponding to the first virtual character is different. For example, the first virtual role is located outside the coverage range, and in the display interface of the management account corresponding to the first virtual role, the virtual environment within the coverage range is not displayed; if the first virtual character is located outside the coverage range, only the virtual environment within the coverage range will be displayed in the display interface of the management account corresponding to the first virtual character.
In addition, in order to enable the management account corresponding to the first virtual character to acquire the moving position of the virtual moving object, a minimap control of the virtual map is also displayed in the display interface, and the current position of the virtual moving object can be displayed on the minimap control. Optionally, the method for releasing the virtual prop provided in the embodiment of the present application further includes:
and displaying a virtual moving object icon in the small map control, wherein the position of the virtual moving object icon changes along with the position of the virtual moving object when the virtual moving object moves along the actual moving path.
Referring to fig. 10, the virtual moving object icon 02 is displayed in the minimap control 1010, and the minimap control 1010 is displayed in a display interface, which is not shown in the drawing. The virtual moving object may or may not correspond to a virtual character. The display state of the virtual moving object icon 02 in the minimap control 1010 is different according to the correspondence between the virtual moving object and the virtual character.
Taking the example that the virtual moving object corresponds to the first virtual character, and the second virtual character and the first virtual character belong to different avatars, in the display interface of the management account corresponding to the first virtual character, the virtual moving object icon can be displayed in a first display state; in the display interface of the management account corresponding to the second virtual character, the virtual moving object icon may be displayed in a second display state. The first display state and the second display state are different, and the display states can be distinguished through the shapes, colors, sizes and other modes of the icons. For example, the first display state is a blue circle and the second display state is a red circle.
As shown schematically in fig. 11, an embodiment of the present application provides a method for releasing a virtual prop, which is used to display a virtual moving object in a display interface. The method for releasing the virtual prop is applied to a terminal, and takes the example that a virtual moving object is a virtual tornado, and comprises the following steps:
step 1101: and equipping the virtual prop.
Illustratively, a virtual tornado is one of the virtual props.
Step 1102: it is determined whether to use a virtual tornado.
The determination of using the virtual tornado may be based on whether a trigger operation to use the virtual tornado is received. For example, a trigger control using the virtual tornado is displayed on the display interface, and if a trigger operation on the trigger control is received, the virtual tornado is determined to be used.
Step 1103: in the case of using a virtual tornado, a virtual notebook is displayed.
Illustratively, a virtual notebook is displayed in the display interface.
Optionally, a virtual map corresponding to the virtual environment is displayed on the virtual notebook.
Step 1104: and judging whether the virtual finger is pressed down.
Illustratively, the virtual finger is displayed in the display interface, and the player can control the sliding of the virtual finger in the display interface through a touch screen, a button, a game pad, a mouse and the like. The judgment of whether the virtual finger is pressed can be based on whether the sliding operation on the display interface is received. For example, the player can receive a touch point on the display interface by touching the touch screen, and accordingly can determine that the virtual finger has been pressed.
Illustratively, in the case of a virtual finger press, step 1105 is performed; if the virtual finger is not pressed, the process returns to step 1103.
Step 1105: the desired start point of the desired movement path is recorded.
In the case where the virtual finger is pressed, the server can acquire a touch point of the virtual finger on the virtual map and determine the touch point as a desired starting point of a desired movement path.
For the description of the expected moving path, reference is made to the foregoing description and the description is omitted.
Step 1106: and judging whether the virtual finger is loosened.
And judging whether the virtual finger is loosened according to the received sliding operation on the display interface. For example, the player performs a sliding operation on the display interface through the touch screen, thereby controlling the movement of the virtual finger on the virtual map. Subsequently, the player cancels the sliding operation on the touch screen at a certain point, and the server determines that the virtual finger has been released according to the cancellation of the sliding operation.
Illustratively, in case the virtual finger is released, step 1107 is executed; in the case where the virtual finger is not released, return is made to step 1105.
Step 1107: the desired end point of the desired movement path is recorded.
In the case where the virtual finger is released, the server can acquire a position point at which the virtual finger stops sliding on the virtual map, and determine the position point as a desired start point of a desired movement path.
Step 1108: a desired movement path is determined.
According to the steps, the server can determine the sliding route of the virtual finger on the display interface, and the server can determine the expected moving path according to the sliding route.
Optionally, a progress control is further provided on the display interface, and the progress control is used for indicating the remaining drawable length of the desired movement path. The player can plan the expected moving path by referring to the remaining drawing length on the progress control.
Optionally, a redrawing control is further displayed on the display interface. If the player wishes to re-plan the expected moving path, the display of the expected moving path on the virtual map can be cancelled through the triggering operation on the redrawing control. And then controlling the virtual finger to slide on the display interface again to acquire a new sliding path, so as to determine a new expected moving path again.
Optionally, a cancel control is also displayed in the display interface. If the player does not want to use the virtual item of the virtual tornado, the display of the virtual map can be cancelled through the triggering operation on the cancelling control.
Step 1109: in response to a release operation for the virtual tornado, the virtual tornado is displayed at an actual origin in the virtual environment on the display interface.
For the release operation of the virtual tornado, the foregoing contents may be referred to, and are not described again.
Step 1110: it is determined whether the virtual character is located in the periphery of the virtual tornado.
Illustratively, the virtual character includes a virtual character, and/or an associated virtual object corresponding to the virtual character; the situation of being located at the periphery of the virtual tornado comprises the following two situations: the distance between the virtual tornado and the virtual tornado is within the coverage range of the virtual tornado and outside the coverage range of the virtual tornado and is not more than a preset value.
The foregoing contents may be referred to for the description of the virtual character, the associated virtual object, and the coverage area, and are not repeated.
Step 1111: when the virtual character is not located in the periphery of the virtual tornado, the virtual tornado is moved along the actual movement path.
The actual moving path is determined according to the expected moving path, and the specific determination manner of the actual moving path may refer to the foregoing contents, which are not described in detail.
Illustratively, after step 1111 is executed, step 1110 is executed again to determine whether the virtual character is located at the periphery of the virtual tornado.
Step 1112: and under the condition that the virtual character is positioned at the periphery of the virtual tornado, calculating a reduction value of the attribute value corresponding to the virtual character according to the relative position relation between the virtual character and the virtual tornado.
Illustratively, the attribute values corresponding to the virtual character include, but are not limited to, a speed attribute value and/or a state attribute value; the relative position relationship between the virtual character and the virtual tornado can be represented by whether the virtual character is located in the influence range of the virtual tornado or the distance between the virtual character and the virtual tornado. The attribute value, the influence range of the virtual tornado, and the distance between the virtual character and the tornado can refer to the above contents, and are not described again.
Step 1113: and judging whether the virtual tornado moves to an actual terminal point.
Illustratively, if the virtual tornado moves to the actual destination, then step 1114 is executed; if the virtual tornado does not move to the actual end point, step 1111 is executed to enable the virtual tornado to select to move along the actual moving path.
Step 1114: in the case where the virtual tornado moves to the actual destination, the display of the virtual tornado is cancelled.
The actual endpoint can be determined by referring to the above contents, and details are not repeated.
Optionally, the canceling display of the virtual tornado can also be performed under the condition that other canceling display conditions are met. The foregoing may be referred to for the explanation of canceling the display condition, and details are not repeated.
To sum up, in the method for releasing the virtual prop provided in the embodiment of the present application, the desired moving path can be obtained by controlling the sliding server of the virtual finger on the virtual map, and then the virtual tornado is displayed in the virtual environment on the display interface according to the desired moving path, and the actual moving path of the virtual tornado is determined according to the desired moving path. Since the desired movement path is determined from random sliding of the virtual finger, the desired movement path is unpredictable, so that the actual movement path of the virtual tornado is also unpredictable.
The following are embodiments of the apparatus of the present application, and for details that are not described in detail in the embodiments of the apparatus, reference may be made to corresponding descriptions in the above method embodiments, and details are not described herein again.
Fig. 12 is a schematic diagram illustrating a device for releasing a virtual prop, provided in this embodiment, the device including:
the display module 1220 is configured to display a virtual map corresponding to the virtual environment in the display interface in the process of releasing the virtual item;
a response module 1240, configured to display a desired movement path of a virtual moving object on the virtual map in response to the movement operation with respect to the virtual map, where the virtual moving object is used to affect an attribute value corresponding to a virtual character in the virtual environment;
the response module 1240 is further configured to display, in response to the release operation for the virtual moving object, the virtual moving object located in the virtual environment in the display interface, where an actual moving path of the virtual moving object is determined according to the desired moving path.
Optionally, the response module 1240 is configured to display the virtual moving object at an actual starting point in the virtual environment in response to the release operation for the virtual moving object, where the position of the actual starting point is determined according to the position of the desired starting point in the desired moving path; controlling the virtual moving object to move along the actual moving path; and canceling the display of the virtual moving object under the condition that the virtual moving object meets the display canceling condition.
Optionally, the response module 1240 is configured to control the virtual moving object to move along the actual moving path at the target moving speed, where the target moving speed is determined according to the length of the desired moving path.
Alternatively, the target moving speed is a ratio of the length of the desired moving path to the duration of the moving.
Optionally, the display module 1220 is further configured to control the virtual moving object to pass through the virtual fixed object when the virtual fixed object is displayed on the actual moving path.
Optionally, the display module 1220 is further configured to control the bottom of the virtual moving object to continuously maintain contact with the virtual ground after the virtual moving object contacts the virtual ground on the actual moving path.
Optionally, a first virtual fixed object and a second virtual fixed object which are adjacent to each other are displayed on the actual moving path, the virtual moving object is located on the first virtual fixed object, the second virtual fixed object is located in front of the moving direction of the virtual moving object, and the height of the top of the second virtual fixed object is lower than the height of the virtual moving object on the first virtual fixed object; the display module 1220 is further configured to control the virtual moving object to move from the first virtual fixed object to the top of the second virtual fixed object.
Optionally, the virtual moving object is released by the first virtual character, and the display module 1220 is further configured to display a perspective effect of the virtual moving object on the obstacle in a case where the obstacle exists between the first virtual character and the virtual moving object.
Optionally, the virtual moving object is released by the first virtual character, and the display module 1220 is further configured to cancel displaying the virtual environment within the coverage area of the virtual moving object when the first virtual character is located outside the coverage area of the virtual moving object; in the case where the first virtual character is located within the coverage of the virtual moving object, the virtual environment outside the coverage of the virtual moving object is canceled from being displayed.
Optionally, the virtual roles include: the virtual character and/or the associated virtual object corresponding to the virtual character, and the display module 1220, are further configured to display influence information of the virtual moving object on the virtual character, where the influence information is used to indicate a decrease in the attribute value of the virtual character and/or the associated virtual object.
Optionally, the influence information includes deceleration information, and the display module 1220 is configured to display the deceleration information in a case where the virtual character is located in the periphery of the virtual moving object, where the deceleration information is used to indicate a decrease in the speed attribute value of the virtual character.
Optionally, the display module 1220 is configured to display first deceleration information when the virtual character is located within the coverage area of the virtual moving object, where the first deceleration information is used to indicate that the speed attribute value decreases by a first drop value; or displaying second deceleration information under the condition that the virtual character is located outside the coverage range and the distance between the virtual character and the virtual moving object does not exceed a preset value, wherein the second deceleration information is used for indicating that the speed attribute value is reduced by a second descending value; wherein the first drop value is greater than the second drop value.
Optionally, the influence information includes deduction information, and the display module 1220 is configured to display the deduction information when the virtual character is located around the virtual moving object, where the deduction information is used to indicate deduction of the state attribute value of the virtual character.
Optionally, the display module 1220 is configured to display deduction information when the virtual character is located within the coverage of the virtual moving object; or, when the distance between the virtual character and the virtual moving object does not exceed a preset value, deduction information is displayed, and the deduction amount of the state attribute value is determined according to the distance between the virtual character and the virtual moving object.
Optionally, the display module 1220 is configured to determine a reduction value according to the relative position relationship between the virtual character and the virtual moving object, where the reduction value is a reduction value of the attribute value of the virtual character and/or the associated virtual object; and determining and displaying the influence information according to the reduction value.
Optionally, a response module 1240 for displaying the virtual finger; controlling the virtual finger to slide on the virtual map in response to a slide operation on the virtual finger; and determining and displaying a desired moving path according to the sliding route of the virtual finger.
Fig. 13 shows a block diagram of a terminal 1300 according to an exemplary embodiment of the present application. The terminal 1300 may be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, motion video Experts compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, motion video Experts compression standard Audio Layer 4), a notebook computer, or a desktop computer. Terminal 1300 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, etc.
In general, terminal 1300 includes: a processor 1301 and a memory 1302.
Processor 1301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and the like. The processor 1301 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 1301 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also referred to as a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 1301 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing content that the display screen needs to display. In some embodiments, processor 1301 may further include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 1302 may include one or more computer-readable storage media, which may be non-transitory. The memory 1302 may also include high speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer-readable storage medium in memory 1302 is used to store at least one instruction for execution by processor 1301 to implement the method of releasing a virtual prop provided by method embodiments herein.
In some embodiments, terminal 1300 may further optionally include: a peripheral interface 1303 and at least one peripheral. Processor 1301, memory 1302, and peripheral interface 1303 may be connected by a bus or signal line. Each peripheral device may be connected to the peripheral device interface 1303 via a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 1304, touch display 1305, camera assembly 1306, audio circuitry 1307, and power supply 1308.
Peripheral interface 1303 may be used to connect at least one peripheral associated with I/O (Input/Output) to processor 1301 and memory 1302. In some embodiments, processor 1301, memory 1302, and peripheral interface 1303 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1301, the memory 1302, and the peripheral device interface 1303 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 1304 is used to receive and transmit RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 1304 communicates with communication networks and other communication devices via electromagnetic signals. The radio frequency circuit 1304 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 1304 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 1304 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: the world wide web, metropolitan area networks, intranets, generations of mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the radio frequency circuit 1304 may also include NFC (Near Field Communication) related circuits, which are not limited in this application.
The touch display 1305 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. The touch display 1305 also has the capability to collect touch signals on or over the surface of the touch display 1305. The touch signal may be input to the processor 1301 as a control signal for processing. At this point, the touch display 1305 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, touch display 1305 may be one, providing the front panel of terminal 1300; in other embodiments, touch display 1305 may be at least two, either on different surfaces of terminal 1300 or in a folded design; in still other embodiments, touch display 1305 may be a flexible display disposed on a curved surface or on a folded surface of terminal 1300. Even more, the touch screen 1305 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The touch Display 1305 may be made of LCD (Liquid Crystal Display), OLED (Organic Light-Emitting Diode), or the like.
The camera assembly 1306 is used to capture images or video. Optionally, camera assembly 1306 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each of the rear cameras is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and a VR (Virtual Reality) shooting function or other fusion shooting functions. In some embodiments, camera assembly 1306 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
The audio circuit 1307 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 1301 for processing, or inputting the electric signals to the radio frequency circuit 1304 for realizing voice communication. For stereo capture or noise reduction purposes, multiple microphones may be provided, each at a different location of terminal 1300. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 1301 or the radio frequency circuitry 1304 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 1307 may also include a headphone jack.
Power supply 1308 is used to provide power to various components within terminal 1300. The power source 1308 may be alternating current, direct current, disposable batteries, or rechargeable batteries. When the power source 1308 includes a rechargeable battery, the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery. The wired rechargeable battery is a battery charged through a wired line, and the wireless rechargeable battery is a battery charged through a wireless coil. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 1300 also includes one or more sensors 1309. The one or more sensors 1309 include, but are not limited to: acceleration sensor 1310, gyro sensor 1311, pressure sensor 1312, optical sensor 1313, and proximity sensor 1314.
Acceleration sensor 1310 may detect acceleration in three coordinate axes of the coordinate system established with terminal 1300. For example, the acceleration sensor 1310 may be used to detect the components of the gravitational acceleration in three coordinate axes. The processor 1301 may control the touch display screen 1305 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 1310. The acceleration sensor 1310 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 1311 may detect a body direction and a rotation angle of the terminal 1300, and the gyro sensor 1311 may cooperate with the acceleration sensor 1310 to acquire a 3D motion of the user on the terminal 1300. The processor 1301 may implement the following functions according to the data collected by the gyro sensor 1311: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
Pressure sensors 1312 may be disposed on a side bezel of terminal 1300 and/or underlying touch display 1305. When the pressure sensor 1312 is disposed on the side frame of the terminal 1300, a user holding signal of the terminal 1300 can be detected, and the processor 1301 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 1312. When the pressure sensor 1312 is disposed at a lower layer of the touch display 1305, the processor 1301 controls the operability control on the UI interface according to the pressure operation of the user on the touch display 1305. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The optical sensor 1313 is used to collect the ambient light intensity. In one embodiment, the processor 1301 can control the display brightness of the touch display screen 1305 according to the intensity of the ambient light collected by the optical sensor 1313. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 1305 is increased; when the ambient light intensity is low, the display brightness of the touch display 1305 is turned down. In another embodiment, the processor 1301 can also dynamically adjust the shooting parameters of the camera assembly 1306 according to the ambient light intensity collected by the optical sensor 1313.
Proximity sensor 1314, also known as a distance sensor, is typically disposed on the front panel of terminal 1300. Proximity sensor 1314 is used to capture the distance between the user and the front face of terminal 1300. In one embodiment, the processor 1301 controls the touch display 1305 to switch from the bright screen state to the dark screen state when the proximity sensor 1314 detects that the distance between the user and the front face of the terminal 1300 gradually decreases; the touch display 1305 is controlled by the processor 1301 to switch from the rest state to the bright state when the proximity sensor 1314 detects that the distance between the user and the front face of the terminal 1300 is gradually increasing.
Those skilled in the art will appreciate that the configuration shown in fig. 13 is not intended to be limiting with respect to terminal 1300 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be employed.
The application also provides a computer device comprising a processor; the processor is used for displaying a virtual map corresponding to the virtual environment in the display interface in the process of releasing the virtual prop; in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment; and in response to the releasing operation aiming at the virtual moving object, displaying the virtual moving object positioned in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.
The present application also provides a computer-readable storage medium, in which a computer program is stored, where the computer program is used to be executed by a processor, so as to implement the method for releasing a virtual item as described above.
The application also provides a chip, which comprises a programmable logic circuit and/or a program instruction, and when the chip runs, the chip is used for realizing the release method of the virtual prop.
The present application also provides a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium, and a processor reads and executes the computer instructions from the computer-readable storage medium to implement the method for releasing a virtual prop as described above.
It should be understood that reference to "a plurality" herein means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only exemplary of the present application and should not be taken as limiting, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (20)

1. A method for releasing a virtual prop, the method comprising:
in the process of releasing the virtual prop, displaying a virtual map corresponding to a virtual environment in a display interface;
in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment;
in response to a release operation for the virtual moving object, displaying the virtual moving object located in the virtual environment in the display interface, wherein an actual moving path of the virtual moving object is determined according to the expected moving path.
2. The method of claim 1, wherein said displaying the virtual moving object located in the virtual environment in the display interface in response to the release operation for the virtual moving object comprises:
displaying the virtual moving object at an actual starting point in the virtual environment in response to a release operation for the virtual moving object, the position of the actual starting point being determined according to the position of a desired starting point in the desired movement path;
controlling the virtual moving object to move along the actual moving path;
and canceling the display of the virtual moving object under the condition that the virtual moving object meets the display canceling condition.
3. The method of claim 2, wherein said controlling the virtual mover to move along the actual movement path comprises:
and controlling the virtual moving object to move along the actual moving path at a target moving speed, wherein the target moving speed is determined according to the length of the expected moving path.
4. The method of any of claims 1 to 3, further comprising:
and controlling the virtual moving object to pass through the virtual fixed object under the condition that the virtual fixed object is displayed on the actual moving path.
5. The method of any of claims 1 to 3, further comprising:
after the virtual moving object is contacted with the virtual ground on the actual moving path, controlling the bottom of the virtual moving object to be continuously contacted with the virtual ground.
6. The method according to any one of claims 1 to 3, wherein a first virtual fixed object and a second virtual fixed object are displayed adjacent to each other on the actual moving path, the virtual moving object is located on the first virtual fixed object, the second virtual fixed object is located in front of the moving direction of the virtual moving object, and the top of the second virtual fixed object is located at a height lower than the height of the virtual moving object on the first virtual fixed object, and the method further comprises:
controlling the virtual moving object to move from the first virtual fixed object to the top of the second virtual fixed object.
7. The method of any one of claims 1 to 3, wherein the virtual sport is released by a first virtual character, the method further comprising:
in the case where there is an obstacle between the first virtual character and the virtual moving object, a perspective effect of the virtual moving object is displayed on the obstacle.
8. The method of any one of claims 1 to 3, wherein the virtual sport is released by a first virtual character, the method further comprising:
canceling the display of the virtual environment within the coverage of the virtual moving object in the case where the first virtual character is located outside the coverage of the virtual moving object;
canceling the display of the virtual environment outside the coverage of the virtual moving object in the case where the first virtual character is located within the coverage of the virtual moving object.
9. The method of any of claims 1 to 3, wherein the virtual character comprises: the virtual character and/or the associated virtual object corresponding to the virtual character, the method further comprises:
displaying influence information of the virtual moving object on the virtual character, wherein the influence information is used for indicating the reduction of the attribute value of the virtual character and/or the associated virtual object.
10. The method of claim 9, wherein the impact information includes deceleration information, and wherein displaying the impact information of the virtual moving object on the virtual character includes:
displaying the deceleration information indicating a decrease in the velocity attribute value of the virtual character in a case where the virtual character is located in the periphery of the virtual moving object.
11. The method according to claim 10, wherein the displaying the deceleration information in the case where the virtual character is located in the periphery of the virtual moving object includes:
displaying first deceleration information in a case where the virtual character is located within a coverage of the virtual moving object, the first deceleration information being used to indicate that the speed attribute value decreases by a first drop value;
or displaying second deceleration information under the condition that the virtual character is located outside the coverage range and the distance between the virtual character and the virtual moving object does not exceed a preset value, wherein the second deceleration information is used for indicating that the speed attribute value is reduced by a second reduction value;
wherein the first drop value is greater than the second drop value.
12. The method of claim 9, wherein the influence information comprises deduction information, and wherein the displaying the influence information of the virtual moving object on the virtual character comprises:
and displaying the deduction information under the condition that the virtual character is positioned at the periphery of the virtual moving object, wherein the deduction information is used for indicating deduction of the state attribute value of the virtual character.
13. The method of claim 12, wherein displaying the deduction information in a case where the virtual character is located at a periphery of the virtual moving object comprises:
displaying the deduction information in the case that the virtual character is located within the coverage of the virtual moving object;
or, when the distance between the virtual character and the virtual moving object does not exceed a preset value, displaying the deduction information, wherein the deduction amount of the state attribute value is determined according to the distance between the virtual character and the virtual moving object.
14. The method of claim 9, wherein the displaying the information about the influence of the virtual moving object on the virtual character comprises:
determining the reduction value according to the relative position relation between the virtual character and the virtual moving object, wherein the reduction value is a reduction numerical value of the attribute value of the virtual character and/or the associated virtual object;
and determining and displaying the influence information according to the reduction value.
15. The method according to any one of claims 1 to 3,
the displaying a desired moving path of a virtual moving object on the virtual map in response to a moving operation on the display interface includes:
displaying a virtual finger;
controlling the virtual finger to slide on the virtual map in response to a slide operation on the virtual finger;
and determining and displaying the expected moving path according to the sliding route of the virtual finger.
16. A device for releasing a virtual prop, the device comprising:
the display module is used for displaying a virtual map corresponding to a virtual environment in a display interface in the process of releasing the virtual prop;
a response module, configured to display, in response to a moving operation for the virtual map, a desired moving path of a virtual moving object on the virtual map, where the virtual moving object is used to affect an attribute value corresponding to a virtual character in the virtual environment;
the response module is further used for responding to the releasing operation of the virtual moving object, displaying the virtual moving object located in the virtual environment in the display interface, wherein the actual moving path of the virtual moving object is determined according to the expected moving path.
17. A computer device, characterized in that the computer device comprises a processor;
the processor is used for displaying a virtual map corresponding to a virtual environment in a display interface in the process of releasing the virtual prop;
in response to a moving operation for the virtual map, displaying a desired moving path of a virtual moving object on the virtual map, the virtual moving object being used for influencing an attribute value corresponding to a virtual character in the virtual environment;
in response to a release operation for the virtual moving object, displaying the virtual moving object located in the virtual environment in the display interface, wherein an actual moving path of the virtual moving object is determined according to the expected moving path.
18. A computer-readable storage medium, characterized in that a computer program is stored in the storage medium for execution by a processor to implement the method of releasing a virtual item according to any one of claims 1 to 15.
19. A chip characterised in that it comprises programmable logic circuits and/or program instructions for implementing the method of releasing a virtual prop according to any one of claims 1 to 15 when the chip is in operation.
20. A computer program product or computer program, characterised in that it comprises computer instructions stored in a computer readable storage medium, from which a processor reads and executes the computer instructions to implement the method of releasing a virtual prop according to any one of claims 1 to 15.
CN202111658380.4A 2021-11-05 2021-12-30 Virtual item release method, device, equipment and medium Pending CN114177616A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2021113061982 2021-11-05
CN202111306198 2021-11-05

Publications (1)

Publication Number Publication Date
CN114177616A true CN114177616A (en) 2022-03-15

Family

ID=80545340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111658380.4A Pending CN114177616A (en) 2021-11-05 2021-12-30 Virtual item release method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN114177616A (en)

Similar Documents

Publication Publication Date Title
WO2021143259A1 (en) Virtual object control method and apparatus, device, and readable storage medium
CN110433488B (en) Virtual character-based fight control method, device, equipment and medium
CN110665230B (en) Virtual role control method, device, equipment and medium in virtual world
CN111589133B (en) Virtual object control method, device, equipment and storage medium
CN112494955B (en) Skill releasing method, device, terminal and storage medium for virtual object
CN111589140B (en) Virtual object control method, device, terminal and storage medium
CN113289331B (en) Display method and device of virtual prop, electronic equipment and storage medium
CN110465083B (en) Map area control method, apparatus, device and medium in virtual environment
CN111467802B (en) Method, device, equipment and medium for displaying picture of virtual environment
CN111589124A (en) Virtual object control method, device, terminal and storage medium
WO2021147468A9 (en) Method and apparatus for virtual character control in virtual environment, and device and medium
CN112604305B (en) Virtual object control method, device, terminal and storage medium
CN111589136B (en) Virtual object control method and device, computer equipment and storage medium
CN111596838B (en) Service processing method and device, computer equipment and computer readable storage medium
CN112083848B (en) Method, device and equipment for adjusting position of control in application program and storage medium
CN111481934A (en) Virtual environment picture display method, device, equipment and storage medium
CN112569607B (en) Display method, device, equipment and medium for pre-purchased prop
WO2023134272A1 (en) Field-of-view picture display method and apparatus, and device
CN111921194A (en) Virtual environment picture display method, device, equipment and storage medium
CN111672102A (en) Virtual object control method, device, equipment and storage medium in virtual scene
CN112354180A (en) Method, device and equipment for updating integral in virtual scene and storage medium
CN113559495B (en) Method, device, equipment and storage medium for releasing skill of virtual object
CN111589144B (en) Virtual character control method, device, equipment and medium
CN111589102B (en) Auxiliary tool detection method, device, equipment and storage medium
JPWO2021143259A5 (en)

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination