WO2021143253A1 - 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质 - Google Patents

虚拟环境中虚拟道具的操作方法、装置、设备及可读介质 Download PDF

Info

Publication number
WO2021143253A1
WO2021143253A1 PCT/CN2020/123547 CN2020123547W WO2021143253A1 WO 2021143253 A1 WO2021143253 A1 WO 2021143253A1 CN 2020123547 W CN2020123547 W CN 2020123547W WO 2021143253 A1 WO2021143253 A1 WO 2021143253A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
virtual
prop
virtual environment
state
Prior art date
Application number
PCT/CN2020/123547
Other languages
English (en)
French (fr)
Inventor
刘智洪
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2021143253A1 publication Critical patent/WO2021143253A1/zh
Priority to US17/591,460 priority Critical patent/US11786817B2/en
Priority to US18/243,022 priority patent/US20230405466A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/833Hand-to-hand fighting, e.g. martial arts competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Definitions

  • the embodiments of the present application relate to the field of virtual environments, and in particular, to a method, device, device, and readable medium for operating virtual props in a virtual environment.
  • the embodiments of the present application provide a method, device, device, and readable medium for operating virtual props in a virtual environment, which can improve the attack efficiency of a first virtual object attacking through the first prop.
  • the technical solution is as follows:
  • a method for operating virtual props in a virtual environment includes:
  • the virtual environment interface including a screen for a first virtual object to observe the virtual environment, the first virtual object is equipped with a first prop, and the first prop is a short-range attack prop;
  • the first virtual object is controlled to perform a short-range attack through the first prop in the sliding shovel state.
  • a device for operating virtual props in a virtual environment includes:
  • the display module is configured to display a virtual environment interface, the virtual environment interface includes a picture of a first virtual object observing the virtual environment, the first virtual object is equipped with a first prop, and the first prop is near Cheng attack props;
  • the receiving module is used to receive the sliding shovel state triggering operation, and control the first virtual object to be in the sliding shovel state in the virtual environment, and the sliding shovel state is used to indicate that the first virtual object is tilted and squatted A state of sliding forward in the virtual environment;
  • the receiving module is further configured to receive an attack operation in response to the first virtual object being in the sliding shovel state
  • the control module is used to control the first virtual object to perform a short-range attack through the first prop in the sliding shovel state.
  • a computer device in another aspect, includes a processor and a memory.
  • the memory stores at least one instruction, at least one program, code set or instruction set, the at least one instruction, the at least A piece of program, the code set or the instruction set is loaded and executed by the processor to realize the operation of the virtual props in the virtual environment as described in any of the above embodiments of the present application.
  • a computer-readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the code
  • the set or instruction set is loaded and executed by the processor to implement the operation of the virtual props in the virtual environment as described in any of the foregoing embodiments of the present application.
  • a computer program product which when the computer program product runs on a computer, causes the computer to execute the operation of virtual props in a virtual environment as described in any of the above embodiments of the present application.
  • the body position in the walking state is relatively close to the ground.
  • the height of the swing path of the first item fits the height of the virtual object more closely, and other virtual objects cannot directly evade the attack of the first item.
  • the attack efficiency of the first virtual object attacking through the first prop thereby improving the human-computer interaction efficiency of the first virtual object attacking through the first prop.
  • FIG. 1 is a schematic diagram of attacking a hostile virtual object through remote attack props according to an exemplary embodiment of the present application
  • FIG. 2 is a schematic diagram of attacking a hostile virtual object through a short-range attack prop provided by an exemplary embodiment of the present application;
  • Fig. 3 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Fig. 4 is a schematic diagram of an implementation environment provided by an exemplary embodiment of the present application.
  • FIG. 5 is a schematic diagram of a user interface of a virtual item operation method provided by an embodiment of the present application.
  • Fig. 6 is a flowchart of a method for operating virtual props in a virtual environment provided by an exemplary embodiment of the present application
  • FIG. 7 is a schematic diagram of an interface based on the props equipment process provided by the embodiment shown in FIG. 6;
  • FIG. 8 is a schematic diagram of an interface when the first virtual object is in a sliding shovel state provided based on the embodiment shown in FIG. 6;
  • FIG. 8 is a schematic diagram of an interface when the first virtual object is in a sliding shovel state provided based on the embodiment shown in FIG. 6;
  • FIG. 9 is a schematic diagram of a collision detection box corresponding to the first item provided based on the embodiment shown in FIG. 6;
  • Fig. 10 is a flowchart of a method for operating virtual props in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of the triggering process of the sliding shovel state provided based on the embodiment shown in FIG. 10;
  • FIG. 12 is a schematic diagram of the triggering process of the sliding shovel state provided by an exemplary embodiment of the present application.
  • FIG. 13 is a flowchart of a method for operating virtual props in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 14 is a schematic diagram of an interface for displaying actions provided based on the embodiment shown in FIG. 13;
  • FIG. 15 is a flowchart of attacking a target in the sliding shovel process provided by an exemplary embodiment of the present application.
  • FIG. 16 is a structural block diagram of a device for operating virtual props in a virtual environment provided by an exemplary embodiment of the present application.
  • Fig. 17 is a structural block diagram of a device for operating virtual props in a virtual environment provided by another exemplary embodiment of the present application.
  • FIG. 18 is a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • Virtual environment It is the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment may be a simulation environment of the real world, a semi-simulation and semi-fictional environment, or a purely fictitious environment.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
  • the virtual environment is a three-dimensional virtual environment as an example.
  • Virtual object refers to the movable object in the virtual environment.
  • the movable objects may be virtual characters, virtual animals, cartoon characters, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in a three-dimensional virtual environment.
  • the virtual object is a three-dimensional model created based on animation skeletal technology.
  • Each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • the embodiment of the application is divided into a target virtual object and a hostile virtual object.
  • the target virtual object is a virtual object currently controlled by the player, and the hostile virtual object is a virtual object that initiates an attack on the target virtual object.
  • the hostile virtual object's attack on the target virtual object can be spontaneous, that is, when the target virtual object appears in the line of sight of the hostile virtual object, the hostile virtual object initiates an attack on the target virtual object; or, the The attack initiated by the hostile virtual object on the target virtual object may also be passive, that is, after the target virtual object attacks the hostile virtual object, the hostile virtual object initiates an attack on the target virtual object according to the attack received.
  • the hostile virtual object may be an artificial intelligence (AI) attack object provided by the system, or may be a virtual object controlled by other players.
  • AI artificial intelligence
  • Attack props refers to props held by virtual objects in a virtual environment for attacking other virtual objects, where the other virtual objects can be virtual objects that are hostile to the target virtual object, or include both
  • the virtual object that is in a hostile state with the target virtual object also includes the virtual object that is in a teammate state with the target virtual object.
  • the attack props can also be divided into long-range attack props and short-range attack props.
  • the long-range attack props refer to the props that realize the attack process by launching the projectile.
  • the projectile can be launched through the prop body, such as virtual firearms, virtual bows and arrows, etc., or the attacking prop itself. , Such as: stones, sandbags, etc.
  • Proximity attack props refer to props that are directly controlled by a virtual object in a waving manner to achieve the attack process, such as: virtual knives, virtual sticks, virtual axes, virtual pans, etc.
  • virtual objects can apply virtual items picked up in a virtual environment to fight, or they can apply virtual items assembled when entering a game to fight, such as when entering a virtual game,
  • Each virtual object is correspondingly equipped with a virtual knife.
  • the virtual object attacks other virtual objects by swinging the virtual knife.
  • FIG. 1 is a schematic diagram of attacking a hostile virtual object through a remote attack prop provided by an exemplary embodiment of the present application.
  • a virtual environment interface 100 includes a virtual firearm 110 and a hostile virtual object 120.
  • the virtual environment The interface 100 is a screen for observing the virtual environment from the first-person perspective of the virtual object.
  • the virtual object controls the virtual firearm 110 to aim at the hostile virtual object 120 to shoot, thereby realizing a long-range attack on the hostile virtual object 120.
  • FIG. 2 is a schematic diagram of attacking a hostile virtual object through a short-range attack prop provided by an exemplary embodiment of the present application.
  • the virtual environment interface 200 includes a virtual prop 210 and a hostile virtual object 220.
  • the environment interface is a screen for observing the virtual environment interface from the first-person perspective of the virtual object.
  • the virtual prop 210 is a short-range attack prop. Since the hostile virtual object 220 is close to the virtual object, the virtual object can control the virtual object.
  • the prop 210 is swung so as to perform a close-range attack on the hostile virtual object 220.
  • Sliding shovel It is used to indicate the way that the virtual object slides forward in the virtual environment in a tilted squat position.
  • tilted squat means that the virtual object leans back in the virtual environment and places its two legs in the virtual environment. Support posture at different distances in front of the body.
  • the forward speed of the virtual object is faster than the normal walking forward speed.
  • the forward speed of the virtual object is faster than the normal running forward speed.
  • the method provided in this application can be applied to virtual reality applications, three-dimensional map programs, military simulation programs, first-person shooting games (FPS), third-person shooting games (TPS), Multiplayer online battle arena games (Multiplayer Online Battle Arena Games, MOBA), etc.
  • the following embodiments are examples of applications in games.
  • Games based on virtual environments are often composed of one or more maps of the game world.
  • the virtual environment in the game simulates the scene of the real world.
  • Users can manipulate virtual objects in the game to walk, run, jump, shoot, and fight in the virtual environment.
  • Driving switching to using virtual weapons, using virtual weapons to attack other virtual objects and other actions, with strong interaction, and multiple users can team up for competitive games online.
  • the user controls the virtual object to use the virtual weapon to attack the first virtual object the user selects a suitable virtual weapon to attack the virtual object according to the location of the first virtual object or operating habits.
  • virtual weapons include at least one of firearms, close weapons, and throwing weapons.
  • firearms include rifles, sniper rifles, pistols, shotguns, and other types of firearms
  • close weapons include daggers, knives, axes, etc.
  • At least one type of sword, stick, and pot for example, pan).
  • Throwing weapons include ordinary grenade, sticky grenade, flash bomb, smoke bomb, and the like.
  • the terminal in this application can be a desktop computer, a laptop portable computer, a mobile phone, a tablet computer, an e-book reader, an MP3 (Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio layer 3) player, MP4 ( Moving Picture Experts Group Audio Layer IV, moving picture experts compress standard audio layer 4) Players and so on.
  • An application program supporting a virtual environment is installed and running in the terminal, such as an application program supporting a three-dimensional virtual environment.
  • the application program can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, and MOBA games.
  • the application program may be a stand-alone version application program, such as a stand-alone version of a 3D game program, or a network online version application program.
  • Fig. 3 shows a structural block diagram of an electronic device provided by an exemplary embodiment of the present application.
  • the electronic device 300 includes an operating system 320 and an application program 322.
  • the operating system 320 is basic software that provides the application program 322 with secure access to computer hardware.
  • the application program 322 is an application program supporting a virtual environment.
  • the application program 322 is an application program supporting a three-dimensional virtual environment.
  • the application program 322 may be any one of a virtual reality application program, a three-dimensional map program, a military simulation program, a TPS game, an FPS game, a MOBA game, and a multiplayer gun battle survival game.
  • the application program 322 may be a stand-alone application program, such as a stand-alone 3D game program.
  • Fig. 4 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 400 includes: a first device 420, a server 440, and a second device 460.
  • the first device 420 installs and runs an application program supporting the virtual environment.
  • the application can be any of virtual reality applications, three-dimensional map programs, military simulation programs, TPS games, FPS games, MOBA games, and multiplayer gun battle survival games.
  • the first device 420 is a device used by the first user.
  • the first user uses the first device 420 to control the first virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an animation character.
  • the first device 420 is connected to the server 440 through a wireless network or a wired network.
  • the server 440 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 440 is used to provide background services for applications supporting the three-dimensional virtual environment.
  • the server 440 is responsible for the main calculation work, and the first device 420 and the second device 460 are responsible for the secondary calculation work; or the server 440 is responsible for the secondary calculation work, and the first device 420 and the second device 460 are responsible for the main calculation work;
  • the server 440, the first device 420, and the second device 460 adopt a distributed computing architecture to perform collaborative computing.
  • the second device 460 installs and runs an application program supporting the virtual environment.
  • the application program can be any of virtual reality applications, three-dimensional map programs, military simulation programs, FPS games, MOBA games, and multiplayer gun battle survival games.
  • the second device 460 is a device used by the second user.
  • the second user uses the second device 460 to control the second virtual object in the virtual environment to perform activities, including but not limited to: adjusting body posture, crawling, walking, running, At least one of riding, jumping, driving, picking up, shooting, attacking, and throwing.
  • the second virtual object is a second virtual character, such as a simulated character or an animation character.
  • first virtual character and the second virtual character are in the same virtual environment.
  • first virtual character and the second virtual character may belong to the same team, the same organization, have a friend relationship, or have temporary communication permissions.
  • first virtual character and the second virtual character may also belong to different teams, different organizations, or two groups that are hostile.
  • the applications installed on the first device 420 and the second device 460 are the same, or the applications installed on the two devices are the same type of application on different control system platforms.
  • the first device 420 may generally refer to one of multiple devices, and the second device 460 may generally refer to one of multiple devices. In this embodiment, only the first device 420 and the second device 460 are used as an example for illustration.
  • the device types of the first device 420 and the second device 460 are the same or different.
  • the device types include: game consoles, desktop computers, smart phones, tablet computers, e-book readers, MP3 players, MP4 players, and portable laptops At least one of the computers.
  • the device is a desktop computer as an example.
  • the number of the above-mentioned devices can be more or less. For example, there may be only one device mentioned above, or there may be dozens or hundreds of devices mentioned above, or more.
  • the embodiments of the present application do not limit the quantity and type of devices.
  • FIG. 5 shows a schematic diagram of a user interface of the method for operating virtual props provided by an embodiment of the present application.
  • the virtual prop is a virtual axe as an example for description, such as As shown in Figure 5:
  • a virtual object 510 is displayed in the virtual environment interface 500.
  • the virtual object 510 is triggered to move forward in the virtual environment in a sliding shovel state.
  • the virtual object 510 holds a virtual axe 511, which is used to align the virtual axe 511 in the virtual environment.
  • Other virtual objects attack.
  • the virtual object 510 swings the virtual axe 511 in the sliding shovel state to attack in the virtual environment.
  • FIG. 6 is a flowchart of the operation method of virtual props in the virtual environment provided by an exemplary embodiment of the present application. Taking this method applied to a terminal as an example for description, as shown in Figure 6, the method includes:
  • a virtual environment interface is displayed.
  • the virtual environment interface includes a picture of a first virtual object observing the virtual environment, and the first virtual object is equipped with a first prop.
  • the first item is a short-range attack item.
  • the first prop is a prop currently held by the first virtual object, or may be a prop carried by the first virtual object but not held by the first virtual object.
  • the first item is a virtual axe as an example for description.
  • the virtual axe is the prop assembled by the first virtual object before the start of the virtual game; or, the virtual axe is the prop picked up by the first virtual object in the virtual environment; or, the virtual axe is the first virtual object passed in the virtual game
  • the props assembled before the start of the virtual game with the virtual axe as the first virtual object are taken as an example for description.
  • a prop assembly interface is displayed, the prop assembly interface includes candidate props, and the candidate props include the first prop and the second prop, where the second prop is the default assembly prop of the first virtual object ,
  • the attack range of the first prop is greater than the attack range of the second prop, and the assembly operation of the first prop is received on the prop assembly interface, and the assembly operation is used to assemble the first prop to the first virtual object.
  • the first item is an item obtained by exchanging resources in the game.
  • a candidate item selection area 710 is displayed in the item assembly interface 700, which includes a candidate item 711 and a candidate item 712.
  • the candidate item 711 is the item that is equipped with the first virtual object by default.
  • the item 712 is the item acquired by the player through purchase in the game.
  • the introduction of the candidate item 712 is displayed in the item assembly interface 700.
  • the candidate item can be selected.
  • the equipment operation of 712 is to provide the first virtual object with the ability to apply the candidate props 712 in the virtual battle.
  • the first item is a short-range attack item as an example for description.
  • the first item may also be a long-range attack item, such as virtual firearms, virtual magic wands, and so on.
  • the first item is a virtual axe as an example for description.
  • the screen in the virtual environment interface may be a screen for observing the virtual environment from a first-person perspective of the first virtual object, or a screen for observing the virtual environment from a third-person perspective of the first virtual object.
  • Step 602 Receive the sliding shovel state trigger operation, and control the first virtual object to be in the sliding shovel state in the virtual environment.
  • the sliding shovel state is used to indicate a state in which the first virtual object slides forward in the virtual environment in an inclined and squatting posture.
  • the inclined squat refers to a posture in which the first virtual object leans backward in the virtual environment and supports two legs at different distances in front of the body.
  • the forward speed of the virtual object is faster than the normal walking forward speed.
  • the forward speed of the virtual object is faster than the normal running forward speed.
  • the single duration of the first virtual object in the sliding shovel state in the virtual environment includes at least one of the following situations:
  • the single duration of the first virtual object in the sliding shovel state corresponds to the time limit.
  • the state of the first virtual object is automatically restored to the state before the sliding shovel state, such as:
  • the virtual object first enters the continuous running state and switches to the sliding shovel state.
  • the sliding shovel state reaches the time limit, the state of the first virtual object is automatically restored to the continuous running state;
  • the single duration of the first virtual object in the sliding shovel state corresponds to the time limit.
  • the state of the first virtual object is automatically restored to the preset state, such as the standing state;
  • the single duration of the first virtual object in the sliding shovel state is determined according to the control operation of the sliding shovel state.
  • the state of the first virtual object is automatically restored to the state before the sliding shovel state, such as: A virtual object first enters the continuous running state, and when receiving the long press operation of the squat control, the first virtual object is controlled to switch to the sliding shovel state, and when the long press operation ends, the state of the first virtual object is automatically restored to continuous Running state
  • the single duration of the first virtual object in the sliding shovel state is determined according to the control operation of the sliding shovel state.
  • the state of the first virtual object is automatically restored to a preset state, such as a standing state.
  • the first virtual object holds the first prop and enters the sliding shovel state
  • the first prop is a virtual axe as an example for description.
  • the virtual environment interface 800 includes a first virtual object 810, and the hand of the first virtual object 810 holds a virtual axe 820.
  • the first virtual object 810 holds the virtual axe 820 in the virtual environment. In the state of sliding shovel.
  • Step 603 In response to the first virtual object being in the sliding shovel state, an attack operation is received.
  • control mode of the attack operation on the first virtual object includes any one of the following situations:
  • the first virtual object holds the first prop, and when an attack operation is received, it attacks through the props currently held by the first virtual object;
  • the first virtual object holds other props.
  • the first virtual object When the first virtual object is in the sliding shovel state and receives an attack operation, it will switch to the first prop to attack by default.
  • the first virtual object With the virtual axe on the subject's shoulder, when an attack operation is received, the first virtual object switches the virtual axe from the back to the hand, and uses the virtual axe to attack.
  • Step 604 Control the first virtual object to perform a short-range attack through the first prop in the sliding shovel state.
  • the first virtual object slides forward in the sliding shovel state, and while sliding, swings the first prop to perform a close-range attack.
  • the other virtual objects are affected. Attack of the first item.
  • the virtual environment also includes a second virtual object, a collision detection box is mounted on the first prop, and collision detection is performed between the first prop and the second virtual object through the collision detection box, and the collision detection box responds to the collision detection box and the second virtual object.
  • the collision detection box responds to the collision detection box and the second virtual object.
  • the virtual environment interface 900 includes a first virtual object 910.
  • the first virtual object 910 holds a virtual axe 920 in its hand.
  • the virtual axe 920 is swung to perform a short-range attack.
  • the swing path of the virtual axe 920 corresponds to the movement path of the collision detection box 930, and a virtual object that collides with the collision detection box 930 is attacked by the virtual axe 920.
  • the virtual axe 920 when the virtual object being attacked is a hostile virtual object of the first virtual object 910, the virtual axe 920 generates a damage value to the virtual object being attacked.
  • FIG. 10 is an operation method of virtual props in a virtual environment provided by another exemplary embodiment of the present application. The flowchart is illustrated by taking the method applied to the terminal as an example. As shown in FIG. 10, the method includes:
  • Step 1001 Display a virtual environment interface.
  • the virtual environment interface includes a picture of a first virtual object observing the virtual environment, and the first virtual object is equipped with a first prop.
  • the first item is a virtual axe as an example for description.
  • the virtual axe is the prop assembled by the first virtual object before the start of the virtual game; or, the virtual axe is the prop picked up by the first virtual object in the virtual environment; or, the virtual axe is the first virtual object passed in the virtual game
  • the props assembled before the start of the virtual game with the virtual axe as the first virtual object are taken as an example for description.
  • Step 1002 Receive a running state trigger operation, and the running state trigger operation is used to control the first virtual object to continue to be in a running state.
  • the running state trigger operation is used to control the first virtual object to continue to be in the running state even when the terminal does not receive any control operations.
  • the running state trigger operation includes any of the following situations:
  • the virtual environment interface includes a continuous running control, which receives a trigger operation on the continuous running control, and controls the first virtual object to continue running in the current facing direction;
  • the virtual environment interface includes a forward joystick control.
  • the forward joystick control When the forward joystick control is dragged to the target direction by a preset distance, the first virtual object is controlled to move forward in the virtual environment.
  • the forward joystick control When the forward joystick control is dragged to the target direction
  • the first virtual object When moving to the target position, the first virtual object is controlled to continuously run along the direction of the joystick in the virtual environment.
  • Step 1003 Receive a jump trigger operation during the running of the first virtual object, and the jump trigger operation is used to control the first virtual object to jump in the virtual environment.
  • the virtual environment interface includes a jump control, and when a trigger operation on the jump control is received, the first virtual object is controlled to jump in the virtual environment.
  • step 1004 during the jumping process of the first virtual object, a squat trigger operation is received as a sliding shovel state trigger operation.
  • the squat trigger operation is used to control the first virtual object to squat in the virtual environment.
  • the jumping process includes a take-off phase, a landing phase, and a landing phase.
  • the take-off phase is used to instruct the first virtual object to jump from the ground until it reaches the highest point of the jump
  • the landing phase is used to indicate that the first virtual object starts to land from the highest point.
  • the landing stage is used to indicate the stage from the touchdown to the completion of the jump during the landing process.
  • the squat trigger operation is received as the sliding shovel state trigger operation.
  • the virtual environment interface 1100 includes a first virtual object 1110, a continuous running control 1120, a jumping control 1130, a squat control 1140, and an attack control 1150; after first clicking the continuous running control 1120, Control the first virtual object 1110 to continue to run in the virtual environment.
  • the jump control 1130 click on the jump control 1130 to control the first virtual object 1110 to jump in the virtual environment, and to jump in the first virtual object 1110.
  • Step 1005 In response to the first virtual object being in the sliding shovel state, an attack operation is received.
  • control mode of the attack operation on the first virtual object includes any one of the following situations:
  • the first virtual object holds the first prop, and when an attack operation is received, it attacks through the props currently held by the first virtual object;
  • the first virtual object holds other props.
  • the first virtual object When the first virtual object is in the sliding shovel state and receives an attack operation, it will switch to the first prop to attack by default.
  • Step 1006 Control the first virtual object to perform a short-range attack through the first prop in the sliding shovel state.
  • the first virtual object slides forward in the sliding shovel state, and while sliding, swings the first prop to perform a close-range attack.
  • the other virtual objects are affected. Attack of the first item.
  • the method provided in this embodiment controls the first virtual object to be in a continuous running state in the virtual environment, and triggers the first virtual object to jump during the running process, thereby controlling the first virtual object to enter the sliding shovel state in the virtual environment, and simulate During the sliding shovel process, the whole process of approach, take-off and sliding shovel improves the authenticity of the sliding shovel state.
  • FIG. 12 is a schematic diagram of the triggering process of the sliding shovel state provided by an exemplary embodiment of the present application.
  • the above-mentioned first virtual object includes the following processes in the process of triggering the sliding shovel :
  • Step 1201 purchase and equip the first item.
  • the first item is an item purchased by the player in the game and equipped in the item assembly interface.
  • Step 1202 Determine whether the first virtual object is running.
  • the continuous running state is used to perform a run-up on the sliding state of the first virtual object.
  • Step 1203 When the first virtual object is running, it enters a running state.
  • Step 1204 it is judged whether to click to jump.
  • the first virtual object when tapping to jump, the first virtual object jumps while continuously running.
  • Step 1205 when the jump is clicked, the first virtual object is controlled to jump.
  • Step 1206 Determine whether the squat control is pressed for a long time at the moment of landing.
  • the jumping process includes a take-off phase, a landing phase, and a landing phase.
  • the take-off phase is used to instruct the first virtual object to jump from the ground until it reaches the highest point of the jump
  • the landing phase is used to indicate that it starts landing from the highest point Until the stage before touchdown
  • the landing stage is used to indicate the stage from the touchdown to the completion of the jump during the landing process.
  • the squat trigger operation is received as the sliding shovel state trigger operation.
  • Step 1207 When the squat control is pressed and held, the sliding shovel state is entered and a specific function is triggered.
  • the specific function may be to trigger the first virtual object to perform a specific action in the virtual environment.
  • Step 1208 It is judged whether the playing of the specific function is over.
  • step 1209 when the playback of the specific function ends, it is restored to the initial state.
  • the virtual environment further includes a second virtual object.
  • the first virtual object When the first virtual object successfully attacks the second virtual object, the first virtual object is controlled to perform the display action.
  • FIG. 13 is another example of this application.
  • the exemplary embodiment provides a flowchart of a method for operating virtual props in a virtual environment. The method is applied to a terminal as an example for description. As shown in FIG. 13, the method includes:
  • a virtual environment interface is displayed.
  • the virtual environment interface includes a picture of a first virtual object observing the virtual environment, and the first virtual object is equipped with a first prop.
  • the first item is a virtual axe as an example for description.
  • the virtual axe is the prop assembled by the first virtual object before the start of the virtual game; or, the virtual axe is the prop picked up by the first virtual object in the virtual environment; or, the virtual axe is the first virtual object passed in the virtual game
  • the props assembled before the start of the virtual game with the virtual axe as the first virtual object are taken as an example for description.
  • Step 1302 Receive the sliding shovel state trigger operation, and control the first virtual object to be in the sliding shovel state in the virtual environment.
  • the sliding shovel state is used to indicate a state in which the first virtual object slides forward in the virtual environment in an inclined and squatting posture.
  • the inclined squat refers to a posture in which the first virtual object leans backward in the virtual environment and supports two legs at different distances in front of the body.
  • Step 1303 In response to the first virtual object being in the sliding shovel state, an attack operation is received.
  • control mode of the attack operation on the first virtual object includes any one of the following situations:
  • the first virtual object holds the first prop, and when an attack operation is received, it attacks through the props currently held by the first virtual object;
  • the first virtual object holds other props.
  • the first virtual object When the first virtual object is in the sliding shovel state and receives an attack operation, it will switch to the first prop to attack by default.
  • Step 1304 Control the first virtual object to perform a short-range attack through the first prop in the sliding shovel state.
  • the first virtual object slides forward in the sliding shovel state, and while sliding, swings the first prop to perform a close-range attack.
  • the other virtual objects are affected. Attack of the first item.
  • Step 1305 in response to the first virtual object successfully attacking the second virtual object through the first prop, control the first virtual object to perform a display action in the virtual environment.
  • the display action is used to represent the attack result of the first virtual object on the second virtual object, that is, when the first virtual object attacks the second virtual object, the attack is successful.
  • the successful attack of the first virtual object on the second virtual object includes any one of the following situations:
  • a collision detection box is mounted on the first prop, and the collision detection between the first prop and the second virtual object is performed through the collision detection box.
  • the collision detection box and the second virtual object Determine the damage value of the first item to the second virtual object.
  • the first virtual object attacks the second virtual object through the first prop, and the life value of the second virtual object drops to 0, it is determined that the first virtual object eliminates the second virtual object through the first prop.
  • the display action can be a custom action, a preset action, or a random action, which is not limited in the embodiment of the present application.
  • the virtual environment interface 1400 includes a first virtual object 1410.
  • the first virtual object 1410 holds a first prop 1420 in its hand.
  • the first virtual object 1410 A virtual object 1410 rotates the hand-held first prop 1420 as a display action after a successful attack.
  • the first virtual object when the first virtual object successfully attacks the second virtual object through the first prop during the sliding shovel, the first virtual object is controlled to perform the display action, because the first virtual object becomes the first virtual object during the sliding shovel. After attacking the second virtual object, it usually slides from one side to the other side of the second virtual object, and the state of the second virtual object cannot be directly obtained.
  • the display action can determine whether the second virtual object is under attack. Avoid observing after the viewing angle is rotated, which improves the efficiency of human-computer interaction.
  • FIG. 15 is a flowchart of attacking a target in the sliding shovel process provided by an exemplary embodiment of the present application. As shown in FIG. 15, the process includes:
  • Step 1501 purchase and equip the first item.
  • the first item is an item purchased by the player in the game and equipped in the item assembly interface.
  • Step 1502 Determine whether the first virtual object is sliding.
  • Step 1503 When the first virtual object is sliding, enter the sliding state.
  • Step 1504 it is judged whether to click to fire.
  • the first virtual object attacks through the first prop.
  • Step 1505 when you click to fire, enter the fire state.
  • the first item is a short-range attack item, so when clicking to fire, the first virtual object is controlled to swing the first item to attack.
  • Step 1506 it is judged whether the target is hit.
  • a collision detection box is mounted on the first item, and it is determined whether the target is hit by the collision between the collision detection box and the target.
  • Step 1507 When the target is hit, a specific function is triggered.
  • the specific function may be to trigger the first virtual object to perform a specific action in the virtual environment.
  • Step 1508 It is judged whether the playing of the specific function is over.
  • Step 1509 When the playback of the specific function ends, restore to the initial state.
  • FIG. 16 is a structural block diagram of a device for operating virtual props in a virtual environment provided by an exemplary embodiment of the present application.
  • the device is used in a terminal as an example for description.
  • the device includes: a display module 1610, Receiving module 1620 and control module 1630;
  • the display module 1610 is configured to display a virtual environment interface, the virtual environment interface includes a screen for a first virtual object to observe the virtual environment, the first virtual object is equipped with a first prop, and the first prop is Short range attack props;
  • the receiving module 1620 is configured to receive a sliding shovel state trigger operation, and control the first virtual object to be in the sliding shovel state in the virtual environment, and the sliding shovel state is used to indicate that the first virtual object is tilted and squatted The state of the posture sliding forward in the virtual environment;
  • the receiving module 1620 is further configured to receive an attack operation in response to the first virtual object being in the sliding shovel state
  • the control module 1630 is configured to control the first virtual object to perform a short-range attack through the first prop in the sliding shovel state.
  • the receiving module 1620 is further configured to receive a running state trigger operation, and the running state trigger operation is used to control the first virtual object to continue to be in a running state;
  • the receiving module 1620 is further configured to receive a jump trigger operation during the running of the first virtual object, where the jump trigger operation is used to control the first virtual object to jump in the virtual environment;
  • the receiving module 1620 is further configured to receive a squat trigger operation as the sliding shovel state trigger operation during the jump process of the first virtual object, and the squat trigger operation is used to control the first virtual object Squat down in the virtual environment.
  • the jumping process includes a take-off phase, a landing phase, and a landing phase;
  • the receiving module 1620 is further configured to receive the squat trigger operation as the sliding shovel state trigger operation during the landing stage of the jumping process.
  • the virtual environment further includes a second virtual object
  • the control module 1630 is further configured to control the first virtual object to perform a display action in the virtual environment in response to the first virtual object successfully attacking the second virtual object through the first prop,
  • the display action is used to represent the result of the attack of the first virtual object on the second virtual object.
  • control module 1630 is further configured to control the first virtual object in response to the first virtual object generating a damage value to the second virtual object through the first prop Performing the display action in the virtual environment;
  • the control module 1630 is further configured to control the first virtual object to perform the display action in the virtual environment in response to the first virtual object to eliminate the second virtual object through the first prop.
  • a collision detection box is mounted on the first prop
  • control module 1630 includes:
  • the detection unit 1631 is configured to perform collision detection between the first prop and the second virtual object through the collision detection box;
  • the determining unit 1632 is configured to determine that the first item causes the damage value to the second virtual object in response to a collision between the collision detection box and the second virtual object.
  • the display module 1610 is further configured to display a prop assembly interface, the prop assembly interface includes candidate props, and the candidate props include the first prop and the second prop, wherein ,
  • the second prop is a default assembly prop of the first virtual object, and the attack range of the first prop is larger than the attack range of the second prop;
  • the receiving module 1620 is further configured to receive an assembling operation on the first prop on the prop assembling interface, and the assembling operation is used to assemble the first prop to the first virtual object.
  • the first virtual object when the first prop is applied, the first virtual object is first controlled to be in the sliding shovel state, and the first prop is used in the sliding shovel state
  • the first prop When attacking, since the body position of the first virtual object in the sliding shovel state is relatively close to the ground in the normal standing or walking state, when the first prop is swung, the height of the swing path of the first prop is greater than the height of the virtual object.
  • other virtual objects cannot directly evade the attack of the first item, and the attack efficiency of the first virtual object's attack through the first item is improved, thereby improving the human-computer interaction of the first virtual object's attack through the first item. efficiency.
  • the operating device for virtual props in the virtual environment provided in the above embodiment is only illustrated by the division of the above functional modules.
  • the above functions can be allocated by different functional modules according to needs. That is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
  • the operating device for virtual props in the virtual environment provided by the above-mentioned embodiment and the embodiment of the method for operating virtual props in the virtual environment belong to the same concept. For the specific implementation process, please refer to the method embodiment, which will not be repeated here.
  • FIG. 18 shows a structural block diagram of a terminal 1800 provided by an exemplary embodiment of the present invention.
  • the terminal 1800 can be: smartphones, tablet computers, MP3 players (Moving Picture Experts Group Audio Layer III, moving picture experts compress standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture experts compress standard audio Level 4) Player, laptop or desktop computer.
  • the terminal 1800 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 1800 includes a processor 1801 and a memory 1802.
  • the processor 1801 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 1801 can adopt at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 1801 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the wake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 1801 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used to render and draw content that needs to be displayed on the display screen.
  • the processor 1801 may further include an AI (Artificial Intelligence) processor, and the AI processor is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1802 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 1802 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1802 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1801 to implement the virtual environment provided by the method embodiment of the present application. The operation method of virtual props.
  • the terminal 1800 may optionally further include: a peripheral device interface 1803 and at least one peripheral device.
  • the processor 1801, the memory 1802, and the peripheral device interface 1803 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 1803 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 1804, a display screen 1805, a camera component 1806, an audio circuit 1807, a positioning component 1808, and a power supply 1809.
  • the peripheral device interface 1803 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1801 and the memory 1802.
  • the processor 1801, the memory 1802, and the peripheral device interface 1803 are integrated on the same chip or circuit board; in some other embodiments, any one of the processor 1801, the memory 1802, and the peripheral device interface 1803 or The two can be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • the radio frequency circuit 1804 is used for receiving and transmitting RF (Radio Frequency, radio frequency) signals, also called electromagnetic signals.
  • the radio frequency circuit 1804 communicates with a communication network and other communication devices through electromagnetic signals.
  • the radio frequency circuit 1804 converts electrical signals into electromagnetic signals for transmission, or converts received electromagnetic signals into electrical signals.
  • the radio frequency circuit 1804 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a user identity module card, and so on.
  • the radio frequency circuit 1804 can communicate with other terminals through at least one wireless communication protocol.
  • the wireless communication protocol includes but is not limited to: World Wide Web, Metropolitan Area Network, Intranet, various generations of mobile communication networks (2G, 3G, 4G, and 5G), wireless local area network and/or WiFi (Wireless Fidelity, wireless fidelity) network.
  • the radio frequency circuit 1804 may also include a circuit related to NFC (Near Field Communication), which is not limited in this application.
  • the display screen 1805 is used to display a UI (User Interface).
  • the UI can include graphics, text, icons, videos, and any combination thereof.
  • the display screen 1805 also has the ability to collect touch signals on or above the surface of the display screen 1805.
  • the touch signal may be input to the processor 1801 as a control signal for processing.
  • the display screen 1805 may also be used to provide virtual buttons and/or virtual keyboards, also called soft buttons and/or soft keyboards.
  • the display screen 1805 there may be one display screen 1805, which is provided with the front panel of the terminal 1800; in other embodiments, there may be at least two display screens 1805, which are respectively arranged on different surfaces of the terminal 1800 or in a folded design; In still other embodiments, the display screen 1805 may be a flexible display screen, which is disposed on the curved surface or the folding surface of the terminal 1800. Furthermore, the display screen 1805 can also be set as a non-rectangular irregular pattern, that is, a special-shaped screen.
  • the display screen 1805 may be made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera assembly 1806 is used to capture images or videos.
  • the camera assembly 1806 includes a front camera and a rear camera.
  • the front camera is set on the front panel of the terminal, and the rear camera is set on the back of the terminal.
  • the camera assembly 1806 may also include a flashlight.
  • the flash can be a single-color flash or a dual-color flash. Dual color temperature flash refers to a combination of warm light flash and cold light flash, which can be used for light compensation under different color temperatures.
  • the audio circuit 1807 may include a microphone and a speaker.
  • the microphone is used to collect sound waves of the user and the environment, and convert the sound waves into electrical signals and input them to the processor 1801 for processing, or input to the radio frequency circuit 1804 to implement voice communication. For the purpose of stereo collection or noise reduction, there may be multiple microphones, which are respectively set in different parts of the terminal 1800.
  • the microphone can also be an array microphone or an omnidirectional collection microphone.
  • the speaker is used to convert the electrical signal from the processor 1801 or the radio frequency circuit 1804 into sound waves.
  • the speaker can be a traditional thin-film speaker or a piezoelectric ceramic speaker.
  • the speaker When the speaker is a piezoelectric ceramic speaker, it can not only convert the electrical signal into human audible sound waves, but also convert the electrical signal into human inaudible sound waves for distance measurement and other purposes.
  • the audio circuit 1807 may also include a headphone jack.
  • the positioning component 1808 is used to locate the current geographic location of the terminal 1800 to implement navigation or LBS (Location Based Service, location-based service).
  • the positioning component 1808 may be a positioning component based on the GPS (Global Positioning System, Global Positioning System) of the United States, the Beidou system of China, or the Galileo system of Russia.
  • the power supply 1809 is used to supply power to various components in the terminal 1800.
  • the power source 1809 may be alternating current, direct current, disposable batteries, or rechargeable batteries.
  • the rechargeable battery may be a wired rechargeable battery or a wireless rechargeable battery.
  • a wired rechargeable battery is a battery charged through a wired line
  • a wireless rechargeable battery is a battery charged through a wireless coil.
  • the rechargeable battery can also be used to support fast charging technology.
  • the terminal 1800 further includes one or more sensors 1810.
  • the one or more sensors 1810 include, but are not limited to: an acceleration sensor 1811, a gyroscope sensor 1812, a pressure sensor 1813, a fingerprint sensor 1814, an optical sensor 1815, and a proximity sensor 1816.
  • the acceleration sensor 1811 can detect the magnitude of acceleration on the three coordinate axes of the coordinate system established by the terminal 1800.
  • the acceleration sensor 1811 can be used to detect the components of gravitational acceleration on three coordinate axes.
  • the processor 1801 may control the display screen 1805 to display the user interface in a horizontal view or a vertical view according to the gravity acceleration signal collected by the acceleration sensor 1811.
  • the acceleration sensor 1811 may also be used for the collection of game or user motion data.
  • the gyroscope sensor 1812 can detect the body direction and rotation angle of the terminal 1800, and the gyroscope sensor 1812 can cooperate with the acceleration sensor 1811 to collect the user's 3D actions on the terminal 1800. Based on the data collected by the gyroscope sensor 1812, the processor 1801 can implement the following functions: motion sensing (such as changing the UI according to the user's tilt operation), image stabilization during shooting, game control, and inertial navigation.
  • the pressure sensor 1813 may be disposed on the side frame of the terminal 1800 and/or the lower layer of the display screen 1805.
  • the processor 1801 performs left and right hand recognition or quick operation according to the holding signal collected by the pressure sensor 1813.
  • the processor 1801 controls the operability controls on the UI interface according to the user's pressure operation on the display screen 1805.
  • the operability control includes at least one of a button control, a scroll bar control, an icon control, and a menu control.
  • the fingerprint sensor 1814 is used to collect the user's fingerprint.
  • the processor 1801 identifies the user's identity according to the fingerprint collected by the fingerprint sensor 1814, or the fingerprint sensor 1814 identifies the user's identity according to the collected fingerprint. When it is recognized that the user's identity is a trusted identity, the processor 1801 authorizes the user to perform related sensitive operations, including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings.
  • the fingerprint sensor 1814 may be provided on the front, back or side of the terminal 1800. When a physical button or a manufacturer logo is provided on the terminal 1800, the fingerprint sensor 1814 can be integrated with the physical button or the manufacturer logo.
  • the optical sensor 1815 is used to collect the ambient light intensity.
  • the processor 1801 may control the display brightness of the display screen 1805 according to the intensity of the ambient light collected by the optical sensor 1815. Specifically, when the ambient light intensity is high, the display brightness of the display screen 1805 is increased; when the ambient light intensity is low, the display brightness of the display screen 1805 is decreased.
  • the processor 1801 may also dynamically adjust the shooting parameters of the camera assembly 1806 according to the ambient light intensity collected by the optical sensor 1815.
  • the proximity sensor 1816 also called a distance sensor, is usually set on the front panel of the terminal 1800.
  • the proximity sensor 1816 is used to collect the distance between the user and the front of the terminal 1800.
  • the processor 1801 controls the touch screen 1805 to switch from the on-screen state to the off-screen state; when the proximity sensor 1816 detects When the distance between the user and the front of the terminal 1800 gradually increases, the processor 1801 controls the touch display screen 1805 to switch from the on-screen state to the on-screen state.
  • FIG. 18 does not constitute a limitation on the terminal 1800, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • An embodiment of the present application also provides a computer device that includes a memory and a processor.
  • the memory stores at least one instruction, at least one program, code set or instruction set, at least one instruction, at least one program, code set or instruction
  • the set is loaded by the processor and implements the operation method of virtual props in the virtual environment as described in any of the above embodiments.
  • the embodiment of the present application also provides a computer-readable storage medium, the readable storage medium stores at least one instruction, at least one program, code set, or instruction set, the at least one instruction, the at least one program, the The code set or instruction set is loaded and executed by the processor to implement the method for operating virtual props in the virtual environment as described in any of the foregoing embodiments.
  • This application also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the above-mentioned method for operating virtual props in a virtual environment as described in any of the above-mentioned embodiments.
  • the program can be stored in a computer-readable storage medium.
  • the medium may be a computer-readable storage medium included in the memory in the foregoing embodiment; or may be a computer-readable storage medium that exists alone and is not assembled into the terminal.
  • the computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor In order to realize the operation method of virtual props in the virtual environment as described in any of the above embodiments.
  • the computer-readable storage medium may include: read only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), solid state drive (SSD, Solid State Drives), optical disks, and the like.
  • random access memory may include resistive random access memory (ReRAM, Resistance Random Access Memory) and dynamic random access memory (DRAM, Dynamic Random Access Memory).
  • ReRAM resistive random access memory
  • DRAM Dynamic Random Access Memory
  • the program can be stored in a computer-readable storage medium.
  • the storage medium mentioned can be a read-only memory, a magnetic disk or an optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种虚拟环境中虚拟道具的操作方法、装置、设备及可读介质,涉及虚拟环境领域。该方法包括:显示虚拟环境界面(800),虚拟环境界面(800)中包括第一虚拟对象(810)对虚拟环境进行观察的画面,第一虚拟对象(810)装配有第一道具(820)(601),第一道具(820)为近程攻击道具;接收滑铲状态触发操作,控制第一虚拟对象(810)在虚拟环境中处于滑铲状态(602);响应于第一虚拟对象(810)处于滑铲状态,接收攻击操作(603);控制第一虚拟对象(810)在滑铲状态下通过第一道具(820)进行近程攻击(604)。在对第一道具(820)进行应用时,首先将第一虚拟对象(810)控制处于滑铲状态,由于滑铲状态下第一虚拟对象(810)的身体位置与普通站立或行走状态下的身体位置相对贴近地面,其他虚拟对象无法直接对第一道具(820)的攻击进行规避,从而提高了第一虚拟对象(810)通过第一道具(820)进行攻击的人机交互效率。

Description

虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
本申请要求于2020年1月15日提交的申请号为202010042540.1、发明名称为“虚拟环境中虚拟道具的操作方法、装置、设备及可读介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟环境领域,特别涉及一种虚拟环境中虚拟道具的操作方法、装置、设备及可读介质。
背景技术
在包括虚拟环境的应用程序中,通常需要通过控制虚拟环境中的虚拟对象在虚拟环境中进行活动,如:步行、驾驶、游泳、作战、捡拾物品等,其中,虚拟对象能够在虚拟环境中对虚拟道具进行应用从而实现作战过程。然而,相关技术中提供的虚拟对象使用虚拟道具的方案,人机交互过程较为繁琐。
发明内容
本申请实施例提供了一种虚拟环境中虚拟道具的操作方法、装置、设备及可读介质,可以提高第一虚拟对象通过第一道具进行攻击的攻击效率。所述技术方案如下:
一方面,提供了一种虚拟环境中虚拟道具的操作方法,所述方法包括:
显示虚拟环境界面,所述虚拟环境界面中包括第一虚拟对象对所述虚拟环境进行观察的画面,所述第一虚拟对象装配有第一道具,所述第一道具为近程攻击道具;
接收滑铲状态触发操作,控制所述第一虚拟对象在所述虚拟环境中处于滑铲状态,所述滑铲状态用于表示所述第一虚拟对象以倾斜下蹲的姿势在所述虚拟环境中滑动前进的状态;
响应于所述第一虚拟对象处于所述滑铲状态,接收攻击操作;
控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻 击。
另一方面,提供了一种虚拟环境中虚拟道具的操作装置,所述装置包括:
显示模块,用于显示虚拟环境界面,所述虚拟环境界面中包括第一虚拟对象对所述虚拟环境进行观察的画面,所述第一虚拟对象装配有第一道具,所述第一道具为近程攻击道具;
接收模块,用于接收滑铲状态触发操作,控制所述第一虚拟对象在所述虚拟环境中处于滑铲状态,所述滑铲状态用于表示所述第一虚拟对象以倾斜下蹲的姿势在所述虚拟环境中滑动前进的状态;
所述接收模块,还用于响应于所述第一虚拟对象处于所述滑铲状态,接收攻击操作;
控制模块,用于控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻击。
另一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例中任一所述的虚拟环境中虚拟道具的操作。
另一方面,提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述本申请实施例中任一所述的虚拟环境中虚拟道具的操作。
另一方面,提供了一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得计算机执行如上述本申请实施例中任一所述的虚拟环境中虚拟道具的操作。
本申请实施例提供的技术方案带来的有益效果至少包括:
在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,提高第一虚拟对象通过第一道具进行攻击的攻击效率,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一个示例性实施例提供的通过远程攻击道具对敌对虚拟对象进行攻击的示意图;
图2是本申请一个示例性实施例提供的通过近程攻击道具对敌对虚拟对象进行攻击的示意图;
图3是本申请一个示例性实施例提供的终端的结构框图;
图4是本申请一个示例性实施例提供的实施环境示意图;
图5是本申请实施例提供的虚拟道具操作方法的用户界面示意图;
图6是本申请一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图;
图7是基于图6示出的实施例提供的道具装备过程的界面示意图;
图8是基于图6示出的实施例提供的第一虚拟对象处于滑铲状态的界面示意图;
图9是基于图6示出的实施例提供的第一道具对应的碰撞检测盒的示意图;
图10是本申请另一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图;
图11是基于图10示出的实施例提供的滑铲状态的触发过程示意图;
图12是本申请一个示例性实施例提供的滑铲状态的触发过程示意图;
图13是本申请另一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图;
图14是基于图13示出的实施例提供的展示动作的界面示意图;
图15是本申请一个示例性实施例提供的滑铲过程中对目标进行攻击的流程图;
图16是本申请一个示例性实施例提供的虚拟环境中虚拟道具的操作装置的结构框图;
图17是本申请另一个示例性实施例提供的虚拟环境中虚拟道具的操作装置 的结构框图;
图18是本申请一个示例性的实施例提供的终端的结构框图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行简单介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真环境,也可以是半仿真半虚构的环境,还可以是纯虚构的环境。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种,本申请对此不加以限定。下述实施例以虚拟环境是三维虚拟环境来举例说明。
虚拟对象:是指虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物等,比如:在三维虚拟环境中显示的人物、动物、植物、油桶、墙壁、石块等。可选地,虚拟对象是基于动画骨骼技术创建的三维立体模型。每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。可选的,本申请实施例中分为目标虚拟对象和敌对虚拟对象,其中,目标虚拟对象为玩家当前控制的虚拟对象,敌对虚拟对象为对该目标虚拟对象发起攻击的虚拟对象,其中,该敌对虚拟对象对目标虚拟对象发起的攻击可以是自发的,也即,当目标虚拟对象出现在敌对虚拟对象的视线范围内时,该敌对虚拟对象即发起对该目标虚拟对象的攻击;或,该敌对虚拟对象对目标虚拟对象发起的攻击也可以是被动的,也即,当目标虚拟对象对该敌对虚拟对象进行攻击后,该敌对虚拟对象根据受到的攻击向目标虚拟对象发起攻击。可选地,该敌对虚拟对象可以是系统提供的人工智能(Artificial Intelligence,AI)攻击对象,也可以是其他玩家所控制的虚拟对象。
攻击道具:是指在虚拟环境中由虚拟对象持有的用于对其他虚拟对象进行攻击的道具,其中,该其他虚拟对象可以为与目标虚拟对象呈敌对状态的虚拟对象,也可以为既包括与目标虚拟对象呈敌对状态的虚拟对象,又包括与目标虚拟对象呈队友状态的虚拟对象。可选地,该攻击道具还可以分为远程攻击道具和近程攻击道具。其中,远程攻击道具是指通过对发射物进行发射,从而实 现攻击过程的道具,其中,该发射物可以是通过道具本体实现发射的,如:虚拟枪械、虚拟弓箭等,也可以是攻击道具本身,如:石子、沙袋等。近程攻击道具是指由虚拟对象直接以挥动方式进行控制从而实现攻击过程的道具,如:虚拟刀具、虚拟棍棒、虚拟斧子、虚拟平底锅等。
在相关技术中,虚拟对象可以在虚拟环境中对捡拾得到的虚拟道具进行应用从而进行作战,也可以对进入对局时所装配的虚拟道具进行应用从而进行作战,如在进入虚拟对局时,每个虚拟对象对应装配有虚拟小刀,在虚拟对局中,虚拟对象通过挥动虚拟小刀对其他虚拟对象进行攻击。
然而,上述对虚拟道具的应用过程中,由于虚拟道具的应用方式皆为已知的,较易对虚拟道具的攻击过程进行规避,如:对虚拟小刀的攻击范围进行规避,从而导致虚拟对象需要通过多次攻击淘汰其他虚拟对象,如:多次挥动小刀并调整位置对其他虚拟对象进行攻击,人机交互过程较为繁琐。
示意性的,对远程攻击道具和近程攻击道具分别进行说明,请参考图1和图2。
图1是本申请一个示例性实施例提供的通过远程攻击道具对敌对虚拟对象进行攻击的示意图,如图1所示,在虚拟环境界面100中包括虚拟枪械110和敌对虚拟对象120,该虚拟环境界面100为以虚拟对象的第一人称视角对虚拟环境进行观察的画面,虚拟对象控制该虚拟枪械110瞄准该敌对虚拟对象120进行射击,从而实现对该敌对虚拟对象120进行远程攻击。
图2是本申请一个示例性实施例提供的通过近程攻击道具对敌对虚拟对象进行攻击的示意图,如图2所示,在虚拟环境界面200中包括虚拟道具210和敌对虚拟对象220,该虚拟环境界面为以虚拟对象的第一人称视角对虚拟环境界面进行观察的画面,该虚拟道具210为近程攻击道具,由于敌对虚拟对象220与虚拟对象距离较近,故该虚拟对象能够通过控制该虚拟道具210进行挥动,从而对该敌对虚拟对象220进行近程攻击。
滑铲:用于表示虚拟对象以倾斜下蹲的姿势在虚拟环境中滑动前进的方式,可选地,倾斜下蹲是指虚拟对象在虚拟环境中向后呈仰体,并将两条腿在身体前方不同距离位置处进行支撑的姿势。可选地,滑铲状态下,虚拟对象的前进速度比正常步行前进的速度快,可选地,滑铲状态下,虚拟对象的前进速度比正常跑步前进的速度快。
本申请中提供的方法可以应用于虚拟现实应用程序、三维地图程序、军事仿真程序、第一人称射击游戏(First-Person Shooting game,FPS)、第三人称射击游戏(Third-Person Shooting game,TPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)等,下述实施例是以在游戏中的应用来举例说明。
基于虚拟环境的游戏往往由一个或多个游戏世界的地图构成,游戏中的虚拟环境模拟现实世界的场景,用户可以操控游戏中的虚拟对象在虚拟环境中进行行走、跑步、跳跃、射击、格斗、驾驶、切换使用虚拟武器、使用虚拟武器攻击其他虚拟对象等动作,交互性较强,并且多个用户可以在线组队进行竞技游戏。用户控制虚拟对象使用虚拟武器对第一虚拟对象发起攻击时,用户根据第一虚拟对象所在的位置,或操作习惯选择合适的虚拟武器对虚拟对象进行攻击。其中,虚拟武器包括枪械武器、近身武器、投掷类武器中的至少一种,其中,枪械武器包括步枪、狙击枪、手枪、霰弹枪等类型的枪械,近身武器包括匕首、刀、斧子、剑、棍子、锅(比如,平底锅)中的至少一种类型,投掷类武器包括普通手雷、粘性手雷、闪光弹、烟雾弹等类型。
本申请中的终端可以是台式计算机、膝上型便携计算机、手机、平板电脑、电子书阅读器、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器等等。该终端中安装和运行有支持虚拟环境的应用程序,比如支持三维虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏中的任意一种。可选地,该应用程序可以是单机版的应用程序,比如单机版的3D游戏程序,也可以是网络联机版的应用程序。
图3示出了本申请一个示例性实施例提供的电子设备的结构框图。该电子设备300包括:操作系统320和应用程序322。
操作系统320是为应用程序322提供对计算机硬件的安全访问的基础软件。
应用程序322是支持虚拟环境的应用程序。可选地,应用程序322是支持三维虚拟环境的应用程序。该应用程序322可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存 游戏中的任意一种。该应用程序322可以是单机版的应用程序,比如单机版的3D游戏程序。
图4示出了本申请一个示例性实施例提供的计算机系统的结构框图。该计算机系统400包括:第一设备420、服务器440和第二设备460。
第一设备420安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、TPS游戏、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第一设备420是第一用户使用的设备,第一用户使用第一设备420控制位于虚拟环境中的第一虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第一设备420通过无线网络或有线网络与服务器440相连。
服务器440包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器440用于为支持三维虚拟环境的应用程序提供后台服务。可选地,服务器440承担主要计算工作,第一设备420和第二设备460承担次要计算工作;或者,服务器440承担次要计算工作,第一设备420和第二设备460承担主要计算工作;或者,服务器440、第一设备420和第二设备460三者之间采用分布式计算架构进行协同计算。
第二设备460安装和运行有支持虚拟环境的应用程序。该应用程序可以是虚拟现实应用程序、三维地图程序、军事仿真程序、FPS游戏、MOBA游戏、多人枪战类生存游戏中的任意一种。第二设备460是第二用户使用的设备,第二用户使用第二设备460控制位于虚拟环境中的第二虚拟对象进行活动,该活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
可选地,第一虚拟人物和第二虚拟人物处于同一虚拟环境中。可选地,第一虚拟人物和第二虚拟人物可以属于同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。可选地,第一虚拟人物和第二虚拟人物也可以属于不同队伍、不同组织、或具有敌对性的两个团体。
可选地,第一设备420和第二设备460上安装的应用程序是相同的,或两 个设备上安装的应用程序是不同控制系统平台的同一类型应用程序。第一设备420可以泛指多个设备中的一个,第二设备460可以泛指多个设备中的一个,本实施例仅以第一设备420和第二设备460来举例说明。第一设备420和第二设备460的设备类型相同或不同,该设备类型包括:游戏主机、台式计算机、智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器和膝上型便携计算机中的至少一种。以下实施例以设备是台式计算机来举例说明。
本领域技术人员可以知晓,上述设备的数量可以更多或更少。比如上述设备可以仅为一个,或者上述设备为几十个或几百个,或者更多数量。本申请实施例对设备的数量和设备类型不加以限定。
本申请实施例提供了一种虚拟环境中虚拟道具的操作方法,图5示出了本申请实施例提供的虚拟道具操作方法的用户界面示意图,以该虚拟道具为虚拟斧子为例进行说明,如图5所示:
在虚拟环境界面500中显示有虚拟对象510,首先触发虚拟对象510以滑铲状态在虚拟环境中前进,可选地,虚拟对象510持有虚拟斧子511,虚拟斧子511用于在虚拟环境中对其他虚拟对象进行攻击,当虚拟对象510处于滑铲状态,且接收到攻击操作时,虚拟对象510在虚拟环境中以滑铲状态挥动虚拟斧子511进行攻击。
结合上述名词简介以及实施环境说明,对本申请实施例提供的虚拟环境中虚拟道具的操作方法进行说明,图6是本申请一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图,以该方法应用于终端中为例进行说明,如图6所示,该方法包括:
步骤601,显示虚拟环境界面,虚拟环境界面中包括第一虚拟对象对虚拟环境进行观察的画面,第一虚拟对象装配有第一道具。
可选地,第一道具为近程攻击道具。可选地,第一道具为第一虚拟对象当前所手持的道具,也可以是第一虚拟对象携带但未手持的道具。示意性的,本实施例中,以第一道具为虚拟斧子为例进行说明。虚拟斧子为第一虚拟对象在虚拟对局开局前装配的道具;或,虚拟斧子为第一虚拟对象在虚拟环境中捡拾得到的道具;或,虚拟斧子为第一虚拟对象在虚拟对局中通过兑换的方式获取的道具,本申请实施例中,以虚拟斧子为第一虚拟对象在虚拟对局开局前装配 的道具为例进行说明。
可选地,在虚拟对局开局前,显示道具装配界面,道具装配界面中包括候选道具,候选道具中包括第一道具和第二道具,其中,第二道具为第一虚拟对象的默认装配道具,第一道具的攻击范围大于第二道具的攻击范围,在道具装配界面上接收对第一道具的装配操作,该装配操作用于向第一虚拟对象装配第一道具。
可选地,第一道具为在游戏中通过交换资源交换得到的道具。
示意性的,请参考图7,在道具装配界面700中显示有候选道具选择区域710,其中包括候选道具711和候选道具712,其中,候选道具711为默认装备于第一虚拟对象的道具,候选道具712为玩家在游戏中通过购买的方式获取的道具,对候选道具712进行选择后,在道具装配界面700中显示有候选道具712的简介,在装备控件720上进行选择后,实现对候选道具712的装备操作,也即向第一虚拟对象提供在虚拟对战中应用候选道具712的能力。
值得注意的是,上述实施例中,以第一道具为近程攻击道具为例进行说明,实际操作中,第一道具也可以为远程攻击道具,如:虚拟枪械、虚拟魔法棒等。本实施例中,以第一道具为虚拟斧子为例进行说明。
可选地,虚拟环境界面中的画面可以是以第一虚拟对象的第一人称视角对虚拟环境进行观察的画面,也可以是以第一虚拟对象的第三人称视角对虚拟环境进行观察的画面。
步骤602,接收滑铲状态触发操作,控制第一虚拟对象在虚拟环境中处于滑铲状态。
可选地,滑铲状态用于表示第一虚拟对象以倾斜下蹲的姿势在虚拟环境中滑动前进的状态。可选地,倾斜下蹲是指第一虚拟对象在虚拟环境中向后呈仰体,并将两条腿在身体前方不同距离位置处进行支撑的姿势。可选地,滑铲状态下,虚拟对象的前进速度比正常步行前进的速度快,可选地,滑铲状态下,虚拟对象的前进速度比正常跑步前进的速度快。
可选地,第一虚拟对象在虚拟环境中处于滑铲状态的单次持续时长包括如下情况中的至少一种:
第一,第一虚拟对象处于滑铲状态的单次持续时长对应有时长限制,当持续时长达到时长限制时,自动将第一虚拟对象的状态恢复至滑铲状态之前的状态,如:第一虚拟对象首先进入持续奔跑状态,并切换至滑铲状态,则当滑铲 状态达到时长限制时,自动恢复第一虚拟对象的状态至持续奔跑状态;
第二,第一虚拟对象处于滑铲状态的单次持续时长对应有时长限制,当持续时长达到时长限制时,自动将第一虚拟对象的状态恢复至预设状态,如:站立状态;
第三,第一虚拟对象处于滑铲状态的单次持续时长根据滑铲状态的控制操作确定,当控制操作结束,自动将第一虚拟对象的状态恢复至滑铲状态之前的状态,如:第一虚拟对象首先进入持续奔跑状态,并当接收到在下蹲控件的长按操作时,控制第一虚拟对象切换至滑铲状态,当长按操作结束时,自动恢复第一虚拟对象的状态至持续奔跑状态;
第四,第一虚拟对象处于滑铲状态的单次持续时长根据滑铲状态的控制操作确定,当控制操作结束,自动将第一虚拟对象的状态恢复至预设状态,如:站立状态。
可选地,本申请实施例中,以第一虚拟对象持有第一道具进入滑铲状态,且第一道具为虚拟斧子为例进行说明,示意性的,请参考图8,在虚拟环境界面800中包括第一虚拟对象810,且第一虚拟对象810的手部持有虚拟斧子820,当接收到滑铲状态触发操作时,第一虚拟对象810在虚拟环境中持有虚拟斧子820的同时处于滑铲状态。
步骤603,响应于第一虚拟对象处于滑铲状态,接收攻击操作。
可选地,攻击操作对第一虚拟对象的控制方式包括如下情况中的任意一种:
第一,第一虚拟对象持有第一道具,当接收到攻击操作时,通过第一虚拟对象当前持有的道具进行攻击;
第二,第一虚拟对象持有其他道具,当第一虚拟对象处于滑铲状态并接收到攻击操作时,默认切换至第一道具进行攻击,示意性的,第一道具为背负在第一虚拟对象肩部的虚拟斧子,则当接收到攻击操作时,第一虚拟对象从背部将虚拟斧子切换至手部,并通过虚拟斧子进行攻击。
步骤604,控制第一虚拟对象在滑铲状态下通过第一道具进行近程攻击。
可选地,第一虚拟对象在滑铲状态下向前滑行,并在滑行的同时挥动第一道具进行近程攻击,当其他虚拟对象在第一道具的挥动路径上时,则其他虚拟对象受到第一道具的攻击。
示意性的,虚拟环境中还包括第二虚拟对象,第一道具上挂载有碰撞检测盒,通过碰撞检测盒对第一道具与第二虚拟对象之间进行碰撞检测,响应于碰 撞检测盒与第二虚拟对象之间存在碰撞情况,确定第一道具对第二虚拟对象产生伤害值。
示意性的,请参考图9,虚拟环境界面900中包括第一虚拟对象910,第一虚拟对象910手部持有虚拟斧子920,在滑铲过程中,挥动虚拟斧子920进行近程攻击,沿虚拟斧子920的挥动路径对应有碰撞检测盒930的移动路径,与碰撞检测盒930之间存在碰撞情况的虚拟对象被虚拟斧子920攻击。可选地,当被攻击的虚拟对象为第一虚拟对象910的敌对虚拟对象,则虚拟斧子920对被攻击的虚拟对象产生伤害值。
综上所述,本实施例提供的虚拟环境中虚拟道具的操作方法,在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,提高第一虚拟对象通过第一道具进行攻击的攻击效率,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
在一个可选的实施例中,上述滑铲状态的触发操作是通过持续奔跑、跳跃等过程结合实现的,图10是本申请另一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图,以该方法应用于终端中为例进行说明,如图10所示,该方法包括:
步骤1001,显示虚拟环境界面,虚拟环境界面中包括第一虚拟对象对虚拟环境进行观察的画面,第一虚拟对象装配有第一道具。
示意性的,本实施例中,以第一道具为虚拟斧子为例进行说明。虚拟斧子为第一虚拟对象在虚拟对局开局前装配的道具;或,虚拟斧子为第一虚拟对象在虚拟环境中捡拾得到的道具;或,虚拟斧子为第一虚拟对象在虚拟对局中通过兑换的方式获取的道具,本申请实施例中,以虚拟斧子为第一虚拟对象在虚拟对局开局前装配的道具为例进行说明。
步骤1002,接收奔跑状态触发操作,奔跑状态触发操作用于控制第一虚拟对象持续处于奔跑状态。
可选地,奔跑状态触发操作用于控制第一虚拟对象在终端不接收任何控制操作的状态下也能持续处于奔跑状态,可选地,奔跑状态触发操作包括如下情 况中的任意一种:
第一,虚拟环境界面中包括持续奔跑控件,接收对持续奔跑控件的触发操作,控制第一虚拟对象以当前面对方向进行持续奔跑;
第二,虚拟环境界面中包括前进摇杆控件,当将前进摇杆控件向目标方向拖动预设距离时,控制第一虚拟对象在虚拟环境中前进,当将前进摇杆控件向目标方向拖动至目标位置时,控制第一虚拟对象在虚拟环境中沿摇杆方向持续奔跑。
步骤1003,在第一虚拟对象的奔跑过程中,接收跳跃触发操作,跳跃触发操作用于控制第一虚拟对象在虚拟环境中跳跃。
可选地,虚拟环境界面中包括跳跃控件,当接收到在跳跃控件上的触发操作时,控制第一虚拟对象在虚拟环境中跳跃。
步骤1004,在第一虚拟对象的跳跃过程中,接收下蹲触发操作作为滑铲状态触发操作。
可选地,下蹲触发操作用于控制第一虚拟对象在虚拟环境中下蹲。
可选地,跳跃过程包括起跳阶段、降落阶段和落地阶段,其中,起跳阶段用于指示第一虚拟对象从地面开始跳起直至达到跳跃最高点的阶段,降落阶段用于指示从最高点开始降落直至触地之前的阶段,落地阶段用于指示降落过程中从触地开始直至跳跃完成的阶段。可选地,在跳跃过程的落地阶段,接收下蹲触发操作作为滑铲状态触发操作。
示意性的,请参考图11,在虚拟环境界面1100中包括第一虚拟对象1110、持续奔跑控件1120、跳跃控件1130、下蹲控件1140和攻击控件1150;首先对持续奔跑控件1120进行点击后,控制第一虚拟对象1110在虚拟环境中持续处于奔跑状态,在第一虚拟对象1110的奔跑过程中,点击跳跃控件1130,控制第一虚拟对象1110在虚拟环境中跳跃,并在第一虚拟对象1110的落地时刻点击下蹲控件1140,从而触发第一虚拟对象1110在虚拟环境中处于滑铲状态。
步骤1005,响应于第一虚拟对象处于滑铲状态,接收攻击操作。
可选地,攻击操作对第一虚拟对象的控制方式包括如下情况中的任意一种:
第一,第一虚拟对象持有第一道具,当接收到攻击操作时,通过第一虚拟对象当前持有的道具进行攻击;
第二,第一虚拟对象持有其他道具,当第一虚拟对象处于滑铲状态并接收到攻击操作时,默认切换至第一道具进行攻击。
步骤1006,控制第一虚拟对象在滑铲状态下通过第一道具进行近程攻击。
可选地,第一虚拟对象在滑铲状态下向前滑行,并在滑行的同时挥动第一道具进行近程攻击,当其他虚拟对象在第一道具的挥动路径上时,则其他虚拟对象受到第一道具的攻击。
综上所述,本实施例提供的虚拟环境中虚拟道具的操作方法,在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
本实施例提供的方法,通过控制第一虚拟对象在虚拟环境中处于持续奔跑状态,并在奔跑过程中触发第一虚拟对象起跳,从而控制第一虚拟对象在虚拟环境中进入滑铲状态,模拟滑铲过程中助跑、起跳、滑铲的整个过程,提高了滑铲状态的真实性。
示意性的,请参考图12,图12是本申请一个示例性实施例提供的滑铲状态的触发过程示意图,如图12所示,上述第一虚拟对象在触发滑铲的过程中包括如下过程:
步骤1201,购买并装备第一道具。
可选地,第一道具为玩家在游戏中购买,并在道具装配界面中装备的道具。
步骤1202,判断第一虚拟对象是否奔跑。
可选地,判断第一虚拟对象是否处于持续奔跑状态,持续奔跑状态用于对第一虚拟对象的滑铲状态进行助跑。
步骤1203,当第一虚拟对象奔跑时,进入奔跑状态。
步骤1204,判断是否点击起跳。
可选地,点击起跳时,第一虚拟对象在持续奔跑的过程中进行跳跃。
步骤1205,当点击起跳时,控制第一虚拟对象跳跃。
步骤1206,判断落地瞬间是否长按下蹲控件。
可选地,跳跃过程包括起跳阶段、降落阶段和落地阶段,其中,起跳阶段用于指示第一虚拟对象从地面开始跳起直至达到跳跃最高点的阶段,降落阶段用于指示从最高点开始降落直至触地之前的阶段,落地阶段用于指示降落过程 中从触地开始直至跳跃完成的阶段。可选地,在跳跃过程的落地阶段,接收下蹲触发操作作为滑铲状态触发操作。
步骤1207,当长按下蹲控件时,进入滑铲状态并触发特定功能。
可选地,该特定功能可以是触发第一虚拟对象在虚拟环境中执行特定的动作。
步骤1208,判断特定功能是否播放结束。
步骤1209,当特定功能播放结束时,恢复至初始状态。
在一个可选的实施例中,虚拟环境中还包括第二虚拟对象,当第一虚拟对象对第二虚拟对象攻击成功时,控制第一虚拟对象执行展示动作,图13是本申请另一个示例性实施例提供的虚拟环境中虚拟道具的操作方法的流程图,以该方法应用于终端中为例进行说明,如图13所示,该方法包括:
步骤1301,显示虚拟环境界面,虚拟环境界面中包括第一虚拟对象对虚拟环境进行观察的画面,第一虚拟对象装配有第一道具。
示意性的,本实施例中,以第一道具为虚拟斧子为例进行说明。虚拟斧子为第一虚拟对象在虚拟对局开局前装配的道具;或,虚拟斧子为第一虚拟对象在虚拟环境中捡拾得到的道具;或,虚拟斧子为第一虚拟对象在虚拟对局中通过兑换的方式获取的道具,本申请实施例中,以虚拟斧子为第一虚拟对象在虚拟对局开局前装配的道具为例进行说明。
步骤1302,接收滑铲状态触发操作,控制第一虚拟对象在虚拟环境中处于滑铲状态。
可选地,滑铲状态用于表示第一虚拟对象以倾斜下蹲的姿势在虚拟环境中滑动前进的状态。可选地,倾斜下蹲是指第一虚拟对象在虚拟环境中向后呈仰体,并将两条腿在身体前方不同距离位置处进行支撑的姿势。
步骤1303,响应于第一虚拟对象处于滑铲状态,接收攻击操作。
可选地,攻击操作对第一虚拟对象的控制方式包括如下情况中的任意一种:
第一,第一虚拟对象持有第一道具,当接收到攻击操作时,通过第一虚拟对象当前持有的道具进行攻击;
第二,第一虚拟对象持有其他道具,当第一虚拟对象处于滑铲状态并接收到攻击操作时,默认切换至第一道具进行攻击。
步骤1304,控制第一虚拟对象在滑铲状态下通过第一道具进行近程攻击。
可选地,第一虚拟对象在滑铲状态下向前滑行,并在滑行的同时挥动第一道具进行近程攻击,当其他虚拟对象在第一道具的挥动路径上时,则其他虚拟对象受到第一道具的攻击。
步骤1305,响应于第一虚拟对象通过第一道具对第二虚拟对象攻击成功,控制第一虚拟对象在虚拟环境中执行展示动作。
可选地,展示动作用于表示第一虚拟对象对第二虚拟对象的攻击结果,也即表示第一虚拟对象对第二虚拟对象进行攻击时,攻击成功。
可选地,第一虚拟对象对第二虚拟对象攻击成功包括如下情况中的任意一种:
第一,响应于第一虚拟对象通过第一道具对第二虚拟对象产生伤害值,控制第一虚拟对象在虚拟环境中执行展示动作;
可选地,第一道具上挂载有碰撞检测盒,通过碰撞检测盒对第一道具与第二虚拟对象之间进行碰撞检测,当碰撞检测盒与第二虚拟对象之间存在碰撞情况时,确定第一道具对第二虚拟对象产生伤害值。
第二,响应于第一虚拟对象通过第一道具淘汰第二虚拟对象,控制第一虚拟对象在虚拟环境中执行展示动作;
可选地,当第一虚拟对象通过第一道具对第二虚拟对象进行攻击后,第二虚拟对象的生命值降低至0时,则确定第一虚拟对象通过第一道具淘汰第二虚拟对象。
可选地,该展示动作可以是自定义动作,也可以是预先设置的动作,还可以是随机动作,本申请实施例对此不加以限定,示意性的,请参考图14,在虚拟环境界面1400中包括第一虚拟对象1410,第一虚拟对象1410手部持有第一道具1420,当第一虚拟对象1410在滑铲过程中通过第一道具1420对第二虚拟对象攻击成功时,则第一虚拟对象1410对手持的第一道具1420进行旋转,作为攻击成功后的展示动作。
综上所述,本实施例提供的虚拟环境中虚拟道具的操作方法,在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
本实施例提供的方法,当第一虚拟对象在滑铲的过程中通过第一道具对第二虚拟对象攻击成功时,控制第一虚拟对象执行展示动作,由于滑铲过程中当第一虚拟对象对第二虚拟对象进行攻击后,通常已从第二虚拟对象的一侧滑行至另一侧,而导致第二虚拟对象的状态无法直接得到,通过展示动作能够确定第二虚拟对象是否受到攻击,避免对视角旋转后进行观察,提高了人机交互效率。
图15是本申请一个示例性实施例提供的滑铲过程中对目标进行攻击的流程图,如图15所示,该过程包括:
步骤1501,购买并装备第一道具。
可选地,第一道具为玩家在游戏中购买,并在道具装配界面中装备的道具。
步骤1502,判断第一虚拟对象是否滑铲。
可选地,判断第一虚拟对象是否通过持续奔跑、跳跃以及长按下蹲后处于滑铲状态。
步骤1503,当第一虚拟对象滑铲时,进入滑铲状态。
步骤1504,判断是否点击开火。
可选地,点击开火时,第一虚拟对象通过第一道具进行攻击。
步骤1505,当点击开火时,进入开火状态。
可选地,第一道具为近程攻击道具,故,点击开火时,控制第一虚拟对象挥动第一道具进行攻击。
步骤1506,判断是否击中目标。
可选地,第一道具上挂载有碰撞检测盒,通过碰撞检测盒与目标之间的碰撞情况判断是否击中目标。
步骤1507,当击中目标时,触发特定功能。
可选地,该特定功能可以是触发第一虚拟对象在虚拟环境中执行特定的动作。
步骤1508,判断特定功能是否播放结束。
步骤1509,当特定功能播放结束时,恢复至初始状态。
综上所述,本实施例提供的虚拟环境中虚拟道具的操作方法,在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行 走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
图16是本申请一个示例性实施例提供的虚拟环境中虚拟道具的操作装置的结构框图,以该装置应用于终端中为例进行说明,如图16所示,该装置包括:显示模块1610、接收模块1620以及控制模块1630;
显示模块1610,用于显示虚拟环境界面,所述虚拟环境界面中包括第一虚拟对象对所述虚拟环境进行观察的画面,所述第一虚拟对象装配有第一道具,所述第一道具为近程攻击道具;
接收模块1620,用于接收滑铲状态触发操作,控制所述第一虚拟对象在所述虚拟环境中处于滑铲状态,所述滑铲状态用于表示所述第一虚拟对象以倾斜下蹲的姿势在所述虚拟环境中滑动前进的状态;
所述接收模块1620,还用于响应于所述第一虚拟对象处于所述滑铲状态,接收攻击操作;
控制模块1630,用于控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻击。
在一个可选的实施例中,所述接收模块1620,还用于接收奔跑状态触发操作,所述奔跑状态触发操作用于控制所述第一虚拟对象持续处于奔跑状态;
所述接收模块1620,还用于在所述第一虚拟对象的奔跑过程中,接收跳跃触发操作,所述跳跃触发操作用于控制所述第一虚拟对象在所述虚拟环境中跳跃;
所述接收模块1620,还用于在所述第一虚拟对象的跳跃过程中,接收下蹲触发操作作为所述滑铲状态触发操作,所述下蹲触发操作用于控制所述第一虚拟对象在所述虚拟环境中下蹲。
在一个可选的实施例中,所述跳跃过程中包括起跳阶段、降落阶段和落地阶段;
所述接收模块1620,还用于在所述跳跃过程的落地阶段,接收所述下蹲触发操作作为所述滑铲状态触发操作。
在一个可选的实施例中,所述虚拟环境中还包括第二虚拟对象;
所述控制模块1630,还用于响应于所述第一虚拟对象通过所述第一道具对 所述第二虚拟对象攻击成功,控制所述第一虚拟对象在所述虚拟环境中执行展示动作,所述展示动作用于表示所述第一虚拟对象对所述第二虚拟对象的攻击结果。
在一个可选的实施例中,所述控制模块1630,还用于响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象产生伤害值,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作;
或,
所述控制模块1630,还用于响应于所述第一虚拟对象通过所述第一道具淘汰所述第二虚拟对象,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作。
在一个可选的实施例中,所述第一道具上挂载有碰撞检测盒;
如图17所示,所述控制模块1630,包括:
检测单元1631,用于通过所述碰撞检测盒对所述第一道具与所述第二虚拟对象之间进行碰撞检测;
确定单元1632,用于响应于所述碰撞检测盒与所述第二虚拟对象之间存在碰撞情况,确定所述第一道具对所述第二虚拟对象产生所述伤害值。
在一个可选的实施例中,所述显示模块1610,还用于显示道具装配界面,所述道具装配界面中包括候选道具,所述候选道具中包括所述第一道具和第二道具,其中,所述第二道具为所述第一虚拟对象的默认装配道具,所述第一道具的攻击范围大于所述第二道具的攻击范围;
所述接收模块1620,还用于在所述道具装配界面上接收对所述第一道具的装配操作,所述装配操作用于向所述第一虚拟对象装配所述第一道具。
综上所述,本实施例提供的虚拟环境中虚拟道具的操作装置,在对第一道具进行应用时,首先将第一虚拟对象控制处于滑铲状态,并在滑铲状态下使用第一道具进行攻击,由于滑铲状态下第一虚拟对象的身体位置与普通站立或行走状态下的身体位置相对贴近地面,对第一道具进行挥动时,第一道具的挥动路径高度与虚拟对象的高度更为贴合,其他虚拟对象无法直接对第一道具的攻击进行规避,提高第一虚拟对象通过第一道具进行攻击的攻击效率,从而提高了第一虚拟对象通过第一道具进行攻击的人机交互效率。
需要说明的是:上述实施例提供的虚拟环境中虚拟道具的操作装置,仅以 上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的虚拟环境中虚拟道具的操作装置与虚拟环境中虚拟道具的操作方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
图18示出了本发明一个示例性实施例提供的终端1800的结构框图。该终端1800可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端1800还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端1800包括有:处理器1801和存储器1802。
处理器1801可以包括一个或多个处理核心,比、如4核心处理器、8核心处理器等。处理器1801可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1801也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1801可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1801还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1802可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器1802还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1802中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1801所执行以实现本申请中方法实施例提供的虚拟环境中虚拟道具的操作方法。
在一些实施例中,终端1800还可选包括有:外围设备接口1803和至少一 个外围设备。处理器1801、存储器1802和外围设备接口1803之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口1803相连。具体地,外围设备包括:射频电路1804、显示屏1805、摄像头组件1806、音频电路1807、定位组件1808和电源1809中的至少一种。
外围设备接口1803可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1801和存储器1802。在一些实施例中,处理器1801、存储器1802和外围设备接口1803被集成在同一芯片或电路板上;在一些其他实施例中,处理器1801、存储器1802和外围设备接口1803中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
射频电路1804用于接收和发射RF(Radio Frequency,射频)信号,也称电磁信号。射频电路1804通过电磁信号与通信网络以及其他通信设备进行通信。射频电路1804将电信号转换为电磁信号进行发送,或者,将接收到的电磁信号转换为电信号。可选地,射频电路1804包括:天线系统、RF收发器、一个或多个放大器、调谐器、振荡器、数字信号处理器、编解码芯片组、用户身份模块卡等等。射频电路1804可以通过至少一种无线通信协议来与其它终端进行通信。该无线通信协议包括但不限于:万维网、城域网、内联网、各代移动通信网络(2G、3G、4G及5G)、无线局域网和/或WiFi(Wireless Fidelity,无线保真)网络。在一些实施例中,射频电路1804还可以包括NFC(Near Field Communication,近距离无线通信)有关的电路,本申请对此不加以限定。
显示屏1805用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。当显示屏1805是触摸显示屏时,显示屏1805还具有采集在显示屏1805的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1801进行处理。此时,显示屏1805还可以用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1805可以为一个,设置终端1800的前面板;在另一些实施例中,显示屏1805可以为至少两个,分别设置在终端1800的不同表面或呈折叠设计;在再一些实施例中,显示屏1805可以是柔性显示屏,设置在终端1800的弯曲表面上或折叠面上。甚至,显示屏1805还可以设置成非矩形的不规则图形,也即异形屏。显示屏1805可以采用LCD(Liquid Crystal Display,液晶显示屏)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1806用于采集图像或视频。可选地,摄像头组件1806包括前 置摄像头和后置摄像头。通常,前置摄像头设置在终端的前面板,后置摄像头设置在终端的背面。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头、长焦摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能、主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能或者其它融合拍摄功能。在一些实施例中,摄像头组件1806还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。
音频电路1807可以包括麦克风和扬声器。麦克风用于采集用户及环境的声波,并将声波转换为电信号输入至处理器1801进行处理,或者输入至射频电路1804以实现语音通信。出于立体声采集或降噪的目的,麦克风可以为多个,分别设置在终端1800的不同部位。麦克风还可以是阵列麦克风或全向采集型麦克风。扬声器则用于将来自处理器1801或射频电路1804的电信号转换为声波。扬声器可以是传统的薄膜扬声器,也可以是压电陶瓷扬声器。当扬声器是压电陶瓷扬声器时,不仅可以将电信号转换为人类可听见的声波,也可以将电信号转换为人类听不见的声波以进行测距等用途。在一些实施例中,音频电路1807还可以包括耳机插孔。
定位组件1808用于定位终端1800的当前地理位置,以实现导航或LBS(Location Based Service,基于位置的服务)。定位组件1808可以是基于美国的GPS(Global Positioning System,全球定位系统)、中国的北斗系统或俄罗斯的伽利略系统的定位组件。
电源1809用于为终端1800中的各个组件进行供电。电源1809可以是交流电、直流电、一次性电池或可充电电池。当电源1809包括可充电电池时,该可充电电池可以是有线充电电池或无线充电电池。有线充电电池是通过有线线路充电的电池,无线充电电池是通过无线线圈充电的电池。该可充电电池还可以用于支持快充技术。
在一些实施例中,终端1800还包括有一个或多个传感器1810。该一个或多个传感器1810包括但不限于:加速度传感器1811、陀螺仪传感器1812、压力传感器1813、指纹传感器1814、光学传感器1815以及接近传感器1816。
加速度传感器1811可以检测以终端1800建立的坐标系的三个坐标轴上的加速度大小。比如,加速度传感器1811可以用于检测重力加速度在三个坐标轴 上的分量。处理器1801可以根据加速度传感器1811采集的重力加速度信号,控制显示屏1805以横向视图或纵向视图进行用户界面的显示。加速度传感器1811还可以用于游戏或者用户的运动数据的采集。
陀螺仪传感器1812可以检测终端1800的机体方向及转动角度,陀螺仪传感器1812可以与加速度传感器1811协同采集用户对终端1800的3D动作。处理器1801根据陀螺仪传感器1812采集的数据,可以实现如下功能:动作感应(比如根据用户的倾斜操作来改变UI)、拍摄时的图像稳定、游戏控制以及惯性导航。
压力传感器1813可以设置在终端1800的侧边框和/或显示屏1805的下层。当压力传感器1813设置在终端1800的侧边框时,可以检测用户对终端1800的握持信号,由处理器1801根据压力传感器1813采集的握持信号进行左右手识别或快捷操作。当压力传感器1813设置在显示屏1805的下层时,由处理器1801根据用户对显示屏1805的压力操作,实现对UI界面上的可操作性控件进行控制。可操作性控件包括按钮控件、滚动条控件、图标控件、菜单控件中的至少一种。
指纹传感器1814用于采集用户的指纹,由处理器1801根据指纹传感器1814采集到的指纹识别用户的身份,或者,由指纹传感器1814根据采集到的指纹识别用户的身份。在识别出用户的身份为可信身份时,由处理器1801授权该用户执行相关的敏感操作,该敏感操作包括解锁屏幕、查看加密信息、下载软件、支付及更改设置等。指纹传感器1814可以被设置终端1800的正面、背面或侧面。当终端1800上设置有物理按键或厂商Logo时,指纹传感器1814可以与物理按键或厂商Logo集成在一起。
光学传感器1815用于采集环境光强度。在一个实施例中,处理器1801可以根据光学传感器1815采集的环境光强度,控制显示屏1805的显示亮度。具体地,当环境光强度较高时,调高显示屏1805的显示亮度;当环境光强度较低时,调低显示屏1805的显示亮度。在另一个实施例中,处理器1801还可以根据光学传感器1815采集的环境光强度,动态调整摄像头组件1806的拍摄参数。
接近传感器1816,也称距离传感器,通常设置在终端1800的前面板。接近传感器1816用于采集用户与终端1800的正面之间的距离。在一个实施例中,当接近传感器1816检测到用户与终端1800的正面之间的距离逐渐变小时,由处理器1801控制触摸显示屏1805从亮屏状态切换为息屏状态;当接近传感器 1816检测到用户与终端1800的正面之间的距离逐渐变大时,由处理器1801控制触摸显示屏1805从息屏状态切换为亮屏状态。
本领域技术人员可以理解,图18中示出的结构并不构成对终端1800的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供一种计算机设备,该计算机设备包括存储器和处理器,存储器中存储有至少一条指令、至少一段程序、代码集或指令集,至少一条指令、至少一段程序、代码集或指令集由处理器加载并实现如上述实施例中任一所述的虚拟环境中虚拟道具的操作方法。
本申请实施例还提供一种计算机可读存储介质,该可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述实施例中任一所述的虚拟环境中虚拟道具的操作方法。
本申请还提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述如上述实施例中任一所述的虚拟环境中虚拟道具的操作方法。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,该计算机可读存储介质可以是上述实施例中的存储器中所包含的计算机可读存储介质;也可以是单独存在,未装配入终端中的计算机可读存储介质。该计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上述实施例中任一所述的虚拟环境中虚拟道具的操作方法。
可选地,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、固态硬盘(SSD,Solid State Drives)或光盘等。其中,随机存取记忆体可以包括电阻式随机存取记忆体(ReRAM,Resistance Random Access Memory)和动态随机存取存储器(DRAM,Dynamic Random Access Memory)。上述本申请实施例序号仅仅为了 描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的可选实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种虚拟环境中虚拟道具的操作方法,应用于计算机设备中,所述方法包括:
    显示虚拟环境界面,所述虚拟环境界面中包括第一虚拟对象对所述虚拟环境进行观察的画面,所述第一虚拟对象装配有第一道具,所述第一道具为近程攻击道具;
    接收滑铲状态触发操作,控制所述第一虚拟对象在所述虚拟环境中处于滑铲状态,所述滑铲状态用于表示所述第一虚拟对象以倾斜下蹲的姿势在所述虚拟环境中滑动前进的状态;
    响应于所述第一虚拟对象处于所述滑铲状态,接收攻击操作;
    控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻击。
  2. 根据权利要求1所述的方法,其中,所述接收滑铲状态触发操作,包括:
    接收奔跑状态触发操作,所述奔跑状态触发操作用于控制所述第一虚拟对象持续处于奔跑状态;
    在所述第一虚拟对象的奔跑过程中,接收跳跃触发操作,所述跳跃触发操作用于控制所述第一虚拟对象在所述虚拟环境中跳跃;
    在所述第一虚拟对象的跳跃过程中,接收下蹲触发操作作为所述滑铲状态触发操作,所述下蹲触发操作用于控制所述第一虚拟对象在所述虚拟环境中下蹲。
  3. 根据权利要求2所述的方法,其中,所述跳跃过程中包括起跳阶段、降落阶段和落地阶段;
    所述在所述第一虚拟对象的跳跃过程中,接收下蹲触发操作作为所述滑铲状态触发操作,包括:
    在所述跳跃过程的落地阶段,接收所述下蹲触发操作作为所述滑铲状态触发操作。
  4. 根据权利要求1至3任一所述的方法,其中,所述虚拟环境中还包括第二虚拟对象;
    所述控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻击之后,还包括:
    响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象攻击成功,控制所述第一虚拟对象在所述虚拟环境中执行展示动作,所述展示动作用于表示所述第一虚拟对象对所述第二虚拟对象的攻击结果。
  5. 根据权利要求4所述的方法,其中,所述响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象攻击成功,控制所述第一虚拟对象在所述虚拟环境中执行展示动作,包括:
    响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象产生伤害值,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作;
    或,
    响应于所述第一虚拟对象通过所述第一道具淘汰所述第二虚拟对象,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作。
  6. 根据权利要求5所述的方法,其中,所述第一道具上挂载有碰撞检测盒;
    所述响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象产生伤害值,包括:
    通过所述碰撞检测盒对所述第一道具与所述第二虚拟对象之间进行碰撞检测;
    响应于所述碰撞检测盒与所述第二虚拟对象之间存在碰撞情况,确定所述第一道具对所述第二虚拟对象产生所述伤害值。
  7. 根据权利要求1至3任一所述的方法,其中,所述显示虚拟环境界面之前,还包括:
    显示道具装配界面,所述道具装配界面中包括候选道具,所述候选道具中包括所述第一道具和第二道具,其中,所述第二道具为所述第一虚拟对象的默认装配道具,所述第一道具的攻击范围大于所述第二道具的攻击范围;
    在所述道具装配界面上接收对所述第一道具的装配操作,所述装配操作用于向所述第一虚拟对象装配所述第一道具。
  8. 一种虚拟环境中虚拟道具的操作装置,所述装置包括:
    显示模块,用于显示虚拟环境界面,所述虚拟环境界面中包括第一虚拟对象对所述虚拟环境进行观察的画面,所述第一虚拟对象装配有第一道具,所述第一道具为近程攻击道具;
    接收模块,用于接收滑铲状态触发操作,控制所述第一虚拟对象在所述虚拟环境中处于滑铲状态,所述滑铲状态用于表示所述第一虚拟对象以倾斜下蹲的姿势在所述虚拟环境中滑动前进的状态;
    所述接收模块,还用于响应于所述第一虚拟对象处于所述滑铲状态,接收攻击操作;
    控制模块,用于控制所述第一虚拟对象在所述滑铲状态下通过所述第一道具进行近程攻击。
  9. 根据权利要求8所述的装置,其中,所述接收模块,还用于接收奔跑状态触发操作,所述奔跑状态触发操作用于控制所述第一虚拟对象持续处于奔跑状态;
    所述接收模块,还用于在所述第一虚拟对象的奔跑过程中,接收跳跃触发操作,所述跳跃触发操作用于控制所述第一虚拟对象在所述虚拟环境中跳跃;
    所述接收模块,还用于在所述第一虚拟对象的跳跃过程中,接收下蹲触发操作作为所述滑铲状态触发操作,所述下蹲触发操作用于控制所述第一虚拟对象在所述虚拟环境中下蹲。
  10. 根据权利要求9所述的装置,其中,所述跳跃过程中包括起跳阶段、降落阶段和落地阶段;
    所述接收模块,还用于在所述跳跃过程的落地阶段,接收所述下蹲触发操作作为所述滑铲状态触发操作。
  11. 根据权利要求8至10任一所述的装置,其中,所述虚拟环境中还包括 第二虚拟对象;
    所述控制模块,还用于响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象攻击成功,控制所述第一虚拟对象在所述虚拟环境中执行展示动作,所述展示动作用于表示所述第一虚拟对象对所述第二虚拟对象的攻击结果。
  12. 根据权利要求11所述的装置,其中,所述控制模块,还用于响应于所述第一虚拟对象通过所述第一道具对所述第二虚拟对象产生伤害值,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作;
    或,
    所述控制模块,还用于响应于所述第一虚拟对象通过所述第一道具淘汰所述第二虚拟对象,控制所述第一虚拟对象在所述虚拟环境中执行所述展示动作。
  13. 根据权利要求12所述的装置,其中,所述第一道具上挂载有碰撞检测盒;
    所述控制模块,包括:
    检测单元,用于通过所述碰撞检测盒对所述第一道具与所述第二虚拟对象之间进行碰撞检测;
    确定单元,用于响应于所述碰撞检测盒与所述第二虚拟对象之间存在碰撞情况,确定所述第一道具对所述第二虚拟对象产生所述伤害值。
  14. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如权利要求1至8任一所述的虚拟环境中虚拟道具的操作方法。
  15. 一种计算机可读存储介质,所述存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由处理器加载并执行以实现如权利要求1至8任一所述的虚拟环境中虚拟道具的操作方法。
PCT/CN2020/123547 2020-01-15 2020-10-26 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质 WO2021143253A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/591,460 US11786817B2 (en) 2020-01-15 2022-02-02 Method and apparatus for operating virtual prop in virtual environment, device and readable medium
US18/243,022 US20230405466A1 (en) 2020-01-15 2023-09-06 Method and apparatus for operating virtual prop in virtual environment, device and readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010042540.1A CN111249726B (zh) 2020-01-15 2020-01-15 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
CN202010042540.1 2020-01-15

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/591,460 Continuation US11786817B2 (en) 2020-01-15 2022-02-02 Method and apparatus for operating virtual prop in virtual environment, device and readable medium

Publications (1)

Publication Number Publication Date
WO2021143253A1 true WO2021143253A1 (zh) 2021-07-22

Family

ID=70946995

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/123547 WO2021143253A1 (zh) 2020-01-15 2020-10-26 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质

Country Status (3)

Country Link
US (2) US11786817B2 (zh)
CN (1) CN111249726B (zh)
WO (1) WO2021143253A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111249726B (zh) * 2020-01-15 2021-08-24 腾讯科技(深圳)有限公司 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
CN113694526B (zh) * 2021-09-18 2023-06-09 腾讯科技(深圳)有限公司 虚拟对象的控制方法、系统、装置、设备、介质及程序
CN114225372B (zh) * 2021-10-20 2023-06-27 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端、存储介质及程序产品

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170291106A1 (en) * 2012-04-26 2017-10-12 Steelseries Aps Method and apparatus for presenting gamer performance at a social network
CN110354489A (zh) * 2019-08-08 2019-10-22 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN111249726A (zh) * 2020-01-15 2020-06-09 腾讯科技(深圳)有限公司 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6773349B2 (en) * 2002-07-31 2004-08-10 Intec, Inc. Video game controller with integrated video display
US7565618B2 (en) * 2003-02-13 2009-07-21 LumaPix Inc. Method and system for distributing multiple dragged objects
US8038533B2 (en) * 2003-05-09 2011-10-18 Nintendo Co., Ltd. Game system using parent game machine and child game machine
US7552399B2 (en) * 2005-12-27 2009-06-23 International Business Machines Corporation Extensible icons with multiple drop zones
US20070173334A1 (en) * 2006-01-20 2007-07-26 David Whitby Video game instruction card holder
US8882590B2 (en) * 2006-04-28 2014-11-11 Nintendo Co., Ltd. Touch-controlled game character motion providing dynamically-positioned virtual control pad
US9050534B2 (en) * 2010-04-23 2015-06-09 Ganz Achievements for a virtual world game
US8777746B2 (en) * 2011-09-23 2014-07-15 2343127 Ontario Inc. Gestures to encapsulate intent
US9174128B2 (en) * 2012-04-26 2015-11-03 Zynga Inc. Dynamic quests in game
JP6022807B2 (ja) * 2012-04-26 2016-11-09 任天堂株式会社 情報処理プログラム、情報処理装置、情報処理システムおよび情報処理制御方法
JP6678682B2 (ja) * 2015-12-09 2020-04-08 株式会社カプコン ゲームプログラムを記録した記録媒体、効果制御方法およびゲーム装置
JP6185123B1 (ja) * 2016-08-18 2017-08-23 グリー株式会社 プログラム、制御方法、及び情報処理装置
CN108970112A (zh) * 2018-07-05 2018-12-11 腾讯科技(深圳)有限公司 姿势的调整方法和装置、存储介质、电子装置
CN109847369B (zh) * 2019-03-18 2023-05-16 网易(杭州)网络有限公司 游戏中虚拟角色的姿势切换方法和装置
CN110201391B (zh) * 2019-06-05 2023-04-07 网易(杭州)网络有限公司 游戏中虚拟角色的控制方法和装置
CN110201403B (zh) * 2019-06-05 2023-01-10 腾讯科技(深圳)有限公司 控制虚拟对象对虚拟物品进行丢弃的方法、装置及介质
CN111249730B (zh) * 2020-01-15 2021-08-24 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、设备及可读存储介质

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170291106A1 (en) * 2012-04-26 2017-10-12 Steelseries Aps Method and apparatus for presenting gamer performance at a social network
CN110354489A (zh) * 2019-08-08 2019-10-22 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质
CN111249726A (zh) * 2020-01-15 2020-06-09 腾讯科技(深圳)有限公司 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Call of Duty mobile game how to set the sliding shovel skills", 3H3, 14 October 2019 (2019-10-14), XP055829721, Retrieved from the Internet <URL:http://www.3h3.com/codm/10883.html> *
WǑ SHÌ XIĀOXIĀO YǓ XIĒ A [I AM XIAOXIAOYU REST]: "Star Wars Parkour Battle Skills", 8 January 2018 (2018-01-08), CN, pages 1 - 3, XP009529310, Retrieved from the Internet <URL:https://jingyan.baidu.com/article/ff411625e5512e12e5823764.html> *

Also Published As

Publication number Publication date
CN111249726A (zh) 2020-06-09
US11786817B2 (en) 2023-10-17
CN111249726B (zh) 2021-08-24
US20220152508A1 (en) 2022-05-19
US20230405466A1 (en) 2023-12-21

Similar Documents

Publication Publication Date Title
US11229840B2 (en) Equipment display method, apparatus, device and storage medium in virtual environment battle
CN110694261B (zh) 控制虚拟对象进行攻击的方法、终端及存储介质
CN110413171B (zh) 控制虚拟对象进行快捷操作的方法、装置、设备及介质
WO2021143259A1 (zh) 虚拟对象的控制方法、装置、设备及可读存储介质
CN110448891B (zh) 控制虚拟对象操作远程虚拟道具的方法、装置及存储介质
CN110755841B (zh) 虚拟环境中道具的切换方法、装置、设备及可读存储介质
KR102619439B1 (ko) 가상 객체를 제어하는 방법 및 관련 장치
WO2020253832A1 (zh) 控制虚拟对象对虚拟物品进行标记的方法、装置及介质
US11656755B2 (en) Method and apparatus for controlling virtual object to drop virtual item and medium
CN110917619B (zh) 互动道具控制方法、装置、终端及存储介质
CN111399639B (zh) 虚拟环境中运动状态的控制方法、装置、设备及可读介质
WO2021143253A1 (zh) 虚拟环境中虚拟道具的操作方法、装置、设备及可读介质
CN111475029B (zh) 虚拟道具的操作方法、装置、设备及存储介质
WO2021031765A1 (zh) 虚拟环境中瞄准镜的应用方法和相关装置
CN111330278B (zh) 基于虚拟环境的动画播放方法、装置、设备及介质
CN113713383A (zh) 投掷道具控制方法、装置、计算机设备及存储介质
CN111921190A (zh) 虚拟对象的道具装备方法、装置、终端及存储介质
JPWO2021143259A5 (zh)
CN112402969B (zh) 虚拟场景中虚拟对象控制方法、装置、设备及存储介质
CN118059496A (zh) 虚拟对象的控制方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20913568

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20913568

Country of ref document: EP

Kind code of ref document: A1