WO2023197777A1 - Procédé et appareil de d'utilisation d'objets virtuels, dispositif, support et produit programme - Google Patents

Procédé et appareil de d'utilisation d'objets virtuels, dispositif, support et produit programme Download PDF

Info

Publication number
WO2023197777A1
WO2023197777A1 PCT/CN2023/079804 CN2023079804W WO2023197777A1 WO 2023197777 A1 WO2023197777 A1 WO 2023197777A1 CN 2023079804 W CN2023079804 W CN 2023079804W WO 2023197777 A1 WO2023197777 A1 WO 2023197777A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
virtual character
response
hand
virtual
Prior art date
Application number
PCT/CN2023/079804
Other languages
English (en)
Chinese (zh)
Inventor
谢琳滢
潘佳绮
杨泽锋
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2023197777A1 publication Critical patent/WO2023197777A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact

Definitions

  • Embodiments of the present application relate to the field of human-computer interaction, and in particular to a method, device, equipment, medium and program product for using virtual props.
  • users can control virtual characters in the virtual environment and use virtual props. For example, users can use hook props to pull virtual characters to move to locations.
  • the user interface when the virtual character is pulled to the position by the hook prop, the user interface will display the visual field of the virtual character in front of the moving direction, but the virtual character is vulnerable to enemy virtual characters located sideways or behind. Character's attack.
  • This application provides a method, device, equipment, medium and program product for using virtual props. By using hidden controls, the angle of view is changed in the process of the hook prop pulling the virtual character to move to the first position.
  • a method of using virtual props includes:
  • the virtual character located in the virtual environment and a first usage control, the virtual character possesses a hook prop, and the first usage control is a control used to trigger the use of the hook prop;
  • a device for using virtual props includes:
  • a display module used to display a virtual character located in the virtual environment and a first usage control, the virtual character possesses a hook prop, and the first usage control is a control used to trigger the use of the hook prop;
  • the display module is further configured to display a picture of the virtual character launching the hook prop to the first position in response to the first triggering operation of the first use control, and the virtual character being hooked by the hook.
  • the picture of the rope prop moving to the first position;
  • the display module is further configured to, in response to a second triggering operation on the first use control, display a picture in which the first use control is switched to a first roulette control;
  • the display module is also configured to display, in response to a direction selection operation on the first roulette control, a display based on the The direction selection operation changes the viewing angle picture after changing the viewing angle.
  • a computer device includes: a processor and a memory. At least one computer program is stored in the memory. The at least one computer program is loaded and executed by the processor to achieve the above aspects. How to use virtual props.
  • a computer storage medium stores at least one computer program.
  • the at least one computer program is loaded and executed by a processor to implement the use of virtual props as described above. method.
  • a computer program product includes a computer program, and the computer program is stored in a computer-readable storage medium; the computer program is obtained by a processor of a computer device from the computer program.
  • the computer-readable storage medium is read and executed, so that the computer device performs the method of using the virtual props as described in the above aspect.
  • Figure 1 is a schematic diagram of a method of using virtual props provided by an exemplary embodiment of the present application
  • Figure 2 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • Figure 3 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 4 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 5 is a schematic diagram of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 6 is a schematic diagram of using a control to switch to a roulette control provided by an exemplary embodiment of the present application
  • Figure 7 is a schematic diagram of the first roulette control controlling the conversion of viewing angles provided by an exemplary embodiment of the present application.
  • Figure 8 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 9 is a schematic diagram of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 10 is a schematic diagram of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 11 is a schematic diagram of changing trajectories during airborne jumping provided by an exemplary embodiment of the present application.
  • Figure 12 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • Figure 13 is a block diagram of a human-computer interaction device for virtual items provided by an exemplary embodiment of the present application.
  • Figure 14 is a schematic structural diagram of a computer device provided by an exemplary embodiment of the present application.
  • Virtual environment It is the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment can be a simulation of the real world, a semi-simulation and semi-fictional world, or a purely fictitious world.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment, which is not limited in this application.
  • the following embodiments illustrate that the virtual environment is a three-dimensional virtual environment.
  • Virtual character refers to the movable objects in the virtual environment.
  • the movable object may be a virtual character, a virtual animal, an animation character, etc., such as: characters, animals, plants, oil barrels, walls, stones, etc. displayed in a three-dimensional virtual environment.
  • the virtual character is a three-dimensional model created based on animation skeleton technology.
  • Each virtual character has its own shape and volume in the three-dimensional virtual environment and occupies a part of the space in the three-dimensional virtual environment.
  • the virtual character has a health value. When the virtual character's health value reaches zero, the virtual character cannot continue to move in the virtual world.
  • the health value is a standard for judging whether a virtual character can move in the virtual world.
  • the health value can also be called a signal value, a red bar, etc.
  • UI User Interface, user interface
  • control any visual control or element that can be seen on the user interface of the application, such as pictures, input boxes, text boxes, buttons, labels and other controls. Some of these UI controls respond to the user's Operation, for example, the user triggers the first use control to control the virtual character to use the hook prop.
  • the UI controls involved in the embodiments of this application include, but are not limited to: first usage controls, second usage controls, and aiming controls.
  • Figure 1 shows a technical solution for a method of using virtual props provided by an embodiment of the present application. This method can be executed by the terminal or the client on the terminal.
  • a virtual environment 101 and a first use control 102 are displayed on the user interface.
  • the virtual environment 101 includes a virtual character 103.
  • the limbs of the virtual character 103 are equipped with hook props.
  • the first use control 102 is used to trigger the use of the hook props. controls.
  • the client In response to the first triggering operation of the first use control 102, the client displays a picture of the virtual character 103 launching the hook prop to the first position 107, and a picture of the virtual character 103 being pulled by the hook prop to move toward the first position 107.
  • the hook prop includes a launching device 104, a projecting component 106, and a traction component 105.
  • the launching device 104 is fixed on the limb of the virtual character 103
  • the traction component 105 is connected to the launching device 104
  • the projection component 106 is connected to the traction component 105. connected.
  • the client controls the virtual character 103 to use the launching device 104 to eject the ejection component 106; in response to the ejection component 106 hitting the first position 107 and fixing it, the client controls the traction component 105 to pull the virtual The character 103 moves towards the first position 107.
  • the client in response to the triggering time of the first use control 102 being greater than the first threshold, displays a picture of the first use control 102 switching to the first roulette control; when the virtual character 103 is pulled toward the first use control by the hook prop, During the movement of the position 107, in response to the direction selection operation on the first roulette control, the display maintains the virtual character 103 moving to the first position 107 on the first trajectory, and changes the observation perspective of the virtual character 103 based on the direction selection operation. perspective screen.
  • the first use control 102 is switched to the first roulette control, and the user rotates or triggers the first roulette wheel. control to change the observation angle of the virtual character 103.
  • the virtual character 103 still moves to the first position 107 on the first trajectory.
  • the observation angle of the virtual character 103 changes based on the direction selection operation.
  • the viewing angle of the virtual character 103 maintains the field of view direction selected in the direction selection operation.
  • the client responds to the first triggering operation on the first use control 102 and displays the second use control 109 in the user interface.
  • the second use control 109 is used to control the virtual character to jump in the air. controls.
  • the client displays the second use control 109 while the virtual character 103 launches the hook prop to the first position 107;
  • the client In response to the first triggering operation on the first use control 102, the client displays the second use control 109 while the virtual character 103 is moved toward the first position 107 by the hook prop.
  • the second use control 109 is highlighted in the user interface, wherein the highlighting method includes: highlighting, inverting, bolding, adding At least one of background color display and adding prompt lines is used, but is not limited to this, and the embodiments of the present application are not limited to this.
  • the client in response to the triggering time of the second usage control 109 being greater than the second threshold, the client displays a screen in which the second usage control 109 is switched to the second roulette control.
  • the client responds to the direction selection operation on the second roulette control, displays the virtual character 103 to A picture of the third trajectory jumping into the air towards the second position 108.
  • the second threshold is set to 0.5s and the user's trigger time on the second usage control 109 is greater than 0.5s
  • the second roulette control is displayed on the second use control 109.
  • the user determines the third trajectory corresponding to the observation perspective of the virtual character 103 by rotating or triggering the second roulette control, and controls the virtual character 103 to follow the third trajectory. Jump into the air to the second position 108.
  • the second usage control 109 is canceled.
  • the method provided by this embodiment displays a virtual character with a hook prop and a first use control in a virtual environment, and controls the virtual character to move to the first position through the first trigger operation on the first use control; Through the second trigger operation on the first use control, the first use control is switched to the first roulette control; through the direction selection operation of the first roulette control, the observation perspective is changed while the virtual character moves to the first position.
  • This application uses hidden controls to realize the combined operation of changing the perspective when the hook props pull the virtual character to move to the first position, reducing the proportion of controls in the user interface and the phenomenon of accidental touches, and improving the virtual props human-computer interaction efficiency.
  • FIG. 2 shows a structural block diagram of a computer system provided by an exemplary embodiment of the present application.
  • the computer system 100 includes: a first terminal 110, a server 120, and a second terminal 130.
  • the first terminal 110 is installed and runs a client 111 that supports a virtual environment.
  • the client 111 may be a multi-player online battle program.
  • the user interface of the client 111 is displayed on the screen of the first terminal 110.
  • the client 111 can be a battle royale shooting game, a virtual reality (Virtual Reality, VR) application, an augmented reality (Augmented Reality, AR) program, a three-dimensional map program, a virtual reality game, an augmented reality game, a first-person shooting game ( Any of First-Person Shooting Game (FPS), Third-Personal Shooting Game (TPS), Multiplayer Online Battle Arena Games (MOBA), Strategy Game (Simulation Game, SLG) A sort of.
  • FPS First-Person Shooting Game
  • TPS Third-Personal Shooting Game
  • MOBA Multiplayer Online Battle Arena Games
  • SLG Strategy Game
  • the first terminal 110 is a terminal used by the first user 112.
  • the first user 112 uses the first terminal 110 to control the first virtual character located in the virtual environment to perform activities, or to operate virtual items owned by the second virtual character.
  • the avatar may be referred to as the avatar of the first user 112 .
  • the first user 112 can assemble, disassemble, uninstall and other operations on the virtual items owned by the first virtual character, which is not limited in this application.
  • the first virtual character is a simulation character or an animation character.
  • the second terminal 130 is installed and runs a client 131 that supports a virtual environment.
  • the client 131 may be a multi-player online battle program.
  • the user interface of the client 131 is displayed on the screen of the second terminal 130.
  • the client can be any one of a battle royale shooting game, a VR application, an AR program, a three-dimensional map program, a virtual reality game, an augmented reality game, an FPS, a TPS, a MOBA, and an SLG.
  • the client is a MOBA game as an example.
  • the second terminal 130 is a terminal used by the second user 113.
  • the second user 113 uses the second terminal 130 to control the second virtual character located in the virtual environment to perform activities and operate virtual items owned by the second virtual character.
  • the second virtual character The virtual character may be called the second user 113 .
  • the second virtual character is a simulation character or an animation character.
  • first virtual character and the second virtual character are in the same virtual environment.
  • first virtual character and the second virtual character may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication permissions.
  • first virtual character and the second virtual character may belong to different camps, different teams, different organizations, or have hostile relationships.
  • the clients installed on the first terminal 110 and the second terminal 130 are the same, or the clients installed on the two terminals are the same type of clients on different operating system platforms (Android or IOS).
  • the first terminal 110 may generally refer to one of the plurality of terminals
  • the second terminal 130 may generally refer to another of the plurality of terminals.
  • This embodiment only takes the first terminal 110 and the second terminal 130 as an example.
  • the device types of the first terminal 110 and the second terminal 130 are the same or different.
  • the device types include: smart phones, smart TVs, vehicle-mounted terminals, wearable devices, tablet computers, e-book readers, MP3 players, MP4 players, At least one of a laptop computer and a desktop computer.
  • terminals 140 Only two terminals are shown in FIG. 2 , but there are multiple other terminals 140 that can access the server 120 in different embodiments.
  • terminals 140 there are one or more terminals 140 corresponding to developers, and the terminals 140 are installed with support A development and editing platform for clients in a virtual environment. Developers can edit and update the client on the terminal 140, and transmit the updated client installation package to the server 120 through a wired or wireless network.
  • the first terminal 110 and the third The second terminal 130 can download the client installation package from the server 120 to update the client.
  • the first terminal 110, the second terminal 130 and other terminals 140 are connected to the server 120 through a wireless network or a wired network.
  • the server 120 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 120 is used to provide background services for clients that support the three-dimensional virtual environment.
  • the server 120 undertakes the main calculation work and the terminal undertakes the secondary calculation work; or the server 120 undertakes the secondary calculation work and the terminal undertakes the main calculation work; or the server 120 and the terminal adopt a distributed computing architecture for collaborative computing. .
  • the server 120 includes a processor 122, a user account database 123, a battle service module 124, and a user-oriented input/output interface (Input/Output Interface, I/O interface) 125.
  • the processor 122 is used to load the instructions stored in the server 120 and process the data in the user account database 123 and the battle service module 124; the user account database 123 is used to store the first terminal 110, the second terminal 130 and other terminals 140.
  • the data of the user account used such as the avatar of the user account, the nickname of the user account, the combat effectiveness index of the user account, and the service area where the user account is located;
  • the battle service module 124 is used to provide multiple battle rooms for users to compete, such as 1V1 battles , 3V3 battle, 5V5 battle, etc.;
  • the user-oriented I/O interface 125 is used to establish communication and exchange data with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network.
  • Figure 3 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • the method may be executed by a terminal in the system as shown in Figure 2 or by a client on the terminal.
  • the method includes:
  • Step 302 Display the virtual character and the first use control located in the virtual environment.
  • the virtual character has a hook prop.
  • a virtual environment is a virtual activity space provided by an application in a terminal during its running process, allowing virtual characters to perform various activities in the virtual activity space.
  • Hook props are items owned by virtual characters in the virtual world.
  • the hook props can be obtained by at least one of picking up, robbing, and purchasing methods, and this application does not limit this.
  • the first control used is the control used to trigger the use of the hook prop.
  • the first usage control is at least one of a button and a UI control, but is not limited thereto, and is not limited in this embodiment of the present application.
  • the client displays a user interface
  • the user interface includes a virtual environment screen.
  • the user interface may also include UI controls located on the virtual environment screen.
  • the UI controls include controls for using virtual props, controls for releasing skills, etc.
  • the user interface may also include: displaying a teaming interface for forming a team with friends, displaying a matching interface for matching virtual characters with other virtual characters, and displaying a pairing interface for loading information about this game. Bureau loading interface, etc.
  • the user controls the virtual character to move in the virtual environment
  • the client observes the virtual environment from the perspective of the virtual character and collects virtual environment pictures.
  • the virtual character in this embodiment is a virtual character controlled by the client.
  • the observation perspective refers to the perspective from which the camera model observes the virtual environment.
  • the camera model is located directly above the virtual character's head, above the left shoulder, in the sky, etc.
  • Step 304 In response to the first triggering operation on the first use control, display a picture of the virtual character launching the hook prop to the first position, and a picture of the virtual character being pulled by the hook prop to move to the first position.
  • the first trigger operation includes at least one of a long press, a single click, a double click, a sliding operation, and a circle drawing operation, which is not limited in the embodiments of the present application.
  • the client controls the virtual character to launch the hook prop to the first position.
  • the virtual character is moved by the hook prop. Traction moves toward the first position.
  • the embodiment of the present application does not specifically limit the way in which the hook prop prop pulls the virtual character to move.
  • Step 306 In response to the second triggering operation on the first use control, display a screen in which the first use control is switched to the first roulette control.
  • the client responds to the second triggering operation on the first use control and displays a picture of the first use control switching to the first roulette control.
  • the second trigger operation includes at least one of a long press, a single click, a double click, a sliding operation, and a circle drawing operation, which is not limited in the embodiments of the present application.
  • the first trigger operation is used to control the virtual character to use the hook prop
  • the second trigger operation is used to switch the first use control to the first roulette control.
  • the first roulette control is a control used to change the viewing angle. Specifically, the first roulette control is a control used to change the observation angle of the camera model.
  • the roulette control refers to a circular roulette selector that can select the direction.
  • the first roulette control is a control used to change the viewing angle of the camera model. Rotate the first roulette control 90° to get the camera model. The viewing angle is rotated 90°.
  • the screen where the first use control is switched to the first roulette control includes canceling the display of the first use control and adding the first roulette control at the original position, and additionally displaying the first roulette control around the first use control. At least one of, but not limited to, this is not limited in the embodiments of the present application.
  • Step 308 In the process of the virtual character being pulled by the hook prop to move to the first position, in response to the direction selection operation on the first roulette control, display the perspective picture after changing the observation perspective based on the direction selection operation.
  • the direction selection operation of the first roulette control includes at least one of a sliding rotating roulette control and a clicking roulette control, which is not limited in the embodiment of the present application.
  • Perspective refers to the observation angle when observing in the virtual environment from the first-person perspective or third-person perspective of the virtual character.
  • the first-person perspective is a perspective in which the camera model is used as the "eyes" of the virtual character in the virtual environment to observe the virtual environment.
  • the third-person perspective is the perspective of observing the virtual character through the camera model in the virtual environment.
  • the camera model is located behind the virtual character at this time.
  • the perspective is the first-person perspective of the virtual character as an example.
  • the perspective is the angle when the virtual environment is observed through the camera model in the virtual environment.
  • the camera model automatically follows the virtual character in the virtual environment, that is, when the virtual character's position in the virtual environment changes, the camera model follows the virtual character's position in the virtual environment and changes simultaneously, and the camera The model is always within a preset distance of the virtual character in the virtual environment.
  • the relative positions of the camera model and the virtual character do not change.
  • the method provided by this embodiment displays a virtual character with a hook prop and a first use control in a virtual environment, and controls the virtual character to move to the first position through the first trigger operation on the first use control; Through the second trigger operation on the first use control, the first use control is switched to the first roulette control; through the direction selection operation on the first roulette control, the observation is changed while the virtual character moves to the first position perspective.
  • This application uses hidden controls to realize the combined operation of changing the perspective during the process of pulling the virtual character to move to the first position with the hook props, reducing the proportion of controls in the user interface and the phenomenon of accidental touches, thus improving the virtual
  • the human-computer interaction efficiency of the props also improves the user experience.
  • Figure 4 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • the method may be executed by a terminal in the system as shown in Figure 2 or by a client on the terminal.
  • the method includes:
  • Step 402 Display the virtual character and the first use control located in the virtual environment.
  • the virtual character has a hook prop.
  • a virtual environment is a virtual activity space provided by an application in a terminal during its running process, allowing virtual characters to perform various activities in the virtual activity space.
  • the virtual environment displayed on the virtual environment screen includes: at least one element among ladders, straight ladders, rock climbing areas, mountains, plains, rivers, lakes, oceans, deserts, swamps, quicksand, sky, plants, buildings, and vehicles. .
  • the virtual environment picture includes a virtual character.
  • the virtual environment picture is a picture obtained from the virtual character's first-person perspective
  • the virtual environment picture includes the virtual character's hands;
  • the virtual environment picture is taken from the virtual character's third-person perspective
  • the virtual environment picture includes the upper body or whole body of the virtual character.
  • Hook props are items owned by virtual characters in the virtual world.
  • the hook props can be obtained by at least one of picking up, robbing, and purchasing methods, and this application does not limit this.
  • the first control used is the control used to trigger the use of the hook prop.
  • Step 404 In response to the first triggering operation on the first use control, display a picture of the virtual character launching the hook prop to the first position, and a picture of the virtual character being pulled by the hook prop to move to the first position.
  • the first trigger operation includes at least one of a long press, a single click, a double click, a sliding operation, and a circle drawing operation, which is not limited in the embodiments of the present application.
  • the hook prop includes a launching device, a projecting component and a traction component.
  • the launching device is fixed on the limb of the virtual character
  • the traction component is connected to the launching device
  • the projecting component is connected to the traction component.
  • the client displays a picture of the virtual character using the launching device to eject the ejection component; in response to the ejection component hitting the first position and being fixed, the client displays the traction component pulling the virtual character toward the first Position moving picture.
  • a virtual environment 501 and a first usage control 502 are displayed on the user interface.
  • the virtual environment 501 includes a virtual character 503.
  • the limbs of the virtual character 503 are equipped with hook props, and the hook props include a launching device. 504.
  • the client controls the virtual character 503 to use the launching device 504 to eject the ejection part 506; in response to the ejection part 506 hitting the first position 507 and fixing it, the client controls the traction part 505 to pull the virtual The character 503 moves to the first position 507.
  • Step 406 In response to the long press triggering operation on the first use control, display a screen in which the first use control is switched to the first roulette control.
  • the first roulette control is a control used to change the viewing angle. Specifically, the first roulette control is a control used to change the observation angle of the camera model.
  • the client in response to a long press trigger operation on the first use control, displays a screen in which the first use control is switched to the first roulette control; wherein the long press trigger operation is a press operation whose trigger time is greater than the first threshold.
  • the first threshold is set to 0.3s.
  • the first use control 601 is switched to the first roulette control 602 .
  • Step 408 In the process of the hook prop pulling the virtual character to move toward the first position along the first trajectory, in response to the direction selection operation on the first roulette control, display and keep the virtual character moving toward the first position along the first trajectory, And the perspective picture after changing the observation perspective based on the direction selection operation.
  • the virtual character in response to the direction selection operation on the first roulette control, the virtual character keeps moving toward the first position on the first trajectory, And the virtual character changes the observation perspective based on the direction selection operation; when the virtual character reaches the first position, the observation perspective maintains the direction selected by the direction selection operation of the first roulette control.
  • the direction selection operation of the first roulette control includes at least one of a sliding rotating roulette control and a clicking roulette control, which is not limited in the embodiment of the present application.
  • the current observation angle of the virtual character corresponds to the 0 scale line of the first roulette control.
  • the first roulette control Perform a direction selection operation, for example, rotate the first roulette control 90 degrees.
  • the camera model will The viewing angle is also rotated 90 degrees, that is, the viewing angle of the camera model is displayed after the viewing angle is rotated 90 degrees; for example, if the first roulette control is rotated 180 degrees, when the virtual character moves to the first position, the camera model's observation
  • the perspective is also rotated 180 degrees, that is, the perspective picture behind the virtual character is displayed when the virtual character moves to the first position.
  • the current observation angle of the virtual character corresponds to the 0 scale line of the first roulette control.
  • the first roulette control Perform a direction selection operation, for example, click on the position corresponding to 75 degrees of the first roulette control.
  • the observation angle of the camera model will be rotated 75 degrees, that is, the observation of the camera model will be displayed.
  • the perspective picture after the perspective is rotated 75 degrees; for example, if you click the first roulette control 180 degrees, the virtual character is moving to the first position, and the observation perspective of the camera model is rotated 180 degrees, which shows that the virtual character is moving to the first position.
  • Perspective refers to the observation angle when observing in the virtual environment from the first-person perspective or third-person perspective of the virtual character.
  • the camera model automatically follows the virtual character in the virtual environment, that is, when the virtual character's position in the virtual environment changes, the camera model follows the virtual character's position in the virtual environment and changes simultaneously, and the camera The model is always within a preset distance of the virtual character in the virtual environment.
  • the relative positions of the camera model and the virtual character do not change.
  • a point is determined in the virtual character 701 as the rotation center 702, and the camera model rotates around the rotation center 702.
  • the camera model is configured with an initial position, and the initial position is behind the virtual character.
  • the position corresponding to the upper position (such as the position behind the head) and the camera model is the initial observation angle position of the virtual character 701.
  • the initial position is position 703.
  • the client responds to the left and right sides and/or rear sides of the movement path of the virtual character moving to the first position.
  • the client responds to the left and right sides and/or rear sides of the movement path of the virtual character moving to the first position.
  • There is an enemy avatar and the perspective picture is displayed after the avatar is kept moving toward the first position along the first trajectory and the observation perspective is changed based on the position of the enemy avatar.
  • the viewing angle will automatically be rotated 90 degrees to the left, that is, the avatar will be shown moving toward the first position.
  • the process of position movement includes the perspective of the enemy virtual character.
  • the method provided by this embodiment displays a virtual character with a hook prop and a first use control in a virtual environment, and controls the virtual character to move to the first position through the first trigger operation on the first use control; Through the long press trigger operation on the first use control, the first use control is switched to the first roulette control; through the direction selection operation of the first roulette control, the trajectory of the virtual character remains unchanged while moving to the first position. But change the viewing angle of the camera model.
  • This application uses hidden controls to realize the combined operation of changing the perspective when the hook props pull the virtual character to move to the first position, reducing the proportion of controls in the user interface and the phenomenon of accidental touches, thereby improving the virtual
  • the human-computer interaction efficiency of the props also improves the user experience.
  • Figure 8 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • the method may be executed by a terminal in the system as shown in Figure 2 or by a client on the terminal.
  • the method includes:
  • Step 802 Display the virtual character located in the virtual environment and the first use control.
  • the virtual character has a hook prop.
  • a virtual environment is a virtual activity space provided by an application in a terminal during its running process, allowing virtual characters to perform various activities in the virtual activity space.
  • Hook props are items owned by virtual characters in the virtual world.
  • the hook props can be obtained by at least one of picking up, robbing, and purchasing methods, and this application does not limit this.
  • the first control used is the control used to trigger the use of the hook prop.
  • Step 804 In response to the first triggering operation on the first usage control, display the second usage control.
  • the first trigger operation includes at least one of a long press, a single click, a double click, a sliding operation, and a circle drawing operation, which is not limited in the embodiments of the present application.
  • the second control used is the control used to control the virtual character to jump in the air.
  • the first triggering operation on the first usage control is not only used to control the virtual character to use the hook prop, but also used to add and display the second usage control on the user interface.
  • the hook prop includes a launching device, a projecting component and a traction component.
  • the launching device is fixed on the limb of the virtual character
  • the traction component is connected to the launching device
  • the projecting component is connected to the traction component.
  • the client displays a picture of the virtual character using the launching device to eject the ejection component; in response to the ejection component hitting the first position and being fixed, the client displays the traction component pulling the virtual character toward the first Position moving picture.
  • the client in response to the first triggering operation on the first usage control, displays the second usage control while the virtual character launches the hook prop to the first position;
  • the client in response to the first triggering operation on the first use control, displays the second use control while the virtual character is being pulled by the hook prop to move toward the first position.
  • the client highlights the second usage control in the user interface in response to a first triggering operation on the first usage control.
  • the highlighting method includes at least one of highlighting, inverting, bolding, adding background color, adding prompt lines, and using dynamic visual effects for display.
  • Step 806 In the process of the virtual character being pulled by the hook prop to move to the first position, in response to the first triggering operation on the second use control, display a picture of the virtual character taking the first position as a fulcrum and jumping to the second position. .
  • the hook prop in response to the first triggering operation on the second use control, the hook prop is displayed with the first position as the fulcrum and the third position as the fulcrum.
  • the second state supports the image of the virtual character jumping into the air to the second position, where the first state is different from the second state.
  • the traction component in the hook prop has a rigid state and a flexible state.
  • the first state of the hook prop is a flexible state; when the hook prop is used to support the virtual character to jump in the air, the second state of the hook prop is a rigid state.
  • a virtual environment 901, a first use control 902 and a second use control 904 are displayed on the user interface.
  • the virtual environment 901 includes a virtual character 903, and the limbs of the virtual character 903 are equipped with hook props.
  • the client controls the virtual character 903 to launch the hook prop to the first position 905, and controls the hook prop to pull the virtual character 903 to move to the first position 905 in the "rope" state.
  • the client responds to the first triggering operation on the second use control 904, and the hook prop takes the first position 905 as the fulcrum and the The "pole" state supports the virtual character 903 to jump into the air toward the second position 906.
  • Step 808 In response to the second trigger operation on the second use control, display a screen in which the second use control is switched to the second roulette control.
  • the second roulette control is a control used to change the aerial jump path of the virtual character.
  • the client in response to a long press trigger operation on the second use control, displays a screen in which the second use control is switched to the second roulette control; wherein the long press trigger operation is a press operation whose trigger time is greater than the second threshold.
  • the second threshold For example, set the second threshold to 0.5s, and when the user's trigger time on the second usage control is greater than 0.5s, the second usage control is switched to the second roulette control.
  • the screen for switching the second use control to the second roulette control includes canceling the display of the second use control and then displaying the second roulette control, and additionally displaying at least one of the second roulette controls around the periphery of the second use control.
  • species but not limited to this, and the embodiments of the present application are not limited to this.
  • Step 810 In the process of the virtual character jumping in the air toward the second position with the first position as the fulcrum, in response to the direction selection operation on the second roulette control, display the air jumping screen after changing the air jumping path based on the direction selection operation.
  • the direction selection operation of the second roulette control includes at least one of sliding the rotating roulette control and clicking the roulette control.
  • Perspective refers to the observation angle when observing in the virtual environment from the first-person perspective or third-person perspective of the virtual character.
  • the virtual character in response to the direction selection operation on the second roulette control, the virtual character is displayed jumping toward the third trajectory on the third trajectory.
  • a third trajectory corresponding to the observation perspective is displayed. Picture of jumping into the air to the second position.
  • the current observation angle of the virtual character corresponds to the 0 scale line of the second roulette control
  • the virtual character takes the first position as the fulcrum and jumps toward the second position along the trajectory corresponding to the direction of the 0 scale line of the second roulette control.
  • the virtual character takes the first position as the fulcrum and the second roulette control.
  • the trajectory corresponding to the direction of the 15th tick mark jumps to the second position.
  • a virtual environment 1001 and a second usage control 1002 are displayed on the user interface.
  • the virtual environment 1001 includes a virtual character 1003, and the limbs of the virtual character 1003 are equipped with hook props.
  • the client responds to the first triggering operation on the second use control 1002, the hook prop takes the first position 1004 as the fulcrum and
  • the virtual character 1003 is supported in the "pole” state and jumps toward the second position 1005 along the second trajectory 1007.
  • the client responds to the long press trigger operation on the second use control 1002, switches the second use control 1002 to the second roulette control 1006, and uses the first roulette wheel control on the virtual character 1003.
  • Position 1004 is the fulcrum and in the process of jumping to the second position 1005 with the second trajectory 1007, in response to the direction selection operation of the second roulette control 1006, the third trajectory 1008 corresponding to the observation perspective of the virtual character 1003 is displayed. The picture of 1005 jumping in the second position.
  • the first direction corresponding to the observation perspective is determined, Displays a picture of the virtual character jumping into the air toward the second position on a third trajectory tangent to the first direction.
  • the second roulette control 30 degrees the second roulette control The direction corresponding to 30 degrees is the first direction 1103 corresponding to the observation angle of the camera model, and the virtual character jumps toward the second position 1102 with a third trajectory 1104 tangent to the first direction 1103 .
  • the virtual character when the virtual character takes the first position as a fulcrum and jumps to the second position, in response to the presence of obstacles on the path of the virtual character jumping to the second position, it is displayed that the virtual character avoids The obstacle continues to jump towards the second position.
  • the collision box is used to detect collisions with the three-dimensional virtual model in the virtual environment; in response to the collision box and obstacles The three-dimensional virtual model collides, and the collision point of the collision is obtained; the avoidance direction of the virtual character is determined according to the collision point; the virtual character is controlled according to the avoidance direction to avoid obstacles and continue to jump to the second position.
  • the second usage control is canceled
  • the second usage control is suppressed.
  • the method provided by this embodiment displays a virtual character with a hook prop and a first use control in a virtual environment, and controls the virtual character to move to the first position through the first trigger operation on the first use control.
  • Display the second use control during the process of the virtual character being pulled by the hook prop to move to the first position, through the first triggering operation of the second use control, the virtual character takes the first position as the fulcrum and jumps toward the second position; by The second triggering operation of the second use control switches the second use control to the second roulette control; and through the direction selection operation of the second roulette control, the air jump screen after changing the air jump path based on the direction selection operation is displayed.
  • this application realizes the combined operations of jumping in the air and changing routes while the hook props are pulling the virtual character to move to the first position, thereby reducing the proportion of controls in the user interface and the phenomenon of accidental touches. It improves the efficiency of human-computer interaction with virtual props and improves the user experience.
  • a virtual character located in the virtual environment and a right-hand use control are displayed, the avatar has a right-hand hook prop, and the right-hand use control is a control used to trigger the use of the right-hand hook prop.
  • the right-hand use control is a control used to trigger the use of the right-hand hook prop.
  • a picture of the virtual character launching the right-hand hook prop to the right-hand position (the position where the right-hand hook prop finally arrives after launching the right-hand hook prop) is displayed, and the virtual character is hooked by the right hand. The prop is moving to the right hand position.
  • the right-hand control In the process of the virtual character being pulled by the right-hand hook prop to move to the right-hand position, in response to the drag operation of the right-hand control to the left (with a larger amplitude), the right-hand control is canceled and the left-hand control is displayed; the left-hand control is The display position is to the left compared to the display position of right-handed controls.
  • the virtual character In response to the first triggering operation of the left-hand control, the virtual character is controlled to launch the left-hand hook prop to the left-hand position, and the fixed connection relationship between the right-hand hook prop and the right-hand position is released; the virtual character is displayed to be pulled toward the left-hand hook prop. Picture of the left hand position moving.
  • the virtual character can launch the left hand hook prop.
  • the virtual character launches the left hand hook prop the fixed connection between the right hand hook prop and the right hand position is cancelled, and the launched left hand hook prop is fixedly connected to the final left hand position.
  • This can be achieved Changing the arrival position during the towing process will help the virtual character adjust the arrival position in time.
  • the position of the left-hand control is set to the left of the right-hand control position, which is in line with the user's operating habits and further improves the user's operating experience.
  • a virtual character located in the virtual environment and a left-hand use control are displayed, the virtual character possesses a left-hand hook prop, and the left-hand use control is a control used to trigger the use of the left-hand hook prop.
  • the left-hand use control is a control used to trigger the use of the left-hand hook prop.
  • a screen is displayed in which the virtual character launches the left-hand hook prop to the left hand position (the position where the left-hand hook prop finally arrives after launching the left-hand hook prop), and the virtual character is hooked by the left hand. The prop is moving to the left hand position.
  • the left-hand control is canceled and the right-hand control is displayed; the right-hand control is The display position is to the right compared to the display position of the left-handed control.
  • the virtual character is controlled to launch the right-hand hook prop to the right-hand position, and the fixed connection relationship between the left-hand hook prop and the left-hand position is released; the virtual character is displayed to be pulled toward the right-hand hook prop. Picture of the right hand position moving.
  • the virtual character can launch the right hand hook prop.
  • the virtual character launches the right hand hook prop the fixed connection between the left hand hook prop and the left hand position is cancelled, and the launched right hand hook prop is fixedly connected to the final arrived right hand position.
  • This can be achieved Changing the arrival position during the towing process will help the virtual character adjust the arrival position in time.
  • the position of the control used by the right hand is set to the right of the position used by the left hand, which is in line with the user's operating habits and further improves the user's operating experience.
  • Figure 12 is a flow chart of a method of using virtual props provided by an exemplary embodiment of the present application.
  • the method may be executed by a terminal in the system as shown in Figure 2 or by a client on the terminal.
  • the method includes:
  • Step 1201 Start.
  • Step 1202 Trigger the first usage control.
  • a virtual environment and a first use control are displayed on the user interface.
  • the virtual environment includes a virtual character.
  • the limbs of the virtual character are equipped with hook props.
  • the first use control is a control used to trigger the use of the hook props.
  • Step 1203 Launch the hook prop.
  • the client controls the virtual character to launch the hook prop to the first position, and controls the hook prop to pull the virtual character to move to the first position.
  • Step 1204 Long press to trigger the first use control.
  • Step 1205 Switch the first control to the first roulette control.
  • the client In response to the triggering time of the first usage control being greater than the first threshold, the client switches the first usage control to the first roulette control.
  • Step 1206 Trigger the first roulette control.
  • step 1207 is performed in response to the direction selection operation on the first roulette control.
  • Step 1207 Change the viewing angle of the camera model.
  • the virtual character In response to the direction selection operation on the first roulette control, the virtual character is controlled to move to the first position along a first trajectory, and the observation perspective of the camera model is changed based on the direction selection operation.
  • Step 1208 Trigger the second usage control.
  • the client In response to the first triggering operation on the first use control, the client displays the second use control in the user interface.
  • the second use control is a control used to control the virtual character to jump in the air.
  • the client responds to the second use control.
  • the first trigger operation is to perform step 1209.
  • Step 1209 Jump into the air towards the second position.
  • the client uses the first position as a fulcrum and jumps toward the second position on a second trajectory.
  • Step 1210 Long press to trigger the second usage control.
  • Step 1211 Switch the second usage control to the second roulette control.
  • the client In response to the triggering time of the second usage control being greater than the second threshold, the client switches the second usage control to the second roulette control.
  • Step 1212 Trigger the second roulette control.
  • the client executes step 1213 in response to the direction selection operation on the second roulette control.
  • Step 1213 Change the aerial jumping path of the virtual character.
  • the virtual character takes the first position as the fulcrum and jumps toward the second position on the second trajectory, in response to the direction selection operation of the second roulette control, the third trajectory direction corresponding to the observation perspective of the camera model is displayed. The picture of jumping in the second position.
  • Step 1214 End.
  • Figure 13 shows a schematic structural diagram of a device for using virtual props provided by an exemplary embodiment of the present application.
  • the device can be implemented as all or part of the computer equipment through software, hardware, or a combination of both.
  • the device includes:
  • the display module 1301 is used to display a virtual character located in the virtual environment and a first usage control.
  • the virtual character has a hook prop
  • the first usage control is a control used to trigger the use of the hook prop;
  • the display module 1301 is further configured to, in response to the first triggering operation of the first use control, display the picture of the virtual character launching the hook prop to the first position, and the virtual character being hit by the hook.
  • the picture of the prop moving to the first position;
  • the display module 1301 is further configured to, in response to the second triggering operation on the first use control, display a screen in which the first use control is switched to the first roulette control;
  • the display module 1301 is also used to display the virtual character when the virtual character is pulled by the hook prop to move to the first position. During the process, in response to a direction selection operation on the first roulette control, a perspective picture after changing the observation perspective based on the direction selection operation is displayed.
  • the display module 1301 is further configured to display a screen in which the first use control is switched to the first roulette control in response to a long press trigger operation on the first use control;
  • the long press trigger operation is a press operation with a trigger time greater than the first threshold.
  • the display module 1301 is also configured to respond to the first wheel when the hook prop pulls the virtual character to move toward the first position on a first trajectory.
  • the direction selection operation of the disk control displays the perspective picture after keeping the virtual character moving to the first position on the first trajectory and changing the observation perspective based on the direction selection operation.
  • the hook prop includes a launching device, a shooting component and a traction component.
  • the launching device is fixed on the limb of the virtual character.
  • the traction component is connected to the launching device.
  • the shooting component is connected to the traction component. .
  • the display module 1301 is further configured to, in response to the first triggering operation on the first use control, display a picture of the virtual character using the launching device to shoot the shooting component. .
  • the display module 1301 is further configured to display the pulling member pulling the virtual character to move toward the first position in response to the ejection component hitting the first position and being fixed. picture.
  • the display module 1301 is further configured to respond to the virtual character moving toward the first position when the virtual character is pulled by the hook prop.
  • the display keeps the virtual character moving to the first position on the first trajectory, and changes the observation perspective based on the position of the enemy virtual character. Contains the perspective of the enemy avatar.
  • the display module 1301 is further configured to display a second usage control in response to the first triggering operation on the first usage control, where the second usage control is used to control all Describes the controls for virtual characters to jump in the air.
  • the display module 1301 is also configured to respond to the third use of the second control when the virtual character is pulled by the hook prop to move to the first position.
  • a triggering operation displays a picture of the virtual character taking the first position as a fulcrum and jumping to the second position.
  • the display module 1301 is further configured to display a screen in which the second use control is switched to a second roulette control in response to a second triggering operation on the second use control.
  • the display module 1301 is also configured to respond to the response to the second roulette wheel when the virtual character takes the first position as a fulcrum and jumps toward the second position.
  • the direction selection operation of the control displays the air jump screen after changing the air jump path based on the direction selection operation.
  • the display module 1301 is also configured to respond to the second use of the hook prop during the process of pulling the virtual character to move to the first position in the first state.
  • the first triggering operation of the control displays a picture of the hook prop using the first position as a fulcrum and supporting the virtual character in a second state to jump into the air toward the second position, wherein the first state and The second state is different.
  • the display module 1301 is further configured to, in response to a long press triggering operation on the second use control, display a screen in which the second use control is switched to the second roulette control;
  • the long press trigger operation is a press operation with a trigger time greater than the second threshold.
  • the display module 1301 is also configured to respond to the response to the virtual character's response to the first position as the fulcrum and the second trajectory as the virtual character jumps toward the second position.
  • the direction selection operation of the second roulette control displays a picture of the virtual character jumping in the air toward the second position on a third trajectory, where the second trajectory is different from the third trajectory.
  • the display module 1301 is also configured to respond to The direction selection operation of the second roulette control displays a picture of the third trajectory corresponding to the observation angle and jumping toward the second position.
  • the display module 1301 is also configured to display the virtual character in the first position.
  • the first direction corresponding to the observation perspective is determined, and the virtual A picture of the character jumping into the air toward the second position on the third trajectory that is tangent to the first direction.
  • the display module 1301 is further configured to launch the hook prop to the first position when the virtual character responds to the first triggering operation on the first use control. During the process, the second usage control is displayed;
  • the second use control is displayed while the virtual character is being pulled by the hook prop to move toward the first position.
  • the display module 1301 is further configured to highlight the second usage control in the user interface in response to the first triggering operation on the first usage control;
  • the method of highlighting includes at least one of highlighting, inverting, bolding, adding background color, adding prompt lines, and presenting dynamic visual effects.
  • the display module 1301 is also configured to cancel the display of the second usage control when the virtual character reaches the first position;
  • the display module 1301 is also configured to respond to the virtual character's movement toward the second position during the process of the virtual character taking the first position as a fulcrum and jumping toward the second position. There is an obstacle on the path of the second position soaring and jumping, and a picture of the virtual character avoiding the obstacle and continuing to soar and jump toward the second position is displayed.
  • the first control is a right-hand control
  • the hook prop is a right-hand hook prop
  • the first position is the right-hand position.
  • the display module 1301 is also configured to cancel the display of the right-hand control and display the left-hand control in response to the drag operation of the right-hand control to the left when the virtual character is pulled by the right-hand hook prop to move to the right-hand position; in response to The first triggering operation of the left-hand control controls the virtual character to launch the left-hand hook prop to the left hand position, and releases the fixed connection relationship between the right-hand hook prop and the right-hand position;
  • the right-hand control is used to trigger the use of the right-hand hook prop
  • the left-hand control is used to trigger the use of the left-hand hook prop.
  • the display position of the left-hand control is to the left compared to the display position of the right-hand control.
  • the first control is a left-hand control
  • the hook prop is a left-hand hook prop
  • the first position is the left-hand position
  • the display module 1301 is also used to display the virtual character when the virtual character is pulled by the left-hand hook prop.
  • the display of the left-hand control is canceled and the right-hand control is displayed; in response to the first triggering operation of the right-hand control, the virtual character is controlled to launch the right-hand hook
  • the rope prop is moved to the right-hand position, and the fixed connection relationship between the left-hand hook prop and the left-hand position is released;
  • the left-hand control is used to trigger the use of the left-hand hook prop
  • the right-hand control is used to trigger the use of the right-hand hook prop.
  • the display position of the right-hand control is to the right compared to the display position of the left-hand control.
  • FIG 14 shows a structural block diagram of a computer device 1400 provided by an exemplary embodiment of the present application.
  • the computer device 1400 can be a portable mobile terminal, such as a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, Moving Picture Experts Compression Standard Audio Layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, Motion Picture Expert compresses the standard audio layer 4) player.
  • the computer device 1400 may also be called a user device, a portable terminal, or other names.
  • the computer device 1400 includes: a processor 1401 and a memory 1402.
  • the processor 1401 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
  • the processor 1401 can be implemented in at least one hardware form among DSP (Digital Signal Processing, digital signal processing), FPGA (Field Programmable Gate Array, field programmable gate array), and PLA (Programmable Logic Array, programmable logic array).
  • the processor 1401 may also include a main processor and a co-processor.
  • the main processor is a processor used to process data in the wake-up state, also called CPU (Central Processing). Unit, central processing unit); the coprocessor is a low-power processor used to process data in standby state.
  • CPU Central Processing
  • Unit central processing unit
  • the coprocessor is a low-power processor used to process data in standby state.
  • the processor 1401 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is responsible for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 1401 may also include an AI (Artificial Intelligence, artificial intelligence) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence, artificial intelligence
  • Memory 1402 may include one or more computer-readable storage media, which may be tangible and non-transitory. Memory 1402 may also include high-speed random access memory, and non-volatile memory, such as one or more disk storage devices, flash memory storage devices. In some embodiments, the non-transitory computer-readable storage medium in the memory 1402 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 1401 to implement the virtual props provided in the embodiments of the present application. Instructions.
  • the computer device 1400 optionally further includes a peripheral device interface 1403.
  • FIG. 14 does not constitute a limitation on the computer device 1400, and may include more or fewer components than shown, or combine certain components, or adopt different component arrangements.
  • An embodiment of the present application also provides a computer device.
  • the computer device includes: a processor and a memory. At least one computer program is stored in the memory. The at least one computer program is loaded and executed by the processor to implement the virtualization provided by the above method embodiments. How to use props.
  • Embodiments of the present application also provide a computer storage medium.
  • the computer-readable storage medium stores at least one computer program.
  • the at least one computer program is loaded and executed by the processor to implement the method of using virtual props provided by the above method embodiments. .
  • Embodiments of the present application also provide a computer program product.
  • the computer program product includes a computer program.
  • the computer program is stored in a computer-readable storage medium.
  • the computer program is readable by a processor of a computer device from the computer.
  • the storage medium is read and executed, so that the computer device executes the method of using virtual props provided by the above method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

Sont divulgués dans la présente demande un objet virtuel utilisant un procédé et un appareil, un dispositif, un support et un produit de programme, appartenant au domaine de l'interaction homme-machine. Le procédé comprend les étapes consistant à : afficher un personnage virtuel qui comporte un objet de crochet grappin et est situé dans un environnement virtuel, et une première commande d'utilisation (302) ; en réponse à une première opération de déclenchement sur la première commande d'utilisation, afficher une image du personnage virtuel lançant l'objet de crochet grappin dans une première position, et une image du personnage virtuel étant entraîné par l'objet de crochet grappin à se déplacer vers la première position (304) ; en réponse à une seconde opération de déclenchement sur la première commande d'utilisation, afficher une image de la première commande d'utilisation qui est commutée vers une première commande de roue (306) ; et en réponse à une opération de sélection de direction sur la première commande de roue, afficher une image d'angle de visualisation après qu'un angle de visualisation d'observation est changé sur la base de l'opération de sélection de direction (308). Dans la présente demande, en utilisant une commande cachée, l'angle de visualisation est converti pendant le processus d'entraînement du personnage virtuel à se déplacer par l'objet de crochet grappin, de sorte que la proportion de la commande sur une interface utilisateur est réduite.
PCT/CN2023/079804 2022-04-14 2023-03-06 Procédé et appareil de d'utilisation d'objets virtuels, dispositif, support et produit programme WO2023197777A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210391863.0A CN116943214A (zh) 2022-04-14 2022-04-14 虚拟道具的使用方法、装置、设备、介质及程序产品
CN202210391863.0 2022-04-14

Publications (1)

Publication Number Publication Date
WO2023197777A1 true WO2023197777A1 (fr) 2023-10-19

Family

ID=88328799

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/079804 WO2023197777A1 (fr) 2022-04-14 2023-03-06 Procédé et appareil de d'utilisation d'objets virtuels, dispositif, support et produit programme

Country Status (2)

Country Link
CN (1) CN116943214A (fr)
WO (1) WO2023197777A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130225289A1 (en) * 2010-09-30 2013-08-29 Kabushiki Kaisha Sega Dba Sega Corporation Image processing apparatus and computer-readable storage medium
US20170011554A1 (en) * 2015-07-01 2017-01-12 Survios, Inc. Systems and methods for dynamic spectating
CN111158469A (zh) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 视角切换方法、装置、终端设备及存储介质
CN111589146A (zh) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 基于虚拟环境的道具操作方法、装置、设备及存储介质
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN113633975A (zh) * 2021-08-19 2021-11-12 腾讯科技(深圳)有限公司 虚拟环境画面的显示方法、装置、终端及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130225289A1 (en) * 2010-09-30 2013-08-29 Kabushiki Kaisha Sega Dba Sega Corporation Image processing apparatus and computer-readable storage medium
US20170011554A1 (en) * 2015-07-01 2017-01-12 Survios, Inc. Systems and methods for dynamic spectating
CN111158469A (zh) * 2019-12-12 2020-05-15 广东虚拟现实科技有限公司 视角切换方法、装置、终端设备及存储介质
CN111589146A (zh) * 2020-04-27 2020-08-28 腾讯科技(深圳)有限公司 基于虚拟环境的道具操作方法、装置、设备及存储介质
CN112330823A (zh) * 2020-11-05 2021-02-05 腾讯科技(深圳)有限公司 虚拟道具的显示方法、装置、设备及可读存储介质
CN113633975A (zh) * 2021-08-19 2021-11-12 腾讯科技(深圳)有限公司 虚拟环境画面的显示方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN116943214A (zh) 2023-10-27

Similar Documents

Publication Publication Date Title
AU2021250929B2 (en) Virtual object control method and apparatus, device, and storage medium
JP7476235B2 (ja) 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム
CN110548288B (zh) 虚拟对象的受击提示方法、装置、终端及存储介质
WO2021244322A1 (fr) Procédé et appareil permettant de viser un objet virtuel, dispositif, et support de stockage
TWI831066B (zh) 虛擬場景中狀態切換方法、裝置、設備、媒體及程式產品
WO2022042435A1 (fr) Procédé et appareil permettant d'afficher une image d'environnement virtuel et dispositif et support de stockage
JP7325664B2 (ja) 仮想オブジェクトの制御方法及び装置、端末、並びに、コンピュータプログラム
JP2022535675A (ja) 仮想オブジェクトの制御方法並びにその、装置、端末及びコンピュータプログラム
WO2022156486A1 (fr) Procédé et appareil de placement d'articles virtuels, terminal, support de stockage et produit programme
CN113398601B (zh) 信息发送方法、信息发送装置、计算机可读介质及设备
CN112416196B (zh) 虚拟对象的控制方法、装置、设备及计算机可读存储介质
WO2022257653A1 (fr) Procédé et appareil d'affichage d'accessoire virtuel, dispositif électronique et support de stockage
WO2022121503A1 (fr) Procédé et appareil d'affichage d'accessoires de pré-commande, dispositif, support et produit
CN111359208A (zh) 游戏中标记信号生成的方法及装置、电子设备、存储介质
CN112691366A (zh) 虚拟道具的显示方法、装置、设备及介质
JP2023164787A (ja) 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム
CN113546422A (zh) 虚拟资源的投放控制方法、装置、计算机设备及存储介质
WO2024098628A1 (fr) Procédé et appareil d'interaction de jeu, dispositif terminal et support de stockage lisible par ordinateur
WO2023197777A1 (fr) Procédé et appareil de d'utilisation d'objets virtuels, dispositif, support et produit programme
WO2022170892A1 (fr) Procédé et appareil de commande d'objet virtuel, dispositif, support de stockage, et produit de programme
CN112138392B (zh) 虚拟对象的控制方法、装置、终端及存储介质
TWI843042B (zh) 虛擬道具的投放方法、裝置、終端、儲存媒體及程式產品
CN116712733A (zh) 虚拟角色的控制方法、装置、电子设备及存储介质
CN115089968A (zh) 一种游戏中的操作引导方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23787425

Country of ref document: EP

Kind code of ref document: A1