WO2021244322A1 - 瞄准虚拟对象的方法、装置、设备及存储介质 - Google Patents

瞄准虚拟对象的方法、装置、设备及存储介质 Download PDF

Info

Publication number
WO2021244322A1
WO2021244322A1 PCT/CN2021/095101 CN2021095101W WO2021244322A1 WO 2021244322 A1 WO2021244322 A1 WO 2021244322A1 CN 2021095101 W CN2021095101 W CN 2021095101W WO 2021244322 A1 WO2021244322 A1 WO 2021244322A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual object
aiming
target
virtual
selection range
Prior art date
Application number
PCT/CN2021/095101
Other languages
English (en)
French (fr)
Inventor
万钰林
翁建苗
胡勋
粟山东
张勇
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Priority to JP2021566345A priority Critical patent/JP2022539289A/ja
Priority to EP21783116.3A priority patent/EP3950079A4/en
Priority to KR1020217035482A priority patent/KR20210151861A/ko
Priority to US17/505,226 priority patent/US11893217B2/en
Publication of WO2021244322A1 publication Critical patent/WO2021244322A1/zh
Priority to US18/398,972 priority patent/US20240143145A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/537Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
    • A63F13/5372Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/56Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Definitions

  • the embodiments of the present application relate to the field of virtual environments, and in particular, to a method, device, device, and storage medium for targeting virtual objects.
  • a battle game is a game in which multiple virtual objects compete in the same virtual world.
  • the battle game is a multiplayer online tactical competitive game (Multiplayer Online Battle Arena Games, MOBA).
  • the user controls the first virtual object to have directional skills.
  • the terminal will display a fan-shaped skill indicator.
  • the fan-shaped skill indicator includes a fan-shaped area at the foot of the first virtual object, and the symmetry axis of the fan-shaped area is the line of sight.
  • the user can drag the fan-shaped skill indicator to rotate around the first virtual object.
  • the candidate virtual object located in the fan-shaped area and closest to the line of sight is determined to be the targeted virtual object, and the user controls the first virtual object to target the target.
  • Virtual objects release skills.
  • a method of aiming at a virtual object including:
  • a user interface is displayed, the user interface includes a screen of a virtual environment, the screen of the virtual environment is a screen obtained by observing the virtual environment with a first virtual object as an observation center, and the screen of the virtual environment includes a screen located in the virtual environment The first virtual object and the second virtual object in
  • the point-type aiming indicator In response to the aiming instruction, displaying a point-type aiming indicator in the virtual environment, the point-type aiming indicator being used to indicate the aiming point selected by the aiming operation on the ground plane of the virtual environment;
  • the target virtual object is a virtual object selected from a second virtual object located in a target selection range
  • the target selection range is based on the aiming point Definite selection range.
  • a device for aiming at a virtual object comprising:
  • the display module is configured to display a user interface, the user interface including a screen of a virtual environment, the screen of the virtual environment is a screen obtained by observing the virtual environment with a first virtual object as an observation center, and the screen of the virtual environment includes A first virtual object and a second virtual object located in the virtual environment;
  • the display module is further configured to display a point-type aiming indicator in the virtual environment in response to an aiming instruction, and the point-type aiming indicator is used to indicate that the aiming operation is selected on the ground plane of the virtual environment. Aiming point
  • the aiming module is used to control the first virtual object to aim at the target virtual object, the target virtual object is a virtual object selected from the second virtual object located in the target selection range, and the target selection range is The aiming point is the selection range determined by the reference.
  • a computer device includes a processor and a memory.
  • the memory stores at least one instruction, at least one program, code set, or instruction set, and the at least one instruction
  • the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for targeting a virtual object as described in the above aspect.
  • a computer-readable storage medium stores at least one instruction, at least one program, code set or instruction set, the at least one instruction, the At least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for targeting a virtual object as described in the above aspect.
  • a computer program product which when the computer program product runs on a computer device, causes the computer device to execute the method for targeting a virtual object as described in the above aspect.
  • Fig. 1 is a structural block diagram of a computer system provided by an exemplary embodiment of the present application
  • Fig. 2 is a schematic diagram of a state synchronization technology provided by an exemplary embodiment of the present application
  • Fig. 3 is a schematic diagram of a frame synchronization technology provided by an exemplary embodiment of the present application.
  • Fig. 4 is a schematic diagram of a roulette aiming control provided by an exemplary embodiment of the present application.
  • Fig. 5 is a schematic diagram of two triggering modes of a roulette aiming control provided by another exemplary embodiment of the present application.
  • FIG. 6 is a schematic diagram of an interface of a method for aiming at a virtual object according to another exemplary embodiment of the present application.
  • Fig. 7 is a flowchart of a method for aiming at a virtual object provided by an exemplary embodiment of the present application.
  • Fig. 8 is a flowchart of a method for aiming at a virtual object provided by another exemplary embodiment of the present application.
  • FIG. 9 is a schematic diagram of mapping of a method for aiming at a virtual object provided by another exemplary embodiment of the present application.
  • FIG. 10 is a schematic diagram of a range of a target selection range provided by an exemplary embodiment of the present application.
  • FIG. 11 is a schematic diagram of a range of a target selection range provided by another exemplary embodiment of the present application.
  • FIG. 12 is a schematic diagram of pre-targeting of a virtual object provided by another exemplary embodiment of the present application.
  • FIG. 13 is a flowchart of a method for aiming at a virtual object according to another exemplary embodiment of the present application.
  • FIG. 14 is a schematic diagram of prioritizing aiming according to distance provided by another exemplary embodiment of the present application.
  • Fig. 15 is a schematic diagram of prioritizing aiming according to the percentage of blood volume provided by another exemplary embodiment of the present application.
  • FIG. 16 is a schematic diagram of a range of a target selection range provided by another exemplary embodiment of the present application.
  • FIG. 17 is a flowchart of a method for aiming at a virtual object according to another exemplary embodiment of the present application.
  • FIG. 18 is a schematic structural diagram of program classes provided by another exemplary embodiment of the present application.
  • FIG. 19 is a schematic diagram of the structure of program classes provided by another exemplary embodiment of the present application.
  • FIG. 20 is a schematic diagram of a scene of aiming at a virtual object provided by another exemplary embodiment of the present application.
  • FIG. 21 is a schematic diagram of a scene of aiming at a virtual object provided by another exemplary embodiment of the present application.
  • FIG. 22 is a block diagram of an apparatus for aiming at a virtual object according to another exemplary embodiment of the present application.
  • FIG. 23 is a block diagram of a terminal provided by another exemplary embodiment of the present application.
  • Virtual environment It is the virtual environment displayed (or provided) when the application is running on the terminal.
  • the virtual environment may be a simulation of the real world, a semi-simulated and semi-fictional three-dimensional world, or a purely fictitious three-dimensional world.
  • the virtual environment may be any one of a two-dimensional virtual environment, a 2.5-dimensional virtual environment, and a three-dimensional virtual environment.
  • the virtual environment is also used for a virtual environment battle between at least two virtual objects, and there are virtual resources available for the at least two virtual objects in the virtual environment.
  • the virtual environment includes a symmetrical lower left corner area and an upper right corner area. Virtual objects belonging to two rival camps occupy one of the areas respectively, and destroy the target building/base/crystal in the depths of the opponent's area as the victory goal.
  • Virtual object refers to the movable object in the virtual environment.
  • the movable object may be at least one of a virtual character, a virtual animal, and an animation character.
  • the virtual environment is a three-dimensional virtual environment
  • the virtual object may be a three-dimensional virtual model, and each virtual object has its own shape and volume in the three-dimensional virtual environment, and occupies a part of the space in the three-dimensional virtual environment.
  • a virtual object is a three-dimensional character constructed based on the technology of three-dimensional human bones.
  • the virtual object realizes different external images by wearing different skins.
  • the virtual object may also be implemented using a 2.5-dimensional or 2-dimensional model, which is not limited in the embodiment of the present application.
  • the virtual object is controlled by the user through the client, or the virtual object is controlled by the server.
  • Multiplayer online tactical competition refers to: in a virtual environment, different virtual teams belonging to at least two rival camps occupy their respective map areas and compete with a certain victory condition as the goal.
  • the victory conditions include but are not limited to: occupying a stronghold or destroying the enemy camp’s stronghold, killing the virtual object of the enemy camp, ensuring one’s own survival in a specified scene and time, grabbing a certain resource, and surpassing the opponent’s score within a specified time At least one of.
  • Tactical competition can be carried out in units of rounds, and the map of each round of tactical competition can be the same or different.
  • Each virtual team includes one or more virtual objects, such as 1, 2, 3, or 5.
  • MOBA game It is a game in which several strongholds are provided in a virtual environment, and users in different camps control virtual objects to fight in the virtual environment, occupy strongholds or destroy enemy camp strongholds. For example, a MOBA game can divide users into two rival camps, disperse the virtual objects controlled by the user in a virtual environment to compete with each other, and destroy or occupy all the enemy's strongholds as a victory condition. MOBA games are based on rounds, and the duration of a round of MOBA games is from the moment the game starts to the moment when the victory conditions are fulfilled.
  • User Interface UI User Interface
  • any visual controls or elements that can be seen on the user interface of the application, such as pictures, input boxes, text boxes, buttons, labels and other controls.
  • Some of the UI controls respond to user operations , For example, skill controls, control the master virtual object to release skills. The user triggers the skill control to control the master virtual object to release the skill.
  • the UI controls involved in the embodiments of this application include, but are not limited to: skill controls and mobile controls.
  • the candidate virtual object closest to the line of sight is determined as the target virtual object to be targeted.
  • the active aiming operation cost is high in the MOBA game.
  • the embodiments of the present application provide a method, device, equipment, and medium for aiming a virtual object, which can improve the accuracy of a user when actively aiming.
  • Fig. 1 shows a structural block diagram of a computer system to which the method for aiming a virtual object provided by an exemplary embodiment of the present application is applicable.
  • the computer system 100 includes: a first terminal 110, a server 120, and a second terminal 130.
  • the first terminal 110 installs and runs a client 111 supporting a virtual environment, and the client 111 may be a multiplayer online battle program.
  • the user interface of the client 111 is displayed on the screen of the first terminal 110.
  • the client can be a military simulation program, battle royale shooting game, virtual reality (Virtual Reality, VR) application, augmented reality (Augmented Reality, AR) program, three-dimensional map program, virtual reality game, augmented reality game, first person Shooting games (First-person shooting game, FPS), Third-Personal Shooting Game (TPS), Multiplayer Online Battle Arena Games (MOBA), Strategy Game (Simulation Game, SLG) Any of them.
  • the client is a MOBA game as an example.
  • the first terminal 110 is a terminal used by the first user 112.
  • the first user 112 uses the first terminal 110 to control a first virtual object located in a virtual environment to perform activities.
  • the first virtual object may be referred to as the first virtual object of the first user 112.
  • Object The activities of the first virtual object include, but are not limited to, at least one of adjusting body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, and throwing.
  • the first virtual object is a first virtual character, such as a simulated character or an animation character.
  • the second terminal 130 installs and runs a client 131 supporting a virtual environment, and the client 131 may be a multiplayer online battle program.
  • the user interface of the client 131 is displayed on the screen of the second terminal 130.
  • the client can be any of military simulation programs, battle royale shooting games, VR applications, AR programs, three-dimensional map programs, virtual reality games, augmented reality games, FPS, TPS, MOBA, and SLG.
  • the client is a MOBA game as an example.
  • the second terminal 130 is a terminal used by the second user 113.
  • the second user 113 uses the second terminal 130 to control a second virtual object located in the virtual environment to perform activities.
  • the second virtual object may be referred to as the first virtual object of the second user 113.
  • Object Illustratively, the second virtual object is a second virtual character, such as a simulated character or an animation character.
  • the first virtual character and the second virtual character are in the same virtual environment.
  • the first virtual character and the second virtual character can belong to the same camp, the same team, the same organization, have a friendship relationship, or have temporary communication permissions.
  • the first virtual character and the second virtual character may belong to different camps, different teams, different organizations, or have a hostile relationship.
  • the clients installed on the first terminal 110 and the second terminal 130 are the same, or the clients installed on the two terminals are the same type of clients on different operating system platforms (Android or IOS).
  • the first terminal 110 may generally refer to one of multiple terminals, and the second terminal 130 may generally refer to another of multiple terminals. This embodiment only uses the first terminal 110 and the second terminal 130 as examples.
  • the device types of the first terminal 110 and the second terminal 130 are the same or different.
  • the device types include smart phones, tablet computers, e-book readers, MP3 players, MP4 players, laptop computers and desktop computers. At least one.
  • terminals 140 Only two terminals are shown in FIG. 1, but there are multiple other terminals 140 that can access the server 120 in different embodiments. There are also one or more terminals 140 corresponding to the developer.
  • a development and editing platform supporting the virtual environment of the client is installed on the terminal 140.
  • the developer can edit and update the client on the terminal 140, and update it.
  • the latter client installation package is transmitted to the server 120 via a wired or wireless network, and the first terminal 110 and the second terminal 130 can download the client installation package from the server 120 to update the client.
  • the first terminal 110, the second terminal 130, and other terminals 140 are connected to the server 120 through a wireless network or a wired network.
  • the server 120 includes at least one of a server, multiple servers, a cloud computing platform, and a virtualization center.
  • the server 120 is used to provide background services for the clients supporting the three-dimensional virtual environment.
  • the server 120 is responsible for the main calculation work, and the terminal is responsible for the secondary calculation work; or, the server 120 is responsible for the secondary calculation work, and the terminal is responsible for the main calculation work; or, the server 120 and the terminal adopt a distributed computing architecture for collaborative calculation.
  • the server 120 includes a processor 122, a user account database 123, a battle service module 124, and a user-oriented input/output interface (Input/Output Interface, I/O interface) 125.
  • the processor 122 is used to load instructions stored in the server 121, and to process data in the user account database 123 and the battle service module 124; the user account database 123 is used to store the first terminal 110, the second terminal 130, and other terminals 140.
  • the data of the user account used such as the avatar of the user account, the nickname of the user account, the combat power index of the user account, the service area where the user account is located;
  • the battle service module 124 is used to provide multiple battle rooms for users to compete, such as 1V1 battle , 3V3 battle, 5V5 battle, etc.;
  • the user-oriented I/O interface 125 is used to establish communication and exchange data with the first terminal 110 and/or the second terminal 130 via a wireless network or a wired network.
  • the server 120 may adopt a synchronization technology to make the images of multiple clients consistent.
  • the synchronization technology adopted by the server 120 includes: state synchronization technology or frame synchronization technology.
  • the server 120 uses state synchronization technology to synchronize with multiple clients.
  • the battle logic runs in the server 120.
  • the server 120 sends the state synchronization result to all clients, such as clients 1 to 10.
  • the client 1 sends a request to the server 120 for requesting the virtual object 1 to release the frost skill
  • the server 120 determines whether the frost skill is allowed to be released, and when the frost skill is allowed to be released, What is the damage value of other virtual objects 2? Then, the server 120 sends the skill release result to all clients, and all the clients update local data and interface performance according to the skill release result.
  • the server 120 uses frame synchronization technology to synchronize with multiple clients.
  • the battle logic runs in each client.
  • Each client sends a frame synchronization request to the server, and the frame synchronization request carries the client's local data changes.
  • the server 120 forwards the frame synchronization request to all clients.
  • each client receives the frame synchronization request, it processes the frame synchronization request according to the local combat logic, and updates the local data and interface performance.
  • the screen display method of the virtual environment provided by the embodiment of the present application will be described in combination with the above introduction of the virtual environment and the description of the implementation environment.
  • the execution subject of the method is the client running on the terminal shown in FIG. 1 as an example.
  • the terminal runs a client, and the client is an application that supports a virtual environment.
  • the aiming refers to selecting one or more target virtual objects from multiple virtual objects.
  • the aiming point in the embodiment of the present application may be the selected point when one or more target virtual objects are selected from multiple virtual objects; similarly, the aiming line may also be called the selected line, and the aiming instruction may also be called Select commands, etc., so I won’t repeat them here.
  • the user can control the first virtual object to release directional skills or directional attacks by controlling the aiming controls of the roulette wheel.
  • the roulette aiming control includes: a roulette area 40 and a joystick button 42.
  • the area of the roulette region 40 is larger than the area of the rocker button 42.
  • the rocker button 42 is movable in the wheel area 40.
  • the roulette area 40 is divided into an inner ring area 41 and an outer ring area 43.
  • the inner circle area 41 is also called a dead zone.
  • FIG. 5 It can be seen from FIG. 5 that according to the different positions of the user's aiming operation, it is divided into: a quick trigger mode and an active aiming mode.
  • the quick cast mode (also called quick cast or automatic cast) is triggered.
  • the quick cast mode means that the client will automatically select the target virtual object within the circular release range centered on the first virtual object according to the default attack object selection rule.
  • the client controls the first virtual object to release directional skills or directional attacks to the target virtual object.
  • the active aiming mode When the user clicks the joystick button 42 and drags the joystick button 42 to the outer circle area 43, the active aiming mode is triggered.
  • the active aiming mode provided by the embodiment of the present application refers to: according to the position of the joystick button 42 in the outer circle area 43, the point aiming indicator 44 and the range indicator 45 are mapped and displayed.
  • the point-type aiming indicator 44 is used to aim at the aiming point in the virtual environment, and the position of the point-type aiming indicator 44 corresponds to the position of the joystick button 42;
  • the range indicator 45 is used to indicate directional skills or directivity For the maximum range of the attack, the range indicator 45 corresponds to the outer edge of the outer ring area 43, and the range indicator 45 is usually a circular release range.
  • the user can change the position of the rocker button 42 in the outer circle area 43, and then change the display position of the point-type aiming indicator 44 within the circular release range 45.
  • the client controls the first virtual object 51 to aim at the target virtual object 52 to release directional skills or directional attacks.
  • the target virtual object 52 is a virtual object closest to the aiming point 44 among other virtual objects except the first virtual object.
  • the first virtual object 51 jumps to the position where the target virtual object 52 is located, and releases a high-injury hack-and-slash attack on the target virtual object 52.
  • Fig. 7 shows a flowchart of a method for aiming at a virtual object provided by an exemplary embodiment of the present application.
  • the method is executed by the terminal (or client) shown in FIG. 1 as an example.
  • the method includes:
  • Step 702 Display a user interface.
  • the user interface includes a picture of a virtual environment, and the picture of the virtual environment includes a first virtual object and a second virtual object located in the virtual environment.
  • the screen of the virtual environment is a screen obtained by observing the virtual environment from the observation angle corresponding to the first virtual object.
  • the screen of the virtual environment is a two-dimensional screen displayed on the client after screen capture of the three-dimensional virtual environment.
  • the shape of the screen of the virtual environment is determined according to the shape of the display screen of the terminal, or according to the shape of the user interface of the client. Taking the rectangular screen of the terminal as an example, the screen of the virtual environment is also displayed as a rectangular screen.
  • a camera model bound to the first virtual object is set in the virtual environment.
  • the picture of the virtual environment is the picture taken by the camera model with a certain observation position in the virtual environment as the observation center.
  • the observation center is the center of the screen of the virtual environment. Taking a rectangular screen in the virtual environment as an example, the intersection of the diagonals of the rectangle in the screen in the virtual environment is the observation center.
  • the camera model bound to the first virtual object takes the first virtual object as the observation center, and the position of the first virtual object in the virtual environment is the observation position.
  • the observation position is the coordinate position in the virtual environment.
  • the observation position is a three-dimensional coordinate. Exemplarily, if the ground in the virtual environment is a horizontal plane, the height coordinate of the observation position is 0, and the observation position can be approximated as two-dimensional coordinates on the horizontal plane.
  • the first virtual object is a virtual object controlled by the client.
  • the client terminal controls the activity of the first virtual object in the virtual environment according to the received user operation (or man-machine operation).
  • the activities of the first virtual object in the virtual environment include: walking, running, jumping, climbing, getting down, attacking, releasing skills, picking up props, and sending messages.
  • a skill is an ability that is used or released by a virtual object, attacks itself and/or other virtual objects, produces a debuff effect, or produces a gain effect.
  • skills include: directional skills and undifferentiated coverage skills.
  • the directional skill is a skill that is released toward the aimed direction or area or virtual object within the maximum range.
  • Indiscriminate coverage skills are skills that are released toward all areas within the maximum range.
  • skills include active skills and passive skills. Active skills are skills that are actively used or released by virtual objects, and passive skills are skills that are automatically triggered when passive conditions are met.
  • the directional skill mentioned in this embodiment is an active skill that the user controls the first virtual object to actively use or release, and the skill is released toward the targeted virtual object within the maximum range. Skill.
  • the directional attack mentioned in this embodiment is an ordinary attack in which the user controls the first virtual object to actively use or release, and the attack is released toward the targeted virtual object within the maximum range. Normal Attack.
  • Step 704 In response to the aiming instruction, display a point-type aiming indicator in the virtual environment, where the point-type aiming indicator is used to indicate the aiming point selected by the aiming operation on the ground plane of the virtual environment;
  • the aiming instruction is triggered by the user's aiming operation (or skill release operation or attack operation). In one example, the aiming instruction is triggered by a drag operation on the aiming control of the roulette; in another example, the aiming instruction is triggered by a drag operation on the joystick button of the physical handle.
  • the trigger method is not limited.
  • the aiming instruction is an instruction triggered by a drag operation that exceeds the dead zone on the roulette aiming control as an example.
  • the "aiming operation” is the drag operation that exceeds the dead zone on the roulette aiming control. It can be seen from FIG. 6 that after receiving a drag operation on the roulette aiming control, the terminal displays the point aiming indicator 44 according to the position of the joystick button 42 in the outer circle area 43.
  • Step 706 Control the first virtual object to aim at the target virtual object, the target virtual object is a virtual object selected from the second virtual object located in the target selection range, and the target selection range is a selection determined based on the aiming point Scope.
  • the target selection range is the selection range determined based on the aiming point.
  • the target selection range is located on the ground plane of the virtual environment, the target selection range takes the first map point where the first virtual object is located as the rotation center, and the symmetry axis of the target selection range passes through the aiming point.
  • the target virtual object is a virtual object selected from the second virtual objects located in the target selection range.
  • the target virtual object is a virtual object selected according to the priority principle among the second virtual objects located in the target selection range.
  • the priority principle includes but is not limited to at least one of the following principles:
  • the candidate virtual object with the highest type priority is preferentially selected.
  • the aiming in this embodiment includes normal aiming and locked aiming.
  • Normal aiming When the position of the aiming target (target virtual object) changes, aiming is automatically cancelled.
  • Locked aiming When the position of the aiming target (target virtual object) changes, the aiming will not be cancelled.
  • the first virtual object when the first virtual object is aimed at the target virtual object by ordinary aiming, and the target virtual object is moved to change the position, the first virtual object no longer aims at the target virtual object, and will not affect the target virtual object.
  • the user Using skills or performing ordinary attacks, if the user wants to continue aiming at the target virtual object, he needs to perform the aiming operation again to aim at the target virtual object.
  • the first virtual object after the first virtual object is aimed at the target virtual object in a locked aiming manner, the first virtual object will continuously aim at the target virtual object to perform skill release or ordinary attacks.
  • the first object after the first object is aimed at the target virtual object by locking the aiming method, when the position of the target virtual object changes so that the position of the target virtual object exceeds the attack range (aiming range) of the first virtual object, The client will also automatically control the first virtual object to follow the target virtual object, so as to continue to target the target virtual object to attack.
  • the methods for ending locked aiming may include the following: after the aiming time reaches the predetermined time, stop the aiming; stop aiming after the target virtual object moves out of the aiming range of the first virtual object; stop aiming at the target virtual object or After the first virtual object dies, the aiming is stopped; when the user performs the aiming operation again to aim at other virtual objects, the aiming of the target virtual object is stopped.
  • the method provided in this embodiment uses the aiming point to simulate the click position of the mouse on the computer, and selects a virtual object from the second virtual object located in the target selection range determined based on the aiming point.
  • Object as the target virtual object being targeted. Since the target selection range is determined based on the aiming point, and the pointing accuracy of the aiming point is better than the pointing accuracy of the aiming line, the active aiming selection target is more stable and it is not easy to select the wrong target, so it can improve the user
  • the accuracy in active aiming reduces the time spent in active aiming and selecting targets, reduces operating costs, and improves the efficiency of human-computer interaction and the user's operating experience. In addition, it also gives the client-side designers more skills and design space.
  • the above-mentioned aiming instruction is triggered by the roulette aiming control.
  • the user interface displayed by the client includes: a roulette aiming control, and the roulette aiming control is displayed superimposed on the screen of the virtual environment.
  • the roulette aiming control 40 includes a roulette area and a rocker button 42.
  • the above step 704 includes the following steps 704a to 704d, as shown in FIG. 8:
  • Step 704a in response to the aiming instruction, calculate an offset vector from the activation point to the offset point;
  • the drag operation triggers the touch screen in the terminal to report a series of touch instructions to the CPU, including but not limited to: a touch start instruction, at least one touch movement instruction, and a touch end instruction.
  • Each touch instruction carries the real-time touch coordinates of the user's finger on the touch screen.
  • a series of touch commands triggered by the drag operation can all be regarded as aiming commands.
  • a touch command triggered by a drag operation in a non-dead zone area can be regarded as an aiming command.
  • the activation point 91 refers to the center position of the roulette area.
  • the center position of the roulette area is fixed; in other embodiments, the center position of the roulette area is dynamically changed.
  • the finger detected by the touch screen drops. The position is set to the center position of the roulette area.
  • the position of the joystick will shift from the activation point 91 to the offset point 92.
  • the first coordinate of the activation point is recorded in the client, the second coordinate of the offset point is read from the aiming instruction, and the offset vector is calculated according to the second coordinate and the first coordinate.
  • the offset vector is a vector from the activation point 91 to the offset point 92, the first coordinate and the second coordinate are both coordinates of the plane where the touch screen is located, and the offset vector is a vector located on the plane where the touch screen is located.
  • Step 704b Calculate the aiming vector according to the offset vector
  • the aiming vector is a vector pointing from the first map point 93 where the first virtual object is located to the aiming point 94.
  • the ratio of the length L1 of the offset vector to the radius of the wheel R1 is equal to the ratio of the length L2 of the aiming vector to the aiming radius R2.
  • the roulette radius R1 is the radius of the roulette area
  • the aiming radius R2 is the maximum aiming distance of the first virtual object when actively aiming.
  • the aiming radius R2 is equal to the maximum range x of the directional skill (or directional attack).
  • the aiming radius R2 is equal to the sum of the maximum range x of the directional skill (or directional attack) and the pre-aim distance y. In this embodiment, the latter is taken as an example. Even if the second virtual object is outside the maximum range of the directional skill, the aim can be locked in advance.
  • ⁇ 1 is equal to the aiming angle ⁇ 2.
  • ⁇ 1 is the offset angle of the offset vector relative to the horizontal direction
  • ⁇ 2 is the offset angle of the aiming vector relative to the x-axis in the virtual environment.
  • the aiming vector is the vector in the virtual environment.
  • the aiming vector is a vector on a plane in the virtual environment.
  • Step 704c Calculate the aiming point according to the aiming vector and the first map point where the first virtual object is located;
  • the client adds the first map point and the aiming vector to calculate the aiming point.
  • the aiming point is a point located on the ground plane of the virtual environment.
  • step 704d a point skill indicator is displayed on the aiming point in the virtual environment.
  • the method provided in this embodiment accurately maps the aiming point 94 according to the offset point 92, which enables the user to obtain an operation similar to the mouse click on the computer when using the roulette aiming control, which improves the active aiming Aim accuracy at the time.
  • the target selection range is the selection range determined based on the aiming point.
  • the target selection range is located on the ground plane of the virtual environment, the target selection range takes the first map point where the first virtual object is located as the rotation center, and the symmetry axis of the target selection range passes through the aiming point.
  • the target selection range is an axisymmetric figure.
  • the target selection range is at least one of a fan shape, a semicircle shape, and a circle shape.
  • the target selection range is the range obtained by mixing at least two geometric figures.
  • the geometric figures include: positive direction, rhombus, triangle, circle, fan,
  • the target selection range 95 is based on the first map point 93 where the first virtual object is located as the rotation center, and the symmetry axis of the target selection range 95 passes through the aiming point 94.
  • the target selection range 95 is a range where a circle and a semicircle are mixed.
  • the target selection range 95 is based on the first map point 93 where the first virtual object is located as the rotation center, and the symmetry axis of the target selection range 95 passes through the aiming point 94.
  • the target selection range 95 is a range in which a circle and a sector are mixed.
  • the first virtual object has a maximum range 96 when performing a directional skill or a directional attack
  • the maximum range 96 may be a circular range in which the first map point 93 where the first virtual object is located is a circle.
  • the target selection range 95 includes: a pre-aiming area located outside the maximum range 96 and an aiming area located within the maximum range 96.
  • the above-mentioned aiming instruction is triggered by the roulette aiming control.
  • the user interface displayed by the client includes: a roulette aiming control, and the roulette aiming control is displayed superimposed on the screen of the virtual environment.
  • the roulette aiming control 40 includes a roulette area and a rocker button 42.
  • the above step 706 includes the following steps 706a to 706c, as shown in FIG. 13:
  • Step 706a List the second virtual object located in the target selection range as a candidate virtual object
  • the client lists all the second virtual objects within the range of the first map point 93 as the center of the circle and the aiming radius R2 as the initial candidate virtual objects. Then a filter is used to filter the initial candidate virtual objects, the second virtual objects located outside the target selection range are filtered out, and the second virtual objects located in the target selection range are retained as candidate virtual objects.
  • the candidate virtual object also needs to meet legal conditions.
  • the legal conditions include but are not limited to: the candidate virtual object and the first virtual object do not belong to the same camp, cannot be a virtual object of a specific type (such as buildings, large and small dragons, eyes), and cannot be in a specific Status of virtual objects (invisible, unselectable) and so on.
  • Step 706b Select the target virtual object from the candidate virtual objects according to the priority principle
  • the candidate virtual object closest to the aiming point is preferentially selected.
  • the candidate virtual object A and the candidate virtual object B exist at the same time, the straight line distance between the candidate virtual object A and the aiming point is the first distance, and the straight line between the candidate virtual object B and the aiming point The distance is the second distance.
  • the candidate virtual object A is preferentially selected as the target virtual object.
  • the candidate virtual object with the lowest percentage of HP is selected first.
  • candidate virtual object A and candidate virtual object B exist at the same time.
  • the blood volume percentage of candidate virtual object A is 43%, and the blood volume percentage of candidate virtual object B is 80%.
  • Object A is the target virtual object.
  • the candidate virtual object with the least absolute blood volume is selected first.
  • candidate virtual object A and candidate virtual object B exist at the same time, candidate virtual object A has a blood volume of 1200 points, candidate virtual object B has a blood volume of 801 points, and candidate virtual object B is preferentially selected as the target virtual object .
  • the candidate virtual object with the highest type priority is preferentially selected.
  • Candidate virtual object A and candidate virtual object B exist at the same time in candidate virtual objects.
  • the type of candidate virtual object A is hero, and the type of candidate virtual object B is creep.
  • the priority of hero is higher than that of creep, and candidate virtual object is selected first.
  • Object A is the target virtual object.
  • the primary priority principle and the secondary priority principle can be set.
  • the primary priority principle there is no selection result or there is more than one selection result, use the secondary
  • the principle of priority is used for selection. For example, first select according to the principle of distance priority. When there are two candidate virtual objects at the same distance from the aiming point, they are selected according to the principle of priority priority to obtain the final target virtual object.
  • the different priority principles are weighted and summed at the same time to calculate the priority score of each candidate virtual object, and the candidate virtual object with the highest priority score is selected.
  • Object get the final target virtual object.
  • the priority score is calculated according to the type of virtual object and the distance from the aiming point, the heroes located around the aiming point will be selected first compared to the minions.
  • Step 706c Control the first virtual object to aim at the target virtual object.
  • Step 707 Display the selected special effect on the target virtual object
  • the selected special effect includes at least one of the following special effects: a first selected mark is displayed on a second map point where the target virtual object is located, and a second selected mark is displayed above the target virtual object.
  • the first selected mark is a circular light effect displayed at the foot of the target virtual object
  • the second selected mark is a circular light beam special effect displayed on the top of the target virtual object's head. This embodiment does not apply the specific form of the selected special effect limited.
  • the first selected special effect is displayed on the target virtual object; in response to the second map point where the target virtual object is located in the targeting area, the first selected special effect
  • the second selected special effect is displayed on the top, and the first selected special effect is different from the second selected special effect.
  • the colors of the first selected special effect and the second selected special effect are different.
  • Step 708 in response to receiving the last aiming instruction, control the first virtual object to release a directional skill or a directional attack on the target virtual object.
  • the method provided in this embodiment selects the targeted virtual object from among the second virtual objects located in the target selection range through the priority principle, and can accurately select the aiming target that the user desires to select. Improve the user's operation fault tolerance, and provide an automatic aiming solution with a certain degree of intelligence.
  • the second selected special effect is displayed on the target virtual object, so that the user can clearly know whether the target virtual object is in the targeted state or in advance. Aiming at the locked state, improving the efficiency of human-computer interaction, and enhancing the amount of information displayed for the selected special effects.
  • the target selection range 95 includes a first selection range 951 (sector shape) and a second selection range 952 (semi-circle), the first selection range 951 and the second selection range 951
  • the selection range 952 partially overlaps, but the priority of the first selection range 951 is greater than the priority of the second selection range 952.
  • the second virtual object in the first selection range 951 is preferentially set as the candidate virtual object; in response to the absence of the second virtual object in the first selection range 951, the second virtual object is set
  • the second virtual object located in the second selection range 952 is set as a candidate virtual object.
  • the second virtual object that meets the legal condition in the first selection range 951 is preferentially set as the candidate Virtual object; in response to the absence of a second virtual object that meets the legal conditions in the first selection range 951, the second virtual object that meets the legal conditions in the second selection range 952 is set as a candidate virtual object.
  • the first selection range 951 corresponds to a first priority rule
  • the second selection range 952 corresponds to a second priority rule.
  • the first priority rule is different from the second priority rule.
  • the first priority rule is HP percentage priority
  • the second priority rule is distance priority.
  • the target virtual object is selected according to the first priority principle among the candidate virtual objects; in response to the candidate virtual object belonging to the second selection range 952, the candidate virtual object is selected according to the second priority
  • the target virtual object is selected based on the level principle.
  • the above method of aiming at the virtual object includes:
  • Step 801 the joystick button of the roulette aiming control is pressed and dragged
  • the touch screen reports the touch start event to the CPU, and the client records the first coordinate in the touch start event as the activation point DownPos.
  • the touch screen reports the touch movement event to the CPU according to the sampling frequency, and the client records the second coordinate in the most recent touch movement event as the offset point DragPos.
  • Step 802 Calculate the corresponding aiming point FocusPoint of the joystick button after dragging in the virtual environment
  • the wheel radius (the maximum drag range) in the wheel aiming control is MaxDragRadius
  • the first map point of the first hero controlled by the user in the virtual environment is HeroPos
  • the maximum range of directional skills X is adopted The following proportional relationship is used to calculate the offset position of the aiming point relative to the first map point:
  • FocusPoint HeroPos+(
  • Normalize stands for normalization.
  • Step 803 Call the enemy search interface according to the skill information (the skill tree ID, the aiming point, the maximum range, the pre-aim range outside the maximum range, etc.);
  • the skill tree ID is the identification of the directional skill.
  • the maximum range is the maximum range of the directional skill, usually a circular range.
  • the maximum range is represented by the above-mentioned maximum range X.
  • the pre-aiming range outside the maximum range is denoted by Y.
  • Y can be individually configured for each directional skill by planning.
  • Step 804 Obtain other virtual objects around the first hero (maximum range + pre-aim range) and store them in the candidate virtual object list;
  • the search interface is centered on the first map point where the first hero is located, and within a circle defined by (X+Y) as the radius, all other heroes within the circle will be added to the target list.
  • X is used for the radius of the maximum range of directional skills
  • Y is the difference between the radius of the pre-aim range and the radius of the maximum range
  • the pre-aim range is an annular range that is socketed outside the maximum range.
  • Step 805 traverse the candidate virtual object list, and delete objects that do not meet the filter
  • the planner will configure a filter ID for each directional skill.
  • This filter ID is also the legal condition that the target of the directional skill needs to meet.
  • a virtual object belonging to a different camp from the first hero cannot be of a specific type ( For example, virtual objects such as buildings, large and small dragons, eyes, etc., cannot be virtual objects in a specific state (invisible, unselectable), etc.
  • the client traverses the candidate virtual objects in the candidate virtual object list, whether they meet the filter rules, and deletes the candidate virtual objects that do not meet the filter from the candidate virtual object list.
  • Step 806 Invoke the search tree to find a suitable second hero.
  • the structure of the search tree is shown in Figure 18.
  • All nodes in the search tree inherit from the BaseSelector node.
  • the BaseSelector node mainly has two function methods Configure and BattleActor Select. BattleActor refers to the candidate virtual object. in:
  • the Configure function is used to initialize the Selector subclass's own data according to the planned configuration table data.
  • the BranchSelector node needs to configure multiple branches, then the data of the Configure configuration is the id of several branch Selectors, and for example, it is needed in the ShapeFilter node Configure the shape field of the target selection range, such as circle, fan, and of course parameters such as the radius of the circle and the angle of the fan.
  • the input parameter of the BattleActor Select function is the list of candidate virtual objects List ⁇ BattleActor>, and the return parameter is the filtered candidate virtual object BattleActor, but its actual content will have different behaviors depending on the implementation of the Selector subclass.
  • the BaseSelector node includes three core derived subclasses: LinkedSelector, BranchSelector and PrioritySelector.
  • LinkSelector The core is a next parameter, which is used to indicate the next required filter to form a chain structure. It has many subcategories, and many subcategories are basically filters Filter.
  • the Select function the candidate virtual object BattleActor that does not meet the legal rules is deleted, and the List ⁇ BattleActor> that deletes the candidate virtual object BattleActor that does not meet the legal rules is passed to the next Selector, so that the filter is implemented.
  • the ShapeSelector corresponding to the target selection range will configure the required graphics and parameters in Configure, and the Select function will determine whether the candidate virtual objects in List ⁇ BattleActor> are within the shape range corresponding to the target selection range and will not be in the target selection range.
  • the candidate virtual object is deleted from List ⁇ BattleActor>, and other Filters are the same.
  • BuffTypeFilter will delete candidate virtual objects with a certain type of additional effect buff
  • IDSelector will delete candidate virtual objects with a certain buff id for processing A certain skill cannot hit the enemy for the second time.
  • many other specific filters can be used, such as CanKillFilter that guarantees that the current skill will be killed when released.
  • IDFilter is used to filter a certain virtual object
  • BuffTypeFilter Used to filter virtual objects with a certain buff there are many implementations of Filter that will not be repeated.
  • BranchSelector The main function is to process the screening of candidate virtual objects in the case of multiple selection ranges. For example, as shown in Fig. 16, it is necessary to first determine the fan-shaped range of the first selection range 951, and then determine the second selection range 952, and the BranchSelector is used. Several Selector IDs will be configured in the configuration table. In the Configure function, the member variable selectors will be initialized according to the configured Selector ID. In the Select function, the parameter List ⁇ BattleActor> actors needs to be paused and stored, and then the BaseSelector in the selectors is used one by one. The temporarily stored List ⁇ BattleActor> is used as a parameter, and the Select function is called to determine whether there is a returned candidate virtual object BattleActor.
  • PrioritySelector Planners use this Selector to sort the filtered List ⁇ BattleActor> and select the appropriate target virtual object BattleActor. Planning needs to configure priority rules in the table, such as HP priority, distance priority, percentage HP priority, etc. In the Select function, the list List ⁇ BattleActor> will be sorted according to the configured priority rules, and return to the list The first one, if the list is empty, it returns NULL.
  • the priority rule in the first selection range 951 can be set as the blood volume, and the second selection range 952
  • the priority rule within is the distance from the aiming point first, then the structure of our entire search tree is shown in Figure 19.
  • the client will get 3 candidate virtual objects, and then put the three candidate virtual objects into the 90-degree fan shape filter corresponding to the first selection range 951.
  • object 3 is not in the first selection range.
  • Select range 951 so object 3 will be deleted, leaving only object 1, object 2, and then the filter sorts object 1 and object 2 according to the principle of priority, because in the first selection range 951, blood volume is prioritized, and customers
  • the end sorting shows that the blood volume of the object 2 is lower, so the object 2 is returned as the final target virtual object, and finally the enemy is searched for the object 2.
  • the client will first obtain 3 candidate virtual objects, and then put the three candidate virtual objects into the 90-degree fan-shaped filter corresponding to the first selection range 951, assuming object 1, object 2, and Object 3 is not in the first selection range 951, so the three candidate virtual objects will be deleted; then the client returns to the second branch of the BranchSelector corresponding to the second selection range 952, or three candidate virtual objects.
  • Three candidate virtual objects are put into the 180-degree semicircle shape filter corresponding to the second selection range 951, no candidate virtual objects are deleted, and then the distances of object 1, object 2, and object 3 to the aiming point are sorted, which can be known from the distance , Object 1 is the closest to the aiming point, so object 1 is returned as the final target virtual object.
  • Fig. 22 is a block diagram of a device for aiming at a virtual object provided by an exemplary embodiment of the present application.
  • the device includes:
  • the display module 901 is configured to display a user interface, the user interface including a picture of a virtual environment, the picture of the virtual environment is a picture obtained by observing the virtual environment with a first virtual object as the observation center, and the picture of the virtual environment Including a first virtual object and a second virtual object located in the virtual environment;
  • the display module 901 is further configured to display a point-type aiming indicator in the virtual environment in response to an aiming instruction, and the point-type aiming indicator is used to indicate that the aiming operation is selected on the ground plane of the virtual environment. Aiming point
  • the aiming module 902 is used to control the first virtual object to aim at the target virtual object, the target virtual object is a virtual object selected from the second virtual objects located in the target selection range, and the target selection range is The aiming point is the selection range determined by the reference.
  • the aiming module 902 is configured to filter the second virtual objects located in the target selection range as candidate virtual objects; among the candidate virtual objects, select all the virtual objects according to the priority principle.
  • the target selection range includes a first selection range and a second selection range, and the priority of the first selection range is greater than the second selection range;
  • the targeting module 902 is configured to, in response to the presence of the second virtual object in the first selection range, preferentially filter the second virtual object in the first selection range as the candidate virtual object; If the second virtual object does not exist in the first selection range, the second virtual object located in the second selection range is filtered as the candidate virtual object.
  • the first selection range corresponds to a first priority rule
  • the second selection range corresponds to a second priority rule
  • the targeting module 902 is configured to respond to the candidate virtual object belonging to the first selection range, select the target virtual object from the candidate virtual objects according to the first priority principle; in response to the The candidate virtual object belongs to the second selection range, and the target virtual object is selected from the candidate virtual objects according to the second priority principle.
  • the target selection range is located on the ground plane of the virtual environment, the target selection range is centered on a first map point where the first virtual object is located, and the target The symmetry axis of the selected range passes through the aiming point.
  • the first virtual object has a maximum range
  • the target selection range includes a pre-aiming area located outside the maximum range
  • the priority principle includes at least one of the following principles:
  • the candidate virtual object with the highest type priority is preferentially selected.
  • the display module 901 is further configured to display selected special effects on the target virtual object, and the selected special effects include at least one of the following special effects: The first selected indicator is displayed on the second map point, and the second selected indicator is displayed above the target virtual object.
  • the first virtual object has a maximum range
  • the target selection range includes a pre-aim area located outside the maximum range, and an aiming area located within the maximum range. area;
  • the display module 901 is further configured to display a first selected special effect on the target virtual object in response to the second map point where the target virtual object is located in the pre-aiming area; in response to where the target virtual object is located The second map point of is located in the aiming area, and a second selected special effect is displayed on the target virtual object, and the first selected special effect is different from the second selected special effect.
  • the user interface includes: a roulette aiming control, the roulette aiming control includes a roulette area and a joystick button; the aiming instruction carries the joystick button on the wheel An offset point in the roulette area that is offset from an activation point, where the activation point is the center of the roulette area;
  • the display module 901 is further configured to calculate an offset vector from the activation point to the offset point in response to the aiming instruction; calculate the aiming vector according to the offset vector, and the aiming vector is A vector pointing to the aiming point from the first map point where the first virtual object is located, the ratio of the offset vector to the radius of the wheel is equal to the ratio of the aiming vector to the aiming radius, the radius of the wheel Is the radius of the roulette area, the aiming radius is the maximum aiming distance of the first virtual object when actively aiming; the aiming point is calculated according to the aiming vector and the first map point where the first virtual object is located ; Display the point-type skill indicator on the aiming point in the virtual environment.
  • the device for aiming at virtual objects provided in the above-mentioned embodiments is only illustrated by the division of the above-mentioned functional modules.
  • the above-mentioned functions can be allocated by different functional modules as required, i.e., equipment
  • the internal structure is divided into different functional modules to complete all or part of the functions described above.
  • the device for aiming at a virtual object provided in the foregoing embodiment and the embodiment of the method for aiming at a virtual object belong to the same concept.
  • the present application also provides a terminal.
  • the terminal includes a processor and a memory, and at least one instruction is stored in the memory, and the at least one instruction is loaded and executed by the processor to implement the method for targeting a virtual object provided by the foregoing method embodiments. It should be noted that the terminal may be the terminal provided in Figure 23 below.
  • FIG. 23 shows a structural block diagram of a terminal 2300 provided by an exemplary embodiment of the present application.
  • the terminal 2300 can be: a smart phone, a tablet computer, an MP3 player (Moving Picture Experts Group Audio Layer III, moving picture expert compression standard audio layer 3), MP4 (Moving Picture Experts Group Audio Layer IV, moving picture expert compressing standard audio Level 4) Player, laptop or desktop computer.
  • the terminal 2300 may also be called user equipment, portable terminal, laptop terminal, desktop terminal and other names.
  • the terminal 2300 includes a processor 2301 and a memory 2302.
  • the processor 2301 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
  • the processor 2301 can adopt at least one hardware form among DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array, Programmable Logic Array). accomplish.
  • the processor 2301 may also include a main processor and a coprocessor.
  • the main processor is a processor used to process data in the awake state, also called a CPU (Central Processing Unit, central processing unit); the coprocessor is A low-power processor used to process data in the standby state.
  • the processor 2301 may be integrated with a GPU (Graphics Processing Unit, image processor), and the GPU is used for rendering and drawing content that needs to be displayed on the display screen.
  • the processor 2301 may also include an AI (Artificial Intelligence) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 2302 may include one or more computer-readable storage media, which may be non-transitory.
  • the memory 2302 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 2302 is used to store at least one instruction, and the at least one instruction is used to be executed by the processor 2301 to implement the aiming virtual machine provided in the method embodiment of the present application. Object method.
  • the terminal 2300 further includes: a peripheral device interface 2303 and at least one peripheral device.
  • the processor 2301, the memory 2302, and the peripheral device interface 2303 may be connected by a bus or a signal line.
  • Each peripheral device can be connected to the peripheral device interface 2303 through a bus, a signal line, or a circuit board.
  • the peripheral device includes: at least one of a radio frequency circuit 2304, a touch screen 2305, a camera 2306, an audio circuit 2307, a positioning component 2308, and a power supply 2309.
  • FIG. 23 does not constitute a limitation on the terminal 2300, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
  • the memory further includes one or more programs, the one or more programs are stored in the memory, and the one or more programs include the method for targeting the virtual object provided in the embodiments of the present application.
  • the present application provides a computer-readable storage medium in which at least one instruction is stored, and the at least one instruction is loaded and executed by the processor to implement the method for targeting a virtual object provided by the foregoing method embodiments .
  • the present application also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the method for targeting virtual objects provided by the foregoing method embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

一种瞄准虚拟对象的方法、装置、设备及介质,包括:显示用户界面,用户界面包括虚拟环境的画面,虚拟环境的画面包括位于虚拟环境中的第一虚拟对象和第二虚拟对象;响应瞄准指令,在虚拟环境中显示点式瞄准指示器,点式瞄准指示器用于指示瞄准操作在虚拟环境的地平面上选中的瞄准点;控制第一虚拟对象瞄准目标虚拟对象,目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,目标选择范围是以瞄准点为基准确定的选择范围。

Description

瞄准虚拟对象的方法、装置、设备及存储介质
本申请要求于2020年06月05日提交中国专利局、申请号为202010508239.5、名称为“瞄准虚拟对象的方法、装置、设备及介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及虚拟环境领域,特别涉及一种瞄准虚拟对象的方法、装置、设备及存储介质。
背景
对战游戏是多个虚拟对象在同一虚拟世界内进行竞技的游戏。对战游戏是多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)。
在典型的MOBA游戏中,用户控制第一虚拟对象拥有指向性技能。在使用该指向性技能时,终端会显示扇形技能指示器,该扇形技能指示器包括位于第一虚拟对象脚下的一个扇形区域,该扇形区域的对称轴为瞄准线。用户可以拖动该扇形技能指示器围绕第一虚拟对象进行旋转,位于该扇形区域中且离瞄准线最近的候选虚拟对象确定为被瞄准的目标虚拟对象,进而由用户控制第一虚拟对象对目标虚拟对象释放技能。
技术内容
根据本申请的一个方面,提供了一种瞄准虚拟对象的方法,所述方法包括:
显示用户界面,所述用户界面包括虚拟环境的画面,所述虚拟环境的画面是以第一虚拟对象作为观察中心对虚拟环境进行观察得到的画面,所述虚拟环境的画面包括位于所述虚拟环境中的第一虚拟对象和第二虚拟对象;
响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,所述点式瞄准指示器用于指示所述瞄准操作在所述虚拟环境的地平面上选中的瞄准点;
控制所述第一虚拟对象瞄准目标虚拟对象,所述目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,所述目标选择范围是以所述瞄准点为基准确定的选择范围。
根据本申请的另一方面,提供了一种瞄准虚拟对象的装置,所述装置包括:
显示模块,用于显示用户界面,所述用户界面包括虚拟环境的画面,所述虚拟环境的画面是以第一虚拟对象作为观察中心对虚拟环境进行观察得到的画面,所述虚拟环境的画面包括位于所述虚拟环境中的第一虚拟对象和第二虚拟对象;
所述显示模块,还用于响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,所述点式瞄准指示器用于指示所述瞄准操作在所述虚拟环境的地平面上选中的瞄准点;
瞄准模块,用于控制所述第一虚拟对象瞄准目标虚拟对象,所述目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,所述目标选择范围是以所述瞄准点为基准确定的选择范围。
根据本申请的另一方面,提供了一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上方面所述的瞄准虚拟对象的方法。
根据本申请的另一方面,提供了一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上方面所述的瞄准虚拟对象的方法。
根据本申请的另一方面,提供了一种计算机程序产品,当所述计算机程序产品在计算机设备上运行时,使得计算机设备执行如上述方面所述的瞄准虚拟对象的方法。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其它的附图。
图1是本申请一个示例性实施例提供的计算机系统的结构框图;
图2是本申请一个示例性实施例提供的状态同步技术的示意图;
图3是本申请一个示例性实施例提供的帧同步技术的示意图;
图4是本申请一个示例性实施例提供的轮盘瞄准控件的示意图;
图5是本申请另一个示例性实施例提供的轮盘瞄准控件的两种触发方式的示意图;
图6是本申请另一个示例性实施例提供的瞄准虚拟对象的方法的界面示意图;
图7是本申请一个示例性实施例提供的瞄准虚拟对象的方法的流程图;
图8是本申请另一个示例性实施例提供的瞄准虚拟对象的方法的流程图;
图9是本申请另一个示例性实施例提供的瞄准虚拟对象的方法的映射示意图;
图10是本申请一个示例性实施例提供的目标选择范围的范围示意图;
图11是本申请另一个示例性实施例提供的目标选择范围的范围示意图;
图12是本申请另一个示例性实施例提供的虚拟对象的预瞄准示意图;
图13是本申请另一个示例性实施例提供的瞄准虚拟对象的方法的流程图;
图14是本申请另一个示例性实施例提供的按照距离优先进行瞄准的示意图;
图15是本申请另一个示例性实施例提供的按照血量百分比优先进行瞄准的示意图;
图16是本申请另一个示例性实施例提供的目标选择范围的范围示意图;
图17是本申请另一个示例性实施例提供的瞄准虚拟对象的方法的流程图;
图18是本申请另一个示例性实施例提供的程序类的结构示意图;
图19是本申请另一个示例性实施例提供的程序类的结构示意图;
图20是本申请另一个示例性实施例提供的瞄准虚拟对象的场景示意图;
图21是本申请另一个示例性实施例提供的瞄准虚拟对象的场景示意图;
图22是本申请另一个示例性实施例提供的瞄准虚拟对象的装置的框图;
图23是本申请另一个示例性实施例提供的终端的框图。
实施方式
为使本申请的目的、技术方案和优点更加清楚,下面将结合附图对本申请实施方式作进一步地详细描述。
首先,对本申请实施例中涉及的名词进行简单介绍:
虚拟环境:是应用程序在终端上运行时显示(或提供)的虚拟环境。该虚拟环境可以是对真实世界的仿真世界,也可以是半仿真半虚构的三维世界,还可以是纯虚构的三维世界。虚拟环境可以是二维虚拟环境、2.5维虚拟环境和三维虚拟环境中的任意一种。该虚拟环境还用于至少两个虚拟对象之间的虚拟环境对战,在该虚拟环境中具有可供至少两个虚拟对象使用的虚拟资源。该虚拟环境包括对称的左下角区域和右上角区域,属于两个敌对阵营的虚拟对象分别占据其中一个区域,并以摧毁对方区域深处的目标建筑/据点/基地/水晶来作为胜利目标。
虚拟对象:是指在虚拟环境中的可活动对象。该可活动对象可以是虚拟人物、虚拟动物、动漫人物中的至少一种。当虚拟环境为三维虚拟环境时,虚拟对象可以是三维虚拟模型,每个虚拟对象在三维虚拟环境中具有自身的形状和体积,占据三维虚拟环境中的一部分空间。虚拟对象是基于三维人体骨骼技术构建的三维角色,该虚拟对象通过穿戴不同的皮肤来实现不同的外在形象。在一些实现方式中,虚拟对象也可以采用2.5维或2维模型来实现,本申请实施例对此不加以限定。示例性的,虚拟对象是由用户通过客户端控制的,或,虚拟对象是由服务器控制的。
多人在线战术竞技是指:在虚拟环境中,分属至少两个敌对阵营的不同虚拟队伍分别占据各自的地图区域,以某一种胜利条件作为目标进行竞技。该胜利条件包括但不限于:占领据点或摧毁敌对阵营据点、击杀敌对阵营的虚拟对象、在指定场景和时间内保证自身的存活、抢夺到某种资源、在指定时间内比分超过对方中的至少一种。战术竞技可以以局为单位来进行,每局战术竞技的地图可以相同,也可以不同。每个虚拟队伍包括一个或多个虚拟对象,比如1个、2个、3个或5个。
MOBA游戏:是一种在虚拟环境中提供若干个据点,处于不同阵营的用户控制虚拟对象在虚拟环境中对战,占领据点或摧毁敌对阵营据点的游戏。例如,MOBA游戏可将用户分成两个敌对阵营,将用户控制的虚拟对象分散在虚拟环境中互相竞争,以摧毁或占领敌方的全部据点作为胜利条件。MOBA游戏以局为单位,一局MOBA游戏的持续时间是从游戏开始的时刻至达成胜利条件的时刻。
用户界面UI(User Interface)控件,在应用程序的用户界面上能够看见的任何可视控件或元素,比如,图片、输入框、文本框、按钮、标签等控件,其中一些UI控件响应用户的操作,比如,技能控件,控制主控虚拟对象释放技能。用户触发技能控件,控制主控虚拟对象释放技能。本申请实施例中涉及的UI控件,包括但不限于:技能控件、移动控件。
在典型的MOBA游戏中,是将离瞄准线最近的候选虚拟对象确定为被瞄准的目标虚拟对象,但是,MOBA游戏中主动瞄准操作成本高,当预期攻击目标与其他目标在瞄准线的同一条直线上时,很难精准的选择预期攻击目标,且操作时间久,出错概率高;不能选择技能射程外的目标,无法提前瞄准目标,操作体验差。
本申请实施例提供了一种瞄准虚拟对象的方法、装置、设备及介质,可以提高用户在主动瞄准时的准确性。
图1给出了本申请一个示例性实施例提供的瞄准虚拟对象的方法所适用的计算机系统的结构框图。该计算机系统100包括:第一终端110、服务器120、第二终端130。
第一终端110安装和运行有支持虚拟环境的客户端111,该客户端111可以是多人在线对战程序。当第一终端运行客户端111时,第一终端110的屏幕上显示客户端111的用户界面。该客户端可以是军事仿真程序、大逃杀射击游戏、虚拟现实(Virtual Reality,VR)应用程序、增强现实(Augmented Reality,AR)程序、三维地图程序、虚拟现实游戏、增强现实游戏、第一人称射击游戏(First-person shooting game,FPS)、第三人称射击游戏(Third-Personal Shooting Game,TPS)、多人在线战术竞技游戏(Multiplayer Online Battle Arena Games,MOBA)、策略游戏(Simulation Game,SLG)中的任意一种。在本实施例中,以该客户端是MOBA游戏来举例说明。第一终端110是第一用户112使用的终端,第一用户112使用第一终端110控制位于虚拟环境中的第一虚拟对象进行活动,第一虚拟对象可以称为第一用户112的第一虚拟对象。第一虚拟对象的活动包括但不限于:调整身体姿态、爬行、步行、奔跑、骑行、飞行、跳跃、驾驶、拾取、射击、攻击、投掷中的至少一种。示意性的,第一虚拟对象是第一虚拟人物,比如仿真人物角色或动漫人物角色。
第二终端130安装和运行有支持虚拟环境的客户端131,该客户端131可以是多人在线对战程序。当第二终端130运行客户端131时,第二终端130的屏幕上显示客户端131的用户界面。该客户端可以是军事仿真程序、大逃杀射击游戏、VR应用程序、AR程序、三维地图程序、虚拟现实游戏、增强现实游戏、FPS、TPS、MOBA、SLG中的任意一种,在本实施例中,以该客户端是MOBA游戏来举例说明。第二终端130是第二用户113使用的终端,第二用户113使用第二终端130控制位于虚拟环境中的第二虚拟对象进行活动,第二虚拟对象可以称为第二用户113的第一虚拟对象。示意性的,第二虚拟对象是第二虚拟人物,比如仿真人物角色或动漫人物角色。
第一虚拟人物和第二虚拟人物处于同一虚拟环境中。第一虚拟人物和第二虚拟人物可以属于同一个阵营、同一个队伍、同一个组织、具有好友关系或具有临时性的通讯权限。第一虚拟人物和第二虚拟人物可以属于不同的阵营、不同的队伍、不同的组织或具有敌对关系。
第一终端110和第二终端130上安装的客户端是相同的,或两个终端上安装的客户端是不同操作系统平台(安卓或IOS)上的同一类型客户端。第一终端110可以泛指多个终端中的一个,第二终端130可以泛指多个终端中的另一个,本实施例仅以第一终端110和第二终端130来举例说明。第一终端110和第二终端130的设备类型相同或不同,该设备类型包括:智能手机、平板电脑、电子书阅读器、MP3播放器、MP4播放器、膝上型便携计算机和台式计算机中的至少一种。
图1中仅示出了两个终端,但在不同实施例中存在多个其它终端140可以接入服务器120。还存在一个或多个终端140是开发者对应的终端,在终端140上安装有支持虚拟环境的客户端的开发和编辑平台,开发者可在终端140上对客户端进行编辑和更新,并将更新后的客户端安装包通过有线或无线网络传输至服务器120,第一终端110和第二终端130可从服务器120下载客户端安装包实现对客户端的更新。
第一终端110、第二终端130以及其它终端140通过无线网络或有线网络与服务器120相连。
服务器120包括一台服务器、多台服务器、云计算平台和虚拟化中心中的至少一种。服务器120用于为支持三维虚拟环境的客户端提供后台服务。服务器120承担主要计算工作,终端承担次要计算工作;或者,服务器120承担次要计算工作,终端承担主要计算工作;或者,服务器120和终端之间采用分布式计算架构进行协同计算。
在一个示意性的例子中,服务器120包括处理器122、用户账号数据库123、对战服务模块124、面向用户的输入/输出接口(Input/Output Interface,I/O接口)125。其中,处理器122用于加载服务器121中存储的指令,处理用户账号数据库123和对战服务模块124中的 数据;用户账号数据库123用于存储第一终端110、第二终端130以及其它终端140所使用的用户账号的数据,比如用户账号的头像、用户账号的昵称、用户账号的战斗力指数,用户账号所在的服务区;对战服务模块124用于提供多个对战房间供用户进行对战,比如1V1对战、3V3对战、5V5对战等;面向用户的I/O接口125用于通过无线网络或有线网络和第一终端110和/或第二终端130建立通信交换数据。
服务器120可以采用同步技术使得多个客户端之间的画面表现一致。示例性的,服务器120采用的同步技术包括:状态同步技术或帧同步技术。
状态同步技术
在基于图1的实施例中,服务器120采用状态同步技术与多个客户端之间进行同步。在状态同步技术中,如图2所示,战斗逻辑运行在服务器120中。当虚拟环境中的某个虚拟对象发生状态变化时,由服务器120向所有的客户端,比如客户端1至10,发送状态同步结果。
在一个示例性的例子中,客户端1向服务器120发送请求,该请求用于请求虚拟对象1释放霜冻技能,则服务器120判断该霜冻技能是否允许释放,以及在允许释放该霜冻技能时,对其他虚拟对象2的伤害值是多少。然后,服务器120将技能释放结果发送给所有的客户端,所有的客户端根据技能释放结果更新本地数据以及界面表现。
帧同步技术
在基于图1的实施例中,服务器120采用帧同步技术与多个客户端之间进行同步。在帧同步技术中,如图3所示,战斗逻辑运行在各个客户端中。每个客户端会向服务器发送帧同步请求,该帧同步请求中携带有客户端本地的数据变化。服务器120在接收到某个帧同步请求后,向所有的客户端转发该帧同步请求。每个客户端接收到帧同步请求后,按照本地的战斗逻辑对该帧同步请求进行处理,更新本地数据以及界面表现。
结合上述对虚拟环境的介绍以及实施环境说明,对本申请实施例提供的虚拟环境的画面显示方法进行说明,以该方法的执行主体为图1所示出的终端上运行的客户端来举例说明。该终端运行有客户端,该客户端是支持虚拟环境的应用程序。
示意性的,以本申请提供的瞄准虚拟对象的方法应用在MOBA游戏中为例。
需要说明的是,所述瞄准是指从多个虚拟对象中选取一个或多个目标虚拟对象。基于此,本申请实施例中的瞄准点可以是从多个虚拟对象中选取一个或多个目标虚拟对象时的选取点;类似的,瞄准线还可以称为选取线,瞄准指令还可以称为选取指令,等等,在此就不一一赘述。
在MOBA游戏中,如图4所示,用户可以通过控制轮盘瞄准控件来控制第一虚拟对象释放指向性技能或指向性攻击。轮盘瞄准控件包括:轮盘区域40和摇杆按钮42。轮盘区域40的面积大于摇杆按钮42的面积。摇杆按钮42在轮盘区域40中可移动位置。轮盘区域40分为内圈区域41和外圈区域43。内圈区域41也称死区。
结合图5可知,根据用户的瞄准操作的作用位置不同,分为:快速触发模式和主动瞄准模式。
当用户在内圈区域41中点击摇杆按钮42时,触发快速施法模式(也称快速施法或自动施法)。快速施法模式是指:客户端会按照默认的攻击对象选择规则,在第一虚拟对象为中心的圆形释放范围内自动选择目标虚拟对象。在用户的手指离开摇杆按钮42时,客户端控制第一虚拟对象向目标虚拟对象释放指向性技能或指向性攻击。
当用户点击摇杆按钮42且拖动摇杆按钮42至外圈区域43中时,触发主动瞄准模式。如图6所示,本申请实施例提供的主动瞄准模式是指:根据摇杆按钮42在外圈区域43中的 位置,映射显示点式瞄准指示器44和射程指示器45。该点式瞄准指示器44用于瞄准操作在虚拟环境中的瞄准点,点式瞄准指示器44的位置与摇杆按钮42的位置对应;该射程指示器45用于指示指向性技能或指向性攻击的最大射程范围,射程指示器45与外圈区域43的外边缘对应,射程指示器45通常为圆形释放范围。用户可以改变摇杆按钮42在外圈区域43中的位置,进而改变点式瞄准指示器44在圆形释放范围45内的显示位置。在用户的手指离开摇杆按钮42时,客户端控制第一虚拟对象51瞄准目标虚拟对象52释放指向性技能或指向性攻击。目标虚拟对象52是除第一虚拟对象之外的其它虚拟对象中离瞄准点44最近的一个虚拟对象。
在一个示例中,第一虚拟对象51跳跃至目标虚拟对象52所在的位置,对目标虚拟对象52释放高额伤害的砍杀型攻击。
图7示出了本申请一个示例性实施例提供的瞄准虚拟对象的方法的流程图。本实施例以该方法由图1所示的终端(或客户端)执行来举例说明。该方法包括:
步骤702,显示用户界面,用户界面包括虚拟环境的画面,虚拟环境的画面包括位于虚拟环境中的第一虚拟对象和第二虚拟对象;
虚拟环境的画面是以第一虚拟对象对应的观察视角对虚拟环境进行观察得到的画面。示例性的,虚拟环境的画面是对三维虚拟环境进行画面采集后,显示在客户端上的二维画面。示例性的,虚拟环境的画面的形状根据终端的显示屏的形状来确定,或,根据客户端的用户界面的形状确定。以终端的显示屏是矩形为例,虚拟环境的画面也显示为矩形画面。
虚拟环境中设置有与第一虚拟对象绑定的摄像机模型。虚拟环境的画面是该摄像机模型以虚拟环境中的某个观察位置为观察中心所拍摄的画面。观察中心是虚拟环境的画面的中心。以虚拟环境的画面是矩形画面为例,在虚拟环境的画面中矩形对角线的交点即为观察中心。通常情况下,与第一虚拟对象绑定的摄像机模型是以第一虚拟对象作为观察中心的,则第一虚拟对象在虚拟环境中所处的位置即为观察位置。观察位置是虚拟环境中的坐标位置。当虚拟环境是三维虚拟环境时,观察位置是三维坐标。示例性的,若虚拟环境中的地面是水平面,则观察位置的高度坐标为0,则可以将观察位置近似的表示为水平面上的二维坐标。
第一虚拟对象是由客户端控制的虚拟对象。客户端根据接收到的用户操作(或称人机操作)控制第一虚拟对象在虚拟环境中的活动。示例性的,第一虚拟对象在虚拟环境中的活动包括:行走、跑动、跳跃、攀爬、趴下、攻击、释放技能、捡拾道具、发送消息。
技能是由某个虚拟对象使用或释放、对自身和/或其他虚拟对象进行攻击、产生减益效果或产生增益效果的一种能力。按照射程范围划分,技能包括:指向性技能和无差别覆盖技能。指向性技能是在最大射程范围内朝向瞄准的方向或区域或虚拟对象进行释放的技能。无差别覆盖技能是在最大射程范围内朝向所有区域进行释放的技能。按照类型划分,技能包括主动技能和被动技能。主动技能是由虚拟对象主动使用或释放的技能,被动技能是当满足被动条件时自动触发的技能。
示例性的,本实施例中所提到的指向性技能是由用户控制第一虚拟对象主动使用或释放的主动技能,且该技能是在最大射程范围内朝向被瞄准的目标虚拟对象进行释放的技能。
示例性的,本实施例中所提到的指向性攻击是由用户控制第一虚拟对象主动使用或释放的普通攻击,且该攻击是在最大射程范围内朝向被瞄准的目标虚拟对象进行释放的普通攻击。
步骤704,响应于瞄准指令,在虚拟环境中显示点式瞄准指示器,点式瞄准指示器用于指示瞄准操作在虚拟环境的地平面上选中的瞄准点;
瞄准指令是由用户的瞄准操作(或技能释放操作或攻击操作)触发的。在一个示例中,瞄准指令是通过轮盘瞄准控件上的拖动操作触发的;在另一个示例中,瞄准指令是通过物理 手柄的摇杆按钮上的拖动操作触发的,本申请对瞄准指令的触发方式不加以限定。
在本实施例中,以瞄准指令是通过轮盘瞄准控件上超出死区的拖动操作触发的指令为例,此时“瞄准操作”即为轮盘瞄准控件上超出死区的拖动操作。结合图6可知,终端在轮盘瞄准控件上接收到拖动操作后,根据摇杆按钮42在外圈区域43中的位置,映射显示点式瞄准指示器44。
步骤706,控制第一虚拟对象瞄准目标虚拟对象,目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,目标选择范围是以所述瞄准点为基准确定的选择范围。
其中,目标选择范围是以瞄准点为基准确定的选择范围。目标选择范围位于虚拟环境的地平面上,目标选择范围以第一虚拟对象所在的第一地图点为旋转中心,且目标选择范围的对称轴经过瞄准点。
目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象。目标虚拟对象是在位于目标选择范围内的第二虚拟对象中按照优先级原则选择出的一个虚拟对象。其中,优先级原则包括但不限于如下原则中的至少一个:
优先选择离瞄准点最近的候选虚拟对象;
优先选择血量百分比最少的候选虚拟对象;
优先选择血量绝对值最少的候选虚拟对象;
优先选择类型优先级最高的候选虚拟对象。
示例性的,本实施例中的瞄准包括普通瞄准和锁定瞄准。普通瞄准:当瞄准目标(目标虚拟对象)的位置发生变化,瞄准即自动取消。锁定瞄准:当瞄准目标(目标虚拟对象)的位置发生变化,瞄准也不会被取消。
示例性的,当第一虚拟对象通过普通瞄准的方式瞄准目标虚拟对象后,目标虚拟对象进行了移动使位置发生改变,则第一虚拟对象不再瞄准目标虚拟对象,也不会对目标虚拟对象使用技能或进行普通攻击,如果用户想要继续瞄准目标虚拟对象,则需要再次进行瞄准操作瞄准目标虚拟对象。
示例性的,当第一虚拟对象通过锁定瞄准的方式瞄准目标虚拟对象后,第一虚拟对象会持续地瞄准目标虚拟对象进行技能释放或普通攻击。在一种实施例中,第一对象通过锁定瞄准的方式瞄准目标虚拟对象后,当目标虚拟对象的位置发生改变,使目标虚拟对象的位置超出第一虚拟对象的攻击范围(瞄准范围)时,客户端还会自动控制第一虚拟对象对目标虚拟对象进行跟随,以便继续瞄准目标虚拟对象进行攻击。示例性的,锁定瞄准的结束方式可以包括以下几种:在瞄准时长达到预定时长后,停止锁定瞄准;在目标虚拟对象移动出第一虚拟对象的瞄准范围后,停止瞄准;在目标虚拟对象或第一虚拟对象死亡后,停止瞄准;在用户重新进行瞄准操作,瞄准其他虚拟对象时,停止对目标虚拟对象的瞄准。
综上所述,本实施例提供的方法,通过采用瞄准点来模拟电脑端上的鼠标的点击位置,在位于以瞄准点为基准确定的目标选择范围内的第二虚拟对象中选择出一个虚拟对象,作为被瞄准的目标虚拟对象。由于目标选择范围是以瞄准点为基准确定的选择范围,且瞄准点的指向精确性要优于瞄准线的指向精确性,使主动瞄准选择目标更加稳定,不容易选错目标,因此能够提高用户在主动瞄准时的准确性,同时减少了主动瞄准选择目标耗费的时间,降低了操作成本,提高了人机交互的效率和用户的操作体验。此外,也给客户端的设计师们更多的技能设计空间。
在基于图7实施例的实施例中,上述瞄准指令是由轮盘瞄准控件触发的。客户端显示的用户界面上包括:轮盘瞄准控件,该轮盘瞄准控件叠加显示在虚拟环境的画面之上。如图4所述,轮盘瞄准控件40包括轮盘区域和摇杆按钮42。上述步骤704包括如下步骤704a至步 骤704d,如图8所示:
步骤704a,响应于瞄准指令,计算从激活点指向偏移点的偏移向量;
拖动操作触发终端中的触摸屏向CPU上报一连串的触摸指令,包括但不限于:一个触摸开始指令、至少一个触摸移动指令和一个触摸结束指令。每个触摸指令中均携带有用户手指在触摸屏上实时的触摸坐标。由拖动操作触发的一连串触摸指令,均可视为瞄准指令。或者,由拖动操作在非死区区域触发的触摸指令,均可视为瞄准指令。
结合参考图9,激活点91是指轮盘区域的中心位置。在一些实施例中,轮盘区域的中心位置是固定不变的;在另一些实施例中,轮盘区域的中心位置是动态改变的,当用户的右手拇指落下时,触摸屏检测到的手指落下位置被设定为轮盘区域的中心位置。
当用户的右手拇指将摇杆按钮在轮盘区域中拖动时,摇杆位置会从激活点91偏移至偏移点92。客户端中记录有激活点的第一坐标,从瞄准指令中读取偏移点的第二坐标,根据第二坐标和第一坐标计算得到偏移向量。
其中,偏移向量是从激活点91指向偏移点92的向量,第一坐标和第二坐标均为触摸屏所在平面的坐标,偏移向量是位于触摸屏所在平面的向量。
步骤704b,根据偏移向量计算瞄准向量;
瞄准向量是从第一虚拟对象所在的第一地图点93指向瞄准点94的向量。
偏移向量的长度L1与轮盘半径R1之比,等于瞄准向量的长度L2与瞄准半径R2之比。轮盘半径R1是轮盘区域的半径,瞄准半径R2是第一虚拟对象在主动瞄准时的最大瞄准距离。在一些实施例中,瞄准半径R2等于指向性技能(或指向性攻击)的最大射程距离x。在另一些实施例中,瞄准半径R2等于指向性技能(或指向性攻击)的最大射程距离x和预瞄准距离y的和。本实施例以后者来举例,即便第二虚拟对象在指向性技能的最大射程之外,也可以提前进行瞄准锁定。
α1等于瞄准夹角α2。α1是偏移向量相对于水平方向的偏移夹角,α2是瞄准向量相对于虚拟环境中的x轴的偏移夹角。
瞄准向量是在虚拟环境中的向量。当虚拟环境是三维虚拟环境时,瞄准向量是虚拟环境中的平面上的向量。
步骤704c,根据瞄准向量和第一虚拟对象所在的第一地图点计算瞄准点;
客户端将第一地图点与瞄准向量相加,计算得到瞄准点。瞄准点是位于虚拟环境的地平面上的点。
步骤704d,在虚拟环境中的瞄准点上显示点式技能指示器。
综上所述,本实施例提供的方法,通过根据偏移点92准确映射得到瞄准点94,能够使得用户在使用轮盘瞄准控件时,获得与电脑端的鼠标点击类似的操作,提高在主动瞄准时的瞄准精确性。
在基于图7实施例的实施例中,目标选择范围是以瞄准点为基准确定的选择范围。目标选择范围位于虚拟环境的地平面上,目标选择范围以第一虚拟对象所在的第一地图点为旋转中心,且目标选择范围的对称轴经过瞄准点。目标选择范围是轴对称图形。
目标选择范围是扇形、半圆形、圆圈形中的至少一种。或者,目标选择范围是至少两个几何图形进行混合后得到的范围,几何图形包括:正方向、菱形、三角形、圆形、扇形、
如图10所示的示例中,目标选择范围95是以第一虚拟对象所在的第一地图点93为旋转中心,且目标选择范围95的对称轴经过瞄准点94。目标选择范围95是圆形和半圆形混合后的范围。
如图11所示的示例中,目标选择范围95是以第一虚拟对象所在的第一地图点93为旋 转中心,且目标选择范围95的对称轴经过瞄准点94。目标选择范围95是圆形和扇形混合后的范围。
示意性的,第一虚拟对象在进行指向性技能或指向性攻击时具有最大射程范围96,该最大射程范围96可以是以第一虚拟对象所在的第一地图点93为圆形的圆形范围。目标选择范围95包括:位于最大射程范围96之外的预瞄准区域,和位于最大射程范围96之内的瞄准区域。
如图12所示,采用该设计,使得用户可以提前在预瞄准区域锁定更远距离的目标虚拟对象。
在基于图7实施例的实施例中,上述瞄准指令是由轮盘瞄准控件触发的。客户端显示的用户界面上包括:轮盘瞄准控件,该轮盘瞄准控件叠加显示在虚拟环境的画面之上。如图4所述,轮盘瞄准控件40包括轮盘区域和摇杆按钮42。上述步骤706包括如下步骤706a至步骤706c,如图13所示:
步骤706a,将位于目标选择范围内的第二虚拟对象列为候选虚拟对象;
客户端将第一地图点93为圆心,瞄准半径R2为半径范围内的所有第二虚拟对象列为初始的候选虚拟对象。然后采用过滤器对初始的候选虚拟对象进行过滤,将位于目标选择范围之外的第二虚拟对象过滤掉,保留位于目标选择范围内的第二虚拟对象列为候选虚拟对象。
候选虚拟对象还需要满足合法条件,合法条件包括但不限于:候选虚拟对象与第一虚拟对象不属于同一阵营、不能是特定类型(比如建筑,大小龙、眼)的虚拟对象、不能是处于特定状态的虚拟对象(隐身,不可选中)等等。
步骤706b,在候选虚拟对象中按照优先级原则选择出目标虚拟对象;
优先级原则包括如下原则中的至少一个:
1、距离优先原则;
优先选择离瞄准点最近的候选虚拟对象。如图14所示,候选虚拟对象中同时存在候选虚拟对象A和候选虚拟对象B,候选虚拟对象A与瞄准点之间的直线距离为第一距离,候选虚拟对象B与瞄准点之间的直线距离为第二距离。当第一距离小于第二距离时,优先选择候选虚拟对象A为目标虚拟对象。
2、血量百分比优先原则;
优先选择血量百分比最少的候选虚拟对象。如图15所示:候选虚拟对象中同时存在候选虚拟对象A和候选虚拟对象B,候选虚拟对象A的血量百分比为43%,候选虚拟对象B的血量百分比为80%,优先选择候选虚拟对象A为目标虚拟对象。
3、血量绝对值优先原则;
优先选择血量绝对值最少的候选虚拟对象。比如:候选虚拟对象中同时存在候选虚拟对象A和候选虚拟对象B,候选虚拟对象A的血量为1200点,候选虚拟对象B的血量为801点,优先选择候选虚拟对象B为目标虚拟对象。
4、类型优先级优先原则;
优先选择类型优先级最高的候选虚拟对象。比如:候选虚拟对象中同时存在候选虚拟对象A和候选虚拟对象B,候选虚拟对象A的类型为英雄,候选虚拟对象B的类型为小兵,英雄的优先级大于小兵的优先级,优先选择候选虚拟对象A为目标虚拟对象。
当优先级原则包括至少两种不同的优先级原则时,可设定主优先级原则和次优先级原则,当主优先级原则选择完毕后,不存在选择结果或存在不止一个选择结果时,使用次优先级原则进行选择。比如,先按照距离优先原则进行选择,当存在两个候选虚拟对象离瞄准点的距离相同时,在按照优先级优先原则进行选择,得到最终的目标虚拟对象。
当优先级原则包括至少两种不同的优先级原则时,同时采用对不同的优先级原则进行加权求和的方式,计算出每个候选虚拟对象的优先级分数,选择优先级分数最高的候选虚拟对象,得到最终的目标虚拟对象。比如,按照虚拟对象的类型以及与瞄准点之间的距离计算优先级分数,位于瞄准点周围的英雄相比小兵会被优先选中。
步骤706c,控制第一虚拟对象瞄准目标虚拟对象。
步骤707,在目标虚拟对象上显示选中特效;
其中,选中特效包括如下特效中的至少一种:在目标虚拟对象所在的第二地图点上显示第一选中标识,在目标虚拟对象的上方显示第二选中标识。
示意性的,第一选中标识是显示在目标虚拟对象脚下的圆形亮光特效,第二选中标识是显示在目标虚拟对象头顶上的圆形光柱特效,本实施例对选中特效的具体形式不加以限定。
示意性的,响应于目标虚拟对象所在的第二地图点位于预瞄准区域,在目标虚拟对象上显示第一选中特效;响应于目标虚拟对象所在的第二地图点位于瞄准区域,在目标虚拟对象上显示第二选中特效,第一选中特效和第二选中特效存在不同。比如,第一选中特效和第二选中特效的颜色不同。
步骤708,响应于接收到最后一个瞄准指令,控制第一虚拟对象对目标虚拟对象释放指向性技能或指向性攻击。
综上所述,本实施例提供的方法,通过优先级原则在位于目标选择范围内的第二虚拟对象中,选择出被瞄准的目标虚拟对象,能够精确地选择到用户期望选择的瞄准目标,提高用户的操作容错性,提供具有一定智能化程度的自动瞄准方案。
本实施例提供的方法,响应于目标虚拟对象所在的第二地图点位于瞄准区域,在目标虚拟对象上显示第二选中特效,可以让用户明确地获知目标虚拟对象是处于已瞄准状态,还是预瞄准锁定状态,提高人机交互效率,增强选中特效的显示信息量。
在基于图13实施例的实施例中,如图16所示,目标选择范围95包括第一选择范围951(扇形)和第二选择范围952(半圆形),第一选择范围951和第二选择范围952存在一部分重叠,但第一选择范围951的优先级大于第二选择范围952的优先级。响应于第一选择范围951内存在第二虚拟对象,优先将第一选择范围951内的第二虚拟对象设置为候选虚拟对象;响应于第一选择范围951内不存在第二虚拟对象,则将位于第二选择范围952内的第二虚拟对象设置为候选虚拟对象。
在候选虚拟对象需要满足合法条件的情况下,响应于第一选择范围951内存在符合合法条件的第二虚拟对象,优先将第一选择范围951内符合合法条件的的第二虚拟对象设置为候选虚拟对象;响应于第一选择范围951内不存在符合合法条件的第二虚拟对象,则将位于第二选择范围952内符合合法条件的的第二虚拟对象设置为候选虚拟对象。
在一个设计中,第一选择范围951对应第一优先级规则,第二选择范围952对应第二优先级规则。第一优先级规则和第二优先级规则不同,比如第一优先级规则是血量百分比优先,第二优先级规则是距离优先。
响应于候选虚拟对象属于第一选择范围951,在候选虚拟对象中按照第一优先级原则选择出目标虚拟对象;响应于候选虚拟对象属于第二选择范围952,在候选虚拟对象中按照第二优先级原则选择出目标虚拟对象。
以第一虚拟对象向目标虚拟对象释放指向性技能为例,如图17所示,上述瞄准虚拟对象的方法包括:
步骤801,轮盘瞄准控件的摇杆按钮被按下并拖动;
在摇杆按钮被按下时,触摸屏向CPU上报触摸开始事件,客户端记录触摸开始事件中的第一坐标,作为激活点DownPos。
在摇杆按钮被拖动时,触摸屏按照采样频率向CPU上报触摸移动事件,客户端记录最近一个触摸移动事件中的第二坐标,作为偏移点DragPos。
步骤802,计算拖动后的摇杆按钮,在虚拟环境中对应的瞄准点FocusPoint;
设轮盘瞄准控件中的轮盘半径(最大的拖动范围)为MaxDragRadius,用户控制的第一英雄在虚拟环境中所处的第一地图点为HeroPos,指向性技能的最大射程半径X,采用如下比例关系计算瞄准点相对于第一地图点的偏移位置:
|DragPos-DownPos|/MaxDragRadius=|FocusPoint–HeroPos|/X;
另外,需要计算出瞄准点FocusPoint相对第一地图点HeroPos的朝向。示意性的,先将屏幕中心点(0,0)位置映射到三维虚拟环境中的位置ScreenCenter2SencePos,位置ScreenCenter2SencePos也是摄像机模型的观察中心,然后用屏幕中心点(0,0)+偏移向量(DragPos-DownPos)的位置映射出参考点ScreenDrag2ScenePos。在三维虚拟环境中参考点ScreenDrag2ScenePos与观察中心ScreenCenter2SencePos的位置朝向就是瞄准点FocusPoint与第一地图点HeroPos的位置朝向。综合以上获得以下公式:
FocusPoint=HeroPos+(|DragPos-DownPos|/MaxDragRadius)*X*Normalize(ScreenDrag2ScenePos-ScreenCenter2SencePos)。
其中,Normalize代表归一化。
步骤803,根据技能信息(技能树ID、瞄准点、最大射程范围、最大射程范围外的预瞄准范围等参数)调用搜敌接口;
其中,技能树ID是指向性技能的标识。最大射程范围是指向性技能的最大射程范围,通常为圆形范围。最大射程范围采用上述最大射程半径X来表示。最大射程范围外的预瞄准范围用Y来表示。其中,Y可以由策划为每个指向性技能单独配置。
步骤804,获取第一英雄(最大射程范围+预瞄准范围)周围的其他虚拟对象存入候选虚拟对象列表;
搜敌接口在以第一英雄所在的第一地图点为中心,以(X+Y)作为半径所确定的圆形范围内,将属于圆形范围内所有其他英雄都加入目标列表。其中,X用于指向性技能的最大射程范围的半径,Y是预瞄准范围的半径与最大射程范围的半径的差值,预瞄准范围是一个套接在最大射程范围之外的圆环状范围,如图9所示。
步骤805,遍历候选虚拟对象列表,删除不符合过滤器的对象;
策划会给每一个指向性技能配置一个过滤器ID,此过滤器ID也就是指向性技能的释放目标所需要满足的合法条件,比如与第一英雄属于不同阵营的虚拟对象、不能是特定类型(比如建筑,大小龙、眼)的虚拟对象、不能是处于特定状态的虚拟对象(隐身,不可选中)等。
客户端遍历候选虚拟对象列表中的候选虚拟对象,是否符合过滤器规则,将不符合过滤器的候选虚拟对象从候选虚拟对象列表中删除。
步骤806,调用搜索树,找出合适的第二英雄。
搜索树的结构如图18所示,首先搜索树中所有节点都继承自BaseSelector节点,BaseSelector节点主要有两个函数方法Configure和BattleActor Select,BattleActor是指候选虚拟对象。其中:
Configure函数用于根据策划的配置的表格数据初始化Selector子类的自身数据,比如,BranchSelector节点就需要配置多个分支,那么Configure配置的数据就是几个分支Selector的id,又比如在ShapeFilter节点中需要配置目标选择范围的形状字段,比如有圆形,扇形,当然还要比如圆形的半径,扇形的角度等参数。
BattleActor Select函数的输入参数为候选虚拟对象列表List<BattleActor>,返回参数为过滤后的候选虚拟对象BattleActor,但是其实际内容根据Selector子类实现不同会有不同的行为。
BaseSelector节点包括有三个核心派生子类:LinkedSelector、BranchSelector和PrioritySelector。
LinkSelector:核心是有一个next参数,用于表示下一个需要的过滤器,从而形成链状结构。它有很多子类,很多子类基本都是过滤器Filter。主要在Select函数中将不符合合法规则的候选虚拟对象BattleActor删除,并将删除了不符合合法规则的候选虚拟对象BattleActor的List<BattleActor>传递给下一个Selector,这样就实现了过滤器。比如目标选择范围对应的ShapeSelector,会在Configure配置需要的图形和参数,Select函数则会逐个判断List<BattleActor>中的候选虚拟对象是否在目标选择范围对应的形状范围内,并且将不在目标选择范围的候选虚拟对象从List<BattleActor>中删除,其他Filter同理,比如BuffTypeFilter会将具有某一类附加效果buff的候选虚拟对象删除,IDSelector将含有某个buff id的候选虚拟对象删除,用于处理某个技能不能第二次命中敌人,除了以上的过滤器之外,还可以其他很多具体的过滤器,比如保证当前技能释放就会击杀的CanKillFilter,比如IDFilter用来筛选某一个虚拟对象,BuffTypeFilter用来筛选具有某种buff的虚拟对象,还有很多Filter的实现不再赘述。
BranchSelector:主要功能是用于处理多个选择范围情况下的候选虚拟对象的筛选。比如图16,需要先判断第一选择范围951的扇形范围,再判断第二选择范围952,就用到BranchSelector。在配置表格中会配置几个Selector ID,在Configure函数中会根据配置的Selector ID初始化成员变量selectors,在Select函数中需要参数List<BattleActor>actors先暂停存起来,然后逐个用selectors中的BaseSelector以暂存的List<BattleActor>作为参数,调用Select函数判断是否存在返回的候选虚拟对象BattleActor,如果存在返回候选虚拟对象BattleActor就表示已经有一个候选虚拟对象BattleActor满足第一选择范围951的规则,则不需要遍历之后的第二选择范围952的selectors,如果没有返回则调用第二选择范围952对应的下一个selectors中的BaseSelector。
PrioritySelector:策划人员利用这个Selector将已经过滤完成的List<BattleActor>进行排序,选出合适的目标虚拟对象BattleActor。策划需要在表格中配置优先级规则,比如血量优先、距离优先、百分比血量优先等,在Select函数中会根据配置的优先级规则对列表List<BattleActor>做一个排序,并返回列表中的第一个,如果列表为空则返回NULL。
通过上述Selector的组合使用,可以实现非常复杂的搜敌逻辑。示意性的如图16,在需要先判断第一选择范围951,再判断第二选择范围952的情况下,可以设定第一选择范围951内的优先级规则是血量,第二选择范围952内的优先级规则是与瞄准点的距离优先,那么我们整个搜索树的结构如图19所示。
结合图20所示的一个例子,首先客户端会得到3个候选虚拟对象,然后将三个候选虚拟对象放入第一选择范围951对应的90度扇形形状过滤器,假设对象3因为不在第一选择范围951,所以会将对象3删除,只剩下对象1、对象2,然后筛选器对对象1和对象2按照优先级原则进行排序,因为在第一选择范围951是血量优先排序,客户端排序可知对象2的血量更低,所以将对象2作为最终瞄准的目标虚拟对象进行返回,最后搜敌得到对象2。
结合图21所示的一个例子,首先客户端会得到3个候选虚拟对象,然后将三个候选虚拟对象放入第一选择范围951对应的90度扇形形状过滤器,假设对象1、对象2和对象3因为不在第一选择范围951,所以会将三个候选虚拟对象删除;然后客户端返回属于第二选择范围952对应的BranchSelector的第二个分支,还是三个候选虚拟对象,首先客户端对三 个候选虚拟对象放入第二选择范围951对应的180度半圆形状过滤器,没有候选虚拟对象删除,然后用对象1、对象2、对象3分别与瞄准点的距离排序,从距离上可知道,对象1是与瞄准点最近的,所以将对象1作为最终瞄准的目标虚拟对象进行返回。
以下为本申请的装置实施例,对于装置实施例中未详细描述的细节,可参考上述方法实施例。
图22是本申请一个示例性实施例提供的瞄准虚拟对象的装置的框图。所述装置包括:
显示模块901,用于显示用户界面,所述用户界面包括虚拟环境的画面,所述虚拟环境的画面是以第一虚拟对象作为观察中心对虚拟环境进行观察得到的画面,所述虚拟环境的画面包括位于所述虚拟环境中的第一虚拟对象和第二虚拟对象;
所述显示模块901,还用于响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,所述点式瞄准指示器用于指示所述瞄准操作在所述虚拟环境的地平面上选中的瞄准点;
瞄准模块902,用于控制所述第一虚拟对象瞄准目标虚拟对象,所述目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,所述目标选择范围是以所述瞄准点为基准确定的选择范围。
在本实施例的一个示例中,所述瞄准模块902,用于将位于所述目标选择范围内的第二虚拟对象过滤为候选虚拟对象;在所述候选虚拟对象中按照优先级原则选择出所述目标虚拟对象;控制所述第一虚拟对象瞄准目标虚拟对象。
在本实施例的一个示例中,所述目标选择范围包括第一选择范围和第二选择范围,所述第一选择范围的优先级大于所述第二选择范围;
所述瞄准模块902,用于响应于所述第一选择范围内存在所述第二虚拟对象,优先将所述第一选择范围内的所述第二虚拟对象过滤为所述候选虚拟对象;响应于所述第一选择范围内不存在所述第二虚拟对象,则将位于所述第二选择范围内的所述第二虚拟对象过滤为所述候选虚拟对象。
在本实施例的一个示例中,所述第一选择范围对应第一优先级规则,所述第二选择范围对应第二优先级规则;
所述瞄准模块902,用于响应于所述候选虚拟对象属于所述第一选择范围,在所述候选虚拟对象中按照所述第一优先级原则选择出所述目标虚拟对象;响应于所述候选虚拟对象属于所述第二选择范围,在所述候选虚拟对象中按照所述第二优先级原则选择出所述目标虚拟对象。
在本实施例的一个示例中,所述目标选择范围位于所述虚拟环境的地平面上,所述目标选择范围以所述第一虚拟对象所在的第一地图点为旋转中心,且所述目标选择范围的对称轴经过所述瞄准点。
在本实施例的一个示例中,所述第一虚拟对象具有最大射程范围,所述目标选择范围包括位于所述最大射程范围之外的预瞄准区域。
在本实施例的一个示例中,所述优先级原则包括如下原则中的至少一种:
优先选择离所述瞄准点最近的候选虚拟对象;
优先选择血量百分比最少的候选虚拟对象;
优先选择血量绝对值最少的候选虚拟对象;
优先选择类型优先级最高的候选虚拟对象。
在本实施例的一个示例中,所述显示模块901,还用于在所述目标虚拟对象上显示选中特效,所述选中特效包括如下特效中的至少一种:在所述目标虚拟对象所在的第二地图点上显示第一选中标识,在所述目标虚拟对象的上方显示第二选中标识。
在本实施例的一个示例中,所述第一虚拟对象具有最大射程范围,所述目标选择范围包括位于所述最大射程范围之外的预瞄准区域,和位于所述最大射程范围之内的瞄准区域;
所述显示模块901,还用于响应于所述目标虚拟对象所在的第二地图点位于所述预瞄准区域,在所述目标虚拟对象上显示第一选中特效;响应于所述目标虚拟对象所在的第二地图点位于所述瞄准区域,在所述目标虚拟对象上显示第二选中特效,所述第一选中特效和所述第二选中特效存在不同。
在本实施例的一个示例中,所述用户界面包括:轮盘瞄准控件,所述轮盘瞄准控件包括轮盘区域和摇杆按钮;所述瞄准指令携带有所述摇杆按钮在所述轮盘区域中从激活点进行偏移后的偏移点,所述激活点是所述轮盘区域的中心;
所述显示模块901,还用于响应于所述瞄准指令,计算从所述激活点指向所述偏移点的偏移向量;根据所述偏移向量计算所述瞄准向量,所述瞄准向量是从所述第一虚拟对象所在的第一地图点指向所述瞄准点的向量,所述偏移向量与所述轮盘半径之比等于所述瞄准向量与瞄准半径之比,所述轮盘半径是轮盘区域的半径,所述瞄准半径是所述第一虚拟对象在主动瞄准时的最大瞄准距离;根据所述瞄准向量和所述第一虚拟对象所在的第一地图点计算所述瞄准点;在所述虚拟环境中的所述瞄准点上显示所述点式技能指示器。
需要说明的是:上述实施例提供的瞄准虚拟对象的装置,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的瞄准虚拟对象的装置与瞄准虚拟对象的方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
本申请还提供了一种终端,该终端包括处理器和存储器,存储器中存储有至少一条指令,至少一条指令由处理器加载并执行以实现上述各个方法实施例提供的瞄准虚拟对象的方法。需要说明的是,该终端可以是如下图23所提供的终端。
图23示出了本申请一个示例性实施例提供的终端2300的结构框图。该终端2300可以是:智能手机、平板电脑、MP3播放器(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器、笔记本电脑或台式电脑。终端2300还可能被称为用户设备、便携式终端、膝上型终端、台式终端等其他名称。
通常,终端2300包括有:处理器2301和存储器2302。
处理器2301可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器2301可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器2301也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器2301可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器2301还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器2302可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是非暂态的。存储器2302还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器2302中的非暂态的计算机可 读存储介质用于存储至少一个指令,该至少一个指令用于被处理器2301所执行以实现本申请中方法实施例提供的瞄准虚拟对象的方法。
在一些实施例中,终端2300还包括有:外围设备接口2303和至少一个外围设备。处理器2301、存储器2302和外围设备接口2303之间可以通过总线或信号线相连。各个外围设备可以通过总线、信号线或电路板与外围设备接口2303相连。具体地,外围设备包括:射频电路2304、触摸显示屏2305、摄像头2306、音频电路2307、定位组件2308和电源2309中的至少一种。
本领域技术人员可以理解,图23中示出的结构并不构成对终端2300的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
所述存储器还包括一个或者一个以上的程序,所述一个或者一个以上程序存储于存储器中,所述一个或者一个以上程序包含用于进行本申请实施例提供的瞄准虚拟对象的方法。
本申请提供了一种计算机可读存储介质,所述存储介质中存储有至少一条指令,所述至少一条指令由所述处理器加载并执行以实现上述各个方法实施例提供的瞄准虚拟对象的方法。
本申请还提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述各个方法实施例提供的瞄准虚拟对象的方法。
上述本申请实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本申请的实施例,并不用以限制本申请,凡在本申请的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (13)

  1. 一种瞄准虚拟对象的方法,由终端执行,所述方法包括:
    显示用户界面,所述用户界面包括虚拟环境的画面,所述虚拟环境的画面包括位于所述虚拟环境中的第一虚拟对象和第二虚拟对象;
    响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,所述点式瞄准指示器用于指示所述瞄准操作在所述虚拟环境的地平面上选中的瞄准点;
    控制所述第一虚拟对象瞄准目标虚拟对象,所述目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,所述目标选择范围是以所述瞄准点为基准确定的选择范围。
  2. 根据权利要求1所述的方法,其中,所述控制所述第一虚拟对象瞄准目标虚拟对象,包括:
    将位于所述目标选择范围内的第二虚拟对象过滤为候选虚拟对象;
    在所述候选虚拟对象中按照优先级原则选择出所述目标虚拟对象;
    控制所述第一虚拟对象瞄准目标虚拟对象。
  3. 根据权利要求2所述的方法,其中,所述目标选择范围包括第一选择范围和第二选择范围,所述第一选择范围的优先级大于所述第二选择范围;
    所述将位于所述目标选择范围内的第二虚拟对象过滤为候选虚拟对象,包括:
    响应于所述第一选择范围内存在所述第二虚拟对象,优先将所述第一选择范围内的所述第二虚拟对象过滤为所述候选虚拟对象;
    响应于所述第一选择范围内不存在所述第二虚拟对象,则将位于所述第二选择范围内的所述第二虚拟对象过滤为所述候选虚拟对象。
  4. 根据权利要求3所述的方法,其中,所述第一选择范围对应第一优先级规则,所述第二选择范围对应第二优先级规则;
    所述在所述候选虚拟对象中按照优先级原则选择出所述目标虚拟对象,包括:
    响应于所述候选虚拟对象属于所述第一选择范围,在所述候选虚拟对象中按照所述第一优先级原则选择出所述目标虚拟对象;
    响应于所述候选虚拟对象属于所述第二选择范围,在所述候选虚拟对象中按照所述第二优先级原则选择出所述目标虚拟对象。
  5. 根据权利要求1至4任一所述的方法,其中,所述目标选择范围位于所述虚拟环境的地平面上,所述目标选择范围以所述第一虚拟对象所在的第一地图点为旋转中心,且所述目标选择范围的对称轴经过所述瞄准点。
  6. 根据权利要求1至4任一所述的方法,其中,所述第一虚拟对象具有最大射程范围,所述目标选择范围包括位于所述最大射程范围之外的预瞄准区域。
  7. 根据权利要求2所述的方法,其中,所述优先级原则包括如下原则中的至少一种:
    优先选择离所述瞄准点最近的候选虚拟对象;
    优先选择血量百分比最少的候选虚拟对象;
    优先选择血量绝对值最少的候选虚拟对象;
    优先选择类型优先级最高的候选虚拟对象。
  8. 根据权利要求2所述的方法,其中,所述在所述候选虚拟对象中按照优先级原则选择出所述目标虚拟对象之后,还包括:
    在所述目标虚拟对象上显示选中特效,所述选中特效包括如下特效中的至少一种:在所述目标虚拟对象所在的第二地图点上显示第一选中标识,在所述目标虚拟对象的上方显示第 二选中标识。
  9. 根据权利要求8所述的方法,其中,所述第一虚拟对象具有最大射程范围,所述目标选择范围包括位于所述最大射程范围之外的预瞄准区域,和位于所述最大射程范围之内的瞄准区域;
    所述在所述目标虚拟对象上显示选中特效,包括:
    响应于所述目标虚拟对象所在的第二地图点位于所述预瞄准区域,在所述目标虚拟对象上显示第一选中特效;
    响应于所述目标虚拟对象所在的第二地图点位于所述瞄准区域,在所述目标虚拟对象上显示第二选中特效,所述第一选中特效和所述第二选中特效存在不同。
  10. 根据权利要求1至3任一所述的方法,其中,所述用户界面包括:轮盘瞄准控件,所述轮盘瞄准控件包括轮盘区域和摇杆按钮;所述瞄准指令携带有所述摇杆按钮在所述轮盘区域中从激活点进行偏移后的偏移点,所述激活点是所述轮盘区域的中心;
    所述响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,包括:
    响应于所述瞄准指令,计算从所述激活点指向所述偏移点的偏移向量;
    根据所述偏移向量计算所述瞄准向量,所述瞄准向量是从所述第一虚拟对象所在的第一地图点指向所述瞄准点的向量,所述偏移向量与所述轮盘半径之比等于所述瞄准向量与瞄准半径之比,所述轮盘半径是轮盘区域的半径,所述瞄准半径是所述第一虚拟对象在主动瞄准时的最大瞄准距离;
    根据所述瞄准向量和所述第一虚拟对象所在的第一地图点计算所述瞄准点;
    在所述虚拟环境中的所述瞄准点上显示所述点式技能指示器。
  11. 一种瞄准虚拟对象的装置,所述装置包括:
    显示模块,用于显示用户界面,所述用户界面包括虚拟环境的画面,所述虚拟环境的画面是以第一虚拟对象作为观察中心对虚拟环境进行观察得到的画面,所述虚拟环境的画面包括位于所述虚拟环境中的第一虚拟对象和第二虚拟对象;
    所述显示模块,还用于响应于瞄准指令,在所述虚拟环境中显示点式瞄准指示器,所述点式瞄准指示器用于指示所述瞄准操作在所述虚拟环境的地平面上选中的瞄准点;
    瞄准模块,用于控制所述第一虚拟对象瞄准目标虚拟对象,所述目标虚拟对象是在位于目标选择范围内的第二虚拟对象中选择出的一个虚拟对象,所述目标选择范围是以所述瞄准点为基准确定的选择范围。
  12. 一种计算机设备,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上权利要求1至10任一所述的瞄准虚拟对象的方法。
  13. 一种计算机可读存储介质,所述计算机可读存储介质中存储有至少一条指令、至少一段程序、代码集或指令集,所述至少一条指令、所述至少一段程序、所述代码集或指令集由所述处理器加载并执行以实现如上权利要求1至10任一所述的瞄准虚拟对象的方法。
PCT/CN2021/095101 2020-06-05 2021-05-21 瞄准虚拟对象的方法、装置、设备及存储介质 WO2021244322A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2021566345A JP2022539289A (ja) 2020-06-05 2021-05-21 仮想オブジェクト照準方法、装置及びプログラム
EP21783116.3A EP3950079A4 (en) 2020-06-05 2021-05-21 METHOD AND DEVICE FOR TARGETING A VIRTUAL OBJECT, DEVICE AND STORAGE MEDIA
KR1020217035482A KR20210151861A (ko) 2020-06-05 2021-05-21 가상 객체에 조준하는 방법, 장치, 기기, 및 저장 매체
US17/505,226 US11893217B2 (en) 2020-06-05 2021-10-19 Method and apparatus for aiming at virtual object, device, and storage medium
US18/398,972 US20240143145A1 (en) 2020-06-05 2023-12-28 Method and apparatus for aiming at virtual object, device, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010508239.5A CN111672119B (zh) 2020-06-05 2020-06-05 瞄准虚拟对象的方法、装置、设备及介质
CN202010508239.5 2020-06-05

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/505,226 Continuation US11893217B2 (en) 2020-06-05 2021-10-19 Method and apparatus for aiming at virtual object, device, and storage medium

Publications (1)

Publication Number Publication Date
WO2021244322A1 true WO2021244322A1 (zh) 2021-12-09

Family

ID=72435185

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/095101 WO2021244322A1 (zh) 2020-06-05 2021-05-21 瞄准虚拟对象的方法、装置、设备及存储介质

Country Status (6)

Country Link
US (2) US11893217B2 (zh)
EP (1) EP3950079A4 (zh)
JP (1) JP2022539289A (zh)
KR (1) KR20210151861A (zh)
CN (1) CN111672119B (zh)
WO (1) WO2021244322A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7469378B2 (ja) 2022-05-24 2024-04-16 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111672119B (zh) 2020-06-05 2023-03-10 腾讯科技(深圳)有限公司 瞄准虚拟对象的方法、装置、设备及介质
CN112569598A (zh) * 2020-12-22 2021-03-30 上海幻电信息科技有限公司 目标对象控制方法及装置
CN112717403B (zh) * 2021-01-22 2022-11-29 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、电子设备及存储介质
CN115400419A (zh) * 2021-05-28 2022-11-29 恩希软件株式会社 瞄准游戏的对象的方法及装置
CN113304478B (zh) * 2021-06-22 2022-07-29 腾讯科技(深圳)有限公司 技能指示器的控制方法和装置、存储介质及电子设备
CN113521737A (zh) * 2021-07-26 2021-10-22 网易(杭州)网络有限公司 虚拟角色的锁定方法、装置、计算机设备及存储介质
CN113577766B (zh) * 2021-08-05 2024-04-02 百度在线网络技术(北京)有限公司 对象处理方法及装置
CN113730909B (zh) * 2021-09-14 2023-06-20 腾讯科技(深圳)有限公司 瞄准位置的显示方法、装置、电子设备及存储介质
CN114344880A (zh) * 2022-01-10 2022-04-15 腾讯科技(深圳)有限公司 虚拟场景中的准星控制方法、装置、电子设备及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105194873A (zh) * 2015-10-10 2015-12-30 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
US9766040B2 (en) * 2015-01-09 2017-09-19 Evrio, Inc. Relative aiming point display
CN107450812A (zh) * 2017-06-26 2017-12-08 网易(杭州)网络有限公司 虚拟对象控制方法及装置、存储介质、电子设备
CN108310772A (zh) * 2018-01-22 2018-07-24 腾讯科技(深圳)有限公司 攻击操作的执行方法和装置以及存储介质、电子装置
CN109513209A (zh) * 2018-11-22 2019-03-26 网易(杭州)网络有限公司 虚拟对象处理方法及装置、电子设备以及存储介质
CN111672119A (zh) * 2020-06-05 2020-09-18 腾讯科技(深圳)有限公司 瞄准虚拟对象的方法、装置、设备及介质

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000167239A (ja) * 1998-12-02 2000-06-20 Square Co Ltd ゲーム装置、記録媒体およびキャラクタの行動制御方法
JP2001009156A (ja) * 1999-06-30 2001-01-16 Square Co Ltd コンピュータ読み取り可能な記録媒体、ゲームの表示制御方法およびゲーム装置
US7594847B1 (en) * 2002-10-11 2009-09-29 Microsoft Corporation Squad command interface for console-based video game
JP3888542B2 (ja) * 2002-12-05 2007-03-07 任天堂株式会社 ゲーム装置およびゲームプログラム
US9704350B1 (en) * 2013-03-14 2017-07-11 Harmonix Music Systems, Inc. Musical combat game
US10350487B2 (en) * 2013-06-11 2019-07-16 We Made Io Co., Ltd. Method and apparatus for automatically targeting target objects in a computer game
CN104898953B (zh) 2015-06-16 2016-10-26 深圳市腾讯计算机系统有限公司 基于触控屏的操控方法和装置
KR101869819B1 (ko) * 2016-04-11 2018-06-21 (주) 덱스인트게임즈 사용자 캐릭터의 자동 공격 제어 방법
CN105930081A (zh) * 2016-04-19 2016-09-07 上海逗屋网络科技有限公司 一种在触摸终端上执行操作的方法与设备
CN107899241B (zh) * 2017-11-22 2020-05-22 网易(杭州)网络有限公司 信息处理方法及装置、存储介质、电子设备
JP7245605B2 (ja) * 2018-02-13 2023-03-24 株式会社バンダイナムコエンターテインメント ゲームシステム、ゲーム提供方法及びプログラム
CN108379839B (zh) * 2018-03-23 2019-12-10 网易(杭州)网络有限公司 控件的响应方法、装置和终端
CN109828663A (zh) * 2019-01-14 2019-05-31 北京七鑫易维信息技术有限公司 瞄准区域的确定方法及装置、瞄准目标物的操作方法
CN109847336B (zh) 2019-02-26 2021-08-06 腾讯科技(深圳)有限公司 虚拟场景显示方法、装置、电子设备及存储介质
CN109865282B (zh) * 2019-03-05 2020-03-17 网易(杭州)网络有限公司 移动终端中的信息处理方法、装置、介质及电子设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9766040B2 (en) * 2015-01-09 2017-09-19 Evrio, Inc. Relative aiming point display
CN105194873A (zh) * 2015-10-10 2015-12-30 腾讯科技(深圳)有限公司 一种信息处理方法、终端及计算机存储介质
CN107450812A (zh) * 2017-06-26 2017-12-08 网易(杭州)网络有限公司 虚拟对象控制方法及装置、存储介质、电子设备
CN108310772A (zh) * 2018-01-22 2018-07-24 腾讯科技(深圳)有限公司 攻击操作的执行方法和装置以及存储介质、电子装置
CN109513209A (zh) * 2018-11-22 2019-03-26 网易(杭州)网络有限公司 虚拟对象处理方法及装置、电子设备以及存储介质
CN111672119A (zh) * 2020-06-05 2020-09-18 腾讯科技(深圳)有限公司 瞄准虚拟对象的方法、装置、设备及介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3950079A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7469378B2 (ja) 2022-05-24 2024-04-16 任天堂株式会社 ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法

Also Published As

Publication number Publication date
KR20210151861A (ko) 2021-12-14
CN111672119B (zh) 2023-03-10
US11893217B2 (en) 2024-02-06
EP3950079A4 (en) 2022-07-13
EP3950079A1 (en) 2022-02-09
US20220035515A1 (en) 2022-02-03
US20240143145A1 (en) 2024-05-02
CN111672119A (zh) 2020-09-18
JP2022539289A (ja) 2022-09-08

Similar Documents

Publication Publication Date Title
WO2021244322A1 (zh) 瞄准虚拟对象的方法、装置、设备及存储介质
JP7476235B2 (ja) 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム
JP7375043B2 (ja) 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム
CN111672116B (zh) 控制虚拟对象释放技能的方法、装置、终端及存储介质
JP7325664B2 (ja) 仮想オブジェクトの制御方法及び装置、端末、並びに、コンピュータプログラム
JP7492611B2 (ja) バーチャルシーンにおけるデータ処理方法、装置、コンピュータデバイス、及びコンピュータプログラム
TWI789088B (zh) 虛擬對象釋放技能的方法、裝置、設備、媒體及程式產品
JP2023164787A (ja) 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム
JP2022552752A (ja) 仮想環境の画面表示方法及び装置、並びにコンピュータ装置及びプログラム
CN112691366A (zh) 虚拟道具的显示方法、装置、设备及介质
CN114225372B (zh) 虚拟对象的控制方法、装置、终端、存储介质及程序产品
JP2024074915A (ja) 仮想オブジェクト照準方法、装置及びプログラム
TWI843042B (zh) 虛擬道具的投放方法、裝置、終端、儲存媒體及程式產品
US20230040506A1 (en) Method and apparatus for controlling virtual character to cast skill, device, medium, and program product
CN116688501A (zh) 虚拟对象的控制方法、装置、设备、介质及程序产品

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2021783116

Country of ref document: EP

Effective date: 20211014

ENP Entry into the national phase

Ref document number: 20217035482

Country of ref document: KR

Kind code of ref document: A

ENP Entry into the national phase

Ref document number: 2021566345

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE