WO2021213026A1 - 虚拟对象的控制方法、装置、设备及存储介质 - Google Patents
虚拟对象的控制方法、装置、设备及存储介质 Download PDFInfo
- Publication number
- WO2021213026A1 WO2021213026A1 PCT/CN2021/079592 CN2021079592W WO2021213026A1 WO 2021213026 A1 WO2021213026 A1 WO 2021213026A1 CN 2021079592 W CN2021079592 W CN 2021079592W WO 2021213026 A1 WO2021213026 A1 WO 2021213026A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- target
- virtual
- control
- trigger signal
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/56—Computing the motion of game characters with respect to other game characters, game objects or elements of the game scene, e.g. for simulating the behaviour of a group of virtual soldiers or for path finding
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/426—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/218—Input arrangements for video game devices characterised by their sensors, purposes or types using pressure sensors, e.g. generating a signal proportional to the pressure applied by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/422—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle automatically for the purpose of assisting the player, e.g. automatic braking in a driving game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5372—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for tagging characters, objects or locations in the game scene, e.g. displaying a circle under the character controlled by the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/55—Controlling game characters or game objects based on the game progress
- A63F13/58—Controlling game characters or game objects based on the game progress by computing conditions of game characters, e.g. stamina, strength, motivation or energy level
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/822—Strategy games; Role-playing games
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1056—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving pressure sensitive buttons
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Definitions
- the embodiments of the present application relate to the field of computer technology, and in particular, to a method, device, device, and storage medium for controlling virtual objects.
- the user quickly releases the skill to the virtual object by clicking the skill control.
- the virtual object is the default virtual object determined by the server according to the client data, or the user can actively select the virtual object and release the skill by dragging the skill control, and the skill is released.
- the presentation layer of the application will specially mark the virtual objects of the released skills and display them in the virtual environment screen.
- the user can only determine whether the released object is the expected virtual object during the skill release process, and cannot know the skill corresponding to the skill before releasing the skill. Virtual objects may cause the wrong target to be released by the skill, resulting in a waste of skill resources.
- the embodiments of the present application provide a method, device, device, and storage medium for controlling virtual objects, which can enable users to know the target virtual objects that the operations act on by marking when they are not performing operations, thereby improving the visibility of virtual objects. Control efficiency and control accuracy.
- the technical solution is as follows:
- an embodiment of the present application provides a method for controlling a virtual object.
- the method is applied to a terminal, and the method includes:
- a game interface is displayed, the game interface includes a first virtual object, at least one second virtual object, and a first control, the first virtual object and the second virtual object are located in the virtual world, and the first virtual object A virtual object and the second virtual object belong to different camps, and the first control is used to control the first virtual object to use virtual props to change target attribute values of other virtual objects;
- an embodiment of the present application provides a virtual object control device, the device includes:
- the display module is configured to display a game interface, the game interface includes a first virtual object, at least one second virtual object, and a first control, and the first virtual object and the second virtual object are located in a virtual world , And the first virtual object and the second virtual object belong to different camps, and the first control is used to control the first virtual object to use virtual props to change target attribute values of other virtual objects;
- a first determining module configured to determine a target virtual object from at least one of the second virtual objects, and mark the target virtual object in a predetermined manner
- a receiving module configured to receive a first trigger signal acting on the first control
- the first control module is configured to control the first virtual object to use virtual props to change the target attribute value of the target virtual object in response to the first trigger signal meeting the automatic control condition.
- an embodiment of the present application provides a computer device, the computer device includes a processor and a memory, and at least one program is stored in the memory, and the at least one program is loaded and executed by the processor to implement The control method of the virtual object as described in the above aspect.
- an embodiment of the present application provides a computer-readable storage medium in which at least one section of a program is stored, and the at least one section of the program is loaded and executed by a processor to implement the above-mentioned aspects.
- the control method of the virtual object is not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to, but not limited to the control method of the virtual object.
- a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the terminal reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the terminal executes the virtual object control method provided in the various optional implementation manners of the foregoing aspects.
- the target virtual object in the game interface is marked in a predetermined manner, and upon receiving the first trigger signal that meets the automatic control condition acting on the first control, the first virtual object is controlled to change the target of the target virtual object Attribute value; mark the target virtual object before controlling the first virtual object to change the target attribute value of the target virtual object, so that when the user is not performing an operation, the target virtual object can be known through the mark. If the target virtual object is If the desired object does not match, you can change the target virtual object through other operations.
- the target virtual object matches the expected object, you can quickly perform the operation through the first control, which improves the control efficiency and control accuracy of the virtual object; at the same time, it can avoid the execution During the operation, it is also necessary to confirm and mark the operation object, which can reduce the operation execution time delay, thereby improving the efficiency of human-computer interaction.
- Fig. 1 is a schematic diagram of an implementation environment provided according to an exemplary embodiment of the present application
- Fig. 2 is a flowchart of a method for controlling a virtual object according to an exemplary embodiment of the present application
- Fig. 3 is a schematic diagram of a game interface provided according to an exemplary embodiment of the present application.
- Fig. 4 is a schematic diagram of target virtual object marking provided according to an exemplary embodiment of the present application.
- Fig. 5 is a schematic diagram of a game interface provided according to another exemplary embodiment of the present application.
- Fig. 6 is a flowchart of a method for controlling a virtual object according to another exemplary embodiment of the present application.
- Fig. 7 is a schematic diagram of determining a candidate virtual object according to an exemplary embodiment of the present application.
- Fig. 8 is a schematic diagram of a first control provided according to an exemplary embodiment of the present application.
- FIG. 9 is a schematic diagram of a game interface provided according to another exemplary embodiment of the present application.
- FIG. 10 is a schematic diagram of a game interface provided according to another exemplary embodiment of the present application.
- Fig. 11 is a flowchart of a method for controlling a virtual object according to another exemplary embodiment of the present application.
- Fig. 12 is a schematic diagram of a game interface provided according to another exemplary embodiment of the present application.
- Fig. 13 is a flowchart of a method for controlling a virtual object according to another exemplary embodiment of the present application.
- Fig. 14 is a structural block diagram of a virtual object control device provided according to an exemplary embodiment of the present application.
- Fig. 15 is a structural block diagram of a terminal provided according to an exemplary embodiment of the present application.
- Fig. 16 is a structural block diagram of a server provided according to an exemplary embodiment of the present application.
- the "plurality” mentioned herein means two or more.
- “And/or” describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone.
- the character “/” generally indicates that the associated objects before and after are in an "or” relationship.
- Virtual world It is the virtual world displayed (or provided) when the application is running on the terminal.
- the virtual world may be a simulation of the real world, a semi-simulated and semi-fictional three-dimensional world, or a purely fictitious three-dimensional world.
- the virtual world can be any of a two-dimensional virtual world, a 2.5-dimensional virtual world, and a three-dimensional virtual world.
- the virtual world is also used for a virtual world battle between at least two virtual objects, and there are virtual resources available for the at least two virtual objects in the virtual world.
- the virtual world includes a symmetrical lower-left corner area and an upper-right corner area. Virtual objects belonging to two rival camps occupy one of the areas respectively, and destroy the target building/base/base/crystal in the depths of the opponent's area. Victory goal.
- Virtual object refers to the movable object in the virtual world.
- the movable object may be at least one of a virtual character, a virtual animal, and an animation character.
- the virtual object when the virtual world is a three-dimensional virtual world, the virtual object may be a three-dimensional model, and each virtual object has its own shape and volume in the three-dimensional virtual world and occupies a part of the space in the three-dimensional virtual world.
- the virtual object is a three-dimensional character constructed based on three-dimensional human bone technology, and the virtual object realizes different external images by wearing different skins.
- the virtual object may also be implemented using a 2.5-dimensional or 2-dimensional model, which is not limited in the embodiment of the present application.
- Multiplayer online tactical competitive game refers to a virtual world where different virtual teams belonging to at least two rival camps occupy their respective map areas and compete with a certain victory condition as the goal.
- the victory conditions include but are not limited to: occupying a stronghold or destroying the enemy camp’s stronghold, killing the virtual object of the enemy camp, ensuring one’s own survival in a specified scene and time, grabbing a certain resource, and surpassing the opponent’s score within a specified time At least one of.
- Tactical competition can be carried out in units of rounds, and the map of each round of tactical competition can be the same or different.
- Each virtual team includes one or more virtual objects, such as 1, 2, 3, or 5, etc.
- Virtual props refers to the props that virtual objects can use in the virtual environment, including pistols, rifles, sniper guns, daggers, knives, swords, axes, ropes and other virtual weapons that can cause damage to other virtual objects, bullets and other supply props. Shields, armors, armored vehicles and other defensive props, virtual beams, virtual shock waves and other virtual props displayed through hands when virtual objects release skills, and part of the body torso of virtual objects, such as hands and legs.
- the virtual props in the embodiment of the present application refer to the props equipped by default by the virtual object.
- UI control refers to any visual control or element that can be seen on the user interface of the application, such as pictures, input boxes, text boxes, buttons, labels and other controls. Some of the UI controls respond User actions.
- FIG. 1 shows a schematic diagram of an implementation environment provided by an embodiment of the present application.
- the implementation environment may include: a first terminal 110, a server 120, and a second terminal 130.
- the first terminal 110 installs and runs an application program 111 supporting the virtual world, and the application program 111 may be a multiplayer online battle program.
- the user interface of the application program 111 is displayed on the screen of the first terminal 110.
- the application program 111 may be any of a military simulation program, a MOBA game, a battle royale shooting game, and a simulation strategy game (Simulation Game, SLG).
- the application 111 is a MOBA game as an example.
- the first terminal 110 is a terminal used by the first user 112.
- the first user 112 uses the first terminal 110 to control a first virtual object located in the virtual world to perform activities.
- the first virtual object may be referred to as the master virtual of the first user 112. Object.
- the activities of the first virtual object include, but are not limited to, at least one of adjusting body posture, crawling, walking, running, riding, flying, jumping, driving, picking up, shooting, attacking, throwing, and releasing skills.
- the first virtual object is a first virtual character, such as a simulated character or an animation character.
- the second terminal 130 installs and runs an application program 131 supporting the virtual world, and the application program 131 may be a multiplayer online battle program.
- the application program 131 may be a multiplayer online battle program.
- the client can be any one of a military simulation program, a MOBA game, a battle royale shooting game, and an SLG game.
- the application 131 is an MOBA game as an example.
- the second terminal 130 is the terminal used by the second user 132.
- the second user 132 uses the second terminal 130 to control a second virtual object located in the virtual world to perform activities.
- the second virtual object may be referred to as the master virtual of the second user 132. Role.
- the second virtual object is a second virtual character, such as a simulated character or an animation character.
- the first virtual object and the second virtual object are in the same virtual world.
- the first virtual object and the second virtual object may belong to the same camp, the same team, the same organization, have a friend relationship, or have temporary communication permissions.
- the first virtual object and the second virtual object may belong to different camps, different teams, different organizations, or have a hostile relationship.
- the applications installed on the first terminal 110 and the second terminal 130 are the same, or the applications installed on the two terminals are the same type of applications on different operating system platforms (Android or IOS).
- the first terminal 110 may generally refer to one of multiple terminals, and the second terminal 130 may generally refer to another of multiple terminals. This embodiment only uses the first terminal 110 and the second terminal 130 as examples.
- the device types of the first terminal 110 and the second terminal 130 are the same or different.
- the device types include smart phones, tablet computers, e-book readers, MP3 players, MP4 players, laptops and desktop computers. At least one.
- terminals Only two terminals are shown in FIG. 1, but there are multiple other terminals that can access the server 120 in different embodiments.
- terminals there are one or more terminals corresponding to the developer, and a development and editing platform supporting virtual world applications is installed on the terminal, and the developer can edit and update the application on the terminal , And transmit the updated application installation package to the server 120 via a wired or wireless network.
- the first terminal 110 and the second terminal 130 can download the application installation package from the server 120 to update the application.
- the first terminal 110, the second terminal 130, and other terminals are connected to the server 120 through a wireless network or a wired network.
- the server 120 includes at least one of a server, a server cluster composed of multiple servers, a cloud computing platform, and a virtualization center.
- the server 120 is used to provide background services for applications supporting the three-dimensional virtual world.
- the server 120 is responsible for the main calculation work, and the terminal is responsible for the secondary calculation work; or, the server 120 is responsible for the secondary calculation work, and the terminal is responsible for the main calculation work; or, the server 120 and the terminal adopt a distributed computing architecture for collaborative calculation. .
- the server 120 includes a memory 121, a processor 122, a user account database 123, a battle service module 124, and a user-oriented input/output interface (Input/Output Interface, I/O interface) 125.
- the processor 122 is used to load instructions stored in the server 120 and process data in the user account database 123 and the battle service module 124; the user account database 123 is used to store the first terminal 110, the second terminal 130, and other terminals.
- the data of the user account of the user account such as the avatar of the user account, the nickname of the user account, the combat power index of the user account, and the service area where the user account is located; 3V3 battle, 5V5 battle, etc.; the user-oriented I/O interface 125 is used to establish communication and exchange data with the first terminal 110 and/or the second terminal 130 through a wireless network or a wired network.
- FIG. 2 shows a flowchart of a method for controlling a virtual object provided by an exemplary embodiment of the present application.
- the method is used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example for description.
- the method includes the following steps:
- Step 201 Display a game interface, which includes a first virtual object, at least one second virtual object, and a first control.
- the first virtual object and the second virtual object are located in the virtual world, and the first virtual object and the second virtual object belong to different camps, and the first control is used to control the first virtual object to use virtual props to change the target attributes of other virtual objects value.
- the game interface includes a virtual world screen and a control layer located on the virtual world screen;
- the virtual world screen includes a first virtual object and at least one second virtual object, and the first virtual object is A virtual object belonging to the first camp, and the second virtual object is a virtual object belonging to the second camp.
- the two are in a hostile relationship.
- the second virtual object includes virtual objects in the second camp controlled by other terminals, and the second camp
- the virtual objects controlled by the server include soldiers controlled by the server and virtual buildings that can be conquered.
- the second virtual object also includes a virtual object belonging to a third camp, and the third camp is controlled by the server, such as monsters in the virtual world.
- the virtual world is a virtual world with an arbitrary boundary shape
- the first virtual object is located within the visible range of the game interface.
- the first virtual object is located at the visual center of the virtual world picture, that is, at the center of the virtual world picture obtained by observing the virtual world from a third-person perspective.
- the angle of view refers to the angle of observation when the virtual character is observed in the virtual world from the first person perspective or the third person perspective.
- the angle of view is the angle when the virtual character is observed through the camera model in the virtual world.
- the camera model automatically follows the virtual object in the virtual world, that is, when the position of the virtual object in the virtual world changes, the camera model follows the position of the virtual object in the virtual world and changes at the same time, and the camera The model is always within the preset distance range of the virtual object in the virtual world.
- the relative position of the camera model and the virtual object does not change.
- the embodiment of the present application takes the third-person perspective as an example for description.
- the camera model is located behind the virtual object (such as the head and shoulders of the virtual person).
- the control layer of the game interface includes a first control for controlling the first virtual object to use virtual props to change the target attribute value of other virtual objects.
- the first control is used for The basic operation of controlling the first virtual object can also be called a common attack control.
- FIG. 3 shows a game interface.
- the virtual world screen of the game interface contains the first virtual object 301 and the second virtual object 302, as well as virtual environments such as buildings, plants, and roads within the field of view;
- the control layer of the game interface includes the first control 303 and other skill controls 304 , And a direction control 305 for controlling the movement and changing direction of the first virtual object 301.
- the user can trigger such controls by clicking, dragging and so on.
- the control layer also includes a map control 306, which is used to display the virtual world.
- the control 307 used to display information such as the record and the duration of the game, and the UI control used for other functions such as game setting, voice call, and message sending.
- Step 202 Determine a target virtual object from at least one second virtual object, and mark the target virtual object in a predetermined manner.
- the terminal searches for the second virtual object in real time, determines the second virtual object that meets the preset condition as the target virtual object, and marks the target virtual object.
- the preset condition may be: the second virtual object meets the common attack target, the distance between the second virtual object and the first virtual object meets the preset distance condition (for example, the closest distance to the first virtual object), the second virtual object The attribute value of the object satisfies at least one of the preset attribute value conditions (for example, the second virtual object has the lowest life value or defense value) and the target camp to which the second virtual object belongs belongs to at least one of the preset camp.
- the preset condition may also be a condition preset by the user, for example, setting a priority to attack a certain type of virtual object. The embodiment of the present application does not limit the method of selecting the target virtual object.
- the predetermined way of marking the target virtual object includes highlighting the character image of the target virtual object and/or the edge of the target attribute information bar, changing the color of the information carried by the target virtual object, and displaying the character of the target virtual object.
- a special mark is added near the image (for example, directly below), and a ray pointing from the first virtual object to the target virtual object is displayed.
- FIG. 4 shows a way of marking the target virtual object 401.
- the terminal After determining the target virtual object 401 from the second virtual object, the terminal adds a highlight effect to the outer edge of the target attribute information bar 402 above the target virtual object 401, and displays a positioning mark 403 directly below the target virtual object 401.
- a first virtual object 501, a second virtual object 502a, and a second virtual object 502b are displayed on the game interface.
- the terminal determines from the two second virtual objects that the second virtual object 502a is For the target virtual object, the second virtual object 502a is marked in the aforementioned predetermined manner.
- Step 203 Receive a first trigger signal acting on the first control.
- the first trigger signal is generated when the user performs a trigger operation on the first control.
- the user can trigger the first trigger signal by clicking, dragging, or the like on the first control.
- Step 204 In response to the first trigger signal meeting the automatic control condition, control the first virtual object to use the virtual prop to change the target attribute value of the target virtual object.
- the terminal in order to implement different control operations on the first virtual object, is preset with control instructions for different first trigger signals, and different operations performed by the user on the first control will generate different corresponding The first trigger signal, thereby controlling the first virtual object to perform a corresponding operation according to a corresponding control instruction.
- automatic control conditions are preset in the terminal.
- the first trigger signal generated by the user's touch operation on the first control meets the automatic control conditions, it indicates the control instruction corresponding to the first trigger signal It is: release the skill to the pre-marked target virtual object, so as to realize the control of the first virtual object to use the virtual prop to change the target attribute value of the target virtual object.
- the terminal controls the first virtual object to change the target attribute value of the target virtual object based on the first trigger signal, and the target attribute value includes the remaining health value (or called the remaining life value), the remaining energy value, At least one of the remaining mana value and other attribute values.
- the first virtual object uses virtual props to change the target attribute value of the target virtual object, which can be expressed as the first virtual object uses the virtual prop to attack the target virtual object, thereby reducing the target virtual object The remaining life value.
- the target virtual object in the game interface is marked in a predetermined manner, and when the first trigger signal that meets the automatic control condition for the first control is received, the first virtual object is controlled to change The target attribute value of the target virtual object; before the first virtual object is controlled to change the target attribute value of the target virtual object, the target virtual object is marked, so that the user can know the target virtual object affected by the operation by marking when the user is not performing an operation. If the target virtual object does not match the desired object, other operations can be used to change the target virtual object. If the target virtual object matches the desired object, the operation can be quickly executed through the first control, which improves the control efficiency and control accuracy of the virtual object.
- FIG. 6 shows a flowchart of a method for controlling a virtual object provided by another exemplary embodiment of the present application.
- the method is used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example for description.
- the method includes the following steps:
- Step 601 Display a game interface.
- the game interface includes a first virtual object, at least one second virtual object, and a first control.
- step 601 For the implementation manner of step 601, reference may be made to step 201 above, and details are not described herein again in the embodiment of the present application.
- Step 602 in response to the at least one second virtual object not containing the actively selected virtual object, obtain first object information of the first virtual object and second object information of the at least one second virtual object.
- the first control has two functions: automatically controlling the first virtual object to attack and actively selecting the attack target to control the first virtual object to attack, and the attack priority of the actively selected attack target is higher than the attack of the automatic control operation. Attack priority of the target.
- the active selection of the virtual object is the virtual object selected by the user by triggering the first control
- the object information is used to characterize the state and position of the virtual object.
- the position in the object information is used to indicate the position of the virtual object in the virtual world
- the state in the object information is used to indicate the current attribute values of the virtual object.
- the first object information includes the first virtual object in the virtual world.
- the second object information includes the first virtual object. 2. At least one of the coordinates of the virtual object in the virtual world and state information such as defense value, remaining life value, and remaining energy value.
- the terminal needs to select the virtual object according to the information of the first object and the second object.
- the information is automatically searched to determine the target virtual object from the at least one second virtual object.
- the determined target virtual object is the release target of the preset attack skill (the attack skill that can be released by triggering the first control), correspondingly, it is necessary to ensure that the preset attack skill can act on the target virtual object, or can
- the target virtual object causes the expected damage, and whether the above-mentioned skill release effect can be achieved is related to the position information, remaining life value, defense value, and remaining energy value of the first virtual object and the second virtual object. Therefore, one In a possible implementation manner, when the terminal needs to select a target virtual object from a plurality of second virtual objects, it needs to first obtain the object information of the first virtual object and each second virtual object, so as to obtain the object information from the plurality of second virtual objects based on the object information. The best target virtual object is selected from the second virtual object.
- Step 603 Determine a target virtual object from at least one second virtual object according to the first object information and the second object information.
- the terminal determines the second virtual object as the target virtual object; if there is no second object whose second object information meets the preset condition Virtual object, there is no target virtual object.
- the number of target virtual objects is one.
- this application only takes the number of target virtual objects as an example for description, and does not limit the number of target virtual objects. If there are two or more target virtual objects that meet the preset conditions, the corresponding target virtual object The number of objects can be two or more.
- the number of selected target virtual objects can also be determined by the attack type corresponding to the preset attack skill.
- the preset attack skill can only act on a single virtual object.
- only a single target virtual object can be selected.
- the preset attack skill can act on two or more virtual objects.
- the terminal can select two or more target virtual objects that meet the preset conditions.
- the first object information includes a first position and a first range
- the second object information includes a second position
- the first position is the position of the first virtual object in the virtual world
- the second position is the position of the second virtual object in the virtual world
- the first range is the use range of the virtual item.
- step 603 includes the following steps one to three:
- Step 1 Determine a second range according to the first range, and the second range is greater than the first range.
- the second range is the range in which the terminal searches for the target virtual object.
- the second range in order to include the second virtual object near the first virtual object into the search range, is set to be larger than the first range.
- the first range and the second range are both circular ranges, or both are fan-shaped regions in which the first virtual object faces a predetermined direction and a predetermined angle.
- the terminal sets the radius of the second range to the radius of the first range + k.
- the first range is a circular area
- the second range is set to the same center and the radius is 2 meters larger than the radius of the first range.
- the circular area is the area in the virtual world where the circle with the first position as the center and the radius of 5 meters is located, then the second range is determined to be the circle with the first position as the center and the radius of 7 meters in the virtual world.
- the area where you are located; or the first range is a fan-shaped area in the virtual world with the first location as the center, 5 meters as the radius, 45° angle and in front of the first virtual object, then the second range is determined to be in the virtual world A circular area with the first position as the center and a radius of 7 meters.
- the target virtual object is the skill release target that launches the preset attack skill, that is, the target virtual object needs to be within the preset skill release range. Therefore, in a possible implementation manner, it may be based on the preset attack
- the skill release range of the skill determines the second range.
- the second range may be less than or equal to the skill release range.
- different attack skills can correspond to different skill release ranges, and correspondingly, different attack skills can set different second ranges.
- Step 2 Determine the second virtual object located in the second range as the candidate virtual object according to the first position and the second position.
- the terminal determines the second virtual object located in the second range as the candidate virtual object, and then determines the target virtual object from the candidate virtual objects according to other conditions.
- the virtual objects in the current virtual world include a first virtual object 701, a second virtual object 702a, and a second virtual object 702b.
- 703 determines the second range 704, and detects that the second virtual object 702a is located in the second range 704, then the second virtual object 702a is determined as a candidate virtual object.
- Step 3 Determine the candidate virtual object that meets the selection condition as the target virtual object.
- the selection condition includes at least one of the following: the distance to the first virtual object is the smallest, the target attribute value is the lowest, and it belongs to the target camp.
- the terminal determines a target virtual object that meets the selection condition from the candidate virtual objects.
- the target virtual object is a priority attack target at the current moment automatically determined by the terminal. If there is no target virtual object that meets the selection condition Is a candidate virtual object, the terminal determines that the target virtual object does not exist at the current moment.
- the target camp includes the second camp and the third camp, where the second camp is the camp that has a hostile relationship with the first camp to which the first virtual object belongs, and the third camp includes virtual objects such as monsters controlled by the server. Camp.
- the candidate virtual objects that meet the selection conditions include candidate virtual objects belonging to the second camp and candidate virtual objects belonging to the third camp
- the terminal preferentially subordinates to the candidate virtual objects belonging to the second camp
- the target virtual object is determined, and if the target virtual object does not exist among the candidate virtual objects belonging to the second camp, the target virtual object is determined from the candidate virtual objects belonging to the third camp.
- the killing efficiency may be related to the remaining life value and remaining defense value corresponding to the target virtual object. For example, if the life value of candidate virtual object A is higher than that of candidate virtual object B, the same skill operations are released on the two candidate virtual objects. Obviously, the probability of killing candidate virtual object B is greater than the probability of killing virtual object A. Therefore, In order to improve the killing efficiency of the first virtual object, the candidate virtual object B may be determined as the target virtual object, that is, the selected target virtual object has the lowest target attribute value.
- the target attribute value may include the remaining life value. , Remaining energy value, defense value and other attribute values.
- the hit rate of the attack skill on the target virtual object may also be related to the distance between the target virtual object and the first virtual object, it can be known that the farther the distance is, the hit rate will be relatively reduced. Therefore, in order to further improve the preset
- the hit rate of the attack skill takes the minimum distance from the first virtual object as one of the preset conditions for selecting the target virtual object.
- the selection conditions may further include: whether the preset attack skill can act on the target virtual object, whether the target attack acts on the target virtual object with a probability of being invalidated, and so on.
- the second virtual object 702a satisfies the selection condition, it is determined that the second virtual object 702a is the target virtual object; if the second virtual object 702a does not meet the selection condition, it is determined that there is no target virtual object currently. Object.
- Step 604 Mark the target virtual object in a first predetermined manner.
- the first predetermined method includes highlighting the character image of the target virtual object and/or the edge of the target attribute information bar, changing the color of the information carried by the target virtual object, and being near the character image of the target virtual object (for example, positive Below) add special marks, display the ray pointing from the first virtual object to the target virtual object, etc.
- Step 605 In response to the at least one second virtual object containing the actively selected virtual object, the actively selected virtual object is determined as the target virtual object.
- the active virtual object selection is the target of the next attack selected by the user through the first control, which may be the same as the target virtual object determined by the terminal automatic search, or it may be the same as the target determined by the terminal automatic search.
- the virtual object is different, and the attack priority of actively selecting the virtual object is higher than the attack priority of the target virtual object automatically searched and determined by the terminal. Therefore, when the second virtual object contains the actively selected virtual object, the terminal will directly determine the active selection of the virtual object For the target virtual object, the process of determining the candidate virtual object is not performed.
- the terminal directly determines the second virtual object 702b as the target virtual object.
- Step 606 Mark the target virtual object in a second predetermined manner.
- the second predetermined manner is the same as the first predetermined manner, or the second predetermined manner is different from the first predetermined manner, and the significance of the marking effect in the second predetermined manner is higher than the significance of the marking effect in the first predetermined manner.
- the second predetermined method is different from the first predetermined method of marking.
- the second predetermined method uses a different marking position from the first predetermined method, or the marking position is the same but uses a different color, for example, the first predetermined method
- the method is to highlight the edge of the target attribute information bar of the target virtual object, and the second predetermined method is to add a positioning mark directly below the character image of the target virtual object.
- Step 607 Receive a first trigger signal acting on the first control.
- step 607 For the implementation manner of step 607, reference may be made to step 203 above, and details are not described herein in the embodiment of the present application.
- Step 608 In response to the touch end position corresponding to the first trigger signal being located in the first automatic control area, it is determined that the first trigger signal meets the automatic control condition.
- the first control includes a first automatic control area and a first active control area. There is no intersection between the first automatic control area and the first active control area.
- the trigger operation of is used to trigger a rapid attack on the target virtual object; and the trigger operation in the first active control area is used to trigger the user to independently select the target virtual object.
- the first control is a circular control
- the first automatic control area is a circle
- the active control area is an annular area around the first automatic control area; or the first automatic control area is the left half of the first control
- the circular area, the active control area is a semicircular area on the right side of the first control, which is not limited in the embodiment of the present application.
- FIG. 8 shows a schematic diagram of a first control.
- the first control is a circular control, where the first automatic control area 801 is a circular area with the center of the first control as the center and a radius smaller than the radius of the first control.
- the first active control area 802 is divided by the first control.
- the terminal determines the first trigger signal Meet the conditions of automatic control. That is, the user can control the first virtual object to change the target attribute value of the target virtual object by quickly clicking the first control.
- Step 609 In response to the first trigger signal meeting the automatic control condition, control the first virtual object to use the virtual prop to change the target attribute value of the target virtual object.
- the terminal determines whether the first trigger signal meets the automatic control condition according to the touch end position. When the touch end position is in the first automatic control area, it determines that the automatic control condition is satisfied. When it is located in the first active control area or an area other than the first control, it is determined that the automatic control condition is not satisfied.
- the first control includes a first automatic control area 901 and a first active control area 902, the user clicks on the first control, and the touch end position 903 is located in the first automatic control area 901, that is, in the figure When the finger is lifted at the touch end position 903 shown in, the terminal determines that the first trigger signal meets the automatic control condition.
- Step 610 In response to the touch end position corresponding to the first trigger signal being located in the first active control area, it is determined that the first trigger signal meets the active control condition.
- the user can actively select the target virtual object through a touch operation on the first control.
- the terminal obtains the touch end position of the first trigger signal, and determines that the touch end is the position Located in the first active control area, it is determined that the first trigger signal meets the active control condition, so as to determine the target virtual object that the user needs to select based on the end position of the touch.
- the user can press and hold the first control with a finger while dragging, and according to the position of the first virtual object and the expected attack object, stop the finger at the corresponding position of the first active control area to complete the active selection of the virtual object. process.
- the first control includes a first automatic control area 1001 and a second automatic control area 1002.
- the touch end position 1003 is located in the second active area.
- the terminal determines that the first trigger signal meets the active control condition.
- Step 611 Determine the second virtual object mapped to the touch end position as the actively selected virtual object.
- the terminal determines the mapping position of the virtual world according to the position of the user's finger in the second automatic control area in real time, and determines to actively select the virtual object according to the second virtual object mapped from the touch end position.
- the terminal can mark the mapping position of the touch end position in the virtual world with a ray, fan-shaped area, etc. in the game interface.
- the user's touch operation corresponds to the virtual world
- the second virtual object is determined as the actively selected virtual object, and step 606 is executed to mark the actively selected virtual object .
- the range where the user chooses to actively select the object is the range of the virtual world included in the game interface.
- the terminal maps the center point 1004 of the first control to the position of the first virtual object 1005, determines the mapping position of the touch end position 1003 in the game interface, and in the game interface The line between the two mapped positions is displayed.
- the touch end position 1003 is mapped to the position of the second virtual object 1006, the second virtual object 1006 is determined to be an actively selected virtual object, and the second virtual object 1006 is marked.
- Step 612 Control the first virtual object to use the virtual prop to change the target attribute value of the actively selected virtual object.
- the terminal controls the first virtual object to use the virtual prop to change the target attribute value of the actively selected virtual object.
- the terminal controls the first virtual object to complete the operation of actively selecting the virtual object to change the target attribute value
- the terminal still keeps the mark of the actively selected virtual object, and determines the actively selected virtual object as the target virtual object.
- the first virtual object is controlled to change the target attribute value of the actively selected virtual object.
- the terminal controls the first virtual object to move in the direction of actively selecting the object, and uses the virtual object when the actively selected virtual object is located in the first virtual object.
- the first virtual object is controlled to use the virtual item to change the target attribute value of the actively selected virtual object.
- the first control is divided into regions so that the user can actively select the virtual object.
- the actively selected virtual object is directly determined as the target virtual object and marked.
- the second virtual object that meets the preset conditions is determined as the target virtual object and marked.
- the user can control the first virtual object to change the target attribute value of the target virtual object through quick operation;
- the target virtual object is marked before the touch operation, so that the user can grasp the object of the operation to change the target attribute value in advance.
- the user can move or aim by controlling the first virtual object Selecting other virtual objects avoids the situation that the user operation does not reach the expected effect due to the difference between the target virtual object and the expected virtual object, and thus the operation needs to be readjusted, thereby improving the operation efficiency.
- control layer of the game interface also includes other controls for controlling the first virtual object to release target skills on the target virtual object.
- the method for controlling the virtual object further includes the following steps:
- Step 205 in response to receiving the second trigger signal acting on the second control, control the first virtual object to release the target skill on the target virtual object.
- the game interface further includes a second control, and the second control is used to control the first virtual object to release the target skill to other virtual objects.
- the game interface includes at least one second control 304 for controlling the first virtual object to release target skills to other virtual objects.
- the second control includes a second automatic control area and a second active control area. There is no intersection between the second automatic control area and the second active control area.
- Step 205 includes the following steps 1 and two:
- Step 1 In response to the touch end position corresponding to the second trigger signal being located in the second automatic control area, the skill release rule of the target skill is acquired.
- the terminal first obtains the skill release rule of the target skill.
- the skill release rule includes the type of the skill release target and the skill release range. , Skills required operations, etc.
- the second control 1200 includes a second automatic control area 1201 and a second active control area 1202.
- the terminal acquires The skill release rules for skill 3.
- Step 2 In response to the target virtual object conforming to the skill release rule, the first virtual object is controlled to release the target skill to the target virtual object.
- the target virtual object is a target virtual object automatically searched and determined by the terminal, or an actively selected object selected by the user by triggering the first control.
- the terminal may select the target virtual object based on the skill release rule.
- the terminal controls the first virtual object 1204 to release skill 3 to the target virtual object 1205. If the target virtual object 1205 is outside the release range of skill 3, the terminal controls the first virtual object 1204 to move in the direction of the target virtual object 1205, and when the target virtual object 1205 is within the release range of skill 3, the first virtual object is controlled 1204 Skill 3 is released.
- Step 206 in response to the target attribute value of the target virtual object reaching the attribute value threshold, re-determine the target virtual object from the at least one second virtual object.
- the target attribute value of the target virtual object reaching the attribute value threshold includes at least one of the following situations: the remaining life value of the target virtual object reaches the life value threshold, for example, the life value threshold is 0, when the remaining life value of the target virtual object At 0, the target attribute value is met and the attribute value threshold is reached; the position of the target virtual object is outside the range displayed on the game interface.
- the target attribute value as the remaining life value as an example, if the remaining life value corresponding to the target virtual object is 0, it means that the target virtual object is killed and cannot continue to be the target virtual object, and the terminal needs to be restarted Obtain object information corresponding to the first virtual object and the remaining second virtual objects, so as to determine the target virtual object from the second virtual object based on the object information.
- the second active control area and the second automatic control area are divided into the second control, so that the user can quickly control the first virtual object to release the target skill on the target virtual object, which simplifies the operation steps of some skills , Which saves the user's operating time.
- the MOBA game includes a presentation layer and a logic layer.
- FIG. 13 shows a flowchart of a virtual object control method provided by another exemplary embodiment of the present application.
- the method is used in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other terminals in the implementation environment as an example for description. The method includes the following steps:
- Step 1301 The presentation layer obtains the end position of the touch operation.
- the presentation layer obtains the touch operation in real time, and obtains the end position of the touch operation when it detects that the user raises his hand.
- Step 1302 The presentation layer judges whether the target virtual object meets the skill release rule.
- the presentation layer acquires a target skill corresponding to the touch operation, and the target skill includes a basic skill corresponding to the first control and a special skill corresponding to the second control. According to the type of the target skill and the skill release rule, it is judged whether the target virtual object meets the skill release rule, and when the target virtual object meets the skill release rule, step 1303 is continued.
- Step 1303 When the target virtual object meets the skill release rule, the presentation layer sends skill release information to the logic layer.
- the performance layer judges that the target virtual object satisfies the skill release rule, it needs further judgment by the logic layer, so as to avoid the wrong judgment of the performance layer caused by the delay of the picture or the cheating behavior of the user.
- Step 1304 The logic layer judges whether the target virtual object meets the skill release rule.
- Step 1305 The logic layer sends the judgment result to the presentation layer.
- the logic layer sends the judgment result to the presentation layer. If the result indicates that the target virtual object does not meet the skill release rule, the subsequent steps are not executed.
- the display result in the game interface is that there is no response after the user triggers the control, and the target skill is not released.
- step 1306 when the logic layer determines that the skill is allowed to be released, the presentation layer sends a skill release request to the server.
- Step 1307 The server forwards the skill release request.
- the server receives the skill release request sent by the terminal presentation layer, obtains the target terminal according to the skill release information in the skill release request, and forwards the skill release request to the logic layer of the target terminal.
- the target terminal is all terminals participating in the current game.
- step 1308 the logic layer performs skill release calculation processing.
- the logic layer When the logic layer receives the skill release request forwarded by the server, it determines to perform the skill release operation and performs the skill release calculation processing to obtain the skill release result, such as the target attribute value of the target virtual object after the skill is released.
- Step 1309 the logic layer sends a skill release instruction.
- Step 1310 the performance layer skills release performance.
- the presentation layer renders the skill release effect in the game interface according to the skill release instruction of the logic layer.
- FIG. 14 is a structural block diagram of a virtual object control device provided by an exemplary embodiment of the present application.
- the device may be set in the first terminal 110 or the second terminal 130 in the implementation environment shown in FIG. 1 or other devices in the implementation environment.
- Terminal the device includes:
- the display module 1401 is configured to display a game interface, the game interface includes a first virtual object, at least one second virtual object, and a first control, the first virtual object and the second virtual object are located in the virtual world , And the first virtual object and the second virtual object belong to different camps, and the first control is used to control the first virtual object to use virtual props to change target attribute values of other virtual objects;
- the first determining module 1402 is configured to determine a target virtual object from at least one of the second virtual objects, and mark the target virtual object in a predetermined manner;
- the receiving module 1403 is configured to receive a first trigger signal acting on the first control
- the first control module 1404 is configured to control the first virtual object to use virtual props to change the target attribute value of the target virtual object in response to the first trigger signal meeting the automatic control condition.
- the first determining module 1402 includes:
- the first obtaining unit is configured to obtain first object information of the first virtual object and second object information of at least one of the second virtual objects in response to at least one of the second virtual objects that do not include actively selected virtual objects.
- Object information wherein the actively selected virtual object is a virtual object selected through the first control, and the object information is used to characterize the state and position of the virtual object;
- a first determining unit configured to determine the target virtual object from at least one of the second virtual objects according to the first object information and the second object information;
- the first marking unit is used to mark the target virtual object in a first predetermined manner.
- the first object information includes a first position and a first range
- the second object information includes a second position
- the first position is that the first virtual object is in the virtual world
- the position, the second position is the position of the second virtual object in the virtual world
- the first range is the use range of the virtual prop
- the first determining unit is further configured to:
- the candidate virtual object that meets a selection condition is determined as the target virtual object, and the selection condition includes at least one of the following: the distance to the first virtual object is the smallest, the target attribute value is the lowest, and it belongs to the target camp.
- the first determining module 1402 includes:
- a second determining unit configured to determine the actively selected virtual object as the target virtual object in response to at least one of the second virtual objects including the actively selected virtual object
- the second marking unit is used to mark the target virtual object in a second predetermined manner.
- the first control includes a first automatic control area and a first active control area, and there is no intersection between the first automatic control area and the first active control area;
- the device also includes:
- the second determining module is configured to determine that the first trigger signal meets the automatic control condition in response to the touch end position corresponding to the first trigger signal being located in the first automatic control area.
- the device further includes:
- a third determining module configured to determine that the first trigger signal meets the active control condition in response to the touch end position corresponding to the first trigger signal being located in the first active control area;
- a fourth determining module configured to determine the second virtual object mapped to the touch end position as an actively selected virtual object
- the second control module is configured to control the first virtual object to use virtual props to change the target attribute value of the actively selected virtual object.
- the game interface further includes a second control, and the second control is used to control the first virtual object to release target skills to other virtual objects;
- the device also includes:
- the third control module is configured to control the first virtual object to release the target skill on the target virtual object in response to receiving the second trigger signal acting on the second control.
- the second control includes a second automatic control area and a second active control area, and there is no intersection between the second automatic control area and the second active control area;
- the third control module includes:
- the second acquiring unit is configured to acquire the skill release rule of the target skill in response to the touch end position corresponding to the second trigger signal being located in the second automatic control area;
- the control unit is configured to control the first virtual object to release the target skill on the target virtual object in response to the target virtual object conforming to the skill release rule.
- the device further includes:
- the fifth determining module is configured to re-determine the target virtual object from at least one of the second virtual objects in response to the target attribute value of the target virtual object reaching the attribute value threshold.
- the target attribute value of the target virtual object reaching the attribute value threshold includes: the remaining life value of the target virtual object reaches the life value threshold, and the position of the target virtual object is located on the game interface Outside the displayed range.
- the virtual object control device marks the target virtual object in the game interface in a predetermined manner, and when it receives the first trigger signal that acts on the first control and meets the automatic control conditions, Control the first virtual object to change the target attribute value of the target virtual object; mark the target virtual object before controlling the first virtual object to change the target attribute value of the target virtual object, so that the user can know the operation location by marking when the user is not operating If the target virtual object does not match the expected object, you can change the target virtual object through other operations. If the target virtual object matches the expected object, you can quickly perform the operation through the first control, which improves the control of the virtual object. Efficiency and control accuracy; At the same time, it can avoid the need to confirm and mark the operation object during the execution of the operation, which can reduce the operation execution delay and improve the efficiency of human-computer interaction.
- FIG. 15 shows a structural block diagram of a terminal provided by an embodiment of the present application.
- the terminal 1500 includes a processor 1501 and a memory 1502.
- the processor 1501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on.
- the processor 1501 can be implemented in at least one hardware form among Digital Signal Processing (DSP), Field Programmable Gate Array (FPGA), and Programmable Logic Array (PLA) .
- the processor 1501 may also include a main processor and a coprocessor.
- the main processor is a processor used to process data in the awake state, also called a central processing unit (CPU);
- the coprocessor is A low-power processor used to process data in the standby state.
- the processor 1501 may be integrated with a graphics processing unit (GPU), and the GPU is used for rendering and drawing content that needs to be displayed on the display screen.
- the processor 1501 may further include an artificial intelligence (AI) processor, and the AI processor is used to process computing operations related to machine learning.
- AI artificial intelligence
- the memory 1502 may include one or more computer-readable storage media, which may be non-transitory.
- the memory 1502 may also include high-speed random access memory and non-volatile memory, such as one or more magnetic disk storage devices and flash memory storage devices.
- the non-transitory computer-readable storage medium in the memory 1502 is used to store at least one instruction, at least one program, code set or instruction set, the at least one instruction, at least one program, code set or instruction set It is used to be executed by the processor 1501 to implement the virtual object control method provided in the method embodiment of the present application.
- the terminal 1500 may optionally further include: a peripheral device interface 1503 and at least one peripheral device.
- the processor 1501, the memory 1502, and the peripheral device interface 1503 may be connected by a bus or a signal line.
- Each peripheral device can be connected to the peripheral device interface 1503 through a bus, a signal line, or a circuit board.
- the peripheral device may include: at least one of a communication interface 1504, a display screen 1505, an audio circuit 1506, a camera component 1507, a positioning component 1508, and a power supply 1509.
- FIG. 15 does not constitute a limitation on the terminal 1500, and may include more or fewer components than shown in the figure, or combine certain components, or adopt different component arrangements.
- FIG. 16 shows a schematic structural diagram of a server provided by an embodiment of the present application. Specifically:
- the server 1600 includes a central processing unit (CPU) 1601, a system memory 1604 including a random access memory (Random Access Memory, RAM) 1602 and a read only memory (Read Only Memory, ROM) 1603, and a connection system
- the server 1600 also includes a basic input/output (Input/Output, I/O) system 1606 that helps transfer information between various devices in the computer, and a storage system 1613, application programs 1614, and other program modules 1615.
- the basic input/output system 1606 includes a display 1608 for displaying information and an input device 1609 such as a mouse and a keyboard for the user to input information.
- the display 1608 and the input device 1609 are both connected to the central processing unit 1601 through the input and output controller 1610 connected to the system bus 1605.
- the basic input/output system 1606 may also include an input and output controller 1610 for receiving and processing input from multiple other devices such as a keyboard, a mouse, or an electronic stylus.
- the input and output controller 1610 also provides output to a display screen, a printer, or other types of output devices.
- the mass storage device 1607 is connected to the central processing unit 1601 through a mass storage controller (not shown) connected to the system bus 1605.
- the mass storage device 1607 and its associated computer-readable medium provide non-volatile storage for the server 1600. That is to say, the mass storage device 1607 may include a computer-readable medium (not shown) such as a hard disk or a CD-ROM (Compact Disc Read-Only Memory) drive.
- the computer-readable media may include computer storage media and communication media.
- Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storing information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media include RAM, ROM, Erasable Programmable Read Only Memory (EPROM), flash memory or other solid-state storage technologies, CD-ROM, Digital Video Disc (DVD) or others Optical storage, tape cartridges, magnetic tape, disk storage or other magnetic storage devices.
- EPROM Erasable Programmable Read Only Memory
- DVD Digital Video Disc
- Optical storage tape cartridges, magnetic tape, disk storage or other magnetic storage devices.
- the aforementioned system memory 1604 and mass storage device 1607 may be collectively referred to as memory.
- the server 1600 may also be connected to a remote computer on the network to run through a network such as the Internet. That is, the server 1600 can be connected to the network 1612 through the network interface unit 1611 connected to the system bus 1605, or in other words, the network interface unit 1611 can also be used to connect to other types of networks or remote computer systems (not shown) .
- the memory also includes at least one instruction, at least one program, code set, or instruction set.
- the at least one instruction, at least one program, code set, or instruction set is stored in the memory and configured to be used by one or more processors. Execute to realize the control method of the above virtual object.
- a computer device is also provided.
- the computer equipment can be a terminal or a server.
- the computer device includes a processor and a memory.
- the memory stores at least one instruction, at least one program, code set or instruction set, and the at least one instruction, the at least one program, the code set or the instruction set consists of
- the processor loads and executes to realize the control method of the virtual object described above.
- the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the virtual The control method of the object.
- the embodiments of the present application also provide a computer program product or computer program.
- the computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
- the processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor executes the computer instruction, so that the computer device executes the virtual object control method provided in the various optional implementation manners of the foregoing aspects.
- the functions described in the embodiments of the present application may be implemented by hardware, software, firmware, or any combination thereof. When implemented by software, these functions can be stored in a computer-readable storage medium or transmitted as one or more instructions or codes on the computer-readable storage medium.
- the computer-readable storage medium includes a computer storage medium and a communication medium, where the communication medium includes any medium that facilitates the transfer of a computer program from one place to another.
- the storage medium may be any available medium that can be accessed by a general-purpose or special-purpose computer.
Abstract
Description
Claims (22)
- 一种虚拟对象的控制方法,其特征在于,所述方法应用于终端,所述方法包括:显示对局界面,所述对局界面中包含第一虚拟对象、至少一个第二虚拟对象以及第一控件,所述第一虚拟对象和所述第二虚拟对象位于虚拟世界中,且所述第一虚拟对象和所述第二虚拟对象属于不同阵营,所述第一控件用于控制所述第一虚拟对象使用虚拟道具改变其它虚拟对象的目标属性值;从至少一个所述第二虚拟对象中确定出目标虚拟对象,并通过预定方式对所述目标虚拟对象进行标记;接收作用于所述第一控件的第一触发信号;响应于所述第一触发信号符合自动控制条件,控制所述第一虚拟对象使用虚拟道具改变所述目标虚拟对象的所述目标属性值。
- 根据权利要求1所述的方法,其特征在于,所述从至少一个所述第二虚拟对象中确定出目标虚拟对象,并通过预定方式对所述目标虚拟对象进行标记,包括:响应于至少一个所述第二虚拟对象中不包含主动选择虚拟对象,获取所述第一虚拟对象的第一对象信息,以及至少一个所述第二虚拟对象的第二对象信息,其中,所述主动选择虚拟对象是通过所述第一控件选取的虚拟对象,对象信息用于表征虚拟对象的状态和位置;根据所述第一对象信息和所述第二对象信息,从至少一个所述第二虚拟对象中确定出所述目标虚拟对象;通过第一预定方式对所述目标虚拟对象进行标记。
- 根据权利要求2所述的方法,其特征在于,所述第一对象信息中包含第一位置和第一范围,所述第二对象信息中包含第二位置,所述第一位置为所述第一虚拟对象在所述虚拟世界中所处的位置,所述第二位置为所述第二虚拟对象在所述虚拟世界中所处的位置,所述第一范围是所述虚拟道具的使用范围;所述根据所述第一对象信息和所述第二对象信息,从至少一个所述第二虚拟对象中确定出所述目标虚拟对象,包括:根据所述第一范围确定第二范围,所述第二范围大于所述第一范围;根据所述第一位置和所述第二位置,将位于所述第二范围内的所述第二虚拟对象确定为候选虚拟对象;将满足选取条件的所述候选虚拟对象确定为所述目标虚拟对象,所述选取条件包括如下至少一项:与所述第一虚拟对象之间的距离最小、所述目标属性值最低、属于目标阵营。
- 根据权利要求2所述的方法,其特征在于,所述从至少一个所述第二虚拟对象中确定出目标虚拟对象,并通过预定方式对所述目标虚拟对象进行标记,包括:响应于至少一个所述第二虚拟对象中包含所述主动选择虚拟对象,将所述主动选择虚拟对象确定为所述目标虚拟对象;通过第二预定方式对所述目标虚拟对象进行标记。
- 根据权利要求1至4任一所述的方法,其特征在于,所述第一控件包括第一自动控制区域和第一主动控制区域,所述第一自动控制区域和所述第一主动控制区域之间不存在交集;所述接收作用于所述第一控件的第一触发信号之后,所述方法包括:响应于所述第一触发信号对应的触控结束位置位于所述第一自动控制区域,确定所述第一触发信号符合所述自动控制条件。
- 根据权利要求5所述的方法,其特征在于,所述接收作用于所述第一控件的第一触发信号之后,所述方法还包括:响应于所述第一触发信号对应的触控结束位置位于所述第一主动控制区域,确定所述第一触发信号符合主动控制条件;将所述触控结束位置映射的所述第二虚拟对象确定为主动选择虚拟对象;控制所述第一虚拟对象使用虚拟道具改变所述主动选择虚拟对象的所述目标属性值。
- 根据权利要求1至4任一所述的方法,其特征在于,所述对局界面中还包 含第二控件,所述第二控件用于控制所述第一虚拟对象向其它虚拟对象释放目标技能;所述从至少一个所述第二虚拟对象中确定出目标虚拟对象,并通过预定方式对所述目标虚拟对象进行标记之后,所述方法还包括:响应于接收到作用于所述第二控件的第二触发信号,控制所述第一虚拟对象对所述目标虚拟对象释放所述目标技能。
- 根据权利要求7所述的方法,其特征在于,所述第二控件包括第二自动控制区域和第二主动控制区域,所述第二自动控制区域和所述第二主动控制区域之间不存在交集;所述响应于接收到作用于所述第二控件的第二触发信号,控制所述第一虚拟对象对所述目标虚拟对象释放所述目标技能,包括:响应于所述第二触发信号对应的触控结束位置位于所述第二自动控制区域,获取所述目标技能的技能释放规则;响应于所述目标虚拟对象符合所述技能释放规则,控制所述第一虚拟对象对所述目标虚拟对象释放所述目标技能。
- 根据权利要求1至4任一所述的方法,其特征在于,所述方法还包括:响应于所述目标虚拟对象的所述目标属性值达到属性值阈值,重新从至少一个所述第二虚拟对象中确定所述目标虚拟对象。
- 根据权利要求9所述的方法,其特征在于,所述目标虚拟对象的所述目标属性值达到所述属性值阈值包括:所述目标虚拟对象的剩余生命值达到生命值阈值,所述目标虚拟对象的位置位于所述对局界面所显示的范围之外。
- 一种虚拟对象的控制装置,其特征在于,所述装置包括:显示模块,用于显示对局界面,所述对局界面中包含第一虚拟对象、至少一个第二虚拟对象以及第一控件,所述第一虚拟对象和所述第二虚拟对象位于虚拟世界中,且所述第一虚拟对象和所述第二虚拟对象属于不同阵营,所述第一控件用于控制所述第一虚拟对象使用虚拟道具改变其它虚拟对象的目标属性 值;第一确定模块,用于从至少一个所述第二虚拟对象中确定出目标虚拟对象,并通过预定方式对所述目标虚拟对象进行标记;接收模块,用于接收作用于所述第一控件的第一触发信号;第一控制模块,用于响应于所述第一触发信号符合自动控制条件,控制所述第一虚拟对象使用虚拟道具改变所述目标虚拟对象的所述目标属性值。
- 根据权利要求11所述的装置,其特征在于,所述第一确定模块,包括:第一获取单元,用于响应于至少一个所述第二虚拟对象中不包含主动选择虚拟对象,获取所述第一虚拟对象的第一对象信息,以及至少一个所述第二虚拟对象的第二对象信息,其中,所述主动选择虚拟对象是通过所述第一控件选取的虚拟对象,对象信息用于表征虚拟对象的状态和位置;第一确定单元,用于根据所述第一对象信息和所述第二对象信息,从至少一个所述第二虚拟对象中确定出所述目标虚拟对象;第一标记单元,用于通过第一预定方式对所述目标虚拟对象进行标记。
- 根据权利要求12所述的装置,其特征在于,所述第一对象信息中包含第一位置和第一范围,所述第二对象信息中包含第二位置,所述第一位置为所述第一虚拟对象在所述虚拟世界中所处的位置,所述第二位置为所述第二虚拟对象在所述虚拟世界中所处的位置,所述第一范围是所述虚拟道具的使用范围;所述第一确定单元,还用于:根据所述第一范围确定第二范围,所述第二范围大于所述第一范围;根据所述第一位置和所述第二位置,将位于所述第二范围内的所述第二虚拟对象确定为候选虚拟对象;将满足选取条件的所述候选虚拟对象确定为所述目标虚拟对象,所述选取条件包括如下至少一项:与所述第一虚拟对象之间的距离最小、所述目标属性值最低、属于目标阵营。
- 根据权利要求12所述的装置,其特征在于,所述第一确定模块,包括:第二确定单元,用于响应于至少一个所述第二虚拟对象中包含所述主动选 择虚拟对象,将所述主动选择虚拟对象确定为所述目标虚拟对象;第二标记单元,用于通过第二预定方式对所述目标虚拟对象进行标记。
- 根据权利要求11至14任一所述的装置,其特征在于,所述第一控件包括第一自动控制区域和第一主动控制区域,所述第一自动控制区域和所述第一主动控制区域之间不存在交集;所述装置还包括:第二确定模块,用于响应于所述第一触发信号对应的触控结束位置位于所述第一自动控制区域,确定所述第一触发信号符合所述自动控制条件。
- 根据权利要求15所述的装置,其特征在于,所述装置还包括:第三确定模块,用于响应于所述第一触发信号对应的触控结束位置位于所述第一主动控制区域,确定所述第一触发信号符合主动控制条件;第四确定模块,用于将所述触控结束位置映射的所述第二虚拟对象确定为主动选择虚拟对象;第二控制模块,用于控制所述第一虚拟对象使用虚拟道具改变所述主动选择虚拟对象的所述目标属性值。
- 根据权利要求11至14任一所述的装置,其特征在于,所述对局界面中还包含第二控件,所述第二控件用于控制所述第一虚拟对象向其它虚拟对象释放目标技能;所述装置还包括:第三控制模块,用于响应于接收到作用于所述第二控件的第二触发信号,控制所述第一虚拟对象对所述目标虚拟对象释放所述目标技能。
- 根据权利要求17所述的装置,其特征在于,所述第二控件包括第二自动控制区域和第二主动控制区域,所述第二自动控制区域和所述第二主动控制区域之间不存在交集;所述第三控制模块,包括:第二获取单元,用于响应于所述第二触发信号对应的触控结束位置位于所 述第二自动控制区域,获取所述目标技能的技能释放规则;控制单元,用于响应于所述目标虚拟对象符合所述技能释放规则,控制所述第一虚拟对象对所述目标虚拟对象释放所述目标技能。
- 根据权利要求11至14任一所述的装置,其特征在于,所述装置还包括:第五确定模块,用于响应于所述目标虚拟对象的所述目标属性值达到属性值阈值,重新从至少一个所述第二虚拟对象中确定所述目标虚拟对象。
- 根据权利要求19所述的装置,其特征在于,所述目标虚拟对象的所述目标属性值达到所述属性值阈值包括:所述目标虚拟对象的剩余生命值达到生命值阈值,所述目标虚拟对象的位置位于所述对局界面所显示的范围之外。
- 一种计算机设备,其特征在于,所述计算机设备包括处理器和存储器,所述存储器中存储有至少一段程序,所述至少一段程序由所述处理器加载并执行以实现如权利要求1至10任一项所述的虚拟对象的控制方法。
- 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质中存储有至少一段程序,所述至少一段程序由处理器加载并执行以实现如权利要求1至10任一项所述的虚拟对象的控制方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
AU2021240132A AU2021240132A1 (en) | 2020-04-23 | 2021-03-08 | Virtual object control method and apparatus, device, and storage medium |
KR1020217035074A KR20210143300A (ko) | 2020-04-23 | 2021-03-08 | 가상 객체 제어 방법 및 장치, 디바이스, 및 저장 매체 |
EP21778333.1A EP3936207A4 (en) | 2020-04-23 | 2021-03-08 | METHOD AND APPARATUS FOR CONTROLLING VIRTUAL OBJECT, DEVICE AND STORAGE MEDIA |
CA3133467A CA3133467A1 (en) | 2020-04-23 | 2021-03-08 | Virtual object control method and apparatus, device, and storage medium |
SG11202111219TA SG11202111219TA (en) | 2020-04-23 | 2021-03-08 | Virtual object control method and apparatus, device, and storage medium |
JP2021566600A JP7476235B2 (ja) | 2020-04-23 | 2021-03-08 | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム |
US17/530,382 US20220072428A1 (en) | 2020-04-23 | 2021-11-18 | Virtual object control method and apparatus, device, and storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010328506.0 | 2020-04-23 | ||
CN202010328506.0A CN111589126B (zh) | 2020-04-23 | 2020-04-23 | 虚拟对象的控制方法、装置、设备及存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/530,382 Continuation US20220072428A1 (en) | 2020-04-23 | 2021-11-18 | Virtual object control method and apparatus, device, and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021213026A1 true WO2021213026A1 (zh) | 2021-10-28 |
Family
ID=72183258
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/079592 WO2021213026A1 (zh) | 2020-04-23 | 2021-03-08 | 虚拟对象的控制方法、装置、设备及存储介质 |
Country Status (8)
Country | Link |
---|---|
US (1) | US20220072428A1 (zh) |
EP (1) | EP3936207A4 (zh) |
KR (1) | KR20210143300A (zh) |
CN (1) | CN111589126B (zh) |
AU (1) | AU2021240132A1 (zh) |
CA (1) | CA3133467A1 (zh) |
SG (1) | SG11202111219TA (zh) |
WO (1) | WO2021213026A1 (zh) |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111249730B (zh) * | 2020-01-15 | 2021-08-24 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、设备及可读存储介质 |
CN111589126B (zh) * | 2020-04-23 | 2023-07-04 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、设备及存储介质 |
US11731037B2 (en) * | 2020-09-11 | 2023-08-22 | Riot Games, Inc. | Rapid target selection with priority zones |
CN112121428B (zh) * | 2020-09-18 | 2023-03-24 | 腾讯科技(深圳)有限公司 | 虚拟角色对象的控制方法和装置及存储介质 |
CN112516583A (zh) * | 2020-12-11 | 2021-03-19 | 网易(杭州)网络有限公司 | 游戏中的数据处理方法、装置以及电子终端 |
CN112546627B (zh) * | 2020-12-22 | 2024-04-09 | 网易(杭州)网络有限公司 | 路线指引方法、装置、存储介质及计算机设备 |
CN112494955B (zh) * | 2020-12-22 | 2023-10-03 | 腾讯科技(深圳)有限公司 | 虚拟对象的技能释放方法、装置、终端及存储介质 |
CN113797536B (zh) * | 2021-10-08 | 2023-06-23 | 腾讯科技(深圳)有限公司 | 虚拟场景中对象的控制方法、装置、设备及存储介质 |
CN114185434A (zh) * | 2021-12-09 | 2022-03-15 | 连尚(新昌)网络科技有限公司 | 针对虚拟对象的信息处理方法及装置 |
CN114870393A (zh) * | 2022-04-14 | 2022-08-09 | 北京字跳网络技术有限公司 | 一种技能释放方法、装置、计算机设备及存储介质 |
CN114860148B (zh) * | 2022-04-19 | 2024-01-16 | 北京字跳网络技术有限公司 | 一种交互方法、装置、计算机设备及存储介质 |
CN117618919A (zh) * | 2022-08-12 | 2024-03-01 | 腾讯科技(成都)有限公司 | 虚拟道具的改造处理方法、装置、电子设备及存储介质 |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100009733A1 (en) * | 2008-07-13 | 2010-01-14 | Sony Computer Entertainment America Inc. | Game aim assist |
CN107398071A (zh) * | 2017-07-19 | 2017-11-28 | 网易(杭州)网络有限公司 | 游戏目标选择方法及装置 |
WO2019044131A1 (ja) * | 2017-09-04 | 2019-03-07 | 株式会社バンダイ | ゲーム装置、プログラム及びゲームシステム |
CN110064193A (zh) * | 2019-04-29 | 2019-07-30 | 网易(杭州)网络有限公司 | 游戏中虚拟对象的操控控制方法、装置和移动终端 |
CN110413171A (zh) * | 2019-08-08 | 2019-11-05 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行快捷操作的方法、装置、设备及介质 |
CN110448891A (zh) * | 2019-08-08 | 2019-11-15 | 腾讯科技(深圳)有限公司 | 控制虚拟对象操作远程虚拟道具的方法、装置及存储介质 |
CN110743168A (zh) * | 2019-10-21 | 2020-02-04 | 腾讯科技(深圳)有限公司 | 虚拟场景中的虚拟对象控制方法、计算机设备及存储介质 |
CN111589126A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、设备及存储介质 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3685836B2 (ja) * | 1995-02-28 | 2005-08-24 | 株式会社ナムコ | 3次元シューティングゲーム装置 |
JP2001149640A (ja) * | 1999-09-16 | 2001-06-05 | Sega Corp | ゲーム機およびゲーム処理方法並びにプログラムを記録した記録媒体 |
US10659288B2 (en) * | 2013-02-21 | 2020-05-19 | Gree, Inc. | Method for controlling server device, recording medium, server device, terminal device, and system |
US9205337B2 (en) * | 2013-03-04 | 2015-12-08 | Gree, Inc. | Server device, method for controlling the same, computer readable recording medium, and game system |
JP6661275B2 (ja) * | 2015-03-05 | 2020-03-11 | 株式会社バンダイナムコエンターテインメント | プログラムおよびサーバシステム |
CN104915117B (zh) * | 2015-06-16 | 2017-03-22 | 深圳市腾讯计算机系统有限公司 | 控制与虚拟目标进行交互的方法和装置 |
JP6632819B2 (ja) * | 2015-06-30 | 2020-01-22 | 株式会社バンダイナムコエンターテインメント | プログラム、ゲーム装置及びサーバシステム |
US20170072317A1 (en) * | 2015-09-16 | 2017-03-16 | Gree, Inc. | Non-transitory computer readable medium, method of controlling a game, and information processing device |
JP5911632B1 (ja) * | 2015-10-05 | 2016-04-27 | グリー株式会社 | プログラム、ゲームの制御方法、及び情報処理装置 |
JPWO2018225163A1 (ja) * | 2017-06-06 | 2019-06-27 | 株式会社スクウェア・エニックス | ビデオゲーム処理プログラム、及びビデオゲーム処理システム |
CN107583271B (zh) * | 2017-08-22 | 2020-05-22 | 网易(杭州)网络有限公司 | 在游戏中选择目标的交互方法和装置 |
CN107837529B (zh) * | 2017-11-15 | 2019-08-27 | 腾讯科技(上海)有限公司 | 一种对象选择方法、装置、终端和存储介质 |
CN108310771A (zh) * | 2018-01-16 | 2018-07-24 | 腾讯科技(深圳)有限公司 | 任务的执行方法和装置、存储介质、电子装置 |
KR101975542B1 (ko) * | 2018-11-07 | 2019-05-07 | 넷마블 주식회사 | 게임 공략 가이드 제공 방법 및 게임 공략 가이드 제공 장치 |
CN109865282B (zh) * | 2019-03-05 | 2020-03-17 | 网易(杭州)网络有限公司 | 移动终端中的信息处理方法、装置、介质及电子设备 |
CN110141864B (zh) * | 2019-04-30 | 2022-08-23 | 深圳市腾讯网域计算机网络有限公司 | 一种游戏自动测试方法、装置及终端 |
CN117482507A (zh) * | 2019-07-19 | 2024-02-02 | 腾讯科技(深圳)有限公司 | 多人在线对战程序中的提醒信息发送方法、装置及终端 |
-
2020
- 2020-04-23 CN CN202010328506.0A patent/CN111589126B/zh active Active
-
2021
- 2021-03-08 SG SG11202111219TA patent/SG11202111219TA/en unknown
- 2021-03-08 EP EP21778333.1A patent/EP3936207A4/en active Pending
- 2021-03-08 AU AU2021240132A patent/AU2021240132A1/en not_active Abandoned
- 2021-03-08 WO PCT/CN2021/079592 patent/WO2021213026A1/zh unknown
- 2021-03-08 CA CA3133467A patent/CA3133467A1/en active Pending
- 2021-03-08 KR KR1020217035074A patent/KR20210143300A/ko not_active Application Discontinuation
- 2021-11-18 US US17/530,382 patent/US20220072428A1/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100009733A1 (en) * | 2008-07-13 | 2010-01-14 | Sony Computer Entertainment America Inc. | Game aim assist |
CN107398071A (zh) * | 2017-07-19 | 2017-11-28 | 网易(杭州)网络有限公司 | 游戏目标选择方法及装置 |
WO2019044131A1 (ja) * | 2017-09-04 | 2019-03-07 | 株式会社バンダイ | ゲーム装置、プログラム及びゲームシステム |
CN110064193A (zh) * | 2019-04-29 | 2019-07-30 | 网易(杭州)网络有限公司 | 游戏中虚拟对象的操控控制方法、装置和移动终端 |
CN110413171A (zh) * | 2019-08-08 | 2019-11-05 | 腾讯科技(深圳)有限公司 | 控制虚拟对象进行快捷操作的方法、装置、设备及介质 |
CN110448891A (zh) * | 2019-08-08 | 2019-11-15 | 腾讯科技(深圳)有限公司 | 控制虚拟对象操作远程虚拟道具的方法、装置及存储介质 |
CN110743168A (zh) * | 2019-10-21 | 2020-02-04 | 腾讯科技(深圳)有限公司 | 虚拟场景中的虚拟对象控制方法、计算机设备及存储介质 |
CN111589126A (zh) * | 2020-04-23 | 2020-08-28 | 腾讯科技(深圳)有限公司 | 虚拟对象的控制方法、装置、设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
AU2021240132A1 (en) | 2021-11-11 |
EP3936207A4 (en) | 2022-07-06 |
JP2022533051A (ja) | 2022-07-21 |
CA3133467A1 (en) | 2021-10-23 |
CN111589126B (zh) | 2023-07-04 |
KR20210143300A (ko) | 2021-11-26 |
SG11202111219TA (en) | 2021-11-29 |
EP3936207A1 (en) | 2022-01-12 |
US20220072428A1 (en) | 2022-03-10 |
CN111589126A (zh) | 2020-08-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021213026A1 (zh) | 虚拟对象的控制方法、装置、设备及存储介质 | |
WO2021208614A1 (zh) | 虚拟对象的控制方法、装置、设备和存储介质 | |
WO2021244322A1 (zh) | 瞄准虚拟对象的方法、装置、设备及存储介质 | |
CN111672116B (zh) | 控制虚拟对象释放技能的方法、装置、终端及存储介质 | |
US20230068653A1 (en) | Method and apparatus for controlling virtual object to use virtual prop, terminal, and medium | |
CN112138384B (zh) | 虚拟投掷道具的使用方法、装置、终端及存储介质 | |
US11931653B2 (en) | Virtual object control method and apparatus, terminal, and storage medium | |
US20220379214A1 (en) | Method and apparatus for a control interface in a virtual environment | |
WO2022156486A1 (zh) | 虚拟道具的投放方法、装置、终端、存储介质及程序产品 | |
KR102645535B1 (ko) | 가상 장면에서의 가상 객체 제어 방법 및 장치, 디바이스 그리고 저장 매체 | |
WO2022227958A1 (zh) | 虚拟载具的显示方法、装置、设备以及存储介质 | |
WO2023142617A1 (zh) | 基于虚拟场景的射线显示方法、装置、设备以及存储介质 | |
KR20220042299A (ko) | 가상 환경의 픽처를 디스플레이하는 방법 및 장치, 디바이스, 그리고 매체 | |
JP2023164787A (ja) | 仮想環境の画面表示方法、装置、機器及びコンピュータプログラム | |
JP7476235B2 (ja) | 仮想オブジェクトの制御方法、装置、デバイス及びコンピュータプログラム | |
CN114210062A (zh) | 虚拟道具的使用方法、装置、终端、存储介质及程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20217035074 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021778333 Country of ref document: EP Effective date: 20211008 |
|
ENP | Entry into the national phase |
Ref document number: 2021566600 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2021240132 Country of ref document: AU Date of ref document: 20210308 Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21778333 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |