CN113082688A - Method and device for controlling virtual character in game, storage medium and equipment - Google Patents

Method and device for controlling virtual character in game, storage medium and equipment Download PDF

Info

Publication number
CN113082688A
CN113082688A CN202110352615.0A CN202110352615A CN113082688A CN 113082688 A CN113082688 A CN 113082688A CN 202110352615 A CN202110352615 A CN 202110352615A CN 113082688 A CN113082688 A CN 113082688A
Authority
CN
China
Prior art keywords
virtual character
virtual
game
control
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110352615.0A
Other languages
Chinese (zh)
Other versions
CN113082688B (en
Inventor
胡佳胜
刘勇成
胡志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202110352615.0A priority Critical patent/CN113082688B/en
Publication of CN113082688A publication Critical patent/CN113082688A/en
Application granted granted Critical
Publication of CN113082688B publication Critical patent/CN113082688B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/214Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
    • A63F13/2145Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1068Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
    • A63F2300/1075Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen

Abstract

The embodiment of the application discloses a method, a device, a storage medium and equipment for controlling virtual characters in a game. The method comprises the following steps: detecting the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to the first touch operation aiming at the specified interactive control, and controlling the target virtual character in the game to execute the first target interactive action, so that the operation complexity is reduced, and the operation efficiency is improved.

Description

Method and device for controlling virtual character in game, storage medium and equipment
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a storage medium, and a device for controlling a virtual character in a game.
Background
With the development and popularization of electronic equipment technology, more and more terminal games, such as ball games, emerge. In a game, a player controls a virtual character to execute different interactive actions by clicking different skill keys, however, the number of interactive actions in the game is large, each interactive action needs to correspond to one skill key, so that the number of skill keys is large, the layout of the skill keys in different games is different, the player needs to know the layout of the skill keys in different games in time, and the player needs to find the corresponding skill keys in each operation, so that the operation is complex, and the operation efficiency is low.
Disclosure of Invention
The embodiment of the application provides a method, a device, a storage medium and equipment for controlling virtual characters in a game, which can reduce operation complexity and improve operation efficiency.
The embodiment of the application provides a method for controlling virtual characters in a game, a graphical user interface is provided through electronic equipment, the graphical user interface comprises a designated interaction control, and the method comprises the following steps:
detecting the control state of a first virtual character in the game on a virtual object;
determining a first target interaction action corresponding to the specified interaction control according to the control state;
and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action.
Optionally, before the detecting the manipulation state of the virtual object by the first virtual character in the game, the method further includes:
and responding to a second touch operation aiming at the specified interaction control, and controlling the target virtual character to execute a preset interaction action corresponding to the specified interaction control.
Optionally, the preset interaction includes:
following the first avatar movement.
Optionally, the controlling, in response to the second touch operation for the specified interaction control, the target avatar to move along with the first avatar includes:
responding to the second touch operation aiming at the specified interactive control, and detecting the touch pressure of the second touch operation;
determining a following speed corresponding to the touch pressure;
and controlling the target virtual character to move along with the first virtual character according to the following speed.
Optionally, the detecting a manipulation state of a virtual object by a first virtual character in the game includes:
receiving a control instruction which is sent by a terminal corresponding to the first virtual role last time;
and determining the control state of the first virtual role on the virtual object according to the control instruction.
Optionally, the detecting a manipulation state of a virtual object by a first virtual character in the game includes:
detecting a direction of motion of the virtual object;
and determining the control state of the first virtual character on the virtual object according to the motion direction.
Optionally, the determining, according to the control state, a first target interaction action corresponding to the specified interaction control includes:
detecting whether the first virtual character is paired with the target virtual character;
and if so, determining a first target interaction action corresponding to the specified interaction control according to the control state.
Optionally, the detecting whether the first virtual character is paired with the target virtual character includes:
detecting whether the distance between the target virtual character and the first virtual character is smaller than a preset distance threshold value;
if so, determining that the first virtual role is paired with the target virtual role;
if not, determining that the first virtual role is not paired with the target virtual role.
Optionally, before responding to the first touch operation on the specified interaction control, the method further includes:
determining an icon corresponding to the first target interactive action;
displaying the designated interaction control on the graphical user interface in the form of the icon.
Optionally, the method further comprises:
when the target virtual character is detected to be located in a preset area in a game scene, detecting the motion state of the virtual object;
when the motion state is detected to be in a preset state, determining a second target interaction action corresponding to the specified interaction control;
and responding to a third touch operation aiming at the specified interactive control, and controlling the target virtual character to execute the second target interactive action.
Optionally, the method further comprises:
when the virtual object is detected to be controlled by a second virtual character in the game, responding to a fourth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fourth touch operation.
Optionally, the method further comprises:
when the virtual object is detected to be controlled by the target virtual character, responding to a fifth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fifth touch operation aiming at the virtual object.
The embodiment of the present application further provides a device for controlling a virtual character in a game, where a graphical user interface is provided through an electronic device, the graphical user interface includes a designated interaction control, and the device includes:
the detection module is used for detecting the control state of a first virtual character in the game on a virtual object;
the determining module is used for determining a first target interaction action corresponding to the specified interaction control according to the control state;
and the control module is used for responding to the first touch operation aiming at the specified interactive control and controlling the target virtual character in the game to execute the first target interactive action.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, where the computer program is suitable for being loaded by a processor to execute the steps in the method for controlling a virtual character in a game according to any of the above embodiments.
An embodiment of the present application further provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores a computer program, and the processor executes, by calling the computer program stored in the memory, the steps in the method for controlling a virtual character in a game according to any of the above embodiments.
The method, the device, the storage medium and the equipment for controlling the virtual character in the game provided by the embodiment of the application detect the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action. According to the method and the device, the target virtual character can be controlled to execute different interaction actions by combining the control state of the first virtual character on the virtual object through the appointed interaction control, so that the operation complexity is reduced, and the operation efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a system schematic diagram of a control device for a virtual character in a game according to an embodiment of the present application.
Fig. 2 is a schematic flow chart of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 3 is a schematic view of a first application scenario of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 4 is a schematic view of a second application scenario of the method for controlling a virtual character in a game according to the embodiment of the present application.
Fig. 5 is a schematic view of a third application scenario of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 6 is a schematic view of a fourth application scenario of the method for controlling a virtual character in a game according to the embodiment of the present application.
Fig. 7 is a schematic view of a fifth application scenario of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 8 is a schematic view of a sixth application scenario of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 9 is another schematic flow chart of a method for controlling a virtual character in a game according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of a control device for a virtual character in a game according to an embodiment of the present application.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a method and a device for controlling virtual characters in a game, a storage medium and electronic equipment. Specifically, the method for controlling a virtual character in a game according to the embodiment of the present application may be executed by an electronic device, where the electronic device may be a terminal or a server. The terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and may further include a client, which may be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the control method of the virtual character in the game is operated on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the control method of the virtual character in the game is executed on a server, the game may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the control method of the virtual character are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a system schematic diagram of a control device for a virtual character in a game according to an embodiment of the present disclosure. The system may include a plurality of terminals 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information related to the game environment may be continuously stored in the databases 3000 when different users play the multiplayer game online.
The embodiment of the application provides a method for controlling virtual characters in a game, which can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which a method for controlling a virtual character in a game is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operation instruction generated by the user acting on the graphical user interface comprises an instruction for starting a game application, and the processor is configured to start the game application after receiving the instruction provided by the user for starting the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, the virtual character in the graphical user interface of the game is controlled to perform an action corresponding to the touch operation. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. In addition, one or more virtual characters controlled by a user (or player) may be included in the virtual scene of the game. Additionally, one or more obstacles, such as railings, ravines, walls, etc., may also be included in the virtual scene of the game to limit movement of the virtual objects, e.g., to limit movement of one or more virtual characters to a particular area within the virtual scene. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual character and one or more other virtual characters (such as enemy characters). In one embodiment, one or more other virtual characters are controlled by other players of the game. For example, one or more other virtual characters may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine engagement mode. For example, virtual characters possess various skills or abilities that a game player uses to achieve a goal. For example, a virtual character possesses one or more props, tools, etc. that can be used to remove other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
Referring to fig. 2 to 8, fig. 2 is a schematic flowchart of a method for controlling a virtual character in a game according to an embodiment of the present application, and fig. 3 to 8 are application scenarios of the method for controlling a virtual character in a game according to the embodiment of the present application.
In this embodiment, the execution terminal is a first terminal, the first terminal may be an electronic device, and the first terminal provides a graphical user interface, where the graphical user interface includes a virtual scene, and the virtual object and the plurality of virtual characters are located in the virtual scene. The virtual scene is a virtual environment provided by a game when running on the terminal, and the virtual environment can be a simulation environment of a real world, a semi-simulation semi-fictional three-dimensional environment or a pure fictional three-dimensional environment. Virtual objects can be manipulated by virtual characters in a virtual environment to move, such as virtual basketballs, virtual volleyballs, virtual soccer balls, and the like. Virtual characters can be animated in a virtual environment, such as a virtual character. The virtual roles can be divided into at least two types, wherein the first virtual role is a target virtual role, the target virtual role is a virtual role controlled by a first terminal, and the number of the target virtual roles is generally one; the second virtual character is a virtual character controlled by other terminals, and the second virtual character and the target virtual character are located in different camps, that is, the second virtual character is an opponent of the target virtual character, and the number of the second virtual characters may be one or more. The virtual roles can also be divided into a third virtual role, the third virtual role is also a virtual role controlled by other terminals, and the third virtual role and the target virtual role are positioned in the same camp, i.e. the third virtual role is a teammate of the target virtual role, and the number of the third virtual roles can be one or more.
The graphical user interface further comprises a designated interaction control, and when a first user corresponding to the first terminal triggers the designated interaction control, the first user can control the target virtual character to execute different interaction actions, wherein the interaction actions can be actions of a certain skill which can be implemented by the target virtual character and do not include movement (such as walking, running and the like). The graphical user interface may further include a movement control, and when the first user triggers different regions of the movement control, the first user may control the target virtual character to move in different directions, for example, when the first user triggers a top region of the movement control, the target virtual character is controlled to move forward, when the first user triggers a bottom region of the movement control, the target virtual character is controlled to turn around and move backward, when the first user triggers a left region of the movement control, the target virtual character is controlled to move leftward, and when the first user triggers a right region of the movement control, the target virtual character is controlled to move rightward and rightward. The graphical user interface may also include other functionality controls, such as functionality controls for recording video, functionality controls for ending a game, etc., but it should be noted that the graphical user interface has only one designated interaction control.
For example, as shown in fig. 3, a virtual object 11 and six virtual characters 12, 13, 14, 15, 16, 17 are displayed on the graphical user interface 100, wherein the virtual character 12 is a target virtual character, the virtual character 13, the virtual character 14, and the virtual character 15 are second virtual characters, the virtual character 16 and the virtual character 17 are third virtual characters, that is, the virtual character 13, the virtual character 16, and the virtual character 17 are located in the same camp, and the virtual character 13, the virtual character 14, and the virtual character 15 are located in the same camp. The lower right hand corner of the graphical user interface 100 displays a designated interaction control 18 and the lower left hand corner of the graphical user interface 100 displays a movement control 19.
Step 101, detecting the control state of a first virtual character in the game on a virtual object.
In this embodiment, the first virtual character is a second virtual character, that is, the first virtual character and the target virtual character are located in different camps, and when the virtual object is controlled by the first virtual character, the first virtual character is an aggressor, and the target virtual character is a defender. As shown in fig. 3, the virtual character 13 handles the virtual object 11, the virtual character 13 is a first virtual character, the virtual character 13, the virtual character 14, and the virtual character 15 are attacking parties, and the virtual character 12, the virtual character 16, and the virtual character 17 are defending parties.
When attacking, the first virtual character may manipulate the virtual object in different states, for example, in a basketball game, the virtual object is a virtual basketball, and the manipulation state of the first virtual character on the virtual object may include dribbling, passing, shooting, and the like.
The control state of the first virtual character on the virtual object can be realized through various detection modes. In one embodiment, the detecting the manipulation state of the virtual object by the first virtual character in the game includes: receiving a control instruction which is sent by a terminal corresponding to the first virtual role last time; and determining the control state of the first virtual role on the virtual object according to the control instruction.
The terminal corresponding to the first virtual character can be a second terminal, and the second user can control the action of the first virtual character through the second terminal, so that the virtual object is controlled through the first virtual character. The graphical user interface of the second terminal can also be provided with only one designated interactive control, the second user can trigger the second terminal to generate different control instructions through different operations on the designated interactive control, and different control of the first virtual role on the virtual object can be realized according to different control instructions, so that the virtual object is in different control states. For example, in a basketball game, a second user clicks a designated interactive control on a second terminal to trigger the second terminal to generate a pass action instruction, the second user presses the designated interactive control on the second terminal for a long time to trigger the second terminal to generate a dribbling action instruction, the second user slides the designated interactive control on the second terminal to trigger the second terminal to generate a shooting action instruction, and the like.
The first terminal can acquire the current control state of the first virtual role on the virtual object by acquiring the control instruction generated by the second terminal for the last time. For example, the second user triggers the second terminal to generate a shooting action instruction, the second terminal controls the first virtual character to execute a shooting action according to the shooting action instruction, the first terminal obtains a control instruction (namely the shooting action instruction) generated by the second terminal last time, and the first terminal determines that the current control state of the first virtual character on the virtual object is shooting according to the shooting action instruction.
In another embodiment, the detecting the manipulation state of the virtual object by the first virtual character in the game includes: detecting a direction of motion of the virtual object; and determining the control state of the first virtual character on the virtual object according to the motion direction.
Since the movement directions of the virtual objects are different when the first virtual character controls the virtual objects to be in different control states, the current control state of the first virtual character on the virtual objects can be detected by detecting the current movement direction of the virtual objects. For example, in a basketball game, when a first virtual character shoots, the movement direction of the virtual object is the basket direction; when the first virtual character passes, the movement direction of the virtual object is the teammate direction of the first virtual character; when the first virtual character dribbles, the movement direction of the virtual object is upward or downward. As shown in fig. 5, the current movement direction B of the virtual object 11 is the basket 20 direction, and the first terminal determines that the current manipulation state of the virtual object 11 by the first virtual character 13 is shooting.
The first terminal may detect, in real time, the control state of the virtual object by the first virtual character after detecting that the virtual object is controlled by the first virtual character, or may detect, after executing a preset interaction action by the control target virtual character, the control state of the virtual object by the first virtual character.
Specifically, before the detecting the manipulation state of the virtual object by the first virtual character in the game, the method further includes: and responding to a second touch operation aiming at the specified interaction control, and controlling the target virtual character to execute a preset interaction action corresponding to the specified interaction control.
The second touch operation may include a click operation (such as a single finger click or a simultaneous click with multiple fingers), a long-time press operation, a sliding operation, and the like, and the operation type may be set by the first user in advance. The second touch operation of the first user may be one operation or multiple operations, and the operation types of the multiple operations may be the same or different. For example, when the target virtual character is a defending party, the first user performs a click operation on the specified interaction control, and the first terminal responds to the click operation to control the target virtual character to execute a preset interaction action corresponding to the specified interaction control.
In order to ensure that the target virtual character can effectively execute the target interaction action in time subsequently, the first user can firstly perform a second touch operation on the specified interaction control, and the first terminal responds to the second touch operation to control the target virtual character to enter a staring state. The preset interaction action may be a defensive action, such as controlling the target avatar to open both hands for defensive purposes.
When the first virtual character moves, the preset interaction may further include following the movement of the first virtual character, that is, the first terminal control target virtual character moves along with the movement of the first virtual character.
As shown in fig. 4, before the first user performs the second touch operation, the target virtual character 12 is located at the position point a, and after the first user performs the second touch operation, the target virtual character 12 enters a staring state, moves from the position point a to the side of the virtual character 13, performs a defensive action, and moves following the movement of the virtual character 13.
And controlling the defense speed of the target virtual role according to the touch strength of the first user during the second touch operation. Specifically, the controlling, by responding to the second touch operation for the specified interaction control, the target virtual character to move along with the first virtual character includes: responding to the second touch operation aiming at the specified interactive control, and detecting the touch pressure of the second touch operation; determining a following speed corresponding to the touch pressure; and controlling the target virtual character to move along with the first virtual character according to the following speed.
The touch control force of the first user is different, and the touch control pressure of the second touch control operation detected by the first terminal is different, so that the following speed of the target virtual character is controlled to be different. The first terminal may preset a correspondence table between the touch pressure and the following speed, and after detecting the touch pressure of the second touch operation, query the correspondence table to determine the corresponding following speed. The first terminal may also preset a plurality of gears, that is, preset a one-to-one correspondence relationship between a plurality of pressure ranges and a plurality of following speeds, and after detecting the touch pressure of the second touch operation, first query the pressure range where the touch pressure is located, and then determine the following speed corresponding to the pressure range. The larger the touch pressure is, the larger the following speed is.
The following speed may be a moving speed at which the target virtual character moves to the side of the first virtual character when the target virtual character enters the staring state. For example, before the target virtual character enters the staring state, the target virtual character is far away from the first virtual character, the following speed corresponding to the touch pressure can be determined according to the touch pressure of the second touch operation, then the target virtual character is controlled to move to the side of the first virtual character according to the following speed, and then the moving speed of the target virtual character is matched with the moving speed of the first virtual character, namely when the first virtual character moves, the target virtual character follows to move at the same speed, and when the first virtual character does not move, the target virtual character does not move.
And 102, determining a first target interaction action corresponding to the specified interaction control according to the control state.
Because the target virtual role is a defending party, when different control states of the first virtual role on the virtual object are detected, different first target interaction actions can be determined so as to be used for corresponding defending. In basketball-like games, for example, interactions for defense include snapping, capping, etc. When the control state of the first virtual character on the virtual object is detected to be shooting, determining a first target interaction as a capping action; and when the control state of the first virtual role on the virtual object is detected to be dribbling or passing, determining the first target interaction as a snap action.
After determining the first target interaction, a different icon may be displayed on the graphical user interface to prompt the first user. Specifically, the method further comprises: determining an icon corresponding to the first target interactive action; displaying the designated interaction control on the graphical user interface in the form of the icon.
For example, when the first target interaction is determined as the capping action, the specified interaction control is displayed on the graphical user interface as an icon of the capping action, and the user is reminded to touch the specified interaction control at the moment so as to trigger the capping action; when the first target interaction is determined to be the emergency action, the appointed interaction control is displayed on the graphical user interface in the form of the icon of the emergency action, the user is reminded to touch the appointed interaction control at the moment, and the emergency action can be triggered.
In the game of countermeasure, for example, the game of basketball, the number of characters in two-party formation is the same, and the one-to-one defense can be realized by pairing. Specifically, the determining, according to the control state, a first target interaction action corresponding to the specified interaction control includes: detecting whether the first virtual character is paired with the target virtual character; and if so, determining a first target interaction action corresponding to the specified interaction control according to the control state.
Whether the first virtual character is paired with the target virtual character can be detected in various ways. In one embodiment, the pairing relationship of the virtual roles in the two-party camp is set in advance. For example, if there are three virtual roles in each of the two matrixes, a one-to-one matching relationship between the three virtual roles in one of the matrixes and the three virtual roles in the other of the matrixes is set. After the first user performs the first touch operation on the specified interactive control, the first terminal can detect whether the first virtual character is paired with the target virtual character according to a preset pairing relationship. As shown in fig. 3, if the target virtual character 12 is paired with the first virtual character 13, the virtual character 16 is paired with the virtual character 14, and the virtual character 17 is paired with the virtual character 15, the first terminal can detect that the first virtual character 13 is paired with the target virtual character 12.
In another embodiment, on the basis of presetting the pairing relationship, the pairing relationship can be updated in real time according to the position relationship of the virtual roles in the two-party camp. Specifically, the detecting whether the first virtual character is paired with the target virtual character includes: detecting whether the distance between the target virtual character and the first virtual character is smaller than a preset distance threshold value; if so, determining that the first virtual role is paired with the target virtual role; if not, determining that the first virtual role is not paired with the target virtual role.
After a first user performs first touch operation on an appointed interaction control, the first terminal can also detect the distance between a target virtual character and the first virtual character, and if only the target virtual character is closest to the first virtual character in the marketing of the target virtual character, the target virtual character is paired with the first virtual character; if at least two virtual roles in the marketing of the target virtual role are equal to and closest to the first virtual role, detecting whether the first virtual role is matched with the target virtual role or not according to a preset matching relationship; and if the distance between the target virtual character and the first virtual character is not the nearest in the marketing in which the target virtual character is positioned, the target virtual character is not paired with the first virtual character.
As shown in fig. 3, among the current target virtual character 12, virtual character 16, and virtual character 17, the target virtual character 12 is closest in distance to the first virtual character 13, and thus the target virtual character 12 is determined to be paired with the first virtual character 13.
Determining a first target interaction action when the first virtual character is determined to be paired with the target virtual character; and when the first virtual role is determined not to be matched with the target virtual role, the first terminal does not perform subsequent processing.
And 103, responding to the first touch operation aiming at the specified interactive control, and controlling the target virtual character in the game to execute the first target interactive action.
In this embodiment, the first touch operation may include a click operation, a long-press operation, a sliding operation, and the like, and the operation type may be set by the first user in a user-defined manner in advance. The first touch operation of the first user may be one operation or multiple operations, and the operation types of the multiple operations may be the same or different.
The first terminal responds to the first touch operation and can control the target virtual character to execute different interactive actions. For example, in a basketball game, when the first target interaction is determined as the capping action, the target virtual character is controlled to execute the capping action; and when the first target interaction is determined to be the emergency action, controlling the target virtual role to execute the emergency action.
The first terminal can automatically control the target virtual character to move to the vicinity of the virtual object, and execute corresponding first target interaction action on the virtual object. In order to increase the interest of the game, the first terminal may also control the target virtual character to move to a position near the virtual object in combination with the operation of the first user on the movement control, and then perform a corresponding first target interaction action on the virtual object.
As shown in fig. 6, when it is detected that the current manipulation state of the first virtual character 13 on the virtual object 11 is shooting, it is determined that the first target interaction corresponding to the specified interaction control is taken as a capping action, and in response to the first touch operation on the specified interaction control, the virtual character 12 is controlled to move to the vicinity of the virtual object 11, and the capping action is executed.
Further, the method further comprises: when the target virtual character is detected to be located in a preset area in a game scene, detecting the motion state of the virtual object; when the motion state is detected to be in a preset state, determining a second target interaction action corresponding to the specified interaction control; and responding to a third touch operation aiming at the specified interactive control, and controlling the target virtual character to execute the second target interactive action. The third touch operation may include a click operation, a long-press operation, a sliding operation, and the like.
Whether the target virtual character is a defending party or an attacking party, as long as the target virtual character is located in a preset area, the first terminal needs to detect the motion state of the virtual object, and if the motion state of the virtual object meets the preset state, a second target interaction action is determined so as to control the target virtual character to execute the second target interaction action after responding to a third touch operation aiming at the specified interaction control; and if the motion state of the virtual object does not meet the preset state, the first terminal controls the target virtual character to execute corresponding interactive actions according to the processing flow of other areas, for example, the target virtual character is controlled to execute the first target interactive actions according to different control states of the first virtual character on the virtual object.
For example, in a basketball game, the preset area is a partial area below the rim, the preset area is a preset area capable of realizing the backboard grabbing, the preset state is a state when the basketball rebounds after being impacted (not entering the basketball), and the interaction action corresponding to the preset state is the backboard grabbing. As shown in fig. 7, the target virtual character 12 is located in the preset area 21, the virtual object 11 moves towards the basket 20, rebounds after hitting the basket 20, in the rebounding process, the first user performs a third operation on the designated interactive control 18, and the first terminal detects that the current motion state of the virtual object 11 is in the preset state, and controls the target virtual character 12 to rob the backboard.
Further, the method further comprises: when the virtual object is detected to be controlled by a second virtual character in the game, responding to a fourth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fourth touch operation. The fourth touch operation may include a click operation, a long-press operation, a sliding operation, and the like.
In this embodiment, the second virtual character is a third virtual character, that is, the second virtual character and the target virtual character are located in the same camp, when the virtual object is controlled by the second virtual character, the second virtual character and the target virtual character are attack parties, and the first virtual character is defense parties. As shown in fig. 8, the virtual character 16 handles the virtual object 11, the virtual character 16 is a second virtual character, the target virtual character 12, the virtual character 16, and the virtual character 17 are attacking parties, and the virtual character 13, the virtual character 14, and the virtual character 15 are defending parties.
When the second virtual character attacks, the target virtual character can cooperate with the second virtual character to attack, and in the basketball game, the cooperation action of the target virtual character can comprise blocking and releasing, putting a goal and the like. The first user can control the target virtual character to execute different interactive actions through different operations on the specified interactive control. For example, in the process that the first user performs long-time pressing operation on the specified interactive control, the target virtual character can be controlled to execute blocking and detaching actions so as to prevent the virtual character strutted by the other party from snatching, the first user releases the specified interactive control, the target virtual character can be controlled not to execute the blocking and detaching actions any more, and the target virtual character can move freely; the first user performs sliding operation on the specified interaction control, and can control the target virtual character to execute the ball-seeking operation so as to indicate the second virtual character to transfer the virtual object to the first user.
In some embodiments, a role attribute of a virtual role for manipulating the virtual object may be detected, and an interaction action corresponding to the specified interaction control may be determined according to the role attribute and the manipulation state. For example, when the virtual character for controlling the virtual object is detected to be the first virtual character, the cap action, the snap action or the pass action is determined according to the control state of the first virtual character on the virtual object; and when the virtual role for controlling the virtual object is detected to be a second virtual role, determining the blocking and releasing action or the ball-needing action and the like according to the control state of the second virtual role on the virtual object.
Further, the method further comprises: and when the virtual object is detected to be controlled by the target virtual character, controlling the target virtual character to execute corresponding interaction action aiming at the virtual object.
When the virtual object is controlled by the target virtual role, the second virtual role and the target virtual role are attacking parties, and the first virtual role is a defending party. The first user can not perform any operation on the specified interaction control, and the first terminal automatically controls the target virtual character to execute an interaction action aiming at the virtual object. For example, in a basketball game, the interaction is dribbling. The first user can only operate the movement control to control the target virtual character to move, and the first terminal can automatically control the target virtual character to dribble in the process that the target virtual character is static or moves.
Further, the method further comprises: when the virtual object is detected to be controlled by the target virtual character, responding to a fifth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fifth touch operation aiming at the virtual object. The fifth touch operation may include a click operation, a long-press operation, a sliding operation, and the like.
The first user can trigger the first terminal control target virtual role to execute different interactive actions on the virtual object through different operations on the specified interactive control. For example, in a basketball game, a first user clicks a specified interaction control, may control a target virtual character to execute a pass action with respect to a virtual object, the first user presses the specified interaction control for a long time, may control the target virtual character to execute a shooting action with respect to the virtual object, and the first user slides the specified interaction control, may control the target virtual character to execute a break action, and so on.
All the above technical solutions can be combined arbitrarily to form the optional embodiments of the present application, and are not described herein again.
The method for controlling the virtual character in the game provided by the embodiment of the application detects the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action. According to the method and the device, the target virtual character can be controlled to execute different interaction actions by combining the control state of the first virtual character on the virtual object through the appointed interaction control, so that the operation complexity is reduced, and the operation efficiency is improved.
Referring to fig. 9, fig. 9 is another schematic flow chart of a method for controlling a virtual character in a game according to an embodiment of the present application. The specific process of the method can be as follows:
step 201, in response to at least one operation on the designated interactive control, detecting a role relationship between a virtual role currently controlling the virtual basketball and a target virtual role, if the role relationship is an opponent, executing step 202, if the role relationship is a teammate, executing step 205, and if the role relationship is a teammate, executing step 205.
The execution terminal is a first terminal, and the target virtual role refers to a virtual role controlled by the first terminal. If the virtual character of the current control virtual basketball and the target virtual character are in different camps, the character relationship is an opponent; if the virtual character of the current control virtual basketball and the target virtual character are positioned in the same formation but not in the same virtual character, the character relationship is teammates; if the virtual character of the current control virtual basketball and the target virtual character are the same character, the character relationship is self.
Step 202, detecting the current control state of the virtual basketball by the virtual character, if the control state is dribbling or passing, executing step 203, and if the control state is shooting, executing step 204.
And step 203, controlling the target virtual character to execute a snap action aiming at the virtual basketball.
And step 204, controlling the target virtual character to perform a cap action aiming at the virtual basketball.
And step 205, controlling the target virtual character to execute a blocking-in action or a ball-taking action according to the operation type of the at least one operation.
The operation types comprise clicking operation, long-time pressing operation, sliding operation and the like, the operation types are different, and the actions executed by the first terminal control target virtual character are different.
And step 206, controlling the target virtual character to execute a pass action, a shooting action or a breakthrough action aiming at the virtual basketball according to the operation type of the at least one operation.
All the above technical solutions can be combined arbitrarily to form the optional embodiments of the present application, and are not described herein again.
According to the control method for the virtual character in the game, the target virtual object can be controlled to execute the corresponding interaction action in any state through the appointed interaction control, the operation complexity is reduced, and the operation efficiency is improved.
In order to better implement the method for controlling a virtual character in a game according to the embodiment of the present application, an embodiment of the present application further provides a device for controlling a virtual character. Referring to fig. 10, fig. 10 is a schematic structural diagram of a control device of a virtual character according to an embodiment of the present application. The control device 300 for the virtual character may include:
a detection module 301, configured to detect a control state of a virtual object by a first virtual character in the game;
a determining module 302, configured to determine, according to the control state, a first target interaction action corresponding to the specified interaction control;
the control module 303 is configured to respond to a first touch operation for the specified interaction control, and control the in-game target virtual character to execute the first target interaction action.
Optionally, the apparatus further comprises an interaction control module, the interaction control module is configured to:
and responding to a second touch operation aiming at the specified interaction control, and controlling the target virtual character to execute a preset interaction action corresponding to the specified interaction control.
Optionally, the preset interaction includes:
following the first avatar movement.
Optionally, the interaction control module is further configured to:
responding to the second touch operation aiming at the specified interactive control, and detecting the touch pressure of the second touch operation;
determining a following speed corresponding to the touch pressure;
and controlling the target virtual character to move along with the first virtual character according to the following speed.
Optionally, the detection module 301 is further configured to:
receiving a control instruction which is sent by a terminal corresponding to the first virtual role last time;
and determining the control state of the first virtual role on the virtual object according to the control instruction.
Optionally, the detection module 301 is further configured to:
detecting a direction of motion of the virtual object;
and determining the control state of the first virtual character on the virtual object according to the motion direction.
Optionally, the determining module 302 is further configured to:
detecting whether the first virtual character is paired with the target virtual character;
and if so, determining a first target interaction action corresponding to the specified interaction control according to the control state.
Optionally, the determining module 302 is further configured to:
detecting whether the distance between the target virtual character and the first virtual character is smaller than a preset distance threshold value;
if so, determining that the first virtual role is paired with the target virtual role;
if not, determining that the first virtual role is not paired with the target virtual role.
Optionally, the apparatus further comprises a display module, the display module is configured to:
determining an icon corresponding to the first target interactive action;
displaying the designated interaction control on the graphical user interface in the form of the icon.
Optionally, the apparatus further comprises a status detection module, wherein the status detection module is configured to:
when the target virtual character is detected to be located in a preset area in a game scene, detecting the motion state of the virtual object;
when the motion state is detected to be in a preset state, determining a second target interaction action corresponding to the specified interaction control;
and responding to a third touch operation aiming at the specified interactive control, and controlling the target virtual character to execute the second target interactive action.
Optionally, the apparatus further comprises a first interaction control module, configured to:
when the virtual object is detected to be controlled by a second virtual character in the game, responding to a fourth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fourth touch operation.
Optionally, the apparatus further comprises a second interaction control module, configured to:
when the virtual object is detected to be controlled by the target virtual character, responding to a fifth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fifth touch operation aiming at the virtual object.
All the above technical solutions can be combined arbitrarily to form the optional embodiments of the present application, and are not described herein again.
The control device for the virtual character in the game provided by the embodiment of the application detects the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action. According to the method and the device, the target virtual character can be controlled to execute different interaction actions by combining the control state of the first virtual character on the virtual object through the appointed interaction control, so that the operation complexity is reduced, and the operation efficiency is improved.
Correspondingly, the embodiment of the present application further provides an electronic device, where the electronic device may be a terminal or a server, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 11, fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer-readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. Those skilled in the art will appreciate that the electronic device configurations shown in the figures do not constitute limitations of the electronic device, and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The processor 401 is a control center of the electronic device 400, connects various parts of the whole electronic device 400 by using various interfaces and lines, performs various functions of the electronic device 400 and processes data by running or loading software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby performing overall monitoring of the electronic device 400.
In this embodiment, the processor 401 in the electronic device 400 loads instructions corresponding to processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 runs the application programs stored in the memory 402, so as to implement various functions:
detecting the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 11, the electronic device 400 further includes: touch-sensitive display screen 403, radio frequency circuit 404, audio circuit 405, input unit 406 and power 407. The processor 401 is electrically connected to the touch display screen 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power source 407. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 11 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The touch display screen 403 may be used for displaying a graphical user interface and receiving operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 401, and can receive and execute commands sent by the processor 401. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may transmit the touch operation to the processor 401 to determine the type of the touch event, and then the processor 401 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 403 may also be used as a part of the input unit 406 to implement an input function.
In the embodiment of the present application, a game application is executed by the processor 401 to generate a graphical user interface on the touch display screen 403, where a virtual scene on the graphical user interface includes a skill control area, and the skill control area includes a designated interaction control. The touch display screen 403 is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface.
The rf circuit 404 may be used for transceiving rf signals to establish wireless communication with a network device or other electronic devices via wireless communication, and for transceiving signals with the network device or other electronic devices.
The audio circuit 405 may be used to provide an audio interface between the user and the electronic device through a speaker, microphone. The audio circuit 405 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 405 and converted into audio data, which is then processed by the audio data output processor 401 and then transmitted to, for example, another electronic device via the rf circuit 404, or the audio data is output to the memory 402 for further processing. The audio circuit 405 may also include an earbud jack to provide communication of a peripheral headset with the electronic device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the electronic device 400. Optionally, the power source 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, power consumption management, and the like through the power management system. The power supply 407 may also include one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, or any other component.
Although not shown in fig. 11, the electronic device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in the method for controlling a virtual character in any game provided by the embodiments of the present application. For example, the computer program may perform the steps of:
detecting the control state of a first virtual character in the game on a virtual object; determining a first target interaction action corresponding to the specified interaction control according to the control state; and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the storage medium can execute the steps in the method for controlling a virtual character in any game provided in the embodiments of the present application, the beneficial effects that can be achieved by the method for controlling a virtual character in any game provided in the embodiments of the present application can be achieved, for details, see the foregoing embodiments, and are not described herein again.
The foregoing detailed description is directed to a method, an apparatus, a storage medium, and an electronic device for controlling a virtual character in a game provided in an embodiment of the present application, where a specific example is applied to explain the principle and an implementation of the present application, and the description of the foregoing embodiment is only used to help understand the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (15)

1. A method for controlling a virtual character in a game, wherein a graphical user interface is provided through an electronic device, the graphical user interface comprises a designated interaction control, and the method comprises the following steps:
detecting the control state of a first virtual character in the game on a virtual object;
determining a first target interaction action corresponding to the specified interaction control according to the control state;
and responding to a first touch operation aiming at the specified interactive control, and controlling a target virtual character in the game to execute the first target interactive action.
2. The method for controlling a virtual character in a game according to claim 1, further comprising, before the detecting a manipulation state of a virtual object by a first virtual character in the game:
and responding to a second touch operation aiming at the specified interaction control, and controlling the target virtual character to execute a preset interaction action corresponding to the specified interaction control.
3. The method of controlling an in-game virtual character according to claim 2, wherein the preset interactive action includes:
following the first avatar movement.
4. The in-game virtual character control method according to claim 3, wherein the controlling the target virtual character to move following the first virtual character in response to the second touch operation for the specified interactive control includes:
responding to the second touch operation aiming at the specified interactive control, and detecting the touch pressure of the second touch operation;
determining a following speed corresponding to the touch pressure;
and controlling the target virtual character to move along with the first virtual character according to the following speed.
5. The method for controlling a virtual character in a game according to claim 1, wherein the detecting a manipulation state of a virtual object by a first virtual character in the game includes:
receiving a control instruction which is sent by a terminal corresponding to the first virtual role last time;
and determining the control state of the first virtual role on the virtual object according to the control instruction.
6. The method for controlling a virtual character in a game according to claim 1, wherein the detecting a manipulation state of a virtual object by a first virtual character in the game includes:
detecting a direction of motion of the virtual object;
and determining the control state of the first virtual character on the virtual object according to the motion direction.
7. The method for controlling a virtual character in a game according to claim 1, wherein the determining a first target interaction action corresponding to the designated interaction control according to the manipulation state includes:
detecting whether the first virtual character is paired with the target virtual character;
and if so, determining a first target interaction action corresponding to the specified interaction control according to the control state.
8. The in-game virtual character control method according to claim 7, wherein the detecting whether the first virtual character is paired with the target virtual character includes:
detecting whether the distance between the target virtual character and the first virtual character is smaller than a preset distance threshold value;
if so, determining that the first virtual role is paired with the target virtual role;
if not, determining that the first virtual role is not paired with the target virtual role.
9. The in-game virtual character control method according to claim 1, further comprising, before responding to the first touch operation for the specified interactive control:
determining an icon corresponding to the first target interactive action;
displaying the designated interaction control on the graphical user interface in the form of the icon.
10. The method of controlling an in-game virtual character according to claim 1, the method further comprising:
when the target virtual character is detected to be located in a preset area in a game scene, detecting the motion state of the virtual object;
when the motion state is detected to be in a preset state, determining a second target interaction action corresponding to the specified interaction control;
and responding to a third touch operation aiming at the specified interactive control, and controlling the target virtual character to execute the second target interactive action.
11. The method of controlling an in-game virtual character according to claim 1, the method further comprising:
when the virtual object is detected to be controlled by a second virtual character in the game, responding to a fourth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fourth touch operation.
12. The method of controlling an in-game virtual character according to claim 1, the method further comprising:
when the virtual object is detected to be controlled by the target virtual character, responding to a fifth touch operation aiming at the specified interaction control, and controlling the target virtual character to execute an interaction action corresponding to the fifth touch operation aiming at the virtual object.
13. An apparatus for controlling a virtual character in a game, wherein a graphical user interface is provided via an electronic device, the graphical user interface including a designated interaction control, the apparatus comprising:
the detection module is used for detecting the control state of a first virtual character in the game on a virtual object;
the determining module is used for determining a first target interaction action corresponding to the specified interaction control according to the control state;
and the control module is used for responding to the first touch operation aiming at the specified interactive control and controlling the target virtual character in the game to execute the first target interactive action.
14. A computer-readable storage medium, characterized in that it stores a computer program adapted to be loaded by a processor for performing the steps in the method for controlling a virtual character in a game according to any one of claims 1 to 12.
15. An electronic device, characterized in that the electronic device comprises a memory in which a computer program is stored and a processor that executes the steps in the method for controlling a virtual character in a game according to any one of claims 1 to 12 by calling the computer program stored in the memory.
CN202110352615.0A 2021-03-31 2021-03-31 Method and device for controlling virtual character in game, storage medium and equipment Active CN113082688B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110352615.0A CN113082688B (en) 2021-03-31 2021-03-31 Method and device for controlling virtual character in game, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110352615.0A CN113082688B (en) 2021-03-31 2021-03-31 Method and device for controlling virtual character in game, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN113082688A true CN113082688A (en) 2021-07-09
CN113082688B CN113082688B (en) 2024-02-13

Family

ID=76672953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110352615.0A Active CN113082688B (en) 2021-03-31 2021-03-31 Method and device for controlling virtual character in game, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN113082688B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113476823A (en) * 2021-07-13 2021-10-08 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and electronic equipment
CN113577763A (en) * 2021-07-29 2021-11-02 网易(杭州)网络有限公司 Control method and device for game role
CN113750533A (en) * 2021-09-07 2021-12-07 网易(杭州)网络有限公司 Information display method and device in game and electronic equipment
CN116688493A (en) * 2023-07-31 2023-09-05 厦门真有趣信息科技有限公司 Interactive control method for football game

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017185190A (en) * 2016-03-31 2017-10-12 株式会社コロプラ Game program, method, and information processor including touch screen
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN107551537A (en) * 2017-08-04 2018-01-09 网易(杭州)网络有限公司 The control method and device of virtual role, storage medium, electronic equipment in a kind of game
CN109173250A (en) * 2018-08-27 2019-01-11 广州要玩娱乐网络技术股份有限公司 More character control methods, computer storage medium and terminal
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN112245918A (en) * 2020-11-13 2021-01-22 腾讯科技(深圳)有限公司 Control method and device of virtual role, storage medium and electronic equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017185190A (en) * 2016-03-31 2017-10-12 株式会社コロプラ Game program, method, and information processor including touch screen
CN107433036A (en) * 2017-06-21 2017-12-05 网易(杭州)网络有限公司 The choosing method and device of object in a kind of game
CN107551537A (en) * 2017-08-04 2018-01-09 网易(杭州)网络有限公司 The control method and device of virtual role, storage medium, electronic equipment in a kind of game
CN109173250A (en) * 2018-08-27 2019-01-11 广州要玩娱乐网络技术股份有限公司 More character control methods, computer storage medium and terminal
CN110215691A (en) * 2019-07-17 2019-09-10 网易(杭州)网络有限公司 The control method for movement and device of virtual role in a kind of game
CN112245918A (en) * 2020-11-13 2021-01-22 腾讯科技(深圳)有限公司 Control method and device of virtual role, storage medium and electronic equipment

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113476823A (en) * 2021-07-13 2021-10-08 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and electronic equipment
CN113476823B (en) * 2021-07-13 2024-02-27 网易(杭州)网络有限公司 Virtual object control method and device, storage medium and electronic equipment
CN113577763A (en) * 2021-07-29 2021-11-02 网易(杭州)网络有限公司 Control method and device for game role
CN113750533A (en) * 2021-09-07 2021-12-07 网易(杭州)网络有限公司 Information display method and device in game and electronic equipment
CN116688493A (en) * 2023-07-31 2023-09-05 厦门真有趣信息科技有限公司 Interactive control method for football game
CN116688493B (en) * 2023-07-31 2023-10-24 厦门真有趣信息科技有限公司 Interactive control method for football game

Also Published As

Publication number Publication date
CN113082688B (en) 2024-02-13

Similar Documents

Publication Publication Date Title
CN113082688B (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN113398566A (en) Game display control method and device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN115888101A (en) Virtual role state switching method and device, storage medium and electronic equipment
CN115212572A (en) Control method and device of game props, computer equipment and storage medium
CN115193064A (en) Virtual object control method and device, storage medium and computer equipment
CN114159789A (en) Game interaction method and device, computer equipment and storage medium
CN116139483A (en) Game function control method, game function control device, storage medium and computer equipment
CN114522429A (en) Virtual object control method and device, storage medium and computer equipment
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN117482523A (en) Game interaction method, game interaction device, computer equipment and computer readable storage medium
CN116870472A (en) Game view angle switching method and device, computer equipment and storage medium
CN115212566A (en) Virtual object display method and device, computer equipment and storage medium
CN116474367A (en) Virtual lens control method and device, storage medium and computer equipment
CN116999835A (en) Game control method, game control device, computer equipment and storage medium
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN116943200A (en) Virtual character control method, device, computer equipment and storage medium
CN116139484A (en) Game function control method, game function control device, storage medium and computer equipment
CN117861213A (en) Game skill processing method, game skill processing device, computer equipment and storage medium
CN115317893A (en) Virtual resource processing method and device, computer equipment and storage medium
CN115212567A (en) Information processing method, information processing device, computer equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant