CN114272607A - Game-based command interaction method and device and electronic equipment - Google Patents

Game-based command interaction method and device and electronic equipment Download PDF

Info

Publication number
CN114272607A
CN114272607A CN202111667941.7A CN202111667941A CN114272607A CN 114272607 A CN114272607 A CN 114272607A CN 202111667941 A CN202111667941 A CN 202111667941A CN 114272607 A CN114272607 A CN 114272607A
Authority
CN
China
Prior art keywords
virtual object
command
touch
control
touch operation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111667941.7A
Other languages
Chinese (zh)
Inventor
许展豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202111667941.7A priority Critical patent/CN114272607A/en
Publication of CN114272607A publication Critical patent/CN114272607A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a command interaction method, a device and electronic equipment based on a game, wherein the method is applied to terminal equipment, the terminal equipment comprises a first touch screen and a second touch screen, and the method comprises the following steps: the method comprises the steps of responding to a first touch operation and a second touch operation which act on a first touch screen, displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on a second touch screen, responding to the operation of moving the first cursor to a first virtual object and moving the second cursor to a second virtual object, generating a command control between the first virtual object and the second virtual object, and responding to a third touch operation which acts on the command control to control the action of the first virtual object or the second virtual object. The scheme saves the time for commanding, improves the efficiency for commanding, avoids the condition of misoperation of the user, and further improves the game experience of the user.

Description

Game-based command interaction method and device and electronic equipment
Technical Field
The embodiment of the application relates to the technical field of games, in particular to a command interaction method and device based on games and electronic equipment.
Background
With the popularization of intelligent devices, more and more users begin to use intelligent terminal devices to play games, and the types of games are more and more abundant, for example, common game types may include round games, role playing games, audio games and the like.
The turn-based game is a game type which has the earliest widespread use of networking functions, and all players take turns and can only perform the operation of the game only by taking turns. And in the round-based game, the cooperation among players is often the key of winning, so the command in battle is a very important link, and because the time of each round is limited, the operations are performed in time-sharing manner. The existing command mode is generally that a long press character pops up a fighting command, or a mode of combining typewriting and voice is used for commanding in a battle.
However, long press or typing is time-consuming, the time of each turn is limited, which easily causes the user to operate overtime, and the command is unclear by voice, which further causes the misoperation of the user and affects the game experience of the user.
Disclosure of Invention
The embodiment of the application provides a command interaction method and device based on a game and electronic equipment, so that the efficiency and accuracy of user command are improved.
In a first aspect, an embodiment of the present application provides a command interaction method based on a game, which is applied to a terminal device, where the terminal device includes a first touch screen and a second touch screen, and includes:
responding to a first touch operation and a second touch operation acting on a first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on a second touch screen;
generating a command control between a first virtual object and a second virtual object in response to moving the first cursor to the first virtual object and moving the second cursor to the second virtual object;
and responding to a third touch operation acted on the command control to control the first virtual object or the second virtual object to act.
Optionally, the responding to a third touch operation applied to the command control to control the action of the first virtual object or the second virtual object includes:
responding to a first sliding operation from the command control to the direction of the first virtual object, and generating and displaying a first command item control acting on the first virtual object;
responding to a first target command item of which the initial touch point is located in the first command item control and the final touch point is located in the first command item control, selecting the first target command item and controlling the second virtual object to execute an action corresponding to the first target command item on the first virtual object through a second sliding operation which is continuous with the first sliding operation.
Optionally, the responding to a third touch operation applied to the command control to control the action of the first virtual object or the second virtual object includes:
responding to a third sliding operation from the command control to the direction of the second virtual object, and generating and displaying a second command item control acting on the second virtual object;
responding to a second target command item of which the initial touch point is located in the second command item control and the final touch point is located in the second command item control, selecting the second target command item and controlling the first virtual object to execute an action corresponding to the second target command item on the second virtual object through a fourth sliding operation which is continuous with the third sliding operation.
Optionally, the generating a first command item control acting on the first virtual object includes:
and generating a first command item control according to the attribute information of the first virtual object, wherein the first command item control comprises any one or more of a general attack command item, a drawing command item, a point and disc command item, a point and mix command item, a nature and sound command item and a protection command item.
Optionally, the responding to a first touch operation and a second touch operation on the first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on the second touch screen includes:
when touch operation of two or more touch points exists on the first touch screen, determining the first detected touch operation as a first touch operation, and determining the second detected touch operation as a second touch operation;
responding to the first touch operation, and displaying a first cursor corresponding to the touch point of the first touch operation on the second touch screen;
and responding to the second touch operation, and displaying a second cursor corresponding to the touch point of the second touch operation on the second touch screen.
Optionally, after the generating a command control between the first virtual object and the second virtual object, further includes:
responding to a fifth sliding operation from the command control to the direction of the first virtual object or the second virtual object, and generating and displaying a command item control corresponding to the first virtual object or the second virtual object;
and responding to a sixth sliding operation which is continuous with the fifth sliding operation and is from the command item control corresponding to the first virtual object or the second virtual object to the command control, and canceling to display the command item control corresponding to the first virtual object or the second virtual object.
Optionally, the method further includes:
and when the third touch operation is executed on the command control, releasing a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation.
Optionally, before generating the command control between the first virtual object and the second virtual object, the method further includes:
generating a connecting line between the first virtual object and the second virtual object according to a preset style;
generating a command control between the first virtual object and the second virtual object, including:
and generating a command control at the middle position of the connecting line.
Optionally, the first touch screen is located on the back side of the terminal device, the second touch screen is located on the front side of the terminal device, and the first touch screen and the second touch screen are used for obtaining touch operations of the same user.
In a second aspect, an embodiment of the present application provides a command interaction apparatus based on a game, which is applied to a terminal device, where the terminal device includes a first touch screen and a second touch screen, and the apparatus includes:
the display module is used for responding to a first touch operation and a second touch operation which act on the first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on the second touch screen;
a response module for generating a command control between a first virtual object and a second virtual object in response to the operation of moving the first cursor to the first virtual object and moving the second cursor to the second virtual object;
the response module is further configured to control the first virtual object or the second virtual object to act in response to a third touch operation acting on the command control.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes computer-executable instructions stored in the memory to implement the game-based command interaction method as described in the first aspect above and in various possible designs of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which computer-executable instructions are stored, and when a processor executes the computer-executable instructions, the method for command interaction based on games according to the first aspect and various possible designs of the first aspect is implemented.
In a fifth aspect, embodiments of the present application provide a computer program product, which includes a computer program, and when the computer program is executed by a processor, the method for game-based command interaction according to the first aspect and various possible designs of the first aspect is implemented.
The embodiment of the application can be operated based on a terminal device comprising a first touch screen and a second touch screen, and provides a command interaction method, a device and an electronic device based on a game, after adopting the scheme, a first touch operation and a second touch operation which act on the first touch screen can be responded, a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation are displayed on the second touch screen, then the first cursor can be moved to a first virtual object and the second cursor can be moved to a second virtual object in response to the operation of moving the first cursor to the second virtual object, a command control is generated between the first virtual object and the second virtual object, after the command control is generated, the first virtual object or the second virtual object can be controlled in response to a third touch operation which acts on the command control, and the touch operation on the first touch screen is correspondingly converted into a cursor on the second touch screen, and when the cursor is positioned on the two virtual objects, the command control for commanding the two virtual objects to act is automatically generated to command the virtual objects to act, so that the command time is saved, the command efficiency is improved, the condition of misoperation of a user is avoided, and the game experience of the user is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic diagram of an application scenario of a game-based command interaction method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a method for game-based command interaction according to an embodiment of the present disclosure;
fig. 3 is an application diagram of a command control generation process provided in an embodiment of the present application;
fig. 4 is an application diagram of a command control generation process according to another embodiment of the present application;
fig. 5 is an application diagram of a command process provided in an embodiment of the present application;
FIG. 6 is a schematic diagram of an application of a command process according to an embodiment of the present application;
FIG. 7 is a schematic structural diagram of a game-based command interaction device according to an embodiment of the present application;
fig. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the above-described drawings (if any) are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of including other sequential examples in addition to those illustrated or described. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
During the game, in order to improve the winning rate of the game, the players are often required to cooperate with each other to perform the operation. When the players cooperate with each other to operate the game, the players usually need to express their own thoughts to improve the cooperation effect, that is, each player can direct the operation process of the game, and because the time of each game round is limited, many operations need to be performed in minutes and seconds. The existing command mode is generally that a long press character pops up a fighting command, or a mode of combining typewriting and voice is used for commanding in a battle. However, long press or typing is time-consuming, the time of each turn is limited, which easily causes the user to operate overtime, and the command is unclear by voice, which further causes the misoperation of the user and affects the game experience of the user.
Based on the technical problem, the touch operation on the first touch screen is correspondingly converted into the cursor on the second touch screen, and the virtual object action is commanded in a mode of automatically generating the command control for commanding the two virtual object actions when the cursor is positioned on the two virtual objects, so that the time for commanding is saved, the efficiency for commanding is improved, the condition of misoperation of a user is avoided, and the technical effect of game experience of the user is improved.
The game-based command interaction method in the embodiment of the application can be operated on local terminal equipment or a cloud interaction system.
The cloud interaction system comprises a cloud server and user equipment and is used for running cloud applications. The cloud applications run separately.
In an alternative embodiment, cloud gaming refers to a cloud computing-based gaming mode. In the running mode of the cloud game, the running main body of the game program and the game picture presenting main body are separated, the storage and the running of the object control method are completed on a cloud game server, and the cloud game client is used for receiving and sending data and presenting the game picture, for example, the cloud game client can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server in the cloud is used for processing the game data. When a game is played, a player operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
In an alternative embodiment, the local terminal device stores a game program and is used to present a game screen. The local terminal device is used for interacting with the player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
Fig. 1 is a schematic view of an application scenario of a game-based command interaction method according to an embodiment of the present application, where the game-based command interaction method can be applied to a terminal device including a first touch screen and a second touch screen. As shown in a in fig. 1, which is an application diagram of a first touch screen of a terminal device, in this embodiment, the first touch screen includes a first touch area 101, and a user can perform a touch operation in the first touch area 101. Further, the user may perform the first touch operation and the second touch operation in the first touch area 101 by using the left and right fingers. As shown in fig. 1 b, which is an application diagram of the second touch screen of the terminal device, in this embodiment, an application scene of a running game is displayed in the second touch screen, and in the application scene, a virtual object, a terrain decoration, and the like may be included. For example, the virtual object may include virtual object a and virtual object B.
In addition, a first cursor 102 corresponding to the first touch operation and a second cursor 103 corresponding to the second touch operation may also be included in the second touch screen. The first cursor can move correspondingly according to the position of the first touch operation, and the second cursor can move correspondingly according to the position of the second touch operation. And the first cursor and the second cursor may be represented by different shapes, for example, the first cursor and the second cursor may be a pentagram, a palm type, an arrow, a square, a solid circle, and the like.
The technical solution of the present application will be described in detail below with specific examples. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments.
Fig. 2 is a schematic flowchart of a game-based command interaction method according to an embodiment of the present application, where the method according to the present embodiment may be executed by a terminal device, and the terminal device includes a first touch screen and a second touch screen. As shown in fig. 2, the method of this embodiment may include:
s201: and responding to a first touch operation and a second touch operation acting on the first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on the second touch screen.
In this embodiment, the terminal device is a mobile phone or a tablet computer with multiple touch screens, for example, a folding screen terminal or a dual-screen terminal, and the screen of the terminal device may be operated by a finger or a touch pen. Specifically, the terminal device may include a first touch screen and a second touch screen, where the first touch screen is located on a back side of the terminal device, the second touch screen is located on a front side of the terminal device, and the first touch screen and the second touch screen are used to obtain touch operations of the same user.
As an example, the second touch screen may be a screen on the front side of the terminal device (front side screen), the first touch screen may be a screen on the back side of the terminal device (back side screen), and the user may hold the terminal device with both hands, at this time, the user may perform terminal operations in a conventional manner for operating the terminal device, specifically, thumbs of both hands of the user may control a virtual object to perform terminal operations such as moving or releasing skills on the front side screen, and may also perform a sliding operation on the back side screen through fingers of both hands (e.g., index finger, middle finger, ring finger, and little finger) to correspondingly slide a display cursor on the front side screen.
In addition, the area of the first touch screen may be the same as the area of the second touch screen. The first touch operation may be a pressing operation, a clicking operation, a sliding operation, and the like. The second touch operation may also be a press operation, a click operation, a slide operation, or the like.
Further, responding to a first touch operation and a second touch operation acting on the first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on the second touch screen may specifically include:
when the touch operation of two or more touch points exists on the first touch screen, determining the first detected touch operation as a first touch operation, and determining the second detected touch operation as a second touch operation.
And responding to the first touch operation, and displaying a first cursor corresponding to the touch point of the first touch operation on the second touch screen.
And responding to the second touch operation, and displaying a second cursor corresponding to the touch point of the second touch operation on the second touch screen.
Specifically, when the user operates on the first touch screen, multiple contact points may exist at the same time, that is, since the fingers of both hands of the user can perform touch operations on the first screen, for example, the index finger and the middle finger can perform touch operations on the first screen, in order to avoid the situation that the cursor cannot be displayed normally due to confusion of the multiple finger operations, or since the multiple touch points may have an inaccurate recognition situation, in the case that only two touch points are needed in the embodiment, the first detected touch operation may be determined as the first touch operation (corresponding to the first touch point), and the first cursor corresponding to the first touch operation is displayed on the second touch screen. And determining the second detected touch operation as a second touch operation (corresponding to a second touch point), and displaying a second cursor corresponding to the second touch operation on the second touch screen. And other touch operations except the first touch operation and the second touch operation may no longer respond.
Further, the terminal device may be a folding screen device, the first screen and the second screen of the folding screen device both belong to the same screen, and only the two screens are distinguished by a folding central axis, and by rotating the folding central axis, the second touch screen of the folding screen device may be adjusted to be on the front side, and the first touch screen may be adjusted to be on the back side. Based on the characteristic that the screens of the folding screen device belong to the same screen, the position of a cursor displaying the cursor on the second touch screen can be converted by calculating the vertical distance from the contact point of the first touch operation on the first touch screen to the folding middle shaft. Correspondingly, for the first cursor, a first touch point position of a touch point of the first touch operation and a vertical distance between the touch point and a folding central axis of the folding screen device may be obtained first. And obtaining a first cursor position on the second touch screen according to the position and the vertical distance of the first touch point, and displaying a first cursor at the first cursor position of the second touch screen. The display principle of the second cursor is similar to that of the first cursor and will not be discussed in detail here.
In the embodiment, because the touch operation is performed on the first touch screen of the terminal device, that is, the selection of the virtual object is not required to be realized through the second touch screen, the selection of the virtual object is not confused with other touch operations on the second touch screen, the object shielding situation occurring when the virtual object is dragged on the second touch screen is avoided, and the accuracy of determining the virtual object is improved.
S202: in response to an operation to move the first cursor to the first virtual object and to move the second cursor to the second virtual object, a command control is generated between the first virtual object and the second virtual object.
In this embodiment, the user can control the movement of the cursor in the second touch screen by moving the position of the contact point in the first touch screen. Specifically, the first cursor may be moved to the position of the first virtual object, and the second cursor may be moved to the position of the second virtual object, so that the command control of the user command action may be generated between the first virtual object and the second virtual object.
The shape of the command control can be set according to the practical application scene, and can be a flag shape, a triangle shape and the like for example, and is not limited specifically here.
For example, fig. 3 is an application schematic diagram of a command control generation process provided in an embodiment of the present application, and as shown in a in fig. 3, in this embodiment, two virtual objects, namely a first virtual object a and a second virtual object B, are located on a second touch screen, and a first cursor C and a second cursor D are further included between the first virtual object a and the second virtual object B, and the first cursor C may move in a direction in which the first virtual object a is located, and the second cursor D may move in a direction in which the second virtual object B is located. In this embodiment, as shown in B in fig. 3, the first cursor C has been moved to the position of the first virtual object a, and the second cursor D has been moved to the position of the second virtual object B, then a flag-shaped command control can be generated between the first virtual object a and the second virtual object B.
Additionally, prior to generating the command control between the first virtual object and the second virtual object, the method may further comprise:
and generating a connecting line between the first virtual object and the second virtual object according to a preset style.
Correspondingly, generating a command control between the first virtual object and the second virtual object may include: and generating a command control at the middle position of the connecting line.
Specifically, when the command control is generated between the first virtual object and the second virtual object, a connection line may be first generated between the first virtual object and the second virtual object, and then the command control may be generated on the connection line. Of course, the connecting line may also be generated at the same time as the command control is generated, i.e., the command control and the connecting line are generated between the first virtual object and the second virtual object in response to the operation of moving the first cursor to the first virtual object and moving the second cursor to the second virtual object. The connecting line is used for connecting the first virtual object and the second virtual object, the style of the connecting line can be preset, and when the game starts, the style to be used is selected. Or a default style may be set, which may be used directly if the user does not select the style at the start of the game. Through the mode of setting up the style for the connecting wire, user's use experience has further been improved.
In addition, the color, width, frame style and the like of the connecting line can be set in a customized manner according to the actual application scene, and are not limited in detail here.
For example, fig. 4 is an application schematic diagram of a command control generation process provided in another embodiment of the present application, and as shown in fig. 4, in this embodiment, when the first cursor C is moved to a position where the first virtual object a is located, and the second cursor D is moved to a position where the second virtual object B is located, a connection line with a gray scale may be generated between the virtual object a and the virtual object B, and a flag-shaped command control is generated at an intermediate position of the connection line.
S203: and responding to a third touch operation acted on the command control to control the first virtual object or the second virtual object to act.
In this embodiment, after generating the command control, the user may control the first virtual object to perform the related action on the second virtual object according to the command control, or control the second virtual object to perform the related action on the first virtual object according to the command control. In addition, it is necessary to execute the relevant action on which virtual object, that is, to execute the sliding operation in the direction in which the virtual object is located. For example, if it is necessary to execute the relevant action on the first virtual object, the sliding operation may be executed in the direction of the first virtual object. If the related action needs to be executed to the second virtual object, the sliding operation can be executed in the direction of the second virtual object.
Further, in an implementation manner, controlling the action of the first virtual object or the second virtual object in response to a third touch operation acting on the command control may specifically include:
and responding to a first sliding operation from the command control to the direction of the first virtual object, and generating and displaying a first command item control acting on the first virtual object.
Responding to a first target command item of which the initial touch point is located in the first command item control and the final touch point is located in the first command item control, selecting the first target command item and controlling the second virtual object to execute an action corresponding to the first target command item on the first virtual object through a second sliding operation which is continuous with the first sliding operation.
Specifically, when the related action needs to be executed on the first virtual object, a first sliding operation may be executed in a direction in which the first virtual object is located, and the terminal device may generate and display a first command item control acting on the first virtual object in response to the first sliding operation. The first command item control comprises command items related to the first virtual object, a user can slide in each command item, the command item corresponding to the position of the finally stopped touch point is determined as a first target command item, and the second virtual object is controlled to execute an action corresponding to the first target command item on the first virtual object.
Further, generating the first command item control acting on the first virtual object may specifically include: and generating a first command item control according to the attribute information of the first virtual object.
The first command item control can contain any one or more of a general attack command item, a drawing command item, a point dish command item, a point mixing command item, a nature command item and a protection command item. The attribute information of the first virtual object may be physical quality, strength, legal strength, and the like, which may determine a command item.
For example, fig. 5 is an application schematic diagram of a command process provided in an embodiment of the present application, in this embodiment, a second virtual object B commands a first virtual object a to perform a movement, as shown in a in fig. 5, a terminal device may generate and display a first command item control acting on the first virtual object a in response to a sliding operation in a direction in which the first virtual object a is located, where the first command item control includes a general attack command item, a bloodletting command item, a point and disk command item, a point and mixed command item, a natural sound command item, and six protection command items, as shown in B in fig. 5, a user may slide at positions in which the six command items are located, and finally release at the bloodletting command item, that is, may control the second virtual object B to perform a bloodletting movement on the first virtual object a.
In another implementation manner, the controlling the motion of the first virtual object or the second virtual object in response to the third touch operation acting on the command control may specifically include:
and responding to a third sliding operation from the command control to the direction of the second virtual object, and generating and displaying a second command item control acting on the second virtual object.
Responding to a second target command item of which the initial touch point is located in the second command item control and the final touch point is located in the second command item control, selecting the second target command item and controlling the first virtual object to execute an action corresponding to the second target command item on the second virtual object through a fourth sliding operation which is continuous with the third sliding operation.
Specifically, when the related action needs to be executed on the second virtual object, a third sliding operation may be executed in a direction in which the second virtual object is located, and the terminal device may generate and display a second command item control acting on the second virtual object in response to the third sliding operation. And the second command item control comprises command items related to the second virtual object, the user can slide in each command item, the command item corresponding to the finally stopped touch point position is determined as a second target command item, and the first virtual object is controlled to execute an action corresponding to the second target command item on the second virtual object.
Further, generating a second command item control acting on the second virtual object may specifically include: and generating a second commanding item control according to the attribute information of the second virtual object.
The second command item control can contain any one or more of a general attack command item, a drawing command item, a point dish command item, a point mixing command item, a nature sound command item and a protection command item. The attribute information of the first virtual object may be physical quality, strength, legal strength, and the like, which may determine a command item.
In addition, the command items in the first command item control can be the same as the command items in the second command item control or different from the command items in the second command item control.
For example, fig. 6 is an application schematic diagram of a command process provided in another embodiment of the present application, in this embodiment, a first virtual object a commands a second virtual object B to perform a movement, as shown in a in fig. 6, a terminal device may generate and display a second command item control acting on the second virtual object B in response to a sliding operation in a direction in which the second virtual object B is located, where the second command item control includes a general attack command item, a bloodletting command item, a point and disk command item, a point and mix command item, a natural sounds command item, and six protection command items, as shown in B in fig. 6, a user may slide at positions in which the six command items are located, and finally release at the bloodletting command item, that is, may control the first virtual object a to perform a bloodletting movement on the second virtual object B.
After the scheme is adopted, a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation can be displayed on the second touch screen in response to the first touch operation and the second touch operation which act on the first touch screen, then a command control can be generated between the first virtual object and the second virtual object in response to the operation of moving the first cursor to the first virtual object and moving the second cursor to the second virtual object, after the command control is generated, the first virtual object or the second virtual object can be controlled to move in response to the third touch operation which acts on the command control, the virtual object can be commanded to move in a mode of correspondingly converting the touch operation on the first touch screen into the cursor on the second touch screen and automatically generating the command control for the movement of the two virtual objects when the cursors are positioned on the two virtual objects, the method is simple and convenient to operate, the time for commanding is saved, the efficiency for commanding is improved, the condition of misoperation of a user is avoided, and the game experience of the user is improved.
Based on the method of fig. 2, the present specification also provides some specific embodiments of the method, which are described below.
Further, in another embodiment, after generating the command control between the first virtual object and the second virtual object, the method may further comprise:
and responding to a fifth sliding operation from the command control to the direction of the first virtual object or the second virtual object, and generating and displaying a command item control corresponding to the first virtual object or the second virtual object.
And responding to a sixth sliding operation which is continuous with the fifth sliding operation and is from the command item control corresponding to the first virtual object or the second virtual object to the command control, and canceling to display the command item control corresponding to the first virtual object or the second virtual object.
In this embodiment, if the user wants to cancel the operation after touching the command control, the user can slide to the range where the command control is located without releasing the sliding operation, and then the trigger command control can be cancelled.
In addition, the command control can be cancelled by sliding to a direction other than the direction in which the first virtual object and the second virtual object are located.
Furthermore, in another embodiment, the method may further include:
and when the third touch operation is executed on the command control, releasing a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation.
In this embodiment, the user generally operates on the second touch screen by using a thumb or an index finger, and operates on the first touch screen by using a middle finger, an index finger or other fingers, so that when the third touch operation is performed on the command control, it is indicated that the user has operated the first virtual object or the second virtual object, and therefore, the first cursor corresponding to the first touch operation and the second cursor corresponding to the second touch operation can be released to release the fingers of the user, the cursor does not need to be kept on the virtual object all the time, which is convenient for the user to command the game to run more flexibly, and improves the game experience of the user.
Based on the same idea, an embodiment of the present specification further provides a device corresponding to the foregoing method, and fig. 7 is a schematic structural diagram of a command interaction device based on a game provided in the embodiment of the present application, and is applied to a terminal device, where the terminal device includes a first touch screen and a second touch screen, and the device includes, as shown in fig. 7, the device provided in this embodiment may include:
the display module 701 is configured to respond to a first touch operation and a second touch operation acting on a first touch screen, and display a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on a second touch screen.
In this embodiment, the display module 701 is further configured to:
when the touch operation of two or more touch points exists on the first touch screen, determining the first detected touch operation as a first touch operation, and determining the second detected touch operation as a second touch operation.
And responding to the first touch operation, and displaying a first cursor corresponding to the touch point of the first touch operation on the second touch screen.
And responding to the second touch operation, and displaying a second cursor corresponding to the touch point of the second touch operation on the second touch screen.
The first touch screen is located on the back side of the terminal device, the second touch screen is located on the front side of the terminal device, and the first touch screen and the second touch screen are used for obtaining touch operation of the same user.
A response module 702, configured to generate a command control between a first virtual object and a second virtual object in response to the operation of moving the first cursor to the first virtual object and moving the second cursor to the second virtual object.
In this embodiment, the response module 702 is further configured to:
and generating a connecting line between the first virtual object and the second virtual object according to a preset style.
And generating a command control at the middle position of the connecting line.
The response module 702 is further configured to control the first virtual object or the second virtual object to act in response to a third touch operation acting on the command control.
In this embodiment, in an implementation manner, the response module 702 is further configured to:
and responding to a first sliding operation from the command control to the direction of the first virtual object, and generating and displaying a first command item control acting on the first virtual object.
Responding to a first target command item of which the initial touch point is located in the first command item control and the final touch point is located in the first command item control, selecting the first target command item and controlling the second virtual object to execute an action corresponding to the first target command item on the first virtual object through a second sliding operation which is continuous with the first sliding operation.
Further, the response module 702 is further configured to:
and generating a first command item control according to the attribute information of the first virtual object, wherein the first command item control comprises any one or more of a general attack command item, a drawing command item, a point and disc command item, a point and mix command item, a nature and sound command item and a protection command item.
In another implementation, the response module 702 is further configured to:
and responding to a third sliding operation from the command control to the direction of the second virtual object, and generating and displaying a second command item control acting on the second virtual object.
Responding to a second target command item of which the initial touch point is located in the second command item control and the final touch point is located in the second command item control, selecting the second target command item and controlling the first virtual object to execute an action corresponding to the second target command item on the second virtual object through a fourth sliding operation which is continuous with the third sliding operation.
Moreover, in another embodiment, the response module 702 is further configured to:
and responding to a fifth sliding operation from the command control to the direction of the first virtual object or the second virtual object, and generating and displaying a command item control corresponding to the first virtual object or the second virtual object.
And responding to a sixth sliding operation which is continuous with the fifth sliding operation and is from the command item control corresponding to the first virtual object or the second virtual object to the command control, and canceling to display the command item control corresponding to the first virtual object or the second virtual object.
Moreover, in another embodiment, the response module 702 is further configured to:
and when the third touch operation is executed on the command control, releasing a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation.
The apparatus provided in the embodiment of the present application can implement the method of the embodiment shown in fig. 2, and the implementation principle and the technical effect are similar, which are not described herein again.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application, and as shown in fig. 8, a device 800 according to the embodiment includes: a processor 801, and a memory communicatively coupled to the processor. The processor 801 and the memory 802 are connected by a bus 803.
In a specific implementation process, the processor 801 executes the computer-executable instructions stored in the memory 802, so that the processor 801 executes the game-based command interaction method in the above method embodiment.
For a specific implementation process of the processor 801, reference may be made to the above method embodiments, which have similar implementation principles and technical effects, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 8, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose processors, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present invention may be embodied directly in a hardware processor, or in a combination of the hardware and software modules within the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The embodiment of the application also provides a computer-readable storage medium, wherein a computer executing instruction is stored in the computer-readable storage medium, and when a processor executes the computer executing instruction, the game-based command interaction method of the embodiment of the method is realized.
The embodiment of the present application further provides a computer program product, which includes a computer program, and when the computer program is executed by a processor, the method for commanding and interacting based on games as described above is implemented.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.

Claims (13)

1. A command interaction method based on a game is applied to terminal equipment, wherein the terminal equipment comprises a first touch screen and a second touch screen, and the method comprises the following steps:
responding to a first touch operation and a second touch operation acting on a first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on a second touch screen;
generating a command control between a first virtual object and a second virtual object in response to moving the first cursor to the first virtual object and moving the second cursor to the second virtual object;
and responding to a third touch operation acted on the command control to control the first virtual object or the second virtual object to act.
2. The method of claim 1, wherein the controlling the first virtual object or the second virtual object in response to a third touch operation on the command control comprises:
responding to a first sliding operation from the command control to the direction of the first virtual object, and generating and displaying a first command item control acting on the first virtual object;
responding to a first target command item of which the initial touch point is located in the first command item control and the final touch point is located in the first command item control, selecting the first target command item and controlling the second virtual object to execute an action corresponding to the first target command item on the first virtual object through a second sliding operation which is continuous with the first sliding operation.
3. The method of claim 1, wherein the controlling the first virtual object or the second virtual object in response to a third touch operation on the command control comprises:
responding to a third sliding operation from the command control to the direction of the second virtual object, and generating and displaying a second command item control acting on the second virtual object;
responding to a second target command item of which the initial touch point is located in the second command item control and the final touch point is located in the second command item control, selecting the second target command item and controlling the first virtual object to execute an action corresponding to the second target command item on the second virtual object through a fourth sliding operation which is continuous with the third sliding operation.
4. The method of claim 2, wherein generating the first command item control that acts on the first virtual object comprises:
and generating a first command item control according to the attribute information of the first virtual object, wherein the first command item control comprises any one or more of a general attack command item, a drawing command item, a point and disc command item, a point and mix command item, a nature and sound command item and a protection command item.
5. The method of claim 1, wherein displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on a second touch screen in response to the first touch operation and the second touch operation on the first touch screen comprises:
when touch operation of two or more touch points exists on the first touch screen, determining the first detected touch operation as a first touch operation, and determining the second detected touch operation as a second touch operation;
responding to the first touch operation, and displaying a first cursor corresponding to the touch point of the first touch operation on the second touch screen;
and responding to the second touch operation, and displaying a second cursor corresponding to the touch point of the second touch operation on the second touch screen.
6. The method of any of claims 1-5, further comprising, after the generating a command control between the first virtual object and the second virtual object:
responding to a fifth sliding operation from the command control to the direction of the first virtual object or the second virtual object, and generating and displaying a command item control corresponding to the first virtual object or the second virtual object;
and responding to a sixth sliding operation which is continuous with the fifth sliding operation and is from the command item control corresponding to the first virtual object or the second virtual object to the command control, and canceling to display the command item control corresponding to the first virtual object or the second virtual object.
7. The method according to any one of claims 1-5, further comprising:
and when the third touch operation is executed on the command control, releasing a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation.
8. The method of any of claims 1-5, wherein prior to generating a command control between the first virtual object and the second virtual object, the method further comprises:
generating a connecting line between the first virtual object and the second virtual object according to a preset style;
generating a command control between the first virtual object and the second virtual object, including:
and generating a command control at the middle position of the connecting line.
9. The method according to any one of claims 1 to 5, wherein the first touch screen is located on a back side of the terminal device, the second touch screen is located on a front side of the terminal device, and the first touch screen and the second touch screen are used for obtaining touch operations of the same user.
10. A command interaction device based on a game is applied to terminal equipment, the terminal equipment comprises a first touch screen and a second touch screen, and the device comprises:
the display module is used for responding to a first touch operation and a second touch operation which act on the first touch screen, and displaying a first cursor corresponding to the first touch operation and a second cursor corresponding to the second touch operation on the second touch screen;
a response module for generating a command control between a first virtual object and a second virtual object in response to the operation of moving the first cursor to the first virtual object and moving the second cursor to the second virtual object;
the response module is further configured to control the first virtual object or the second virtual object to act in response to a third touch operation acting on the command control.
11. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor;
the memory stores computer-executable instructions;
the processor executes the computer-executable instructions stored by the memory to implement the game-based conductor interaction method of any of claims 1 to 9.
12. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the game-based conductor interaction method of any one of claims 1 to 9.
13. A computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the game based conductor interaction method of any of claims 1 to 9.
CN202111667941.7A 2021-12-30 2021-12-30 Game-based command interaction method and device and electronic equipment Pending CN114272607A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111667941.7A CN114272607A (en) 2021-12-30 2021-12-30 Game-based command interaction method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111667941.7A CN114272607A (en) 2021-12-30 2021-12-30 Game-based command interaction method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114272607A true CN114272607A (en) 2022-04-05

Family

ID=80879556

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111667941.7A Pending CN114272607A (en) 2021-12-30 2021-12-30 Game-based command interaction method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114272607A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024055596A1 (en) * 2022-09-13 2024-03-21 网易(杭州)网络有限公司 Interaction control method and apparatus for skill cast, and electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024055596A1 (en) * 2022-09-13 2024-03-21 网易(杭州)网络有限公司 Interaction control method and apparatus for skill cast, and electronic device

Similar Documents

Publication Publication Date Title
CN107648847B (en) Information processing method and device, storage medium and electronic equipment
CN108804013B (en) Information prompting method and device, electronic equipment and storage medium
WO2017054465A1 (en) Information processing method, terminal and computer storage medium
US20090062004A1 (en) Input Terminal Emulator for Gaming Devices
CN111773705A (en) Interaction method and device in game scene
CN113069759B (en) Scene processing method and device in game and electronic equipment
CN112494939B (en) Processing method and device for decorating props, storage medium and electronic equipment
CN110772789A (en) Method and device for controlling skill in game, storage medium and terminal equipment
CN113350779A (en) Game virtual character action control method and device, storage medium and electronic equipment
CN111643903B (en) Control method and device of cloud game, electronic equipment and storage medium
CN111467791A (en) Target object control method, device and system
CN112891936A (en) Virtual object rendering method and device, mobile terminal and storage medium
CN107626105B (en) Game picture display method and device, storage medium and electronic equipment
CN111841001A (en) Information processing method, device, equipment and storage medium in game
CN114272607A (en) Game-based command interaction method and device and electronic equipment
CN113262476A (en) Position adjusting method and device of operation control, terminal and storage medium
CN111408128B (en) Interaction method and device in game and electronic equipment
CN114504808A (en) Information processing method, information processing apparatus, storage medium, processor, and electronic apparatus
CN113680062A (en) Information viewing method and device in game
CN112221123B (en) Virtual object switching method and device, computer equipment and storage medium
CN112755510A (en) Mobile terminal cloud game control method, system and computer readable storage medium
JP2021145888A (en) Program, terminal, and game system
CN113680047B (en) Terminal operation method, device, electronic equipment and storage medium
CN114442898B (en) Information processing method, device, electronic equipment and readable medium
JP6978540B2 (en) Programs, terminals, and game systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination