CN115645919A - Game control method and device, electronic equipment and readable storage medium - Google Patents

Game control method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN115645919A
CN115645919A CN202211352501.7A CN202211352501A CN115645919A CN 115645919 A CN115645919 A CN 115645919A CN 202211352501 A CN202211352501 A CN 202211352501A CN 115645919 A CN115645919 A CN 115645919A
Authority
CN
China
Prior art keywords
virtual keyboard
virtual
control
game
keyboard
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211352501.7A
Other languages
Chinese (zh)
Inventor
沈天嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202211352501.7A priority Critical patent/CN115645919A/en
Publication of CN115645919A publication Critical patent/CN115645919A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a game control method, a game control device, electronic equipment and a computer-readable storage medium; according to the method and the device, the graphical user interface can be displayed through the terminal equipment, the graphical user interface comprises at least part of game pictures and object operation controls of a virtual scene, and the virtual scene comprises a virtual object; displaying a virtual keyboard on a graphical user interface in response to a keyboard callout operation; displaying an object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; and responding to the touch operation of the object operation control on the virtual keyboard, and controlling the virtual object to make an action corresponding to the object operation control. According to the method and the device, the virtual object can still be controlled in the process of inputting characters by the calling keyboard, the operation complexity and the operation time of controlling the characters in the process of inputting the characters are reduced, and the game operation efficiency is improved.

Description

Game control method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of game technologies, and in particular, to a method and an apparatus for controlling a game, an electronic device, and a computer-readable storage medium.
Background
With the development and popularization of computer equipment technology, more and more terminal games emerge. Currently, text communication is an important interaction mode among players, and in a game such as MOBA (Multiplayer Online Battle Arena), a virtual keyboard is usually required to be called to perform text input so as to realize interaction among players. However, since the player cannot directly perform the game operation after calling out the virtual keyboard, the player must repeatedly perform the following operations to control the character during the character input process: the method comprises the steps of calling out characters input by a keyboard, retracting the keyboard, operating roles, calling out characters input by the keyboard again, retracting the keyboard \8230, and obviously, the operation process of controlling the roles in the character input process is complex, the operation time is long, and the operation efficiency is low.
Disclosure of Invention
The embodiment of the application provides a game control method, a game control device, an electronic device and a computer-readable storage medium, which can still control a virtual object in the process of inputting characters by a calling keyboard, reduce the operation complexity and the operation time of controlling characters in the process of inputting characters, and improve the game operation efficiency.
In a first aspect, an embodiment of the present application provides a method for controlling a game, in which a terminal device displays a graphical user interface, the graphical user interface includes a game screen and an object operation control of at least a part of a virtual scene, and the virtual scene includes a virtual object, the method including:
displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
and in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to perform an action corresponding to the object operation control.
In a second aspect, an embodiment of the present application further provides a game control apparatus, where a terminal device displays a graphical user interface, where the graphical user interface includes a game screen and an object operation control of at least a part of a virtual scene, and the virtual scene includes a virtual object, where the game control apparatus includes:
a first display unit for displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
a second display unit configured to display the object operation control above a display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
the control unit is used for responding to the touch operation of the object operation control on the virtual keyboard, and controlling the virtual object to perform the action corresponding to the object operation control.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory storing a plurality of instructions; the processor loads instructions from the memory to execute the steps of any one of the methods for game control provided by the embodiments of the present application.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium, where a plurality of instructions are stored, and the instructions are suitable for being loaded by a processor to execute steps in any one of the methods for game control provided by the embodiments of the present application.
The method comprises the steps of responding to a keyboard calling operation to display a virtual keyboard on a graphical user interface; displaying an object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; responding to the touch operation of the object operation control on the virtual keyboard with the display level, and controlling the virtual object to make an action corresponding to the object operation control; because the object operation control can be displayed on the display level of the virtual keyboard after the virtual keyboard is called, the object operation control originally covered by the virtual keyboard can be operated by touch control, so that the virtual object can still be controlled in the process of character input of the calling keyboard, and the virtual object can be quickly switched to a conversation input state when being controlled, thereby reducing the operation complexity and the operation time of character control in the character input process and improving the game operation efficiency. Therefore, on one hand, the requirement that a player sends a custom text for interaction is met; on the other hand, interruption of game participation during text information input is reduced, and game experience is improved to a certain extent.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a system for game control provided by an embodiment of the present application;
FIG. 2 is a flow chart illustrating an embodiment of a method for game control provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a scenario of a graphical user interface provided in an embodiment of the present application before calling out a virtual keyboard;
FIG. 4 is a schematic diagram of an application scenario of the graphical user interface after calling out the virtual keyboard provided in the embodiment of the present application;
FIG. 5 is a schematic diagram of another application scenario of the graphical user interface after calling out the virtual keyboard according to the embodiment of the present application;
fig. 6 is a schematic application scenario illustrating an object operation control displayed above a display level of a virtual keyboard according to an embodiment of the present application;
FIG. 7 is a schematic diagram of another application scenario in which an object operation control is displayed above a display level of a virtual keyboard according to an embodiment of the present application;
FIG. 8 is a schematic diagram of another application scenario in which an object operation control is displayed above a display level of a virtual keyboard according to an embodiment of the present application;
fig. 9 is a schematic view of another application scenario in which an object operation control is displayed above a display level of a virtual keyboard according to an embodiment of the present application;
FIG. 10 is a schematic structural diagram of a game control apparatus provided in an embodiment of the present application;
fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. Meanwhile, in the description of the embodiments of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance. Thus, features defined as "first" and "second" may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The embodiment of the application provides a game control method and device, electronic equipment and a computer-readable storage medium. Specifically, the method for controlling a game according to the embodiment of the present application may be executed by an electronic device, where the electronic device may be a terminal or a server. The terminal can be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal can also include a client, which can be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the game control method is operated on a terminal, a terminal device stores a game application and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the game control method is executed on a server, the game control method can be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the game control method are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic diagram of a game control system according to an embodiment of the present application. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. In addition, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points on one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information about game environments may be continuously stored in the databases 3000 while different users play a multiplayer game online.
The game control method provided by the embodiment of the application can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which a game control method is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operating instructions generated by the user acting on the graphical user interface include instructions for launching the game application, and the processor is configured to launch the game application after receiving the instructions provided by the user to launch the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed simultaneously at a plurality of points on the screen. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. Further, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that a game player uses to achieve a goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that may be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
The following detailed description is made with reference to the accompanying drawings, respectively. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments. Although a logical order is shown in the flowcharts, in some cases, the steps shown or described may be performed in an order different than that shown in the figures.
As shown in fig. 2, the specific flow of the game control method may include the following steps 201 to 203, where:
201. in response to a keyboard callout operation, a virtual keyboard is displayed on the graphical user interface.
The terminal equipment can be used for displaying a graphical user interface, the graphical user interface comprises at least part of game pictures and object operation controls of a virtual scene, and the virtual scene comprises a virtual object. For example, as shown in fig. 3, a graphical user interface 100 can be generated by rendering on a touch display screen of a terminal 1000 through execution of a game application, wherein the graphical user interface 100 includes a game screen of a virtual scene and an object operation control 20. The virtual scene on the graphical user interface 100 includes a virtual object 10; the object manipulation controls 20 may comprise a plurality of ones, and may include, for example, a skill control 21 (associated skill mode 1), a skill control 22 (associated skill mode 2), a skill control 23 (associated skill mode 3), a skill control 24 (associated skill mode 4), and a movement control 25 (for moving the virtual object). Wherein, the user can control the virtual object 10 to trigger the associated skill mode in the virtual scene by clicking a single skill control; for example, when the user clicks the skill control 22 corresponding to the skill pattern 2, the virtual object 10 may be controlled to trigger the skill pattern 2 in the virtual scene.
The virtual object refers to a virtual character in the game, such as a virtual character, a virtual animal, a virtual article, and the like.
The object operation control refers to a control for controlling the virtual object, for example, a movement control (such as a joystick control) for controlling the movement of the virtual object, and a skill control (such as an attack control) for controlling the virtual object to release skills.
The virtual keyboard can be used for inputting characters, or can be used for converting voice input into characters.
The keyboard calling operation refers to an operation for calling out a virtual keyboard, specifically, a mode for calling out the virtual keyboard can be preset according to actual service scene requirements, and the mode for calling out the virtual keyboard is not specifically limited in the embodiment of the application; for example, it may be set that a virtual keyboard is called out at any position of a double-click graphical user interface, and a user may call out the virtual keyboard at any position of the double-click graphical user interface through a touch display screen of terminal 1000, where the keyboard call-out operation is specifically "double-click any position of the graphical user interface"; for another example, when the power key of the terminal 1000 is continuously pressed 3 times during the game running process, the user can call the virtual keyboard by continuously pressing the power key of the terminal 1000 3 times, and the keyboard call-out operation is specifically "continuously pressing the power key of the terminal 1000 3 times".
The manner of displaying the virtual keyboard in step 201 is various, and exemplarily includes:
(1) In some embodiments, the virtual keyboard may be directly covered on the game screen and the object operation control, so as to avoid the outgoing keyboard being blocked by the game screen or the object operation control, and ensure that the information can be normally input after the virtual keyboard is outgoing. In this case, step 201 may specifically include: in response to a keyboard bring-up operation, a virtual keyboard is displayed above a display level of the game screen and above a display level of the object operation control to display the virtual keyboard on the graphical user interface.
Taking the example that the manner of presetting the outgoing call virtual keyboard is "double-clicking the arbitrary position of the graphical user interface" as an example, for example, as shown in fig. 3 and 4, the graphical user interface before the outgoing call virtual keyboard is as shown in fig. 3, when the player double-clicks the arbitrary position of the graphical user interface through the touch display screen of the terminal 1000, the terminal 1000 will respond to the keyboard to call up and operate "double-clicking the arbitrary position of the graphical user interface", and display the virtual keyboard above the display level of the game screen and above the display level of the object operation control so as to display the virtual keyboard on the game screen and the object operation control in an overlaying manner, as shown in fig. 4.
(2) As shown in fig. 4, since the display hierarchy of the virtual keyboard is above the display hierarchy of the game screen, the game screen displayed in the graphical user interface in superimposition with the virtual keyboard is covered by the virtual keyboard, so that the player cannot directly touch the part of the game screen (displayed in superimposition with the virtual keyboard), and the player cannot control the virtual object during calling out the virtual keyboard for text input. In some embodiments, in order to avoid the game picture being covered by the virtual keyboard, thereby avoiding affecting the game situation observed by the player, the virtual keyboard may be displayed with a certain transparency. In this case, step 201 may specifically include: and responding to a keyboard calling-out operation, displaying a virtual keyboard on the graphical user interface with a preset first transparency so as to see out a game picture below the display level of the virtual keyboard.
The first transparency is the display transparency of the virtual keyboard and is used for showing game pictures with display levels below the virtual keyboard; it can be understood that the first transparency is greater than 0 to effectively show the game picture with the display level below the virtual keyboard. The specific value of the first transparency may be set according to the actual service scene requirement, and the specific value of the first transparency in the embodiment of the present application is not limited.
Taking the first transparency of 50% as an example, the mode of presetting the calling virtual keyboard is "double-clicking the calling virtual keyboard at any position of the graphical user interface", for example, as shown in fig. 3 and 5, a graphical user interface before calling the virtual keyboard is as shown in fig. 3, first, a player may double-click at any position of the graphical user interface through a touch display screen of the terminal 1000; then, the terminal 1000 will display a virtual keyboard above the display level of the game screen with a transparency of 50% in response to the keyboard bring-up operation "double click on an arbitrary position of the graphical user interface", as shown in fig. 5; although the virtual keyboard covers the game picture, the game picture under the display level of the virtual keyboard can be shown because the virtual keyboard displayed at the moment has 50% transparency, so that a player can observe the current game situation through the virtual keyboard, the player can observe the game situation while inputting characters, the player can be prevented from reacting due to the character input, the game process is prevented from being influenced due to the character input, and the game experience can be improved to a certain extent.
202. Displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard.
The specific operation mode of the designated operation may be set according to the actual service scene requirement, and the specific operation mode of the designated operation is not limited in the embodiment of the present application. For example, the specified operation may be a long press on the virtual keyboard for a preset duration (e.g., a long press on the virtual keyboard for 1 second), a continuous click on the virtual keyboard for multiple times and hold (e.g., a continuous click on the virtual keyboard for 2 times and hold), and so on.
There are various ways to display the object operation control on the display level of the virtual keyboard in step 202, and the following takes the display ways of the case one and the case two and the example that the designated operation is "press 2 seconds on the virtual keyboard", as an example, how to display the object operation control on the display level of the virtual keyboard is described.
The first condition is as follows: the object operation control and the game picture are displayed above the display level of the virtual keyboard by placing the display level of the game picture and the object operation control above the display level of the virtual keyboard.
Referring to fig. 6, at this time, in step 202: when an operation of pressing for 2 seconds on the virtual keyboard is detected, a game screen is displayed above the display level of the virtual keyboard, and an object operation control is displayed above the display level of the virtual keyboard. The display hierarchy of the object operation control and the display hierarchy of the game picture in the graphical user interface are arranged on the virtual keyboard, so that a player can control the virtual object through the object operation control under the condition of calling out the keyboard and can avoid the called-out virtual keyboard from covering the game picture, the operation complexity and the operation time of controlling a character in the text input process are reduced, and the game operation efficiency is improved.
And a second condition: the object operation control is placed above the display hierarchy of the virtual keyboard without placing the game screen above the display hierarchy of the virtual keyboard to display the object operation control above the display hierarchy of the virtual keyboard.
In the following, taking the example of displaying all the object manipulating controls in the graphical user interface above the display level of the virtual keyboard and displaying part of the object manipulating controls in the graphical user interface above the display level of the virtual keyboard, how to display the object manipulating controls above the display level of the virtual keyboard in the second case is described.
<1>, the object operational controls displayed above the display hierarchy of the virtual keyboard are all object operational controls in the graphical user interface.
Referring to fig. 3 and fig. 7, at this time, step 202 may specifically include: when an operation of pressing for 2 seconds on the virtual keyboard is detected, all object operation controls in the graphical user interface are displayed above the display level of the virtual keyboard. The display levels of all the object operation controls in the graphical user interface are arranged above the virtual keyboard, so that a player can control the virtual object through the object operation controls under the condition of calling out the keyboard, the operation complexity and the operation time of character control in the text input process are reduced, and the game operation efficiency is improved.
For example, as shown in fig. 3, 4 skill controls and 1 movement control (e.g., a joystick control) are included in the graphical user interface, and the 4 skill controls and 1 movement control included in the graphical user interface are each overlaid under the virtual keyboard after the virtual keyboard is called out at step 201. When a player presses a virtual keyboard of a graphical user interface for 2 seconds through a touch display screen of the terminal 1000, 4 skill controls and 1 movement control are displayed on the display level of the virtual keyboard, as shown in fig. 7.
<2>, the object operation control displayed above the display hierarchy of the virtual keyboard is a part of the object operation control in the graphical user interface.
In this case, step 202 may specifically include: when the operation of pressing for 2 seconds on the virtual keyboard is detected, a specified object operation control in the graphical user interface is displayed above the display level of the virtual keyboard. The display hierarchy of part of the object operation controls in the graphical user interface is arranged above the virtual keyboard, so that a player can control the virtual object through the displayed part of the object operation controls under the condition of calling out the keyboard, the operation complexity and the operation time of character control in the text input process are reduced, and the game operation efficiency is improved.
The displayed specified object operation control may specifically be an object operation control covered under the virtual keyboard after the virtual keyboard is called in step 201, or may also be an object operation control of a specified category, or may also be an object operation control at a specified region (the specified region is specifically a region associated with a display hot zone in the graphical user interface, such as a movement control region).
<2.1> taking the example of specifying that the object operation control is the object operation control covered by the virtual keyboard, assuming that 5 skill controls (controls corresponding to skill 1, skill 2, skill 3, skill 4, and skill 5, respectively) and 1 movement control (such as a joystick control) are included in the graphical user interface, 4 skill controls (controls corresponding to skill 1, skill 2, skill 3, and skill 4, respectively) and 1 movement control in the graphical user interface are covered under the virtual keyboard after the virtual keyboard is called out in step 201. When a player presses a virtual keyboard of a graphical user interface for 2 seconds through a touch display screen of the terminal 1000, 4 skill controls (controls corresponding to skill 1, skill 2, skill 3, and skill 4, respectively) and 1 mobile control covered by the virtual keyboard are displayed on a display level of the virtual keyboard.
<2.2> taking the specified object operation control as an example, the specified object operation control is a specified class of object operation control, assuming that the graphical user interface comprises 4 skill controls (skill control 1, skill control 2, skill control 3, skill control 4) and 1 mobile control (such as a rocker control), and the control classes of the skill controls 1, skill control 2, skill control 3 and skill control 4 are respectively an attack class, an avoidance class and an avoidance class; the 4 skill controls and 1 movement control included in the graphical user interface are each overlaid under the virtual keyboard after the call out to the virtual keyboard at step 201. Assuming that the designated category is an avoidance category, when the player presses for 2 seconds on the virtual keyboard of the graphical user interface through the touch display screen of the terminal 1000, the skill controls (i.e., skill control 1, skill control 2) of the designated category in the graphical user interface will be displayed above the display level of the virtual keyboard, and the display levels of the controls of other categories, such as skill control 3, skill control 4, etc., are not placed above the display level of the virtual keyboard and pressed for 2 seconds on the virtual keyboard of the graphical user interface, as shown in fig. 9.
<2.3> taking the specified object operation control as an example, at this time, in step 202: and in response to a specified operation (such as pressing the screen area where the mobile control is located for 2 seconds) acting on a specified area on the virtual keyboard, displaying an object operation control (such as the mobile control) on the display level of the virtual keyboard. The method comprises the following steps that a graphical user interface is assumed to comprise 4 skill controls (skill control 1, skill control 2, skill control 3 and skill control 4) and 1 mobile control (such as a rocker control), wherein the control categories of the skill controls 1, 2, 3 and 4 are attack categories, avoidance categories and avoidance categories respectively; after calling out the virtual keyboard in step 201, 4 skill controls and 1 movement control included in the graphical user interface are each overlaid under the virtual keyboard. When a player presses for 2 seconds on a virtual keyboard of a graphical user interface (e.g., at a corresponding movement control) through a touch display screen of terminal 1000, an object operation control (i.e., movement control) covered by the virtual keyboard at a long-press area of the player will be displayed above a display level of the virtual keyboard, as shown in fig. 8.
Further, in case two, in order to facilitate the player to observe the game situation, when the object operation control is displayed on the display level of the virtual keyboard, the display transparency of the virtual keyboard may be appropriately reduced, so that the player may better observe the game situation when controlling the virtual object. At this time, in step 202, when the object operation control is displayed on the display level of the virtual keyboard, the virtual keyboard is displayed with a preset second transparency, so as to show the game picture below the level of the virtual keyboard.
The first transparency is the display transparency of the virtual keyboard and is used for showing a game picture; it can be understood that the second transparency is greater than 0 to effectively reveal the game pictures with the display level below the virtual keyboard. The specific value of the second transparency may be set according to the actual service scene requirement, and the specific value of the second transparency in the embodiment of the present application is not limited.
The first transparency and the second transparency are distinguished by: the first transparency is the display transparency of the virtual keyboard when the display level of the virtual keyboard is above the display level of the game picture and the display level of the virtual keyboard is also above the display level of the object operation control; the second transparency is a display transparency of the virtual keyboard when the display hierarchy of the virtual keyboard is above the display hierarchy of the game screen but below the display hierarchy of the object manipulation control.
Further, if the virtual keyboard is displayed on the game screen of the gui according to the first transparency in step 201, in order to facilitate the player to observe the game situation and more accurately control the character, the second transparency may be controlled to be greater than the first transparency.
203. And in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to perform an action corresponding to the object operation control.
The touch operation of the object operation control displayed on the virtual keyboard may be specifically pressing, long pressing, dragging, sliding, and the like.
In the following, taking the touch-operated object operation controls as a mobile control and a skill control, respectively, how to control the virtual object in the virtual scene to perform the action corresponding to the object operation control is described.
(1) The object operation control of the touch operation is a mobile control.
At this time, in step 203, the virtual object is controlled to move in the virtual scene in response to a touch operation on the object operation control displayed above the virtual keyboard in a hierarchical manner.
For example, as shown in fig. 6, 7, 8 or 9, when a player triggers a specified operation on a virtual keyboard through a touch display screen of the terminal 1000, a movement control will be displayed above a display hierarchy of the virtual keyboard; when a touch operation of a player for a movement control displayed on the virtual keyboard is received, a virtual object in the virtual scene is controlled to make a movement action (such as forward, backward, leftward or rightward movement). Therefore, the virtual object can be controlled to move in the virtual scene, so that the player can control the virtual object under the condition of calling out the keyboard, the operation complexity and the operation time of character control in the text input process are reduced, and the game operation efficiency is improved; the problem that the virtual object cannot be controlled after the keyboard is called so that timely response cannot be made to the game situation is solved.
(2) The object operation control of the touch operation is a skill control.
At this time, in step 203, in response to a touch operation on the object operation control displayed above the virtual keyboard in a hierarchical manner, the virtual object is controlled to release the skill corresponding to the skill control.
For example, as shown in fig. 6, 7 or 9, when a player triggers a specified operation on a virtual keyboard through a touch display screen of terminal 1000, a skill control will be displayed above a display level of the virtual keyboard; when receiving a touch operation of a player for a skill control (such as skill control 1) displayed above a virtual keyboard, controlling a virtual object in a virtual scene to release a skill (such as skill mode 1 corresponding to skill control 1) corresponding to the touch-operated skill control. Therefore, the skill of the virtual object in the virtual scene can be controlled, so that the player can control the virtual object under the condition of calling out the keyboard, the operation complexity and the operation time of controlling the character in the text input process are reduced, and the game operation efficiency is improved; the problem that the virtual object cannot be dealt with in time aiming at the game situation because the virtual object cannot be dealt with after the keyboard is called is solved.
Further, in order to avoid operation conflict between the object operation control and the virtual keyboard, when the object operation control is displayed above the display level of the virtual keyboard, the virtual keyboard can be controlled to enter a locking state.
Further, in order to ensure that the switching between the character input and the character control can be performed quickly and effectively after the virtual keyboard is called out, under the condition that the object operation control is displayed on the display level of the virtual keyboard, when the touch operation of the object operation control on the virtual keyboard is finished, the object operation control on the display level of the virtual keyboard is hidden, so that the input function of the virtual keyboard can be recovered quickly, and the character input efficiency is improved. Meanwhile, if the virtual keyboard is in the locked state, the virtual keyboard can be controlled to exit the locked state when the touch operation of the object operation control on the virtual keyboard in the display level is finished, so that the input function of the virtual keyboard can be recovered.
For example, as shown in fig. 3, 5, and 7, the manner in which the outgoing virtual keyboard is set in advance is "double-clicking the outgoing virtual keyboard at an arbitrary position of the graphical user interface", the designation operation is "long-pressing for 1 second on the virtual keyboard", and all object operation controls in the graphical user interface are displayed above the display hierarchy of the virtual keyboard are taken as an example. When a player double-clicks an arbitrary position of the graphical user interface shown in fig. 3 through the touch display screen of the terminal 1000, the terminal 1000 displays a virtual keyboard above the display level of the game screen in response to a keyboard bring-up operation "double-clicking an arbitrary position of the graphical user interface" as shown in fig. 5. When the player presses with his left hand for a long time of 1 second on the virtual keyboard (as at the movement control) shown in fig. 5, all of the object-manipulating controls in the graphical user interface will be displayed above the display level of the virtual keyboard, as shown in fig. 7. When the touch operation of the player for the object operation control (such as the skill control 1) above the virtual keyboard according to the display level shown in fig. 7 is detected, the virtual object is controlled to release the skill corresponding to the object skill control 1 in the virtual scene in response to the touch operation for the object operation control (such as the skill control 1) above the virtual keyboard according to the display level shown in fig. 7. When the player releases his left hand (i.e., the touch operation on the object operation control on the virtual keyboard is finished), the display state shown in fig. 3 is restored, and the virtual keyboard is displayed on the object operation control and the game screen, thereby ensuring fast switching to the text input state.
Further, when the object operation control is not displayed on the display hierarchy of the virtual keyboard, input information for sending to other terminal devices may be obtained in response to an input operation for the virtual keyboard. For example, when the virtual keyboard is called in step 201, an input operation of the player for the virtual keyboard may be received, and input information for transmission to the other terminal device may be obtained. For another example, when the touch operation of the object operation control displayed on the virtual keyboard is finished and the virtual keyboard exits from the locked state and recovers to the normal input function, the input information for sending to the other terminal device may be obtained in response to the input operation of the virtual keyboard.
As can be seen from the above, in the present embodiment, a virtual keyboard is displayed on a graphical user interface by responding to a keyboard call operation; displaying an object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; responding to the touch operation of the object operation control on the virtual keyboard with the display level, and controlling the virtual object to make an action corresponding to the object operation control; because the object operation control can be displayed on the display level of the virtual keyboard after the virtual keyboard is called, the object operation control originally covered by the virtual keyboard can be operated by touch control, so that the virtual object can still be controlled in the process of character input of the calling keyboard, and the virtual object can be quickly switched to a conversation input state when being controlled, thereby reducing the operation complexity and the operation time of character control in the character input process and improving the game operation efficiency. Therefore, on one hand, the requirement that the player sends the custom text for interaction is met; on the other hand, interruption of game participation during text information input is reduced, and game experience is improved to a certain extent.
In order to better implement the above method, the embodiment of the present application further provides a game control apparatus, where the game control apparatus may be specifically integrated in an electronic device, for example, a computer device, and the computer device may be a terminal, a server, or the like.
The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in the present embodiment, a game control device is specifically integrated in a smart phone as an example, and the method in the embodiment of the present application is described in detail.
For example, as shown in fig. 10, the game control apparatus may display, by a terminal device, a graphical user interface including a game screen and an object operation control of at least a part of a virtual scene including a virtual object, where the game control apparatus may include:
a first display unit 1001 configured to display a virtual keyboard on the graphical user interface in response to a keyboard bring-up operation;
a second display unit 1002 configured to display the object operation control above a display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
a control unit 1003, configured to control the virtual object to perform an action corresponding to the object operation control in response to a touch operation on the object operation control displayed hierarchically above the virtual keyboard.
In some embodiments, the first display unit 1001 is specifically configured to:
when the object operation control is not displayed above the display level of the virtual keyboard, displaying the virtual keyboard on the graphical user interface with a preset first transparency.
In some embodiments, the first display unit 1001 is specifically configured to:
and when the object operation control is displayed on the display level of the virtual keyboard, displaying the virtual keyboard with a preset second transparency so as to show the game picture below the level of the virtual keyboard.
In some embodiments, the game-controlled apparatus further comprises an input unit (not shown in the figures), the input unit being specifically configured to:
and when the object operation control is not displayed on the display level of the virtual keyboard, responding to the input operation aiming at the virtual keyboard, and acquiring input information for sending to other terminal equipment.
In some embodiments, the second display unit 1002 is specifically configured to:
and controlling the virtual keyboard to enter a locked state when the object operation control is displayed above the display level of the virtual keyboard.
In some embodiments, the second display unit 1002 is specifically configured to:
and when the object operation control is displayed on the display level of the virtual keyboard, responding to the end of the touch operation, and hiding the object operation control on the display level of the virtual keyboard.
In some embodiments, the second display unit 1002 is specifically configured to:
and responding to the end of the touch operation, and controlling the virtual keyboard to exit the locking state.
In some embodiments, the specified operation is a long press operation.
In some embodiments, the object operation control comprises a movement control; the control unit 1003 is specifically configured to:
controlling the virtual object to move in the virtual scene in response to a touch operation with respect to the object operation control displayed hierarchically above the virtual keyboard.
In some embodiments, the object manipulation controls comprise skill controls; the control unit 1003 is specifically configured to:
and in response to the touch operation of the object operation control displayed above the virtual keyboard in a hierarchical manner, controlling the virtual object to release the skill corresponding to the skill control.
In some embodiments, the second display unit 1002 is specifically configured to:
displaying the object operation control above the display level of the virtual keyboard in response to a specified operation acting on a specified area on the virtual keyboard; wherein the designated area is associated with a display hotspot of the object operation control in the graphical user interface.
As can be seen from the above, the game control apparatus of the present embodiment may display a virtual keyboard on the graphical user interface by the first display unit 1001 in response to a keyboard callout operation; displaying, by the second display unit 1002, the object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; and controlling the virtual object to perform an action corresponding to the object operation control by the control unit 1003 in response to a touch operation on the object operation control displayed hierarchically above the virtual keyboard.
Therefore, the game control device provided by the embodiment of the application can bring the following technical effects: because the object operation control can be displayed on the display level of the virtual keyboard after the virtual keyboard is called, the object operation control originally covered by the virtual keyboard can be operated by touch control, so that the virtual object can still be controlled in the process of character input of the calling keyboard, and the virtual object can be quickly switched to a conversation input state when being controlled, thereby reducing the operation complexity and the operation time of character control in the character input process and improving the game operation efficiency. Therefore, on one hand, the requirement that the player sends the custom text for interaction is met; on the other hand, interruption of game participation during text information input is reduced, and game experience is improved to a certain extent.
Correspondingly, the embodiment of the present application further provides an electronic device, where the electronic device may be a terminal, and the terminal may be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 11, fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 1100 includes a processor 1101 having one or more processing cores, memory 1102 having one or more computer-readable storage media, and a computer program stored on the memory 1102 and operable on the processor. The processor 1101 is electrically connected to the memory 1102. Those skilled in the art will appreciate that the electronic device structures shown in the figures do not constitute limitations on the electronic device, and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The processor 1101 is a control center of the electronic device 1100, connects various parts of the entire electronic device 1100 using various interfaces and lines, performs various functions of the electronic device 1100 and processes data by running or loading software programs and/or modules stored in the memory 1102, and calling data stored in the memory 1102, thereby performing overall monitoring of the electronic device 1100.
In this embodiment, the processor 1101 in the electronic device 1100 loads instructions corresponding to processes of one or more application programs into the memory 1102, and the processor 1101 executes the application programs stored in the memory 1102, so as to implement various functions as follows:
displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
and in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to perform an action corresponding to the object operation control.
In some embodiments, said displaying a virtual keyboard on said graphical user interface when said object operational control is not displayed above a display hierarchy of said virtual keyboard comprises:
and displaying a virtual keyboard on the graphical user interface with a preset first transparency.
In some embodiments, when the object operation control is displayed above the display level of the virtual keyboard, the virtual keyboard is displayed with a preset second transparency so as to show the game picture below the level of the virtual keyboard.
In some embodiments, when the object operational control is not displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and responding to the input operation of the virtual keyboard, and obtaining input information used for being sent to other terminal equipment.
In some embodiments, when the object operational control is displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and controlling the virtual keyboard to enter a locking state.
In some embodiments, when the object operational control is displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and hiding the object operation control above the display level of the virtual keyboard in response to the end of the touch operation.
In some embodiments, the method further comprises:
and responding to the end of the touch operation, and controlling the virtual keyboard to exit the locking state.
In some embodiments, the specified operation is a long press operation.
In some embodiments, the object manipulation control comprises a movement control;
the controlling the virtual object to make an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
controlling the virtual object to move in the virtual scene in response to a touch operation directed to the object operation control displayed hierarchically above the virtual keyboard.
In some embodiments, the object manipulation controls comprise skill controls;
the controlling the virtual object to perform an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
and in response to the touch operation of the object operation control displayed above the virtual keyboard in a hierarchical manner, controlling the virtual object to release the skill corresponding to the skill control.
In some embodiments, the displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard comprises:
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on a specified area on the virtual keyboard; wherein the designated area is associated with a display hotspot of the object operation control in the graphical user interface.
Therefore, the electronic device 1100 provided by the embodiment can bring the following technical effects: displaying a virtual keyboard on a graphical user interface by responding to a keyboard callout operation; displaying an object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; responding to the touch operation of the object operation control on the virtual keyboard, and controlling the virtual object to make an action corresponding to the object operation control; because the object operation control can be displayed on the display level of the virtual keyboard after the virtual keyboard is called, the object operation control originally covered by the virtual keyboard can be operated by touch control, so that the virtual object can still be controlled in the process of character input of the calling keyboard, and the virtual object can be quickly switched to a conversation input state when being controlled, thereby reducing the operation complexity and the operation time of character control in the character input process and improving the game operation efficiency. Therefore, on one hand, the requirement that the player sends the custom text for interaction is met; on the other hand, interruption of game participation during text information input is reduced, and game experience is improved to a certain extent.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 11, the electronic device 1100 further includes: touch display 1103, radio frequency circuit 1104, audio circuit 1105, input unit 1106, and power supply 1107. The processor 1101 is electrically connected to the touch display 1103, the rf circuit 1104, the audio circuit 1105, the input unit 1106 and the power supply 1107 respectively. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 11 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The touch display screen 1103 can be used for displaying a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. The touch display screen 1103 may include a display panel and a touch panel. The display panel may be used, among other things, to display information entered by or provided to a user and various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user (for example, operations of the user on or near the touch panel by using a finger, a stylus pen, or any other suitable object or accessory) and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 1101, and can receive and execute commands sent by the processor 1101. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may be transmitted to the processor 1101 to determine the type of the touch event, and then the processor 1101 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, a touch panel and a display panel may be integrated into the touch display screen 1103 to implement input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display 1103 can also be used as a part of the input unit 1106 to implement input function.
The rf circuit 1104 may be used for transceiving rf signals to establish wireless communication with a network device or other electronic devices via wireless communication, and for transceiving signals with the network device or other electronic devices.
The audio circuitry 1105 may be used to provide an audio interface between a user and the electronic device through speakers and microphones. The audio circuit 1105 may transmit the electrical signal converted from the received audio data to a speaker, and the electrical signal is converted into a sound signal by the speaker and output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 1105 and converted into audio data, and then the audio data is processed by the output processor 1101, and then sent to another electronic device via the rf circuit 1104, or the audio data is output to the memory 1102 for further processing. Audio circuitry 1105 may also include an earbud jack to provide communication of peripheral headphones with the electronic device.
The input unit 1106 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
A power supply 1107 is used to power the various components of electronic device 1100. Optionally, the power supply 1107 may be logically connected to the processor 1101 through a power management system, so that the power management system may manage charging, discharging, and power consumption. Power supply 1107 may also include any component or components of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 11, the electronic device 1100 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions or by associated hardware controlled by the instructions, which may be stored in a computer readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the methods for game control provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
and in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to perform an action corresponding to the object operation control.
In some embodiments, said displaying a virtual keyboard on said graphical user interface when said object operational control is not displayed above a display hierarchy of said virtual keyboard comprises:
and displaying a virtual keyboard on the graphical user interface with a preset first transparency.
In some embodiments, when the object operation control is displayed above the display level of the virtual keyboard, the virtual keyboard is displayed with a preset second transparency so as to show the game picture below the level of the virtual keyboard.
In some embodiments, when the object operational control is not displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and responding to the input operation aiming at the virtual keyboard, and obtaining input information used for being sent to other terminal equipment.
In some embodiments, when the object operational control is displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and controlling the virtual keyboard to enter a locking state.
In some embodiments, when the object operational control is displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and hiding the object operation control above the display level of the virtual keyboard in response to the end of the touch operation.
In some embodiments, the method further comprises:
and responding to the end of the touch operation, and controlling the virtual keyboard to exit the locking state.
In some embodiments, the specified operation is a long press operation.
In some embodiments, the object operation control comprises a movement control;
the controlling the virtual object to perform an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
controlling the virtual object to move in the virtual scene in response to a touch operation directed to the object operation control displayed hierarchically above the virtual keyboard.
In some embodiments, the object manipulation controls comprise skill controls;
the controlling the virtual object to perform an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
and in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to release the skill corresponding to the skill control.
In some embodiments, the displaying the object operational control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard comprises:
displaying the object operation control above the display level of the virtual keyboard in response to a specified operation acting on a specified area on the virtual keyboard; wherein the designated area is associated with a display hotspot of the object manipulation control in the graphical user interface.
It can be seen that the computer program can be loaded by a processor to execute the steps in any method for controlling a game provided in the embodiments of the present application, and thus, the computer readable storage medium in the embodiments of the present application may bring about the following technical effects: displaying a virtual keyboard on a graphical user interface by responding to a keyboard callout operation; displaying an object operation control over the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard; responding to the touch operation of the object operation control on the virtual keyboard with the display level, and controlling the virtual object to make an action corresponding to the object operation control; because the object operation control can be displayed on the display level of the virtual keyboard after the virtual keyboard is called out, the object operation control originally covered by the virtual keyboard can be operated by touch control, so that the virtual object can still be controlled in the process of character input by the calling-out keyboard, and the virtual object can be quickly switched to a conversation input state when being controlled, thereby reducing the operation complexity and the operation time of character control in the character input process, and improving the game operation efficiency. Therefore, on one hand, the requirement that the player sends the custom text for interaction is met; on the other hand, interruption of game participation during text information input is reduced, and game experience is improved to a certain extent.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the computer-readable storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
In the above game control apparatus, computer-readable storage medium, and electronic device embodiments, the descriptions of the respective embodiments have their respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments. It can be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes and possible advantages of the game control apparatus, the computer-readable storage medium, the electronic device and the corresponding units thereof described above may refer to the description of the game control method in the above embodiments, and are not described herein again in detail.
The method, apparatus, electronic device and computer-readable storage medium for game control provided in the embodiments of the present application are described in detail above, and specific examples are applied herein to explain the principles and embodiments of the present application, and the description of the embodiments is only used to help understand the method and its core ideas of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, the specific implementation manner and the application scope may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (14)

1. A method for controlling a game is characterized in that a graphical user interface is displayed through a terminal device, the graphical user interface comprises a game picture and an object operation control of at least a part of a virtual scene, the virtual scene comprises a virtual object, and the method comprises the following steps:
displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
in response to the touch operation of the object operation control displayed on the virtual keyboard, controlling the virtual object to perform an action corresponding to the object operation control.
2. The method of claim 1, wherein displaying a virtual keyboard on the graphical user interface while the object operational control is not displayed above a display hierarchy of the virtual keyboard comprises:
and displaying a virtual keyboard on the graphical user interface with a preset first transparency.
3. The method according to claim 1, wherein when the object operation control is displayed above a display level of the virtual keyboard, the virtual keyboard is displayed with a preset second transparency so as to show the game screen below the level of the virtual keyboard.
4. The method of claim 1, wherein when the object operational control is not displayed above the display hierarchy of the virtual keyboard, the method further comprises:
and responding to the input operation aiming at the virtual keyboard, and obtaining input information used for being sent to other terminal equipment.
5. The method of claim 1, wherein when the object operational control is displayed above a display hierarchy of the virtual keyboard, the method further comprises:
and controlling the virtual keyboard to enter a locking state.
6. The method of claim 1, wherein when the object operational control is displayed above a display hierarchy of the virtual keyboard, the method further comprises:
and hiding the object operation control above the display level of the virtual keyboard in response to the end of the touch operation.
7. The method of claim 6, further comprising:
and responding to the end of the touch operation, and controlling the virtual keyboard to exit the locking state.
8. The method of claim 1, wherein the specified operation is a long press operation.
9. The method of claim 1, wherein the object manipulation control comprises a movement control;
the controlling the virtual object to perform an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
controlling the virtual object to move in the virtual scene in response to a touch operation with respect to the object operation control displayed hierarchically above the virtual keyboard.
10. The method of claim 1, wherein the object manipulation controls comprise skill controls;
the controlling the virtual object to make an action corresponding to the object operation control in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchical manner comprises:
and in response to the touch operation of the object operation control displayed on the virtual keyboard in a hierarchy, controlling the virtual object to release the skill corresponding to the skill control.
11. The method of claim 1, wherein displaying the object operation control above a display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard comprises:
displaying the object operation control above the display hierarchy of the virtual keyboard in response to a specified operation acting on a specified area on the virtual keyboard; wherein the designated area is associated with a display hotspot of the object operation control in the graphical user interface.
12. An apparatus for controlling a game, wherein a graphical user interface is displayed through a terminal device, the graphical user interface includes a game screen and an object operation control of at least a part of a virtual scene, and the virtual scene includes a virtual object, the apparatus comprising:
a first display unit for displaying a virtual keyboard on the graphical user interface in response to a keyboard callout operation;
a second display unit configured to display the object operation control above a display hierarchy of the virtual keyboard in response to a specified operation acting on the virtual keyboard;
the control unit is used for responding to the touch operation of the object operation control on the virtual keyboard, and controlling the virtual object to perform the action corresponding to the object operation control.
13. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps of the method according to any one of claims 1 to 11.
14. A computer readable storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor to perform the steps of the method according to any one of claims 1 to 11.
CN202211352501.7A 2022-10-31 2022-10-31 Game control method and device, electronic equipment and readable storage medium Pending CN115645919A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211352501.7A CN115645919A (en) 2022-10-31 2022-10-31 Game control method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211352501.7A CN115645919A (en) 2022-10-31 2022-10-31 Game control method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN115645919A true CN115645919A (en) 2023-01-31

Family

ID=84996289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211352501.7A Pending CN115645919A (en) 2022-10-31 2022-10-31 Game control method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN115645919A (en)

Similar Documents

Publication Publication Date Title
CN111760274A (en) Skill control method and device, storage medium and computer equipment
CN113101652A (en) Information display method and device, computer equipment and storage medium
CN113082688A (en) Method and device for controlling virtual character in game, storage medium and equipment
CN113101650A (en) Game scene switching method and device, computer equipment and storage medium
CN112870718A (en) Prop using method and device, storage medium and computer equipment
CN113398590A (en) Sound processing method, sound processing device, computer equipment and storage medium
CN113426124A (en) Display control method and device in game, storage medium and computer equipment
CN115040873A (en) Game grouping processing method and device, computer equipment and storage medium
CN113332721B (en) Game control method, game control device, computer equipment and storage medium
CN113332719A (en) Virtual article marking method, device, terminal and storage medium
CN115501581A (en) Game control method and device, computer equipment and storage medium
CN113413600B (en) Information processing method, information processing device, computer equipment and storage medium
CN115382201A (en) Game control method and device, computer equipment and storage medium
CN115193043A (en) Game information sending method and device, computer equipment and storage medium
CN112799754B (en) Information processing method, information processing device, storage medium and computer equipment
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
CN113867873A (en) Page display method and device, computer equipment and storage medium
CN113426115A (en) Game role display method and device and terminal
CN115645919A (en) Game control method and device, electronic equipment and readable storage medium
CN112245914A (en) Visual angle adjusting method and device, storage medium and computer equipment
CN115212567A (en) Information processing method, information processing device, computer equipment and computer readable storage medium
CN115382221A (en) Method and device for transmitting interactive information, electronic equipment and readable storage medium
CN117919694A (en) Game control method, game control device, computer equipment and storage medium
CN115193062A (en) Game control method, device, storage medium and computer equipment
CN115430145A (en) Target position interaction method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination