CN115212568A - Control method and device for game guidance, electronic equipment and readable storage medium - Google Patents

Control method and device for game guidance, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN115212568A
CN115212568A CN202210880351.0A CN202210880351A CN115212568A CN 115212568 A CN115212568 A CN 115212568A CN 202210880351 A CN202210880351 A CN 202210880351A CN 115212568 A CN115212568 A CN 115212568A
Authority
CN
China
Prior art keywords
interface layer
game
currently
component
simulated
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210880351.0A
Other languages
Chinese (zh)
Inventor
赵丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210880351.0A priority Critical patent/CN115212568A/en
Publication of CN115212568A publication Critical patent/CN115212568A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses a control method and device for game guidance, electronic equipment and a computer-readable storage medium; the method and the device for displaying the target graphical user interface of the currently running game can display the target graphical user interface of the currently running game, wherein the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game; responding to touch operation on the mask interface layer, and determining a touched position of the mask interface layer; and if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to complete the operation guidance of the current guided component. The embodiment of the application can avoid the problem that the guide component is shielded by other pages in the UI interface layer, and the running energy consumption of the game can be reduced on the basis that the guide component can be effectively triggered.

Description

Control method and device for game guidance, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of game guidance technologies, and in particular, to a method and an apparatus for controlling game guidance, an electronic device, and a computer-readable storage medium.
Background
In the field of gaming, novice guidance of a game is an important step in a player's quick start-up and entry into the game. In the prior art, in order to enhance the guiding effect, the guiding special effect and the guiding component are used to complete the guiding of the novice, wherein the guiding of the novice can be performed by clicking the guiding component, and the guiding special effect can make a prominent and striking prompt for the operation of the guided game task.
However, since the guidance component is placed in the game UI (User Interface) Interface layer, the guidance component is easily blocked by other page contents in the UI Interface layer, which affects the completion of guidance by the novice, and even prevents the game from continuing if the guidance is strong.
Disclosure of Invention
The embodiment of the application provides a game guide control method and device, electronic equipment and a computer readable storage medium, which can avoid the problem that a guide component is shielded by other pages in a UI interface layer and ensure that the running energy consumption of a game is reduced on the basis that the guide component can be effectively triggered.
In a first aspect, an embodiment of the present application provides a control method for game guidance, including:
displaying a target graphical user interface of a currently running game, wherein the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
responding to the touch operation of the mask interface layer, and determining the touched position of the mask interface layer;
and if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to complete the operation guidance of the current guided component.
In a second aspect, an embodiment of the present application further provides a game guidance control device, including:
the game system comprises a display unit, a display unit and a control unit, wherein the display unit is used for displaying a target graphical user interface of a currently running game, the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
the response unit is used for responding to the touch operation on the mask interface layer and determining the touched position of the mask interface layer;
and the control unit is used for triggering a click event which is pre-associated with the simulated click area if the touched position is located in the simulated click area so as to complete the operation guidance of the currently guided component.
In some embodiments, the game guidance control device further includes a processing unit, and before the target gui of the currently running game is displayed, the processing unit is specifically configured to:
generating the mask interface layer on the upper layer of the interaction interface layer;
determining bounding box data of the currently directed component on the mask interface layer based on an interface location of the currently directed component on the interaction interface layer;
generating a rectangular object region in the mask interface layer based on the bounding box data as a simulated click region of the currently directed component, wherein the rectangular object region is added with a click event for triggering completion of operational guidance of the currently directed component.
In some embodiments, the mask interface layer is further provided with a guidance special effect of the currently guided component, and the processing unit is specifically configured to:
acquiring the display position of the directing special effect on the mask interface layer based on the bounding box data;
generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
In some embodiments, the control unit is specifically configured to:
if the touched position is located in the simulated clicking area, the clicking event pre-associated with the simulated clicking area is dispatched in a list event dispatching mode to complete the operation guidance of the current guided assembly;
or if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area in a dispatching mode of a common clicking event so as to complete the operation guidance of the currently guided component.
In some embodiments, the control unit is specifically configured to:
detecting whether an automatic guide function of the current running game is started;
and if the automatic guiding function is started, automatically dispatching the click event pre-associated with the simulated click area according to a preset interval duration so as to complete the operation guiding of the currently guided component.
In some embodiments, before the displaying the target graphical user interface of the currently running game, the display unit is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
detecting whether the current guided component or the guide special effect of the current guided component is blocked;
in some embodiments, the display unit is specifically configured to:
if the guiding type is strong guiding and the current guided component is blocked, displaying the target graphical user interface;
or if the guide type is strong guide and the guide special effect of the currently guided component is blocked, displaying the target graphical user interface.
In some embodiments, before displaying the target graphical user interface of the currently running game, the display unit is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
detecting a clicked position on an interaction interface layer where the current guided component is located;
in some embodiments, the display unit is specifically configured to:
and if the guide type is strong guide and the clicked position is other positions except the current guided component, displaying the target graphical user interface.
In some embodiments, the game guidance control device further comprises a hiding unit, the hiding unit is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory storing a plurality of instructions; the processor loads instructions from the memory to execute the steps of any game guiding control method provided by the embodiment of the application.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium, where multiple instructions are stored, and the instructions are suitable for being loaded by a processor to perform steps in any one of the game guidance control methods provided in the embodiments of the present application.
In the embodiment of the application, on the first hand, by displaying the target graphical user interface including the mask interface layer, since the mask interface layer is provided with the simulated click area of the currently guided component of the currently running game, the actual currently guided component which is blocked in the interactive interface layer can be highlighted, the operation guide item of the currently guided component can be found and triggered by a player, and the operation guide of the blocked currently guided component can be effectively completed. In a second aspect, because the click event used for triggering the operation guidance completion of the currently guided component is added in advance in the simulated click area, the click event correlated with the simulated click area in advance is triggered when the touched position is in the simulated click area, and the click event transmission of the indirectly clicked component can be realized, so that the guidance operation can be completed without directly contacting the actual component, and then the operation guidance of the currently guided component is completed without depending on the actual component, thereby avoiding the problem of game interruption caused by the blocked click failure of the component, and improving the game fluency. In the third aspect, the click event of the real component of the interactive interface layer is simulated by dispatching the click event of the simulated click area, so that the processing of the penetration event penetrating the interactive interface layer is not needed, the energy consumption caused by the penetration event is avoided, and the running energy consumption of the game is reduced on the basis of ensuring that the guided component can be effectively triggered.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram of a control system for game guidance provided by an embodiment of the present application;
FIG. 2 is a flowchart illustrating an embodiment of a method for controlling game guidance according to an embodiment of the present disclosure;
FIG. 3 is an illustrative schematic diagram of a directing assembly provided in embodiments of the present application;
FIG. 4 is a schematic diagram of a scenario of a target graphical user interface provided in an embodiment of the present application;
FIG. 5 is a process diagram of guidance for operation of directed components in an embodiment of the present application;
FIG. 6 is a schematic structural diagram of a game guidance control device provided in an embodiment of the present application;
fig. 7 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application. Meanwhile, in the description of the embodiments of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not to be construed as indicating or implying relative importance. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present application, "a plurality" means two or more unless specifically defined otherwise.
The embodiment of the application provides a control method and device for game guide, electronic equipment and a computer-readable storage medium. Specifically, the control method for game guidance according to the embodiment of the present application may be executed by an electronic device, where the electronic device may be a terminal or a server. The terminal can be a terminal device such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game machine, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like, and the terminal can also include a client, which can be a game application client, a browser client carrying a game program, or an instant messaging client, and the like. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, cloud functions, cloud storage, network service, cloud communication, middleware service, domain name service, security service, CDN, and a big data and artificial intelligence platform.
For example, when the control method of the game guide is operated on the terminal, the terminal device stores a game application program and is used for presenting a virtual scene in a game screen. The terminal device is used for interacting with a user through a graphical user interface, for example, downloading and installing a game application program through the terminal device and running the game application program. The manner in which the terminal device provides the graphical user interface to the user may include a variety of ways, for example, the graphical user interface may be rendered for display on a display screen of the terminal device or presented by holographic projection. For example, the terminal device may include a touch display screen for presenting a graphical user interface including a game screen and receiving operation instructions generated by a user acting on the graphical user interface, and a processor for executing the game, generating the graphical user interface, responding to the operation instructions, and controlling display of the graphical user interface on the touch display screen.
For example, when the control method of the game guidance is executed on the server, the game may be a cloud game. Cloud gaming refers to a gaming regime based on cloud computing. In the running mode of the cloud game, the running main body of the game application program and the game picture presenting main body are separated, and the storage and the running of the control method of the game guide are finished on the cloud game server. The game screen presentation is performed at a cloud game client, which is mainly used for receiving and sending game data and presenting the game screen, for example, the cloud game client may be a display device with a data transmission function near a user side, such as a mobile terminal, a television, a computer, a palm computer, a personal digital assistant, and the like, but a terminal device for performing game data processing is a cloud game server at the cloud end. When a game is played, a user operates the cloud game client to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are coded and compressed, the data are returned to the cloud game client through a network, and finally the data are decoded through the cloud game client and the game pictures are output.
Referring to fig. 1, fig. 1 is a schematic view of a game guidance control system according to an embodiment of the present disclosure. The system may include at least one terminal 1000, at least one server 2000, at least one database 3000, and a network 4000. The terminal 1000 held by the user can be connected to servers of different games through the network 4000. Terminal 1000 can be any device having computing hardware capable of supporting and executing a software product corresponding to a game. Additionally, terminal 1000 can have one or more multi-touch sensitive screens for sensing and obtaining user input through touch or slide operations performed at multiple points of one or more touch sensitive display screens. In addition, when the system includes a plurality of terminals 1000, a plurality of servers 2000, and a plurality of networks 4000, different terminals 1000 may be connected to each other through different networks 4000 and through different servers 2000. The network 4000 may be a wireless network or a wired network, such as a Wireless Local Area Network (WLAN), a Local Area Network (LAN), a cellular network, a 2G network, a 3G network, a 4G network, a 5G network, and so on. In addition, different terminals 1000 may be connected to other terminals or a server using their own bluetooth network or hotspot network. For example, a plurality of users may be online through different terminals 1000 to be connected and synchronized with each other through a suitable network to support multiplayer games. In addition, the system may include a plurality of databases 3000, the plurality of databases 3000 being coupled to different servers 2000, and information about game environments may be continuously stored in the databases 3000 while different users play a multiplayer game online.
The control method of the game guide provided by the embodiment of the application can be executed by a terminal or a server. The embodiment of the present application is described by taking an example in which a control method of game guidance is executed by a terminal. The terminal comprises a touch display screen and a processor, wherein the touch display screen is used for presenting a graphical user interface and receiving an operation instruction generated by a user acting on the graphical user interface. When a user operates the graphical user interface through the touch display screen, the graphical user interface can control the local content of the terminal through responding to the received operation instruction, and can also control the content of the opposite-end server through responding to the received operation instruction. For example, the operating instructions generated by the user acting on the graphical user interface include instructions for launching the game application, and the processor is configured to launch the game application after receiving the instructions provided by the user to launch the game application. Further, the processor is configured to render and draw a graphical user interface associated with the game on the touch display screen. A touch display screen is a multi-touch sensitive screen capable of sensing a touch or slide operation performed at a plurality of points on the screen at the same time. The user uses a finger to perform touch operation on the graphical user interface, and when the graphical user interface detects the touch operation, different virtual objects in the graphical user interface of the game are controlled to perform actions corresponding to the touch operation. For example, the game may be any one of a leisure game, an action game, a role-playing game, a strategy game, a sports game, a game of chance, and the like. Wherein the game may include a virtual scene of the game drawn on a graphical user interface. In addition, one or more virtual objects, such as virtual characters, controlled by the user (or player) may be included in the virtual scene of the game. Optionally, the virtual scene of the game also includes one or more elements, such as skills, points, character health, energy, etc., to provide assistance to the player, provide virtual services, increase points related to player performance, etc. In addition, the graphical user interface may also present one or more indicators to provide instructional information to the player. For example, a game may include a player-controlled virtual object and one or more other virtual objects (such as enemy characters). In one embodiment, one or more other virtual objects are controlled by other players of the game. For example, one or more other virtual objects may be computer controlled, such as a robot using Artificial Intelligence (AI) algorithms, to implement a human-machine fight mode. For example, the virtual objects possess various skills or capabilities that the game player uses to achieve the goal. For example, the virtual object possesses one or more weapons, props, tools, etc. that can be used to eliminate other objects from the game. Such skills or capabilities may be activated by a player of the game using one of a plurality of preset touch operations with a touch display screen of the terminal. The processor may be configured to present a corresponding game screen in response to an operation instruction generated by a touch operation of a user.
The following detailed description is made with reference to the accompanying drawings, respectively. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments. Although a logical order is shown in the flowcharts, in some cases, the steps shown or described may be performed in an order different than that shown in the figures.
As shown in fig. 2, a specific flow of the control method for game guidance may include steps 201 to 203, where:
201. displaying the target graphical user interface of the current running game.
The target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game.
The currently running game may be a battle game, a card game, etc., and the specific type of the currently running game is not limited herein.
Wherein the target graphical user interface is a graphical user interface comprising an interaction interface layer and a mask interface layer of a currently running game.
The interactive interface layer refers to a rendering layer where a normal game UI interface is located. Under normal conditions, the interactive interface layer contains various interface contents such as chat pages, interactive controls and the like. The interactive control in the interactive interface layer may be specifically in the form of a picture, a button, or the like.
The guided component refers to an interactive control for guiding a player to operate, for example, as shown in fig. 3, the guided component may specifically be a control of a "buddy" graphic, a control of a "pet" graphic, a control of a "character" graphic, or a control of a "main city" graphic in a normal game UI interface.
Wherein, the currently directed component refers to an interactive control which needs to be operated by the player currently.
The mask interface layer refers to a rendering layer with a display level above the interaction interface layer.
In order to solve the problem that the guide component in the interactive interface layer is blocked by other page contents, in this embodiment, a mask interface layer is added on the interactive interface layer, a simulated click region of the currently guided component of the target graphical user interface is set on the mask interface layer, and the simulated click region is clicked to simulate an actual click effect of the currently guided component in the interactive interface layer. Therefore, the problem that operation guidance cannot be effectively finished due to the fact that the guiding component on the interaction interface layer is shielded by other page content on the interaction interface layer is solved.
Taking a terminal with a touch display screen as an example, in step 201, a target graphical user interface of a currently running game may be generated by rendering the terminal on the touch display screen by executing a game application or a game applet, where the target graphical user interface includes an interaction interface layer and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently directed component (e.g., "pet") of the target graphical user interface. And the simulated clicking area is added with a clicking event in advance and used for triggering the completion of the operation guide of the currently guided component corresponding to the simulated clicking area. At this time, the user triggers the click event added on the mask interface layer during touch operation on the touch display screen, and the click event added on the interactive interface layer is not directly triggered.
It will be appreciated that the click event of the simulated click region added on the mask interface layer and the click event of the currently directed component added on the interaction interface layer are both used to trigger the completion of the operational guidance of the currently directed component. The difference is that: the click event of the simulated click area added on the mask interface layer is that the completion of the operation guidance of the currently guided component can be triggered by contacting the simulated click area; the click event of the currently guided component added on the interactive interface layer is to contact the actual currently guided component on the interactive interface layer to trigger the completion of the operation guidance of the currently guided component.
Since the purpose of displaying the target graphical user interface including the mask interface layer in step 201 is to: the problem that operation guidance cannot be effectively finished due to the fact that the guided component on the interactive interface layer is shielded by other page content on the interactive interface layer is solved. Thus, in some embodiments, the display of the mask interface layer may be triggered when the currently directed component is occluded by other page content on the interaction interface layer; only the interaction interface layer may be displayed without triggering the display of the mask interface layer when the currently directed component is not obscured by other page content on the interaction interface layer. In other embodiments, the display of the mask interface layer may also be triggered when the type of guidance of the currently guided component is a strong guidance; only the interaction interface layer may be displayed without triggering the display of the mask interface layer when the type of guidance for the currently guided component is weak guidance.
As can be seen, the manner of displaying the target gui in step 201 varies with the trigger condition of displaying the target gui, and exemplarily includes:
(1) And when the currently guided component is blocked by other page contents on the interaction interface layer and the guide type of the currently guided component is strong guide, triggering the target graphical user interface display containing the shading interface layer. At this time, step 201 may further include steps A1 to A2, where:
a1, detecting the guide type of the current guided component of the current running game.
Wherein the guideline type is classification information for whether the currently directed component is a strong guideline.
Illustratively, the guidance type of each guided component in the currently running game is stored in the preset database in association with each guided component, and the guidance type of the currently guided component can be directly queried from the preset database in step A1.
And A2, detecting whether the current guided component or the guide special effect of the current guided component is blocked.
Wherein, the directing special effect can be a finger, an aperture and the like added on the currently directed component for indicating the operation position of the currently directed component.
Before step 201, a graphical user interface including an interactive interface layer and not including a mask interface layer is displayed on a touch display screen during the current running game. In this embodiment, the current game interface refers to that before step 201, a graphical user interface including an interactive interface layer and a mask interface layer is displayed on a touch display screen when the current running game runs.
In step A2, a screenshot of the current game interface may be obtained according to an interface position of the currently guided component in the interactive interface layer of the current game interface, and is detected to identify whether the currently guided component exists in the current game interface and whether the currently guided component is complete, so that whether the currently guided component is occluded may be determined. Similarly, the screenshot of the current game interface can be obtained according to the interface position of the guiding special effect of the currently guided component in the interactive interface layer of the current game interface for detection, so that whether the guiding special effect of the currently guided component exists in the current game interface or not and whether the guiding special effect of the currently guided component is complete or not can be identified, and whether the guiding special effect of the currently guided component is blocked or not can be determined.
Correspondingly, step 201 may specifically include: and if the guide type is strong guide and the currently guided component is blocked, displaying the target graphical user interface to highlight the operation guide item of the currently guided component and ensure that the operation guide item of the currently guided component can be found and triggered by a player, thereby ensuring that the operation guide of the currently guided component is effectively finished. Or if the guiding type is strong guiding and the guiding special effect of the currently guided component is blocked, displaying the target graphical user interface to highlight the operation guiding item of the currently guided component and ensure that the operation guiding item of the currently guided component can be found and triggered by a player, thereby ensuring that the operation guiding of the currently guided component is effectively finished.
If the guide type of the current guided component is weak guide, the current game interface including the interaction interface layer and the mask interface layer is kept to be displayed, and the target graphic user interface including the mask interface layer is not triggered to be displayed, so that the intuitiveness of the current game interface is kept, and the game interaction reality during operation guide is improved. Or if the currently guided component is not shielded (namely the currently guided component is not shielded), the current game interface including the interaction interface layer and the mask interface layer is kept to be displayed, and the target graphic user interface including the mask interface layer is not triggered to be displayed, so that the intuitiveness of the current game interface is kept, and the game interaction reality during operation guidance is improved. Or if the guide special effect of the currently guided component is not blocked, the current game interface comprising the interaction interface layer and the mask interface layer is kept to be displayed, and the target graphic user interface comprising the mask interface layer is not triggered to be displayed, so that the intuitiveness of the current game interface is kept, and the game interaction reality during operation guide is improved.
Therefore, in the first aspect, as the click event is added in advance in the click simulation area of the mask interface layer, the actual click effect of the currently guided component in the interactive interface layer can be simulated by clicking the click simulation area; therefore, when the guiding type of the currently guided component is strong guiding and the guiding component associated with the task to be guided is blocked, the target graphical user interface display comprising the mask interface layer is triggered; the problem that operation guidance cannot be effectively finished due to the fact that the actual currently guided component on the interaction interface layer is shielded by other page content on the interaction interface layer can be solved. In the second aspect, when the guide type of the currently guided component is weak guide, the completion of the operation guide does not affect the normal game, so when the guide type of the currently guided component is weak guide, the current game interface including the interaction interface layer and not including the mask interface layer is kept displayed, and the normal game interaction under the condition of weak guide can be ensured. In the third aspect, when the currently guided component is not shielded, the operation guide can be normally completed based on the currently guided component displayed by the interactive interface layer, and by keeping displaying the current game interface including the interactive interface layer and not including the mask interface layer, the attractiveness of the game interactive interface can be optimized, the intuitiveness of the current game interface can be kept, and the game interaction reality during the operation guide can be improved.
(2) When the type of the guide of the current guided component is strong guide and the player clicks at the current game interface at other positions outside the current guided component, the target graphical user interface display containing the mask interface layer is triggered again.
Because the data processing amount required for judging whether the currently guided component in the current game interface is occluded is relatively large, and whether the currently guided component is occluded or not can be reflected to a certain extent when a player correctly clicks the currently guided component, in order to simplify the data processing amount, the display of the target graphical user interface can be triggered when the guide type of the currently guided component is strong guide and the player clicks other positions outside the currently guided component in the current game interface. In this case, step 201 may further include the following steps B1 to B2, where:
b1, detecting the guide type of the currently guided component of the currently running game.
The implementation of step B1 is similar to that of step A1, and is not described herein again for simplicity.
And B2, detecting the clicked position on the interaction interface layer where the currently guided component is located.
Illustratively, firstly, touch operation of a player on a display touch screen is received, and a screen touch position of the display touch screen is determined. And then, determining the clicked position on the interactive interface layer where the currently guided component is located according to a preset position mapping relation between the interactive interface layer and the display touch screen and the touch position of the screen.
Correspondingly, step 201 may specifically include: and if the guide type is strong guide and the clicked position is other positions outside the current guided component, displaying the target graphical user interface.
If the guide type of the currently guided component is weak guide, the current game interface including the interaction interface layer and the mask interface layer is kept to be displayed, and the target graphical user interface display including the mask interface layer is not triggered. Or if the clicked position is the position of the currently guided component in the mask interface layer, the current game interface including the interaction interface layer and the mask interface layer is kept to be displayed, and the target graphical user interface including the mask interface layer is not triggered to be displayed.
Therefore, when the clicked position is other positions outside the currently guided component, the fact that the currently guided component is possibly occluded is reflected to a certain extent, and the target graphical user interface comprising the masking interface layer is displayed when the clicked position is other positions outside the currently guided component, so that the operation guide items of the currently guided component can be highlighted, the operation guide items of the currently guided component can be found and triggered by the player, and the operation guide of the currently guided component can be effectively finished.
202. And determining the touched position of the mask interface layer in response to the touch operation on the mask interface layer.
The touched position refers to a position of the mask interface layer which is touched.
Taking an example of generating a target graphical user interface by a terminal executing a game application or a game applet on a touch display screen for rendering, as shown in fig. 4, fig. 4 is a scene schematic diagram of the target graphical user interface provided in this embodiment, a player may touch any position of the target graphical user interface on the touch display screen, and since the mask interface layer is located on the interaction interface layer, the player directly touches and operates the mask interface layer. Therefore, the touched position of the mask interface layer can be determined by detecting the touch operation on the mask interface layer. For example, as shown in fig. 4, if the player clicks a position a on the touch display screen, the touched position of the mask interface layer is the position a; and if the player clicks the position B on the touch display screen, the touched position of the mask interface layer is the position B.
203. And if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to complete the operation guidance of the current guided component.
In order to ensure that the touch operation effect of the simulated click area is consistent with the actual touch operation effect of the currently guided component on the interactive interface layer, in the embodiment, an event dispatching function in an egr engine is used for event transmission; the real click event distribution of the currently guided component is simulated by sending the click event of the simulated click area, so that the operation guidance of the currently guided component can be finished without directly contacting the actual currently guided component, and further the new guidance operation of the game is finished. In this embodiment, through an event dispatch function in the egr engine, a click event simulating a click area is sent to simulate real click event dispatch of a currently guided component, and there are two main forms:
first, the click event simulating the click area is a list click event type in an egr engine, and the click event pre-associated with the simulated click area is distributed in a distribution mode of the list event. In this case, step 203 may specifically include: and if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area in a list event dispatching mode to complete the operation guidance of the current guided component.
Secondly, the click event simulating the click area is of a common click event type (namely, the click event simulating the click area is not a list click event type in an egr engine), and the click event pre-associated with the simulated click area is distributed in a distribution mode of the common click event. In this case, step 203 may specifically include: and if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area in a common clicking event dispatching mode to finish the operation guidance of the current guided assembly.
For example, in the egr engine, itemTapEvent is a list click event type, touchEvent is a common click event type; the dispatching mode of the list event is to dispatch the ItemTapEvent event in the Egret engine, and the dispatching mode of the common click event is to dispatch the TouchEvent event in the Egret engine. If the touched position is in the simulated clicking area, when the clicking event of the simulated clicking area is the list clicking event type, simulating the clicking event in the list item of the Egret engine by dispatching the ItemTapEvent event in the Egret engine, thereby realizing the transmission of the clicking event of the simulated clicking area and realizing the completion of the operation guidance of the currently guided component. When the click event of the simulated click area is of a common click event type (such as click events of common components such as buttons and pictures), the click event of the common components such as the buttons and the pictures is simulated by dispatching a touch event in the egr engine, so that the click event transmission of the simulated click area is realized, and the completion of operation guidance of the currently guided component is realized.
The click events pre-associated with the simulated click areas are dispatched in a dispatching mode of list events in the Egret engine or in a dispatching mode of common click events in the Egret engine, the click events pre-associated with the simulated click areas can be dispatched, click event transmission without directly clicking the components can be achieved, operation guidance of the currently guided components is finished without depending on the components, the problem that the game is interrupted due to the fact that the components are blocked and click failure is caused is solved, and game fluency is improved.
In order to improve the reality of simulating the actual trigger effect of the real component through the simulated click region, in the present embodiment, in the target graphical user interface, the simulated click region set on the mask interface layer is substantially the same as the actual position of the currently guided component on the interaction interface layer. However, the geometric characteristics of the currently guided component are complex and variable, and if the geometric characteristics of the currently guided component are directly adopted to determine the simulated click area of the currently guided component on the mask interface layer, the point data describing the simulated click area will be greatly increased, and the detection response speed of the click event of the simulated click area will be further caused.
In order to reduce the subsequent data processing amount of the simulated click region and improve the response speed of the operation guide, in this embodiment, the bounding box data of the currently guided component in the mask interface layer is calculated to replace the actual geometric representation of the currently guided component, and the simulated click region of the mask interface layer in step 201 can be determined through the following steps C1 to C3:
and C1, generating the mask interface layer on the upper layer of the interaction interface layer.
In order to solve the problem of occlusion caused by level disorder, the embodiment avoids the occlusion problem by adding a mask interface layer on the interactive interface layer. And then setting a simulated click area of the currently guided component on the mask interface layer according to the mapping coordinates of the currently guided component on the mask interface layer. And simulating the actual click event effect of the currently guided component by clicking the simulated click area to finish click guidance.
And C2, determining bounding box data of the current guided component on the mask interface layer based on the interface position of the current guided component on the interaction interface layer.
The bounding box algorithm is an algorithm for solving the optimal bounding space of the discrete point set, and the basic idea is to approximately replace a complex geometric object by a geometric body with a slightly larger volume and simple characteristics.
Wherein the bounding box data refers to geometric representation data of the currently directed component on the mask interface layer derived from a bounding box representation of the currently directed component.
For example, taking the bounding box of the currently directed component as a rectangle as an example, first, the rectangular bounding box of the currently directed component on the interactive interface layer may be calculated based on the interface position of the currently directed component on the interactive interface layer and an AABB bounding box (Axis-aligned bounding box) algorithm. Then, according to the rectangular bounding box of the currently guided component on the interactive interface layer and the position mapping relation between the interactive interface layer and the mask interface layer, the rectangular bounding box of the currently guided component on the interactive interface layer is mapped onto the mask interface layer, and therefore the bounding box data of the currently guided component on the mask interface layer is obtained.
For example, in some embodiments, the bounding box data of the currently guided component mapped on the mask interface layer may also be calculated by the obj.
And C3, generating a rectangular object region in the mask interface layer based on the bounding box data to serve as a simulated click region of the current directed component.
Wherein the rectangular object area is added with a click event for triggering completion of the operational guidance of the currently directed component.
Illustratively, taking the case where the bounding box of the currently directed component is rectangular, after determining the bounding box data of the currently directed component on the mask interface layer, first, a rectangular object region adapted to the bounding box data may be generated at the mask interface layer directly at the currently directed component. Then, adding a click event in the rectangular object area to trigger the completion of the operation guidance of the currently guided component by clicking the rectangular object area, thereby simulating the real click event of the currently guided component. Finally, the rectangular object area added with the click event (used for triggering the operation guide of the currently guided component to be completed) is used as the simulated click area of the currently guided component.
It will be appreciated that the click event of the simulated click region added on the mask interface layer and the click event of the currently directed component added on the interaction interface layer are both used to trigger the completion of the operational guidance of the currently directed component. The difference is that: the click event of the simulated click area added on the mask interface layer is that the completion of the operation guidance of the currently guided component can be triggered by contacting the simulated click area; the click event of the currently guided component added on the interactive interface layer is to contact the actual currently guided component on the interactive interface layer to trigger the completion of the operation guidance of the currently guided component.
Therefore, the rectangular object area added with the click event (used for triggering the completion of the operation guidance of the currently guided component) is used as the simulated click area of the currently guided component, so that the completion of the operation guidance of the currently guided component can be directly triggered by clicking the simulated click area without directly contacting the actually currently guided component of the interaction interface layer, the problem that the operation guidance of the currently guided component can be triggered to complete by penetrating the currently guided component is avoided, the problem of performance consumption caused by penetrating the component is further avoided, and the game running speed is improved to a certain extent.
Further, in order to enhance the guiding effect, a guiding special effect may be placed in a mask interface layer to avoid a problem that game guiding cannot be effectively completed due to the guiding special effect being blocked by other page contents in an interaction interface layer, and at this time, before step 201, a display position of the guiding special effect on the mask interface layer may be obtained based on the bounding box data; generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
In addition, the guided component and the guided special effect are kept at the same position to enhance the guiding effect, but if the current running game is the H5 game, the guiding special effect can be added only after the guiding hierarchy is loaded (namely the current guided component is loaded completely) due to the asynchronous loading problem of the H5 resource. Therefore, in order to keep the guided component and the guiding special effect at the same position, the display position of the guiding special effect on the mask interface layer is determined through the bounding box data of the current guided component, the guiding special effect of the current guided component is generated on the mask interface layer, and the reading of the local coordinates of the current guided component and the scaling for calibrating the guiding special effect can be realized.
Further, click events pre-associated with simulated click zones may also be automatically dispatched in some cases to automatically complete the operational guidance of the currently directed component. For example, when the touch operation of the mask interface layer is not detected for a long time, in order to ensure the continuation of the currently running game, the click event pre-associated with the simulated click area can be automatically distributed. For another example, the game novice guide needs a large number of testers to perform manual tests, and in the stage of testing the novice guide, the testers are required to follow the click operation that the game guide does not stop, in order to reduce the test cost, click events associated in advance in the simulated click area can be automatically distributed, so that automatic click behaviors are realized to realize automatic guide, the time of the testers is effectively reduced, the click frequency of different dimensions can be tested and counted, the coverage of the test range is increased, the BUG is triggered and hidden, and the robustness of the project is improved.
Taking the automatic guidance in the game novice guidance test as an example, the control method for game guidance may further include the following steps D1 to D2 to realize the automatic click of the simulated click area, wherein:
d1, detecting whether the automatic guiding function of the current running game is started.
Taking an example of generating a target graphical user interface by rendering a game application or a game applet on a touch display screen through a terminal, as shown in fig. 5, fig. 5 is a schematic process diagram of operation guidance of a guided component in an embodiment of the present application. After a target graphical user interface is displayed on a touch display screen, detecting whether an automatic guide function of a currently running game is started; if the automatic guiding function of the current running game is started, entering the step D2 to realize automatic guiding; otherwise, if the automated guidance function of the operation guidance is not started, triggering a click event associated in advance with the simulated click area according to the response mode of the step 202 to the step 203.
And D2, if the automatic guiding function is started, automatically dispatching the click event pre-associated with the simulated click area according to the preset interval duration so as to complete the operation guiding of the currently guided component.
The specific value of the preset interval duration may be set according to the actual test service scenario requirement, and the specific value of the preset interval duration is not limited here.
The manner of "automatically dispatching the click event pre-associated with the simulated click area" is similar to the manner of using the event dispatching function in the egr engine to deliver the event in step 203, and is not described herein again for simplicity. In contrast, in step 203, the click event pre-associated with the simulated click region is dispatched only when the touched position of the interface layer is located within the simulated click region (i.e. during the player touch operation); and here, the click events pre-associated with the simulated click zones are automatically dispatched every preset interval duration.
For example, in step D2, when the simulated click area is determined by the automatic guidance, a delay interval of the delay function may be set to simulate a preset interval duration, so as to implement automatic dispatch of click events pre-associated with the simulated click area at a delay designated time. On one hand, by setting different delay intervals (namely preset interval duration), clicking behaviors at different intervals can be simulated, and the simulation of player clicking frequency by increasing the time dimension is realized. On the other hand, the test click frequency is increased in an automatic guide mode, the repeated test time of quality detection personnel is saved, the test efficiency is improved, and the test cost is reduced.
For example, taking the implementation of game novice guiding as an example, as shown in fig. 5, the game server issues a novice instruction to the running terminal currently running the game, and then may trigger the novice manager to complete game novice guiding. Alternatively, the guidance prompt for the currently directed component may be highlighted in the mask interface layer by displaying the target graphical user interface for the currently running game. At the moment, if the automatic guiding function of the current running game is started, automatically responding to the click event finished by the operation guiding of the current guided component, and distributing the click event pre-associated with the simulated click area through a distribution mode of list events in an Egret engine or a distribution mode of common click events; if the automatic guiding function of the current running game is not started, after receiving the operation of clicking the simulated clicking area, responding to the clicking event finished by the operation guidance of the current guided component and finishing the dispatching of the clicking event; thereby completing the game novice guidance. And by analogy, after the distribution of the click event is finished, a new round of game beginner guide is entered.
In some cases, such as where the steps of operating the guidance are more often than not, the player may temporarily not want to complete the guidance to avoid affecting the current game progress. On the one hand, however, the current guide cannot be exited without completing all the steps of the current operation guide, so that the target graphical user interface including the mask interface layer is still displayed on the screen; on the other hand, since the mask interface layer is covered on the interactive interface layer in a full screen, the game may be continued until all steps of the guideline are completed even in the case of weak guideline. Therefore, further, in order to avoid that the game interface without the mask interface layer can be recovered when all the guiding steps need to be completed, the player can automatically quit the target graphical user interface with the mask interface layer when the player temporarily does not want to complete all the guiding steps, and then the touch operation game can be continued in the graphical user interface with the interactive interface layer but without the mask interface layer, so that the current game progress is prevented from being influenced. The method may further comprise: detecting a type of direction of a currently directed component of the currently running game; and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
The hidden layer operation of the mask interface layer may be set according to actual service scene requirements, for example, the fixed position may be continuously clicked for multiple times under the display condition of the mask interface layer, the sliding distance in the fixed region is greater than a preset distance, and the like, and the specific representation form of the hidden layer operation of the mask interface layer is not limited herein.
Detecting hidden layer operation of a mask interface layer under the condition that the guiding type of the current guided component is weak guiding, and hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected; therefore, the player can automatically quit the guidance and continue the game when the player does not want to complete the guidance temporarily, and the problem that the game can be continued only when weak guidance needs to be completed after a mask interface layer is added can be avoided.
It can be seen from the above that, in the first aspect, by displaying the target graphical user interface including the mask interface layer, since the mask interface layer is provided with the simulated click region of the currently guided component of the currently running game, the actual currently guided component that is blocked in the interaction interface layer can be highlighted, the operation guidance item of the currently guided component can be found and triggered by the player, and further the operation guidance of the blocked currently guided component can be effectively completed. In a second aspect, because the click event used for triggering the operation guidance completion of the currently guided component is added in advance in the simulated click area, the click event correlated with the simulated click area in advance is triggered when the touched position is in the simulated click area, and the click event transmission of the indirectly clicked component can be realized, so that the guidance operation can be completed without directly contacting the actual component, and then the operation guidance of the currently guided component is completed without depending on the actual component, thereby avoiding the problem of game interruption caused by the blocked click failure of the component, and improving the game fluency. In the third aspect, the click event of the real component of the interactive interface layer is simulated by distributing the click event of the simulated click area, so that the processing of a penetration event penetrating the interactive interface layer is not needed, the energy consumption caused by the penetration event is avoided, and the running energy consumption of the game is reduced on the basis that the guided component can be effectively triggered.
In order to better implement the method, the embodiment of the present application further provides a game guide control device, which may be specifically integrated in an electronic device, for example, a computer device, where the computer device may be a terminal, a server, or the like.
The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer and other devices; the server may be a single server or a server cluster composed of a plurality of servers.
For example, in this embodiment, the method of the embodiment of the present application will be described in detail by taking an example in which a control device for game guidance is specifically integrated in a smart phone.
For example, as shown in fig. 6, the game guide control means may include:
the display unit 601 is configured to display a target graphical user interface of a currently running game, where the target graphical user interface includes an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
a response unit 602, configured to determine a touched position of the mask interface layer in response to a touch operation on the mask interface layer;
the control unit 603 is configured to trigger a click event pre-associated with the simulated click area if the touched position is located in the simulated click area, so as to complete operation guidance of the currently guided component.
In some embodiments, the game guidance control device further includes a processing unit (not shown in the figure), and before the target graphical user interface of the currently running game is displayed, the processing unit is specifically configured to:
generating the mask interface layer on the upper layer of the interaction interface layer;
determining bounding box data of the currently directed component on the mask interface layer based on an interface location of the currently directed component on the interaction interface layer;
generating a rectangular object region in the mask interface layer based on the bounding box data as a simulated click region of the currently directed component, wherein the rectangular object region is added with a click event for triggering completion of operational guidance of the currently directed component.
In some embodiments, the mask interface layer is further provided with a guidance special effect of the currently guided component, and the processing unit is specifically configured to:
acquiring the display position of the directing special effect on the mask interface layer based on the bounding box data;
generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
In some embodiments, the control unit 603 is specifically configured to:
if the touched position is located in the simulated clicking area, dispatching a clicking event pre-associated with the simulated clicking area in a dispatching mode of a list event to complete operation guidance of the current guided component;
or if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area through a dispatching mode of a common clicking event so as to complete the operation guidance of the current guided component.
In some embodiments, the control unit 603 is specifically configured to:
detecting whether an automatic guiding function of the current running game is started or not;
and if the automatic guiding function is started, automatically dispatching the click event pre-associated with the simulated click area according to a preset interval duration so as to complete the operation guiding of the currently guided component.
In some embodiments, before displaying the target graphical user interface of the currently running game, the display unit 601 is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
detecting whether the current guided component or the guide special effect of the current guided component is blocked;
in some embodiments, the display unit 601 is specifically configured to:
if the guiding type is strong guiding and the current guided component is blocked, displaying the target graphical user interface;
or if the guide type is strong guide and the guide special effect of the currently guided component is blocked, displaying the target graphical user interface.
In some embodiments, before displaying the target graphical user interface of the currently running game, the display unit 601 is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
detecting a clicked position on an interaction interface layer where the current guided component is located;
in some embodiments, the display unit 601 is specifically configured to:
and if the guide type is strong guide and the clicked position is other positions except the current guided component, displaying the target graphical user interface.
In some embodiments, the game-guided control device further comprises a hiding unit (not shown in the figure), which is specifically configured to:
detecting a type of direction of a currently directed component of the currently running game;
and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
As can be seen from the above, the control device for game guidance according to this embodiment may display, by the display unit 601, a target graphical user interface of a currently running game, where the target graphical user interface includes an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game; determining, by the response unit 602, a touched position of the mask interface layer in response to a touch operation on the mask interface layer; if the touched position is located in the simulated click area, the control unit 603 triggers a click event pre-associated with the simulated click area to complete the operation guidance of the currently guided component. Therefore, the control device for game guidance provided by the embodiment of the application can bring the following technical effects: in the first aspect, by displaying the target graphical user interface including the mask interface layer, the actual currently guided component which is blocked in the interactive interface layer can be highlighted as the simulated click area of the currently guided component of the currently running game is arranged on the mask interface layer, so that the operation guide item of the currently guided component can be found and triggered by the player, and the operation guide of the blocked currently guided component can be effectively completed. In a second aspect, because the click event used for triggering the operation guidance completion of the currently guided component is added in advance in the simulated click area, the click event correlated with the simulated click area in advance is triggered when the touched position is in the simulated click area, and the click event transmission of the indirectly clicked component can be realized, so that the guidance operation can be completed without directly contacting the actual component, and then the operation guidance of the currently guided component is completed without depending on the actual component, thereby avoiding the problem of game interruption caused by the blocked click failure of the component, and improving the game fluency. In the third aspect, the click event of the real component of the interactive interface layer is simulated by distributing the click event of the simulated click area, so that the processing of a penetration event penetrating the interactive interface layer is not needed, the energy consumption caused by the penetration event is avoided, and the running energy consumption of the game is reduced on the basis that the guided component can be effectively triggered.
Correspondingly, the embodiment of the present application further provides an electronic device, where the electronic device may be a terminal, and the terminal may be a terminal such as a smart phone, a tablet Computer, a notebook Computer, a touch screen, a game console, a Personal Computer (PC), a Personal Digital Assistant (PDA), and the like. As shown in fig. 7, fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present application. The electronic device 700 includes a processor 701 having one or more processing cores, a memory 702 having one or more computer-readable storage media, and a computer program stored on the memory 702 and executable on the processor. The processor 701 is electrically connected to the memory 702. Those skilled in the art will appreciate that the electronic device configurations shown in the figures do not constitute limitations of the electronic device, and may include more or fewer components than shown, or some components in combination, or a different arrangement of components.
The processor 701 is a control center of the electronic device 700, connects various parts of the entire electronic device 700 using various interfaces and lines, and performs various functions of the electronic device 700 and processes data by running or loading software programs and/or modules stored in the memory 702 and calling data stored in the memory 702, thereby performing overall monitoring of the electronic device 700.
In this embodiment, the processor 701 in the electronic device 700 loads instructions corresponding to processes of one or more application programs into the memory 702 according to the following steps, and the processor 701 executes the application program stored in the memory 702, so as to implement various functions:
displaying a target graphical user interface of a currently running game, wherein the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
responding to the touch operation of the mask interface layer, and determining the touched position of the mask interface layer;
and if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to complete the operation guidance of the current guided component.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
generating the mask interface layer on the upper layer of the interaction interface layer;
determining bounding box data of the currently directed component on the mask interface layer based on the interface location of the currently directed component on the interaction interface layer;
generating a rectangular object region in the mask interface layer based on the bounding box data as a simulated click region of the currently directed component, wherein the rectangular object region is added with a click event for triggering completion of operational guidance of the currently directed component.
In some embodiments, the mask interface layer is further provided with a directing effect of the currently directed component, the method further comprising:
acquiring the display position of the directing special effect on the mask interface layer based on the bounding box data;
generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
In some embodiments, the triggering a click event pre-associated with the simulated click region to complete operation guidance of the currently directed component if the touched position is within the simulated click region includes:
if the touched position is located in the simulated clicking area, dispatching a clicking event pre-associated with the simulated clicking area in a dispatching mode of a list event to complete operation guidance of the current guided component;
or if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area through a dispatching mode of a common clicking event so as to complete the operation guidance of the current guided component.
In some embodiments, the method further comprises:
detecting whether an automatic guide function of the current running game is started;
if the automatic guiding function is started, according to a preset interval duration, automatically dispatching a click event pre-associated with the simulated click area so as to complete the operation guiding of the current guided component.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
detecting whether the current guided component or the guide special effect of the current guided component is blocked;
the target graphical user interface for displaying the currently running game comprises:
if the guiding type is strong guiding and the current guided component is blocked, displaying the target graphical user interface;
or if the guide type is strong guide and the guide special effect of the currently guided component is blocked, displaying the target graphical user interface.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
detecting a clicked position on an interaction interface layer where the current guided component is located;
the target graphic user interface for displaying the current running game comprises:
and if the guide type is strong guide and the clicked position is other positions except the current guided component, displaying the target graphical user interface.
In some embodiments, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
Therefore, the electronic device 700 provided by the embodiment can bring the following technical effects: in the first aspect, by displaying the target graphical user interface including the mask interface layer, the actual currently guided component which is blocked in the interactive interface layer can be highlighted as the simulated click area of the currently guided component of the currently running game is arranged on the mask interface layer, so that the operation guide item of the currently guided component can be found and triggered by the player, and the operation guide of the blocked currently guided component can be effectively completed. In the second aspect, as the click event for triggering the completion of the operation guidance of the currently guided component is added in the simulated click area in advance, the click event which is correlated with the simulated click area in advance is triggered when the touched position is in the simulated click area, and the click event transmission of the indirectly clicked component can be realized, so that the guidance operation can be completed without directly contacting the actual component, the operation guidance of the currently guided component is further completed without depending on the actual component, the problem of game interruption caused by the fact that the component is blocked and clicked to fail is avoided, and the game fluency is improved. In the third aspect, the click event of the real component of the interactive interface layer is simulated by distributing the click event of the simulated click area, so that the processing of a penetration event penetrating the interactive interface layer is not needed, the energy consumption caused by the penetration event is avoided, and the running energy consumption of the game is reduced on the basis that the guided component can be effectively triggered.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Optionally, as shown in fig. 7, the electronic device 700 further includes: a touch display screen 703, a radio frequency circuit 704, an audio circuit 705, an input unit 706, and a power supply 707. The processor 701 is electrically connected to the touch display screen 703, the radio frequency circuit 704, the audio circuit 705, the input unit 706, and the power source 707. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 7 does not constitute a limitation of the electronic device and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The touch display screen 703 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 703 may include a display panel and a touch panel. Among other things, the display panel may be used to display information input by or provided to a user as well as various graphical user interfaces of the electronic device, which may be made up of graphics, text, icons, video, and any combination thereof. Alternatively, the Display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel using any suitable object or accessory such as a finger, a stylus pen, and the like), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 701, and can receive and execute commands sent by the processor 701. The touch panel may overlay the display panel, and when the touch panel detects a touch operation thereon or nearby, the touch panel may be transmitted to the processor 701 to determine the type of the touch event, and then the processor 701 may provide a corresponding visual output on the display panel according to the type of the touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 703 to realize input and output functions. However, in some embodiments, the touch panel and the touch panel can be implemented as two separate components to perform the input and output functions. That is, the touch display screen 703 can also be used as a part of the input unit 706 to implement an input function.
The radio frequency circuit 704 may be used for transceiving radio frequency signals to establish wireless communication with a network device or other electronic devices through wireless communication, and for transceiving signals with the network device or other electronic devices.
The audio circuit 705 may be used to provide an audio interface between a user and an electronic device through a speaker, microphone. The audio circuit 705 may transmit the electrical signal converted from the received audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 705 and converted into audio data, which is then processed by the output processor 701 and transmitted to, for example, another electronic device via the rf circuit 704, or output to the memory 702 for further processing. The audio circuit 705 may also include an earbud jack to provide communication of peripheral headphones with the electronic device.
The input unit 706 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 707 is used to supply power to the various components of the electronic device 700. Optionally, the power source 707 may be logically connected to the processor 701 through a power management system, so as to implement functions of managing charging, discharging, power consumption, and the like through the power management system. The power supply 707 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
Although not shown in fig. 7, the electronic device 700 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described in detail herein.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
It will be understood by those skilled in the art that all or part of the steps of the methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, the present application provides a computer-readable storage medium, in which a plurality of computer programs are stored, and the computer programs can be loaded by a processor to execute the steps in any one of the game guidance control methods provided by the embodiments of the present application. For example, the computer program may perform the steps of:
displaying a target graphical user interface of a currently running game, wherein the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
responding to the touch operation of the mask interface layer, and determining the touched position of the mask interface layer;
and if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to complete the operation guidance of the current guided component.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
generating the mask interface layer on the upper layer of the interaction interface layer;
determining bounding box data of the currently directed component on the mask interface layer based on an interface location of the currently directed component on the interaction interface layer;
generating a rectangular object region at the mask interface layer based on the bounding box data as a simulated click region of the currently directed component, wherein the rectangular object region is augmented with a click event for triggering completion of operational guidance of the currently directed component.
In some embodiments, the mask interface layer is further provided with a directing effect of the currently directed component, the method further comprising:
acquiring the display position of the directing special effect on the mask interface layer based on the bounding box data;
generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
In some embodiments, the triggering a click event pre-associated with the simulated click region to complete operation guidance of the currently directed component if the touched position is within the simulated click region includes:
if the touched position is located in the simulated clicking area, dispatching a clicking event pre-associated with the simulated clicking area in a dispatching mode of a list event to complete operation guidance of the current guided component;
or if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area in a dispatching mode of a common clicking event so as to complete the operation guidance of the currently guided component.
In some embodiments, the method further comprises:
detecting whether an automatic guide function of the current running game is started;
and if the automatic guiding function is started, automatically dispatching the click event pre-associated with the simulated click area according to a preset interval duration so as to complete the operation guiding of the currently guided component.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
detecting whether the current guided component or the guide special effect of the current guided component is blocked;
the target graphical user interface for displaying the currently running game comprises:
if the guiding type is strong guiding and the current guided component is blocked, displaying the target graphical user interface;
or if the guide type is strong guide and the guide special effect of the currently guided component is blocked, displaying the target graphical user interface.
In some embodiments, before displaying the target graphical user interface of the currently running game, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
detecting a clicked position on an interaction interface layer where the current guided component is located;
the target graphical user interface for displaying the currently running game comprises:
and if the guide type is strong guide and the clicked position is other positions outside the current guided component, displaying the target graphical user interface.
In some embodiments, the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
It can be seen that the computer program can be loaded by a processor to execute the steps in any one of the control methods for game guidance provided in the embodiments of the present application, so as to achieve the following technical effects: in the first aspect, by displaying the target graphical user interface including the mask interface layer, the actual currently guided component which is blocked in the interactive interface layer can be highlighted as the simulated click area of the currently guided component of the currently running game is arranged on the mask interface layer, so that the operation guide item of the currently guided component can be found and triggered by the player, and the operation guide of the blocked currently guided component can be effectively completed. In the second aspect, as the click event for triggering the completion of the operation guidance of the currently guided component is added in the simulated click area in advance, the click event which is correlated with the simulated click area in advance is triggered when the touched position is in the simulated click area, and the click event transmission of the indirectly clicked component can be realized, so that the guidance operation can be completed without directly contacting the actual component, the operation guidance of the currently guided component is further completed without depending on the actual component, the problem of game interruption caused by the fact that the component is blocked and clicked to fail is avoided, and the game fluency is improved. In the third aspect, the click event of the real component of the interactive interface layer is simulated by dispatching the click event of the simulated click area, so that the processing of the penetration event penetrating the interactive interface layer is not needed, the energy consumption caused by the penetration event is avoided, and the running energy consumption of the game is reduced on the basis of ensuring that the guided component can be effectively triggered.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the computer-readable storage medium may include: read Only Memory (ROM), random Access Memory (RAM), magnetic or optical disks, and the like.
Since the computer program stored in the computer-readable storage medium can execute the steps in any of the control methods for game guidance provided in the embodiments of the present application, the beneficial effects that can be achieved by any of the control methods for game guidance provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The above detailed description is provided for the control method, apparatus, electronic device and computer readable storage medium of game guidance provided in the embodiments of the present application, and specific examples are applied herein to explain the principles and embodiments of the present application, and the description of the above embodiments is only used to help understanding the method and its core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (11)

1. A control method of a game guide, comprising:
displaying a target graphical user interface of a currently running game, wherein the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
responding to the touch operation of the mask interface layer, and determining the touched position of the mask interface layer;
and if the touched position is located in the simulated clicking area, triggering a clicking event pre-associated with the simulated clicking area to finish the operation guidance of the current guided component.
2. The method of controlling game guidance according to claim 1, wherein before displaying the target graphical user interface of the currently running game, further comprising:
generating the mask interface layer on the upper layer of the interaction interface layer;
determining bounding box data of the currently directed component on the mask interface layer based on the interface location of the currently directed component on the interaction interface layer;
generating a rectangular object region in the mask interface layer based on the bounding box data as a simulated click region of the currently directed component, wherein the rectangular object region is added with a click event for triggering completion of operational guidance of the currently directed component.
3. A method of controlling a game guide according to claim 2, wherein the mask interface layer is further provided with a guide effect for the currently guided component, the method further comprising:
acquiring the display position of the directing special effect on the mask interface layer based on the bounding box data;
generating a directing special effect for the currently directed component at the mask interface layer according to the display position.
4. The method as claimed in claim 1, wherein if the touched position is within the simulated click zone, triggering a click event pre-associated with the simulated click zone to complete the operation guidance of the currently directed component, comprises:
if the touched position is located in the simulated clicking area, dispatching a clicking event pre-associated with the simulated clicking area in a dispatching mode of a list event to complete operation guidance of the current guided component;
or if the touched position is located in the simulated clicking area, dispatching the clicking event pre-associated with the simulated clicking area through a dispatching mode of a common clicking event so as to complete the operation guidance of the current guided component.
5. The game guidance control method according to claim 1, characterized in that the method further comprises:
detecting whether an automatic guide function of the current running game is started;
and if the automatic guiding function is started, automatically dispatching the click event pre-associated with the simulated click area according to a preset interval duration so as to complete the operation guiding of the currently guided component.
6. The method of controlling game guidance according to claim 1, wherein before displaying the target graphical user interface of the currently running game, further comprising:
detecting a type of direction of a currently directed component of the currently running game;
detecting whether the current guided component or the guide special effect of the current guided component is blocked;
the target graphical user interface for displaying the currently running game comprises:
if the guiding type is strong guiding and the current guided component is blocked, displaying the target graphical user interface;
or if the guide type is strong guide and the guide special effect of the currently guided component is blocked, displaying the target graphical user interface.
7. The method of controlling game guidance according to claim 1, wherein before displaying the target graphical user interface of the currently running game, further comprising:
detecting a type of direction of a currently directed component of the currently running game;
detecting a clicked position on an interaction interface layer where the currently guided component is located;
the target graphical user interface for displaying the currently running game comprises:
and if the guide type is strong guide and the clicked position is other positions except the current guided component, displaying the target graphical user interface.
8. The method of controlling game guidance of any one of claims 1-7, wherein the method further comprises:
detecting a type of direction of a currently directed component of the currently running game;
and if the guide type is weak guide, hiding the mask interface layer when the hidden layer operation of the mask interface layer is detected.
9. A game guidance control device, characterized by comprising:
the game system comprises a display unit, a display unit and a control unit, wherein the display unit is used for displaying a target graphical user interface of a currently running game, the target graphical user interface comprises an interaction interface layer of the currently running game and a mask interface layer on the interaction interface layer, and the mask interface layer is provided with a simulated click area of a currently guided component of the currently running game;
the response unit is used for responding to the touch operation of the mask interface layer and determining the touched position of the mask interface layer;
and the control unit is used for triggering a click event which is pre-associated with the simulated click area if the touched position is located in the simulated click area so as to complete the operation guidance of the currently guided component.
10. An electronic device comprising a processor and a memory, the memory storing a plurality of instructions; the processor loads instructions from the memory to perform the steps in the game guidance control method according to any one of claims 1 to 8.
11. A computer-readable storage medium storing a plurality of instructions adapted to be loaded by a processor to perform the steps of the game guidance control method according to any one of claims 1 to 8.
CN202210880351.0A 2022-07-25 2022-07-25 Control method and device for game guidance, electronic equipment and readable storage medium Pending CN115212568A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210880351.0A CN115212568A (en) 2022-07-25 2022-07-25 Control method and device for game guidance, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210880351.0A CN115212568A (en) 2022-07-25 2022-07-25 Control method and device for game guidance, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN115212568A true CN115212568A (en) 2022-10-21

Family

ID=83614184

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210880351.0A Pending CN115212568A (en) 2022-07-25 2022-07-25 Control method and device for game guidance, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN115212568A (en)

Similar Documents

Publication Publication Date Title
US11833426B2 (en) Virtual object control method and related apparatus
CN113101652A (en) Information display method and device, computer equipment and storage medium
AU2021250929A1 (en) Virtual object control method and apparatus, device, and storage medium
CN111760274A (en) Skill control method and device, storage medium and computer equipment
CN113082712A (en) Control method and device of virtual role, computer equipment and storage medium
CN113398590B (en) Sound processing method, device, computer equipment and storage medium
CN113350793B (en) Interface element setting method and device, electronic equipment and storage medium
CN113485617A (en) Animation display method and device, electronic equipment and storage medium
CN113332716A (en) Virtual article processing method and device, computer equipment and storage medium
CN114189731B (en) Feedback method, device, equipment and storage medium after giving virtual gift
CN112799754B (en) Information processing method, information processing device, storage medium and computer equipment
CN115999153A (en) Virtual character control method and device, storage medium and terminal equipment
CN115212568A (en) Control method and device for game guidance, electronic equipment and readable storage medium
CN114225412A (en) Information processing method, information processing device, computer equipment and storage medium
US20240173626A1 (en) Method and apparatus for interaction in virtual environment
CN115040867A (en) Game card control method and device, computer equipment and storage medium
CN115430145A (en) Target position interaction method and device, electronic equipment and readable storage medium
CN115430151A (en) Game role control method and device, electronic equipment and readable storage medium
CN115193062A (en) Game control method, device, storage medium and computer equipment
CN115068943A (en) Game card control method and device, computer equipment and storage medium
CN115212569A (en) Virtual property prompting method and device, storage medium and electronic equipment
CN116370960A (en) Virtual character selection method, device, electronic equipment and storage medium
CN114191814A (en) Information processing method, information processing device, computer equipment and storage medium
CN116421968A (en) Virtual character control method, device, electronic equipment and storage medium
CN116328315A (en) Virtual model processing method, device, terminal and storage medium based on block chain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination