CN112587925B - Guide information display method and device, storage medium and computer equipment - Google Patents

Guide information display method and device, storage medium and computer equipment Download PDF

Info

Publication number
CN112587925B
CN112587925B CN202011639207.5A CN202011639207A CN112587925B CN 112587925 B CN112587925 B CN 112587925B CN 202011639207 A CN202011639207 A CN 202011639207A CN 112587925 B CN112587925 B CN 112587925B
Authority
CN
China
Prior art keywords
control
controls
queriable
current game
game interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011639207.5A
Other languages
Chinese (zh)
Other versions
CN112587925A (en
Inventor
刘海岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202011639207.5A priority Critical patent/CN112587925B/en
Publication of CN112587925A publication Critical patent/CN112587925A/en
Application granted granted Critical
Publication of CN112587925B publication Critical patent/CN112587925B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • A63F13/533Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game for prompting the player, e.g. by displaying a game menu
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0483Interaction with page-structured environments, e.g. book metaphor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a method, a device, a storage medium and a computer device for displaying guide information, comprising the following steps: when a guiding inquiry command is detected, determining a plurality of inquireable controls according to a plurality of functional controls displayed on a current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface further comprises a guiding control; determining a target control from the queriable controls in response to touch operation of a user on the current game interface on the queriable controls and/or the guiding controls; the guiding information corresponding to the target control is queried, and the queried guiding information is displayed on the current game interface, so that guiding explanation of a single function key can be rapidly realized in the game process, the flexibility is high, text input is not needed, and the operation is simple and convenient.

Description

Guide information display method and device, storage medium and computer equipment
Technical Field
The present application relates to the field of game design, and in particular, to a method and apparatus for displaying guidance information, a storage medium, and a computer device.
Background
When a game is produced and released for a player, the player does not know how to play the game when contacting the game for the first time, which may cause a part of novice players to discard the game. Under such conditions, game guidance techniques have been developed that guide players through script control to learn how to control characters to perform various basic operations in a game.
The beginner guiding of the existing game mostly adopts a forced guiding mode, for example, when entering a certain game scene for the first time, the key skill guiding explanation corresponding to the game scene is automatically triggered, the forced guiding mode is generally suitable for guiding the whole game, and if the role of related keys is to be ascertained in the game, the related guiding explanation must be queried through text, so that the operation is complex, and the flexibility is lower.
Disclosure of Invention
The application aims to provide a method and a device for displaying guide information, a storage medium and computer equipment, which can rapidly realize guide interpretation of single function keys and have high flexibility.
The embodiment of the application provides a method for displaying guide information, which comprises the following steps:
When a guiding inquiry command is detected, determining a plurality of inquireable controls according to a plurality of functional controls displayed on a current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface further comprises a guiding control;
Determining a target control from the queriable controls in response to touch operation of a user on the current game interface on the queriable controls and/or the guiding controls;
inquiring the guiding information corresponding to the target control, and displaying the inquired guiding information on the current game interface.
The embodiment of the application also provides a display device of the guiding information, which comprises:
The first determining module is used for determining a plurality of queriable controls according to a plurality of functional controls displayed on a current game interface when a guiding query instruction is detected, wherein different queriable controls correspond to different guiding information, the queriable controls comprise at least one functional control, and the current game interface further comprises a guiding control;
A second determining module, configured to determine a target control from the queriable controls in response to a touch operation of a user on the current game interface on the queriable controls and/or the guide control;
and the inquiry display module is used for inquiring the guide information corresponding to the target control and displaying the inquired guide information on the current game interface.
The touch operation includes a drag operation, and the second determining module is specifically configured to: when any of the queriable controls on the current game interface is dragged to the guide control, the dragged queriable control is used as a target control; or when dragging the guiding control on the current game interface to any queriable control, taking the corresponding queriable control as a target control.
The touch operation includes a clicking operation, and the second determining module is specifically configured to:
and when the click operation occurs to the guide control and any one of the queriable controls and the click operation meets the preset condition, taking the clicked queriable control as a target control.
Wherein, the preset conditions include: the click start time of the guide control is earlier than the click start time of the clicked queriable control, and the click end time is equal to the click end time of the clicked queriable control.
Wherein the touch operation includes a click operation, and the determining, in response to a touch operation of the user on the current game interface on the queriable control, a target control from the queriable controls includes:
And when any one of the inquireable controls is subjected to continuous multiple clicking operations, and the times of the clicking operations are equal to the preset times, taking the clicked inquireable control as a target control.
Wherein, the display device of the guiding information further comprises a highlighting module for:
after the first determining module determines a plurality of queriable controls according to the plurality of functional controls displayed on the current game interface, highlighting the queriable controls according to a preset color and a preset contrast brightness.
The first determining module is specifically configured to:
determining a display area of each functional control displayed on the current game interface;
when the display area is larger than a preset threshold value, generating a preset control as a corresponding queriable control;
And when the display area is smaller than or equal to a preset threshold value, the corresponding function control is used as a queriable control.
The embodiment of the application also provides a computer readable storage medium, wherein a plurality of instructions are stored in the storage medium, and the instructions are suitable for being loaded by a processor to execute the method for displaying the guiding information.
The embodiment of the application also provides computer equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the steps in the method for displaying any guide information when executing the computer program.
According to the guide information display method, device, storage medium and computer equipment provided by the application, when the guide inquiry instruction is detected, a plurality of inquireable controls are determined according to the plurality of functional controls displayed on the current game interface, different inquireable controls correspond to different guide information, then, a target control is determined from the inquireable controls in response to the touch operation of a user on the current game interface on the inquireable controls and/or the guide controls, and then, the guide information corresponding to the target control is inquired and displayed, so that the guide interpretation of a single functional key can be rapidly realized in the game process, the flexibility is high, text input is not needed, and the operation is simple and convenient.
Drawings
The technical solution and other advantageous effects of the present application will be made apparent by the following detailed description of the specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic view of a scenario of a method for displaying guidance information according to an embodiment of the present application;
fig. 2 is a flow chart of a method for displaying guidance information according to an embodiment of the present application;
FIG. 3 is a schematic illustration showing a game interface before a boot function is triggered according to an embodiment of the present application;
fig. 4 is another flow chart of a method for displaying guidance information according to an embodiment of the present application;
FIG. 5 is a schematic illustration showing a game interface before and after a boot function is triggered according to an embodiment of the present application;
FIG. 6 is a schematic illustration showing a drag operation performed by a user on a game interface according to an embodiment of the present application;
FIG. 7 is a schematic illustration showing a user pressing a game interface according to an embodiment of the present application;
FIG. 8 is a schematic illustration showing a clicking operation performed by a user on a game interface according to an embodiment of the present application;
Fig. 9 is a schematic structural diagram of a display device for guiding information according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
The embodiment of the application provides a method and a device for displaying guide information, a storage medium and computer equipment.
Referring to fig. 1, fig. 1 is a schematic view of a scenario of a method for displaying guidance information according to an embodiment of the present application, where the method for displaying guidance information may be applied to a computer device, and the computer device is specifically a terminal or a server, where the terminal may include a smart phone, a tablet computer, a game console, and the server may be a game server, but is not limited to the smart phone, the tablet computer, the game console, and the like.
When the guiding inquiry command is detected, the computer equipment can determine a plurality of inquireable controls according to the plurality of functional controls displayed on the current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface also comprises guiding controls; determining a target control from the queriable controls in response to a touch operation of the user on the current game interface to the queriable controls and/or the guide controls; inquiring the guiding information corresponding to the target control, and displaying the inquired guiding information on the current game interface.
The guiding query instruction is usually generated by manual triggering, and the triggering mode can be touch control, sound control, gesture and the like. The guidance control may be used to trigger generation of guidance query instructions, such as when a user presses or clicks the guidance control for a long time. Functionality controls may be used to trigger some common functions and character skills in a game, and clicking on different functionality controls by a user may trigger different game operations. The guiding information is mainly used for providing the use description of some basic functions and character skills in the game, can be displayed in the form of characters and/or graphics, and can even provide the demonstration animation of the use operation, and can be provided by a game server background or carried in a data packet when the game application is downloaded. All or part of the functional controls can be directly used as queriable controls, new controls can be generated according to all or part of the functional controls to be used as queriable controls, and the rest of the functional controls and the new controls can be jointly used as queriable controls. Touch operations may include dragging, pressing, clicking, and the like. The guidance information is mainly used for providing the use instruction of basic functions and character skills, and even providing the demonstration animation of the use operation.
For example, referring to fig. 1, the computer device may be a mobile terminal, and the game interface may include a small map, a self-healing skill, an accelerating skill, an enhancing skill, a dialog box, and other functional controls (an icon of. Thereafter, all of the queriable controls are highlighted by changing the display color and/or brightness (e.g., icons represented by bold outlines in fig. 1 represent highlighted queriable controls, icons represented by thin outlines represent functionality controls that do not contain guide information) to alert the user to the drag operation. When the user drags the guiding control to the position of the 'strengthening' skill button or drags the 'strengthening' skill button to the guiding control, the target control can be considered as the 'strengthening' skill button, and relevant guiding information corresponding to the 'strengthening' skill button is obtained from a server or a local game data packet and displayed in a game interface.
As shown in fig. 2, fig. 2 is a flow chart of a method for displaying guide information according to an embodiment of the present application, where the method for displaying guide information is applied to a computer device, and a specific flow may be as follows:
S101, when a guiding inquiry instruction is detected, determining a plurality of inquireable controls according to a plurality of functional controls displayed on a current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface also comprises a guiding control.
The guiding query instruction is usually generated by manual triggering, and the triggering mode can be touch control, sound control, gesture and the like. The guiding control may be used to trigger generating guiding query instructions, such as when the user presses, re-presses, drags, double-clicks or clicks the guiding control for a long time, that is, before detecting the guiding query instructions, the display method may further include: and responding to the touch operation of the user on the guide control displayed on the current game interface, and generating a guide inquiry instruction. For example, please refer to fig. 3, "? The icon represents a guidance control, which when clicked by a user, may trigger the generation of a guidance query instruction. For another example, dragging the guiding control (i.e. the sliding operation that the touch start point is located on the guiding control) may trigger to generate the guiding query command, if the dragging operation is continuously performed, the guiding control is directly dragged to the target control, and the related guiding information of the target control may be directly queried and displayed, so that the operations of triggering to generate the guiding query command and subsequently querying and displaying the guiding information of the target control may be completed at one time.
In general, the game interface may include functional controls representing basic functions, such as a small map a, a dialog box, a character role display column B, and the like, and may also include functional controls representing specific character skills, such as self-healing skills, accelerating skills, strengthening skills, and the like, and specifically referring to fig. 3, a user clicking different functional controls may trigger different game operations. The guiding information is mainly used for providing the use description of some basic functions and character skills in the game, can be displayed in the form of characters and/or graphics, and can even provide the demonstration animation of the use operation, and can be provided by a game server background or carried in a data packet when the game application is downloaded.
Referring to fig. 4, the step S101 may specifically include:
s1011, determining a display area of each functional control displayed on the current game interface;
s1012, when the area of the display area is larger than a preset threshold value, generating a preset control as a corresponding queriable control; and when the area of the display area is smaller than or equal to a preset threshold value, the corresponding function control is used as a queriable control.
The preset threshold may be set manually in advance, and each functional control may be a single control, for example, a button similar to a self-healing skill, an acceleration skill, or a single character skill for strengthening a skill in fig. 3 may be used as a functional control. The functional control may also be formed by combining a plurality of small controls, for example, for the character display column in fig. 3, since each party in the fight game may involve a plurality of character roles, the character role display column may include a plurality of character role display buttons, a user clicks different character role display buttons to obtain different character introduction contents, all character role display buttons included in each fight party form a functional control, for such functional control, the area of the display area of the functional control is usually larger, at this time, the center point of the display area may be used as a reference point to generate a preset control as a queriable control (i.e. generate a new functional control), where the guide information corresponding to the preset control is the guide information set by the system for the corresponding functional control, and the shape and size of the preset control may be artificially set, for example, be circular, square, pentagram, etc., for example, please refer to fig. 5, the icons represented by bold outlines represent queriable controls, it is easy to see that the two functional controls, namely, the small map a and character display column B are not directly used as queriable controls, but a corresponding new control is generated. Of course, the new control may be not generated, and the small map a or the task character display column B may be directly used as a whole as the queriable control, so long as the touch operation acts on a part of the small map a or one of the small controls in the character display column B, the guiding information display of the small map a or the character display column B may be triggered.
It should be noted that, after determining the queriable controls, to remind the user of which controls can be selected, the queriable controls and the remaining controls on the game interface may be displayed differently in a highlighted manner, for example, after step S101 described above, the method may further include:
And highlighting the plurality of queriable controls according to a preset color and/or preset contrast brightness.
The preset color and the preset contrast brightness can be set manually in advance, specifically, the brightness of the game interface and the queriable control can be adjusted according to the preset contrast brightness, so that the queriable control is highlighted relative to the game interface, and the edge contour color of the queriable control can be further adjusted to be a preset color, such as yellow, so that the queriable control is more highlighted.
S102, responding to touch operation of a user on the current game interface on the inquireable control and/or the guide control, and determining a target control from the inquireable control.
The touch operation may include dragging, pressing, clicking, etc., where a user may only touch the queriable control or the guide control, for example, the guide control may be dragged to the queriable control where the guide description is required, or the queriable control where the guide description is required may also be dragged to the guide control, or the queriable control where the guide description is required may also be double-clicked, re-pressed, or long-clicked. In other embodiments, the user may also touch both the queriable control and the guidance control, such as pressing the guidance control and the queriable control that needs guidance explanation at the same time, and so on.
In some embodiments, when the touch operation includes a drag operation, only the queriable control or the guide control may be dragged, and the determining, in response to the touch operation of the user on the current game interface on the queriable control or the guide control, the target control from the queriable control may specifically include:
When any of the queriable controls on the current game interface is dragged to the guide control, the dragged queriable control is used as a target control; or alternatively
And when the guiding control on the current game interface is dragged to any of the queriable controls, the corresponding queriable control is used as a target control.
The user may drag the guide control or the queriable control separately to make the positions of the two partially overlap, or completely overlap, or be very close (for example, the minimum distance between the two is within a set range), so that the corresponding queriable control is used as the target control. For example, referring to fig. 6, the icons represented by the bold outline in fig. 6 represent queriable controls (e.g., "highlight" icons), and the guide controls appear as "? "icon, when will"? Dragging an "icon to an" enhanced "icon, or dragging an" enhanced "icon to"? When the icon is positioned, the enhanced icon can be used as a target control. It should be noted that, in the process of performing the drag operation, the guiding control or the queriable control does not necessarily generate a displacement visually, and the query and display of the guiding description of the target control can be triggered as long as the sliding operation from the guiding control to the queriable control of the touch point or the sliding operation from the queriable control to the guiding control of the touch point is detected. The movement of the guide control or the queriable control following the drag operation is merely used as a visual reference for assisting the user operation, and other visual references are also possible in practical implementations, such as displaying a drag track between the guide control and the queriable control. Of course, the user may drag both the guide control and the queriable control, so that the position between them satisfies the condition, for example? After the icon is dragged to a certain position, the enhanced icon is also dragged to the position.
In other embodiments, when the touch operation includes a pressing operation, both the queriable control and the guide control may be pressed, where the step of "determining, in response to a touch operation of the queriable control and the guide control by the user on the current game interface, a target control from the queriable control" may specifically include:
And when the pressing operation is carried out on the guide control and any one of the inquired controls and the pressing operation meets the preset condition, the pressed inquired control is taken as a target control.
Specifically, the pressing operations of the guide control and the queriable control may have a sequence, or may occur simultaneously, that is, the preset condition may include: the pressing start time of the guide control is earlier than the pressing start time of the query control that is pressed, and the pressing end time is equal to the pressing end time of the query control that is pressed. Or the preset condition may include: the pressing start time of the queriable control is earlier than the pressing start time of the guide control, and the pressing end time is equal to the pressing end time of the guide control. Or the preset condition may include: the pressing operation occurs simultaneously with the directing control and the queriable control. The guiding control can be continuously pressed by one finger, and then the plurality of inquired controls are pressed by the other finger, so that guiding information of the plurality of inquired controls can be continuously inquired and displayed.
For example, referring to fig. 7, a user may hold "? "icon, then press" intensify "icon with another finger, or press" intensify "icon first and then"? "icon, or hold it at the same time"? The ' icon and the ' intensified ' icon are released at the same time, or can be released sequentially, so that the ' intensified ' icon is finally used as a target control.
In other embodiments, when the touch operation includes a clicking operation, only a queriable control that needs to be guided and described may be clicked, where the step of "determining, in response to a touch operation of the queriable control by the user on the current game interface, a target control from the queriable control" includes:
And when any one of the queriable controls is subjected to continuous clicking operation for a plurality of times and the number of times of the clicking operation is equal to the preset number of times, taking the clicked queriable control as a target control.
The preset times can be two or three times of manual setting, and the continuous clicking refers to that the time interval between the adjacent clicking operations is within a small duration, such as 0.5s. For example, referring to FIG. 8, when the user wants to have the "emphasis" icon as the target control, the "emphasis" icon may be double-clicked.
In another embodiment, in response to a touch operation acting on the guide control, entering a query state of the guide information, the user may also query the guide information displaying the plurality of queriable controls separately through a single click.
S103, inquiring the guide information corresponding to the target control, and displaying the inquired guide information on the current game interface.
The guiding information corresponding to the target control can be queried from a background of the game server or a local application data packet, and then the guiding information can be displayed to a user in a pop-up window mode, so that the guiding requirement of the user can be met, and normal running of the game is not influenced. For example, in fig. 6-8, when the target control is an "enhanced" icon, the content displayed by the pop-up window may be related guide information of the "enhanced" skill.
As can be seen from the foregoing, in the method for displaying guidance information provided in this embodiment, when a guidance query instruction is detected, a plurality of queriable controls are determined according to the function controls displayed on the current game interface, different queriable controls correspond to different guidance information, the queriable controls include at least one of the function controls, the current game interface further includes a guidance control, then, in response to a touch operation of a user on the queriable controls on the current game interface, a target control is determined from the queriable controls, then, the guidance information corresponding to the target control is queried, and the queried guidance information is displayed on the game interface, so that guidance interpretation of a single function button can be rapidly implemented in the game process, flexibility is high, text input is not required, and operation is simple.
On the basis of the above embodiments, this embodiment will be further described from the perspective of a display device of guidance information, referring to fig. 9, fig. 9 specifically describes a display device of guidance information provided by an embodiment of the present application, which may include: a first determination module 10, a second determination module 20, and a query display module 30, wherein:
(1) First determination module 10
The first determining module 10 is configured to determine, when a guiding query instruction is detected, a plurality of queriable controls according to a plurality of functional controls displayed on a current game interface, where different queriable controls correspond to different guiding information, the queriable controls include at least one functional control, and the current game interface further includes a guiding control.
The guiding query instruction is usually generated by manual triggering, and the triggering mode can be touch control, sound control, gesture and the like. The guiding control may be used to trigger generating guiding query instructions, such as when the user presses, re-presses, drags, double-clicks or clicks the guiding control for a long time, that is, before detecting the guiding query instructions, the display method may further include: and responding to the touch operation of the user on the guide control displayed on the current game interface, and generating a guide inquiry instruction. For example, please refer to fig. 3, "? The icon represents a guidance control, which when clicked by a user, may trigger the generation of a guidance query instruction.
In general, the game interface may include functional controls representing basic functions, such as a small map a, a dialog box, a character role display column B, and the like, and may also include functional controls representing specific character skills, such as self-healing skills, accelerating skills, strengthening skills, and the like, and specifically referring to fig. 3, a user clicking different functional controls may trigger different game operations. The guiding information is mainly used for providing the use description of some basic functions and character skills in the game, can be displayed in the form of characters and/or graphics, and can even provide the demonstration animation of the use operation, and can be provided by a game server background or carried in a data packet when the game application is downloaded.
All the functional controls can be directly used as queriable controls, or new controls can be generated according to part of the functional controls, and the remaining functional controls and the new controls are used as queriable controls, at this time, the first determining module 10 is specifically configured to:
determining a display area of each functional control displayed on the current game interface;
when the area of the display area is larger than a preset threshold value, generating a preset control as a corresponding inquireable control;
and when the area of the display area is smaller than or equal to a preset threshold value, the corresponding function control is used as a queriable control.
The preset threshold may be set manually in advance, and each functional control may be a single control, for example, a button similar to a self-healing skill, an acceleration skill, or a single character skill for strengthening a skill in fig. 3 may be used as a functional control. The functional control may also be formed by combining a plurality of small controls, for example, for the character display column in fig. 3, since each party in the fight game may involve a plurality of character roles, the character role display column may include a plurality of character role display buttons, a user clicks different character role display buttons to obtain different character introduction contents, all character role display buttons included in each fight party form a functional control, for such functional control, the area of the display area of the functional control is usually larger, at this time, the center point of the display area may be used as a reference point to generate a preset control as a queriable control (i.e. generate a new functional control), where the guide information corresponding to the preset control is the guide information set by the system for the corresponding functional control, and the shape and size of the preset control may be artificially set, for example, be circular, square, pentagram, etc., for example, please refer to fig. 5, the icons represented by bold outlines represent queriable controls, it is easy to see that the two functional controls, namely, the small map a and character display column B are not directly used as queriable controls, but a corresponding new control is generated.
It should be noted that, after determining the queriable controls, to remind the user of which controls can be selected, the queriable controls and the remaining controls on the game interface may be displayed differently by highlighting, for example, referring to fig. 10, the display apparatus further includes a highlighting module 40 for:
After the first determining module 10 determines a plurality of queriable controls according to the plurality of functional controls displayed on the current game interface, the plurality of queriable controls are highlighted according to a preset color and/or a preset contrast brightness.
The preset color and the preset contrast brightness can be set manually in advance, specifically, the brightness of the game interface and the queriable control can be adjusted according to the preset contrast brightness, so that the queriable control is highlighted relative to the game interface, and the edge contour color of the queriable control can be further adjusted to be a preset color, such as yellow, so that the queriable control is more highlighted.
(2) The second determination module 20
A second determining module 20, configured to determine a target control from the queriable controls in response to a touch operation of the queriable controls and/or the guiding controls on the current game interface by a user.
The touch operation may include dragging, pressing, clicking, etc., where a user may only touch the queriable control or the guide control, for example, the guide control may be dragged to the queriable control where the guide description is required, or the queriable control where the guide description is required may also be dragged to the guide control, or the queriable control where the guide description is required may also be double-clicked, re-pressed, or long-clicked. In other embodiments, the user may also touch both the queriable control and the guidance control, such as pressing the guidance control and the queriable control that needs guidance explanation at the same time, and so on.
In some embodiments, referring to fig. 5, when the touch operation includes a drag operation, only the queriable control or the guide control may be dragged, where the second determining module 20 is specifically configured to:
When any of the queriable controls on the current game interface is dragged to the guide control, the dragged queriable control is used as a target control; or alternatively
And when the guiding control on the current game interface is dragged to any of the queriable controls, the corresponding queriable control is used as a target control.
The user may drag the guide control or the queriable control separately to make the positions of the two partially overlap, or completely overlap, or be very close (for example, the minimum distance between the two is within a set range), so that the corresponding queriable control is used as the target control. For example, referring to fig. 6, the icons represented by the bold outline in fig. 6 represent queriable controls (e.g., "highlight" icons), and the guide controls appear as "? "icon, when will"? Dragging an "icon to an" enhanced "icon, or dragging an" enhanced "icon to"? When the icon is positioned, the enhanced icon can be used as a target control. Of course, the user may drag both the guide control and the queriable control, so that the position between them satisfies the condition, for example? After the icon is dragged to a certain position, the enhanced icon is also dragged to the position.
In other embodiments, when the touch operation includes a pressing operation, both the queriable control and the guide control may be pressed, where the second determining module 30 is specifically configured to: and when the pressing operation is carried out on the guide control and any one of the inquired controls and the pressing operation meets the preset condition, the pressed inquired control is taken as a target control.
Specifically, the pressing operations of the guide control and the queriable control may have a sequence, or may occur simultaneously, that is, the preset condition may include: the pressing start time of the guide control is earlier than the pressing start time of the query control that is pressed, and the pressing end time is equal to the pressing end time of the query control that is pressed. Or the preset condition may include: the pressing start time of the queriable control is earlier than the pressing start time of the guide control, and the pressing end time is equal to the pressing end time of the guide control. Or the preset condition may include: the pressing operation occurs simultaneously with the directing control and the queriable control.
For example, referring to fig. 7, a user may hold "? "icon, then press" intensify "icon with another finger, or press" intensify "icon first and then"? "icon, or hold it at the same time"? The ' icon and the ' intensified ' icon are released at the same time, or can be released sequentially, so that the ' intensified ' icon is finally used as a target control.
In other embodiments, when the touch operation includes a clicking operation, only the queriable control that needs to be guided and described may be clicked, where the second determining module 30 is specifically configured to:
And when any one of the queriable controls is subjected to continuous clicking operation for a plurality of times and the number of times of the clicking operation is equal to the preset number of times, taking the clicked queriable control as a target control.
The preset times can be two or three times of manual setting, and the continuous clicking refers to that the time interval between the adjacent clicking operations is within a small duration, such as 0.5s. For example, referring to FIG. 8, when the user wants to have the "emphasis" icon as the target control, the "emphasis" icon may be double-clicked.
(3) Query display module 30
And the query display module 30 is used for querying the guiding information corresponding to the target control and displaying the queried guiding information on the current game interface.
The guiding information corresponding to the target control can be queried from a background of the game server or a local application data packet, and then the guiding information can be displayed to a user in a pop-up window mode, so that the guiding requirement of the user can be met, and normal running of the game is not influenced. For example, in fig. 6 or fig. 8, when the target control is an "enhanced" icon, the content displayed by the popup window may be related guiding information of the "enhanced" skill.
In the implementation, each unit may be implemented as an independent entity, or may be implemented as the same entity or several entities in any combination, and the implementation of each unit may be referred to the foregoing method embodiment, which is not described herein again.
As can be seen from the foregoing, in the guiding information display device provided in this embodiment, when a guiding query instruction is detected, the first determining module 10 determines a plurality of queriable controls according to a plurality of functional controls displayed on a current game interface, where different queriable controls correspond to different guiding information, the queriable controls include at least one of the functional controls, the current game interface further includes a guiding control, and then, the second determining module 20 determines a target control from the queriable controls in response to a touch operation of a user on the current game interface on the queriable controls, and the querying display module 30 queries the guiding information corresponding to the target control and displays the queried guiding information on the game interface, so that guiding interpretation of a single functional key can be quickly realized in a game process, flexibility is high, text input is not required, and operation is simple.
Correspondingly, the embodiment of the application also provides computer equipment, which can be a terminal or a server, wherein the terminal can be a smart phone, a tablet Personal computer, a notebook computer, a touch screen, a game console, a Personal computer, a Personal digital assistant (Personal DIGITAL ASSISTANT, PDA) and other equipment. Fig. 10 is a schematic structural diagram of a computer device according to an embodiment of the present application. The computer apparatus 400 includes a processor 401 having one or more processing cores, a memory 402 having one or more computer readable storage media, and a computer program stored on the memory 402 and executable on the processor. The processor 401 is electrically connected to the memory 402. It will be appreciated by those skilled in the art that the computer device structure shown in the figures is not limiting of the computer device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Processor 401 is a control center of computer device 400 and connects the various portions of the entire computer device 400 using various interfaces and lines to perform various functions of computer device 400 and process data by running or loading software programs and/or modules stored in memory 402 and invoking data stored in memory 402, thereby performing overall monitoring of computer device 400.
In the embodiment of the present application, the processor 401 in the computer device 400 loads the instructions corresponding to the processes of one or more application programs into the memory 402 according to the following steps, and the processor 401 executes the application programs stored in the memory 402, so as to implement various functions:
When a guiding inquiry command is detected, determining a plurality of inquireable controls according to a plurality of functional controls displayed on a current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface also comprises a guiding control;
Determining a target control from the queriable controls in response to a touch operation of the user on the current game interface to the queriable controls and/or the guide controls;
inquiring the guiding information corresponding to the target control, and displaying the inquired guiding information on the current game interface.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Optionally, as shown in fig. 10, the computer device 400 further includes: a touch display 403, a radio frequency circuit 404, an audio circuit 405, an input unit 406, and a power supply 407. The processor 401 is electrically connected to the touch display 403, the radio frequency circuit 404, the audio circuit 405, the input unit 406, and the power supply 407, respectively. Those skilled in the art will appreciate that the computer device structure shown in FIG. 10 is not limiting of the computer device and may include more or fewer components than shown, or may be combined with certain components, or a different arrangement of components.
The touch display 403 may be used to display a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface. The touch display screen 403 may include a display panel and a touch panel. Wherein the display panel may be used to display information entered by a user or provided to a user as well as various graphical user interfaces of a computer device, which may be composed of graphics, text, icons, video, and any combination thereof. Alternatively, the display panel may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. The touch panel may be used to collect touch operations on or near the user (such as operations on or near the touch panel by the user using any suitable object or accessory such as a finger, stylus, etc.), and generate corresponding operation instructions, and the operation instructions execute corresponding programs. Alternatively, the touch panel may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, and sends the touch point coordinates to the processor 401, and can receive and execute commands sent from the processor 401. The touch panel may overlay the display panel, and upon detection of a touch operation thereon or thereabout, the touch panel is passed to the processor 401 to determine the type of touch event, and the processor 401 then provides a corresponding visual output on the display panel in accordance with the type of touch event. In the embodiment of the present application, the touch panel and the display panel may be integrated into the touch display screen 403 to realize the input and output functions. In some embodiments, however, the touch panel and the touch panel may be implemented as two separate components to perform the input and output functions. I.e. the touch-sensitive display 403 may also implement an input function as part of the input unit 406.
In the embodiment of the present application, the processor 401 executes the game application program to generate a picture of the virtual three-dimensional scene on the touch display screen 403, where the picture includes a graphical user interface (UI interface), and the graphical user interface includes a second spatial orientation indicator, where a spatial orientation identifier corresponding to the target object is displayed on the second spatial orientation indicator, and the spatial orientation identifier is used to indicate an orientation where the target object is located.
The touch display 403 may be used to present a screen of a virtual three-dimensional scene, and a graphical user interface and receive operation instructions generated by a user acting on the graphical user interface.
The radio frequency circuitry 404 may be used to transceive radio frequency signals to establish wireless communications with a network device or other computer device via wireless communications.
The audio circuitry 405 may be used to provide an audio interface between a user and a computer device through speakers, microphones, and so on. The audio circuit 405 may transmit the received electrical signal after audio data conversion to a speaker, where the electrical signal is converted into a sound signal for output; on the other hand, the microphone converts the collected sound signals into electrical signals, which are received by the audio circuit 405 and converted into audio data, which are processed by the audio data output processor 401 and sent via the radio frequency circuit 404 to, for example, another computer device, or which are output to the memory 402 for further processing. The audio circuit 405 may also include an ear bud jack to provide communication of the peripheral ear bud with the computer device.
The input unit 406 may be used to receive input numbers, character information, or user characteristic information (e.g., fingerprint, iris, facial information, etc.), and to generate keyboard, mouse, joystick, optical, or trackball signal inputs related to user settings and function control.
The power supply 407 is used to power the various components of the computer device 400. Alternatively, the power supply 407 may be logically connected to the processor 401 through a power management system, so as to implement functions of managing charging, discharging, and power consumption management through the power management system. The power supply 407 may also include one or more of any of a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
Although not shown in fig. 10, the computer device 400 may further include a camera, a sensor, a wireless fidelity module, a bluetooth module, etc., which are not described herein.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, an embodiment of the present application provides a computer readable storage medium in which a plurality of computer programs are stored, the computer programs being capable of being loaded by a processor to perform steps in any of the methods for displaying guidance information provided by the embodiments of the present application. For example, the computer program may perform the steps of:
When a guiding inquiry command is detected, determining a plurality of inquireable controls according to a plurality of functional controls displayed on a current game interface, wherein different inquireable controls correspond to different guiding information, the inquireable controls comprise at least one functional control, and the current game interface also comprises a guiding control;
Determining a target control from the queriable controls in response to a touch operation of the user on the current game interface to the queriable controls and/or the guide controls;
inquiring the guiding information corresponding to the target control, and displaying the inquired guiding information on the current game interface.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
The steps in any guiding information display method provided by the embodiment of the present application can be executed by the computer program stored in the storage medium, so that the beneficial effects of any guiding information display method provided by the embodiment of the present application can be achieved, which are detailed in the previous embodiments and are not described herein.
The foregoing has described in detail the method, apparatus, storage medium and computer device for displaying guide information provided by the embodiments of the present application, and specific examples have been applied to illustrate the principles and embodiments of the present application, where the foregoing description of the embodiments is only for aiding in understanding the method and core idea of the present application; meanwhile, as those skilled in the art will vary in the specific embodiments and application scope according to the ideas of the present application, the present description should not be construed as limiting the present application in summary.

Claims (10)

1. A method for displaying guidance information, comprising:
In the game process, when a guiding inquiry instruction is detected, a plurality of inquireable controls are determined on a current game interface according to a plurality of functional controls displayed on the current game interface, the inquireable controls and other controls on the current game interface are displayed in a distinguishing mode on the current game interface, different inquireable controls correspond to different guiding information, each inquireable control comprises at least one functional control, and each current game interface further comprises a guiding control;
Determining a target control from the queriable controls in response to touch operation of a user on the current game interface on the queriable controls and/or the guiding controls;
inquiring the guiding information corresponding to the target control, and displaying the inquired guiding information on the current game interface.
2. The method of claim 1, wherein the touch operation includes a drag operation, and wherein the determining a target control from among the queriable controls in response to a touch operation of the queriable control or the guide control by a user on the current game interface includes:
when any of the queriable controls on the current game interface is dragged to the guide control, the dragged queriable control is used as a target control; or alternatively
And when the guiding control on the current game interface is dragged to any inquired control, the corresponding inquired control is used as a target control.
3. The method of displaying guidance information according to claim 1, wherein the touch operation includes a press operation, and the determining a target control from among the queriable controls in response to a touch operation of the queriable controls and the guidance control by a user on the current game interface includes:
And when the pressing operation is carried out on the guide control and any one of the inquireable controls and the pressing operation meets the preset condition, taking the pressed inquireable control as a target control.
4. A method of displaying guidance information according to claim 3, wherein the preset conditions include: the guide control has a press start time earlier than a press start time of the queriable control being pressed and a press end time equal to a press end time of the queriable control being pressed.
5. The method of claim 1, wherein the touch operation comprises a click operation, and wherein the determining a target control from among the queriable controls in response to a touch operation of the queriable control by a user on the current game interface comprises:
And when any one of the inquireable controls is subjected to continuous multiple clicking operations, and the times of the clicking operations are equal to the preset times, taking the clicked inquireable control as a target control.
6. The method of any of claims 1-5, further comprising, after determining a plurality of queriable controls from a plurality of functionality controls displayed on a current game interface:
And highlighting the plurality of queriable controls according to a preset color and/or preset contrast brightness.
7. The method for displaying guidance information according to any one of claims 1-5, wherein determining a plurality of queriable controls according to a plurality of functionality controls displayed on a current game interface includes:
determining a display area of each functional control displayed on the current game interface;
when the display area is larger than a preset threshold value, generating a preset control as a corresponding queriable control;
And when the display area is smaller than or equal to a preset threshold value, the corresponding function control is used as a queriable control.
8. A display device for guiding information, comprising:
The first determining module is used for determining a plurality of queriable controls on a current game interface according to a plurality of functional controls displayed on the current game interface when a guiding query instruction is detected in the game process, the queriable controls and other controls on the current game interface are displayed in a distinguishing mode on the current game interface, different queriable controls correspond to different guiding information, the queriable controls comprise at least one functional control, and the current game interface further comprises guiding controls;
A second determining module, configured to determine a target control from the queriable controls in response to a touch operation of a user on the current game interface on the queriable controls and/or the guide control;
and the inquiry display module is used for inquiring the guide information corresponding to the target control and displaying the inquired guide information on the current game interface.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program, which is adapted to be loaded by a processor for performing the steps in the method of displaying guidance information according to any of claims 1-7.
10. A computer device, characterized in that the computer device comprises a memory in which a computer program is stored and a processor which performs the steps in the method of displaying guidance information according to any one of claims 1-7 by calling the computer program stored in the memory.
CN202011639207.5A 2020-12-31 2020-12-31 Guide information display method and device, storage medium and computer equipment Active CN112587925B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011639207.5A CN112587925B (en) 2020-12-31 2020-12-31 Guide information display method and device, storage medium and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011639207.5A CN112587925B (en) 2020-12-31 2020-12-31 Guide information display method and device, storage medium and computer equipment

Publications (2)

Publication Number Publication Date
CN112587925A CN112587925A (en) 2021-04-02
CN112587925B true CN112587925B (en) 2024-07-26

Family

ID=75206781

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011639207.5A Active CN112587925B (en) 2020-12-31 2020-12-31 Guide information display method and device, storage medium and computer equipment

Country Status (1)

Country Link
CN (1) CN112587925B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113332718B (en) * 2021-06-10 2024-02-20 网易(杭州)网络有限公司 Interactive element query method and device, electronic equipment and storage medium
CN113934343B (en) * 2021-09-30 2024-07-02 北京五八信息技术有限公司 Information processing method and device
CN113893540B (en) * 2021-09-30 2023-08-25 腾讯科技(深圳)有限公司 Information prompting method and device, storage medium and electronic equipment
CN114816612A (en) * 2022-02-24 2022-07-29 北京高德云信科技有限公司 Display method, device, electronic equipment and computer program product
CN114548055A (en) * 2022-02-28 2022-05-27 长沙朗源电子科技有限公司 Description document editing method, description document display method, terminal and computer-readable storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945146A (en) * 2012-10-29 2013-02-27 海信集团有限公司 Display method and system for help information of projector equipment
CN106575196A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Electronic device and method for displaying user interface thereof

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007047989A (en) * 2005-08-09 2007-02-22 Mitsubishi Electric Corp Guidance information provision device
CN101751257B (en) * 2009-11-19 2016-08-24 华为终端有限公司 The method and apparatus of image user interface help information display
CN106897081A (en) * 2015-12-18 2017-06-27 中兴通讯股份有限公司 The bootstrap technique and device of application, terminal
CN107729023A (en) * 2016-12-19 2018-02-23 西安艾润物联网技术服务有限责任公司 The method and device of guiding operation application program
CN107301052A (en) * 2017-06-30 2017-10-27 厦门美图移动科技有限公司 The display methods and mobile terminal of a kind of help information
CN111580911A (en) * 2020-05-09 2020-08-25 惠州Tcl移动通信有限公司 Operation prompting method and device for terminal, storage medium and terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102945146A (en) * 2012-10-29 2013-02-27 海信集团有限公司 Display method and system for help information of projector equipment
CN106575196A (en) * 2014-07-31 2017-04-19 三星电子株式会社 Electronic device and method for displaying user interface thereof

Also Published As

Publication number Publication date
CN112587925A (en) 2021-04-02

Similar Documents

Publication Publication Date Title
CN112587925B (en) Guide information display method and device, storage medium and computer equipment
CN109062479B (en) Split screen application switching method and device, storage medium and electronic equipment
CN113082712B (en) Virtual character control method, device, computer equipment and storage medium
WO2021104271A1 (en) Control method, stylus, and electronic assembly
CN112870718B (en) Prop using method, prop using device, storage medium and computer equipment
CN109032732B (en) Notification display method and device, storage medium and electronic equipment
CN108446156B (en) Application program control method and terminal
EP3799040A1 (en) Speech recognition control method and apparatus, electronic device and readable storage medium
CN113332719B (en) Virtual article marking method, device, terminal and storage medium
CN112843719A (en) Skill processing method, skill processing device, storage medium and computer equipment
US10908868B2 (en) Data processing method and mobile device
CN113126875B (en) Virtual gift interaction method and device, computer equipment and storage medium
CN113050863A (en) Page switching method and device, storage medium and electronic equipment
CN109104640B (en) Virtual gift presenting method and device and storage equipment
TW202144984A (en) Equipment control method and device, storage medium and electronic equipment
WO2024045528A1 (en) Game control method and apparatus, and computer device and storage medium
WO2023246166A1 (en) Method and apparatus for adjusting video progress, and computer device and storage medium
CN113332718B (en) Interactive element query method and device, electronic equipment and storage medium
CN113426115B (en) Game role display method, device and terminal
CN112799754B (en) Information processing method, information processing device, storage medium and computer equipment
CN115382221A (en) Method and device for transmitting interactive information, electronic equipment and readable storage medium
CN112783386A (en) Page jump method, device, storage medium and computer equipment
CN113467661A (en) Task synchronization method, device, equipment and readable storage medium
CN111026562A (en) Message sending method and electronic equipment
CN113521725B (en) Pattern effect display method, storage medium and computer device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant