CN113117345A - Control display method, device, equipment and storage medium - Google Patents

Control display method, device, equipment and storage medium Download PDF

Info

Publication number
CN113117345A
CN113117345A CN202110442408.4A CN202110442408A CN113117345A CN 113117345 A CN113117345 A CN 113117345A CN 202110442408 A CN202110442408 A CN 202110442408A CN 113117345 A CN113117345 A CN 113117345A
Authority
CN
China
Prior art keywords
control group
function
target
control
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110442408.4A
Other languages
Chinese (zh)
Inventor
于蒙
夏润恺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perfect World Chongqing Interactive Technology Co ltd
Perfect World Co Ltd
Original Assignee
Perfect World Chongqing Interactive Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Perfect World Chongqing Interactive Technology Co ltd filed Critical Perfect World Chongqing Interactive Technology Co ltd
Priority to CN202110442408.4A priority Critical patent/CN113117345A/en
Publication of CN113117345A publication Critical patent/CN113117345A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/822Strategy games; Role-playing games
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/308Details of the user interface
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/807Role playing or strategy games

Abstract

The embodiment of the application provides a control display method, device, equipment and storage medium. In the method, a target control group in a first state is displayed in a graphical interface; in response to a first instruction for the target control group, determining a target function satisfying the first instruction from a plurality of functions configured for the target control group; and switching the target control group from a first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different. The method can integrate the layout of the function controls, avoid the disorder of the layout of the function controls, improve the layout rationality of the human-computer interaction interface, greatly reduce the triggering difficulty of various functions, improve the utilization rate of various functions and improve the human-computer interaction efficiency.

Description

Control display method, device, equipment and storage medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to a method, an apparatus, a device, and a storage medium for displaying a control.
Background
In a game, various non-combat functions are designed, so that game playing is increased through the non-combat functions. For example, the player interacts with other players or NPCs through an interaction function to obtain more game resources and task prompts so as to promote game progress.
Currently, in games, non-combat functions are provided with corresponding trigger modes, such as function buttons. For example, in a mobile game, the non-combat function buttons are numerous and mostly reside in the game interface. To ensure the operation space of the fighting functions, these non-fighting function buttons are usually disposed at the edge of the operation hot area in the game interface, so that the game interface layout is messy, the player operation is scattered, and the usability is poor. Furthermore, to avoid the excessive interface space occupied by these non-combat function buttons, these non-combat function buttons are typically small in size and difficult to attract the attention of the player, resulting in a low frequency of use of the non-combat functions.
Therefore, a new solution is desired to overcome at least one of the problems of the prior art.
Disclosure of Invention
Aspects of the application provide a control display method, device, equipment and storage medium, which are used for integrating functional control layout and improving human-computer interaction efficiency.
The embodiment of the application provides a control display method, which comprises the following steps:
displaying the target control group in the first state in the graphical interface;
in response to a first instruction for the target control group, determining a target function satisfying the first instruction from a plurality of functions configured for the target control group;
and switching the target control group from a first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
Further optionally, the target control group in the first state includes at least one function trigger control, and the at least one function trigger control corresponds to at least one function one to one.
Based on this, in response to a first instruction for the target control group, determining a target function satisfying the first instruction from among a plurality of functions configured for the target control group, includes:
responding to a selection instruction of the target function trigger control, and taking a function corresponding to the target function trigger control as a target function; the target function trigger control is any one of at least one function trigger control.
Wherein, further optionally, the at least one functionality-triggering control comprises: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
Further optionally, switching the target control from the first state to a second state for implementing the target function includes:
determining a target function control group for executing a target function from at least one function control group associated with the target control group;
calling the target function control group to load the target function control group into a function control area configured for the target function control group in the second state;
and displaying the target control group in the second state in the graphical interface.
Wherein, at least one function control group related to the target control group comprises: at least one of an interactive control group, an expression control group and a flight control group.
Optionally, if the interactive control group and/or the expression control group are loaded in the function control area, displaying the target control group in the second state in the graphical interface, including:
and in the function control area, displaying a plurality of function controls in the interactive action control group and/or the expression control group by adopting a multilayer wheel structure, wherein each layer of wheel structure comprises at least one function control.
Optionally, the set of interactive controls includes a single-person interactive control group and a multi-person interactive control group.
The method further comprises the following steps: and controlling the wheel disc control in the target control group to switch the type of the interactive action control group loaded in the function control area.
Further optionally, if the function trigger control is a flight touch control and the target function is a flight function, the method further includes: and switching the virtual character in the graphical interface to the flight state.
Determining a target function control group for executing a target function from at least one function control group associated with the target control group, including: and determining a flight function control group for executing the flight function from at least one function control group associated with the target control group, wherein the flight function control group comprises a landing control for triggering a landing operation and a state operation control for controlling the flight state.
Furthermore, invoking the target function control group to load the target function control group into the function control area configured for the target function control group in the second state includes: and calling the flight function control group to switch the flight trigger control in the target control group into a landing control, and switching the wheel disc control in the target control group into a state operation control to obtain the target control group in the second state.
Further optionally, the method further comprises: judging whether the target control group meets a state switching condition or not; and if the target control group meets the state switching condition, switching the target control group from a third state for executing the fighting function to the first state.
Wherein, further optionally, the state switching condition includes: the interval between the moment of receiving the second instruction aiming at the target control group for the last time and the current moment exceeds a set threshold value; or the second instruction for the target control group is an operation instruction for stopping the battle.
The embodiment of the application further provides a control display device, and the device is loaded with a graphical interface. The device includes:
the display module is used for displaying the target control group in the first state or the second state in the graphical interface;
the receiving and sending module is used for receiving a first instruction aiming at the target control group;
the processing module is used for responding to the first instruction and determining a target function meeting the first instruction from a plurality of functions configured for the target control group; and the triggering display module switches the target control group from a first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
An embodiment of the present application further provides an electronic device, including: a memory and a processor; the memory is to store one or more computer instructions; the processor is to execute the one or more computer instructions to: the steps in the method provided by the embodiments of the present application are performed.
Embodiments of the present application further provide a computer-readable storage medium storing a computer program, where the computer program can implement the steps in the method provided in the embodiments of the present application when executed.
In the technical solution provided in the embodiment of the present application, a target control group in a first state is displayed in a graphical interface, and the target control group is an operation object provided to a player in the graphical interface, for example, a non-combat function button. Based on this, in response to a first instruction for a target control group, a target function satisfying the first instruction is determined from a plurality of functions configured for the target control group. And further, the target control group is switched from the first state to a second state for realizing the target function, so that the player can trigger the target function through the target control group in the second state.
In the scheme, at least one function is configured in the target control group, and the target control group is switched to the second state for realizing the target function from the first state, so that the target function can be executed in the graphical interface, the layout of the function controls can be integrated, the problem that the usability is poor due to the scattered layout of the function controls is avoided, the layout rationality of a human-computer interaction interface is improved, and the human-computer interaction efficiency is improved. In the scheme, the controls used for realizing different functions can be displayed in different states of the target control group, the triggering difficulty of various functions is greatly reduced, the utilization rate of various functions is improved, and the human-computer interaction efficiency is further improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a flowchart illustrating a control display method according to an exemplary embodiment of the present application;
fig. 2 to 7 are schematic views of a game interface provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of a control display apparatus according to an exemplary embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In a game, various non-combat functions are designed, so that game playing is increased through the non-combat functions. For example, the player interacts with other players or the NPC through an interaction function, obtains more game resources (such as equipment, props, skills and the like), and task prompts so as to promote game progress.
At present, a game interface is provided with a trigger mode corresponding to a non-combat function, such as operation entries and function build-to-build aiming at various non-combat functions. Among these, the forms of non-combat functions are numerous, such as: flights, chats, transactions, gift gifts, spot transfers, etc.
Taking a mobile phone game as an example, in order to ensure the operation space of the fighting function, the non-fighting function button positions are respectively and independently arranged at the edge of an operation hot area in the game interface, so that the layout of the game interface is scattered, the operation of players is scattered, and the usability is poor. As the number of non-combat functions increases, the difficulty of operating the non-combat functions is further exacerbated.
Moreover, these non-combat function buttons are resident in the game interface, and thus, to avoid occupying too much game interface space, these non-combat function buttons are generally small in size and hard to attract the attention of the player, resulting in a low frequency of use of the non-combat function.
Therefore, a new solution is desired to overcome at least one of the problems of the prior art.
In view of at least one of the above technical problems, in some embodiments of the present application, a solution is provided, and the technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
The control display scheme provided by the embodiment of the application can be executed by an electronic device, and the electronic device can be a server. The server may be a physical server including an independent host, or may also be a virtual server carried by a host cluster, or may also be a cloud server. The electronic device may also be a terminal device such as a tablet computer, PC, notebook computer, etc. Of course, the control display scheme may also be executed by the server in cooperation with the terminal device, or by cooperation with a plurality of electronic devices, which is not limited in the present application.
The control display scheme provided by the embodiment of the application is suitable for various graphical interfaces. Taking the game interface as an example, the control display scheme can be used for integrating the layout of the function controls in the game interface so as to improve various problems caused by the scattered layout of the function controls in the game interface, such as high control operation difficulty and low control utilization rate.
Of course, besides game interfaces, the control display scheme is also applicable to other interfaces, such as game development application interfaces, document editing application interfaces, navigation interfaces, advertisement interfaces and the like in the mobile terminal.
For example, aiming at the characteristic that the space of a graphical interface in the mobile terminal is small, many advanced functions of game development application are often scattered and distributed to the edge of an operation hot area, under the condition, the advanced function controls in the game development application interface are integrated through the control display scheme, so that the advanced function controls are more adaptive to the use habits of mobile terminal users through switching the layout of the control group, the layout rationality of a human-computer interaction interface is improved, the triggering difficulty and the operation difficulty of the advanced functions are reduced, and the utilization rate of the advanced functions is improved.
In fact, the advanced function of the game development application can be understood as an auxiliary function in the game development application. Such advanced functions are not usually basic functions in game development applications, but the application of the advanced functions in a specific scene can greatly improve game development efficiency. For example, the advanced functions may be custom material editing, map planning, level planning, etc. functions in a game development application. In practical applications, similar processing can be performed on the trigger control of the advanced function in other graphical interfaces, for example, word counting, spell checking, document checking, and the like, which are advanced functions in a document editing application.
For another example, a map switching function, a nearby search function, a personalized setting function, and the like in the navigation software are often scattered and distributed to the edge of an operation hot area, and under the circumstance, the function controls in the navigation interface are integrated through the control display scheme, so that the function controls are more suitable for the use habit of a mobile terminal user through the distribution of the switching control group, the layout rationality of the human-computer interaction interface is improved, the triggering difficulty of the functions is reduced, and the utilization rate of the functions is improved. For example, assuming that a target control group in a certain state includes a map switching function trigger control, selecting the control can trigger the target control group to switch from the current state to a map switching state, so that a variety of available map options (such as an earthquake map option, a view map option, a convenience store map option, a gas station map option, etc.) are displayed in the target control group in the map switching state.
No matter what kind of graphical interface is applied, the core thought of the control display scheme provided by the embodiment of the application is as follows: and displaying the target control group in the first state in the graphical interface, wherein the target control group is an operation object provided for a user in the graphical interface, such as a non-combat function button in the game interface. On the basis, in response to a first instruction for a target control group, a target function satisfying the first instruction is determined from a plurality of functions configured for the target control group. And further, the target control group is switched from the first state to a second state for realizing the target function, so that the user can trigger the target function through the target control group in the second state. In the scheme, at least one function is configured in the target control group, and the target control group is switched to the second state for realizing the target function from the first state, so that the target function can be executed in the graphical interface, the layout of the function controls can be integrated, the problem of poor usability caused by scattered layout of the function controls is avoided, the layout rationality of a human-computer interaction interface is improved, and the human-computer interaction efficiency is improved. In addition, the scheme can also display the controls for realizing different functions in different states of the target control group, so that the interface space occupied by the function controls is more reasonable, the triggering difficulty of various functions is greatly reduced, and the utilization rate of various functions is improved.
The following describes an implementation process of the control presentation scheme in conjunction with the following embodiments.
An embodiment of the present application provides a control display method, and fig. 1 is a flowchart illustrating the control display method according to an exemplary embodiment of the present application. As shown in fig. 1, the method includes:
101. displaying the target control group in the first state in the graphical interface;
102. in response to a first instruction for the target control group, determining a target function satisfying the first instruction from a plurality of functions configured for the target control group;
103. and switching the target control group from the first state to a second state for realizing the target function.
In the embodiment of the application, in the graphical interface, the target control group is composed of a plurality of controls. The plurality of controls includes a functionality control and a non-functionality control. Non-functional controls include, but are not limited to: a virtual character property bar, a blood bar, a wheel control with selection function, an indicator bar with arrows, etc. In some embodiments, the corresponding execution program may be triggered by the function control, so that the execution program implements the function corresponding to the function control.
For example, the interactive action control can trigger an interactive action execution program, so that the virtual character in the graphical interface can make a corresponding interactive action by executing the interactive action execution program. And the interactive action controls correspond to the interactive actions one by one. To distinguish between different interaction controls, each interaction control may be exposed in the graphical interface as an identification indicating the interaction, e.g., a thumbnail, a name of the interaction. Similarly, there are a plurality of emotion controls, and each emotion control can trigger the virtual character to send out a corresponding emotion packet in the graphical interface. In the graphical interface, in order to distinguish different types of expression controls, each expression control can also be displayed as an identifier for indicating an expression, such as a thumbnail and a name of an expression package. Optionally, the user can enter custom emoticons to enrich the interactive form of each user in the graphical interface.
For example, the flight control can trigger a flight control program for the virtual character, so that the virtual character enters a flight state by executing the flight control program, and the flight state of the virtual character is displayed in the graphical interface. In some examples, flight controls include, but are not limited to: flight direction controls (e.g., up, down, forward, backward), landing controls, speed controls through which flight status may be set, e.g., setting parameters in a flight control program to control the landing location, flight direction, speed of the virtual character.
In the embodiment of the application, in order to prevent a plurality of function controls from occupying too much interface space, the function control groups are respectively integrated into different states of the target control group. In fact, different states of the target control group may be understood as display forms for implementing different functions in the target control group.
Based on the above design, in the embodiment of the present application, the control layouts of the target control group in different states are different. Optionally, the control layout difference of the target control group in different states can be embodied in at least one of the following aspects: control type, control number and control layout structure.
Taking a game interface as an example, assume that a plurality of functions are set in a game, and the game is mainly divided into a fighting function and a non-fighting function. It is assumed that a plurality of functions are respectively implemented by different function controls. Based on the assumption, the fighting function control and the non-fighting function control are respectively configured to different states of the target control group according to the control types, and the control layout structure corresponding to each state is determined according to the number of the controls configured to each state and the preset control layout rule, so that the target control group in each state is loaded based on the control layout structure corresponding to each state.
In the embodiment of the present application, according to different functions that can be realized by different states, a relationship between states in which the target control group is located may be a master-slave relationship or a parallel relationship, and the present application is not limited.
For example, in a game interface, a target control group includes at least one of the following states: a first state for showing various types of non-combat functions that can be performed, a second state for performing a target function (a non-combat function selected in the first state), and a third state for performing a combat function. The relationship between the first state and the second state is a master-slave relationship, and the relationship between the first state and the third state is a parallel relationship.
In the above example, in one aspect, one of the non-combat functions is selected for the function controls (i.e., function trigger controls) provided in the target control group in the first state, and the target control group is triggered to enter a second state for executing the selected non-combat function (i.e., target function), so as to execute the selected non-combat function in the second state, where the number of the second state is consistent with the number of the function trigger controls in the first state. I.e., the first state is the primary state for non-combat functions, and the function-triggering controls for entering the respective second states may be managed. For example, suppose that the user selects the function trigger control corresponding to the non-combat function a through the target control group in the first state, at this time, the target control group may be triggered to jump to the second state corresponding to the non-combat function a, so as to execute the non-combat function a in the second state.
On the other hand, the first state is a main state of the non-combat function and is in parallel with a third state for realizing the combat function. Specifically, the two states in the parallel relationship are mutually exclusive, i.e., the target control group cannot simultaneously exhibit the two states in the parallel relationship. In practical application, after the user finishes fighting or actively quits fighting, the target control group is switched from the third state to the first state, so as to provide a non-fighting function for the user.
It should be noted that, in the context embodiment, "first", "second", and "third" are only used to distinguish different states of the target control group, and do not limit the presentation order of the states.
The following describes, with reference to specific examples, an execution process of each step in the control presentation method shown in fig. 1:
first, in 101, a target control group in a first state is displayed in a graphical interface.
Taking the game interface as an example, in the game interface, the target control group may be a roulette control and a function control managed by the roulette control. In the display form, if the number of the function controls is multiple, the function controls can be uniformly surrounded around the wheel disc control, or semi-surrounded around the wheel disc control, or surrounded around the wheel disc control by adopting a multilayer wheel disc structure, and the application is not limited.
The wheel disc control can be realized as a single wheel disc control, and can also be realized as a multi-wheel disc control. Taking the dual-wheel control as an example, the two wheel controls are respectively deployed at the left side and the right side of the game interface, so that the left hand and the right hand can respectively operate conveniently. In an alternative embodiment, the target control group is composed of a right wheel control in the dual wheel controls and a function control managed by the right wheel control.
Of course, according to the actual setting of the game interface, the two roulette wheel controls may also adopt other deployment forms, and the present application is not limited. For example, two roulette controls may also be deployed on the upper and lower ends of the same side (right or left) of the game interface, respectively, so that a player may operate with one hand using a dominant hand. Optionally, the distance between the two roulette controls may be set according to the size of the game interface, or may be set according to the operation habits of the player to improve the game control experience, or may be dynamically adjusted according to the scene change in the game interface to avoid blocking the game scene.
In the above or below embodiments, the functionality control comprises a functionality trigger control for triggering the client (or server) to execute the target functionality. Based on the above embodiments, it is assumed that the right wheel control of the dual wheel control is available to control multiple functionality trigger controls. Based on the technical scheme, a certain function triggering control can be selected by controlling the right wheel control, so that the virtual character in the game is triggered to execute the corresponding skill or interactive behavior through the selected function triggering control. In this case, the right wheel control may also be referred to as a skill launching wheel and a skill triggering wheel, and the present application is not limited thereto.
In practical applications, optionally, the target control group in the first state includes at least one function triggering control. Further, at least one functionality-triggering control, comprising: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
Further, assuming that the device carried by the game displays the game interface by using the touch screen, in order to adapt to the operation habit of the user, the target control group may be displayed on the user dominant hand side in the game interface, for example, the lower right corner and the lower left corner in the game interface. Or dynamically configuring the position of the target control group in the game interface based on the holding adaptation of the terminal equipment by the user. Specifically, it is assumed that the terminal device includes a touch screen, a position where an operating frequency of a user is high in the touch screen is obtained, and a target control group is set at the position. Therefore, the target control group is more suitable for the use habit of the user, and the convenience of the target control group is improved.
Of course, the position of the target control group in the game interface can be dynamically configured by considering factors such as scene change in the game process, size scaling of the game interface (for example, the game interface is reduced after the screen of the mobile phone is split), and the shielding of the target control group on scene elements such as main scenery, virtual characters and the like in the scene can be avoided.
Further, in 102, in response to a first instruction for the target control group, a target function satisfying the first instruction is determined from among a plurality of functions configured for the target control group.
The first instruction may be a touch instruction sent by a user through a touch screen, or a selection instruction fed back by an input device such as a mouse, a keyboard, a voice acquisition device, a wearable device, an image acquisition device, or other types of instructions, which is not limited in the present application. In practical applications, the touch command may be one or a combination of a single click, a double click, a long press, a release after pressing, and a slide (including but not limited to various directions). Of course, the touch command may be a single-point touch command or a multi-point touch command. For example, the multi-touch command may be implemented as a combined track swipe, where the combined track includes, but is not limited to, a custom track, a pre-specified track in a game. Whatever the instruction, it is essentially a target function that represents the user's intent to trigger through the set of target controls.
Still taking the game interface as an example, assume in particular that there are selectable individual non-combat functions shown in the game interface in the first state. Assume that the non-combat functions include: interactive action function, expression display function, flight function. Assuming that the target control group in the game interface in the first state comprises: the control system comprises a wheel disc control and function trigger controls corresponding to all non-combat functions managed by the wheel disc control. Assuming that the function trigger control corresponding to each non-combat function specifically includes: the system comprises an interactive action trigger control, an expression trigger control and a flight trigger control. Suppose that the device carried by the game adopts a touch screen to display a game interface. Assume that the first command is a touch command issued by a user through a touch screen.
Based on the above assumptions, in some embodiments, the user may trigger a control by sliding the wheel control to point to the intended selected function. If the retention time of the wheel disc control reaches the time threshold, determining that the function trigger control pointed by the user through the wheel disc control is the target function trigger control (namely, receiving a first instruction for selecting the target function trigger control), and accordingly, determining a target function corresponding to the target function trigger control from a plurality of non-combat functions configured for the target control group (namely, responding to the first instruction for the target control group, and determining a target function meeting the first instruction from a plurality of functions configured for the target control group). For example, from a plurality of non-combat functions configured for the target control group, an interactivity function corresponding to the currently-pointed interactivity trigger control is determined.
In practical application, the control mode of the wheel disc control can be as follows: click, double click, drag (including but not limited to all directions). Optionally, a wake-up mode corresponding to the function trigger control is set, so that different function trigger controls are selected through different operation modes of the wheel disc control. For example, assuming that the target control group is a right-side wheel control in the dual-wheel control, and assuming that an operation gesture a corresponding to the function triggering control a is set in the game, based on the above assumption, in the touch area where the right-side wheel control is located, if it is detected that the touch trajectory sent by the user is consistent with the operation gesture a, it is determined that the function triggering control that the user intends to select is the function triggering control a.
In other embodiments, the user may also directly touch the function triggering control selected by the intent to complete the input of the first instruction. For example, clicking on a functionality trigger control. For another example, a preset gesture is used to directly call a certain function triggering control.
In an alternative example, the game interface is shown in FIG. 2, wherein a set of target controls in a first state is disposed at the lower left of the game interface, the set of target controls including: the control system comprises a wheel control and function trigger controls a, b and c corresponding to non-combat functions a, b and c managed by the wheel control. In fig. 2, by sliding the wheel disc control, the arrow of the wheel disc control points to the function trigger control corresponding to any non-combat function, so as to complete the selection process of the function trigger control. Alternatively, in fig. 2, the function trigger control may be directly clicked to select the function trigger control.
Of course, the first instruction may be generated by the client simulation after the client detects that the predetermined condition is fulfilled (for example, entering a game copy or entering a certain type of scene), in addition to the first instruction input by the user.
Still taking the game interface assumed above as an example, when the client detects that the virtual character is located on the cliff, the client generates a first instruction for selecting the flight trigger control in a simulation manner, so that the flight trigger control indicated by the first instruction is used as the target function trigger control, and the flight function corresponding to the currently-pointed flight trigger control is determined from a plurality of non-combat functions configured for the target control group.
Further, after determining the target function that satisfies the first instruction, the target control group is switched 103 from the first state to a second state for implementing the target function.
Still taking the game interface as an example, assume in particular that there are selectable individual non-combat functions shown in the game interface in the first state. Assume that the non-combat functions include: interactive action function, expression display function, flight function. Assuming that the target control group in the game interface in the first state comprises: the control system comprises a wheel disc control and function trigger controls corresponding to all non-combat functions managed by the wheel disc control. Assume that the target function is a non-combat function a.
Based on the assumption, after the non-combat function a corresponding to the target function trigger control is determined, the control layout of the target control group in the second state is determined, and the target control group is triggered to be switched to the second state corresponding to the non-combat function a, namely, the control layout in the second state is displayed in the game interface, so that the non-combat function a is executed in the second state.
Additionally, to enhance the gaming experience of the user, in the above or below embodiment, the process of the target control group switching from the first state to the second state for implementing the target function in 103 is imperceptible to the player. I.e. this process may be triggered automatically by the client in response to the first instruction.
By the control display method shown in fig. 1, the layout of the functional controls can be integrated, and the layout rationality of the human-computer interaction interface is improved. The method can also reduce the triggering difficulty of various functions and improve the utilization rate of various functions, thereby improving the human-computer interaction efficiency.
In some exemplary embodiments, it is assumed that the set of target controls in the first state includes at least one functionality-triggering control. At least one function trigger control is assumed to correspond one-to-one to at least one function.
Based on the above assumption, in the above 102, in response to the first instruction for the target control group, determining a target function satisfying the first instruction from the plurality of functions configured for the target control group may be specifically implemented as: and responding to a selection instruction of the target function trigger control, and taking the function corresponding to the trigger control i as the target function.
The target function trigger control is any one of at least one function trigger control. In practical applications, the at least one function-triggering control includes, but is not limited to: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
In this embodiment, the selection instruction for the target function trigger control is one implementation form of the first instruction described above, where the selection instruction may be a touch instruction (including but not limited to the various types of touch instructions described above) sent by a user through a touch screen, or a selection instruction fed back by an input device such as a mouse, a keyboard, a microphone, an image acquisition device, and a wearable device, or may also be another type of instruction, and the application is not limited.
In the above or below embodiments, optionally, prior to 102, the target control group is configured with a plurality of functionalities available to the user. In some exemplary embodiments, an optional manner of configuring the target control group is specifically: and dividing the function controls according to different function types to obtain a plurality of function control groups. And then, the functional control groups are respectively configured to different states of the target control group, and the target control group is created. Specifically, the plurality of functional control groups are respectively arranged by adopting different control layout structures according to the number of controls and/or an application scene to obtain control layouts corresponding to the plurality of functional control groups, and the control layouts corresponding to the plurality of functional control groups are used as control layouts of a target control group in different states to realize integration of the functional control groups.
In another way of configuring the target control group, the functional controls can be classified based on the user operation preferences to obtain functional control groups corresponding to different operation preferences, so that the functional control groups corresponding to different operation preferences are configured to different states of the target control group respectively to establish a corresponding relationship between the operation preferences and the states of the target control group. Therefore, different operation preferences of the user can be adapted by switching the state of the target control group, and user experience is improved.
For example, attack skill controls in the combat function controls are ranked based on the user's operational preferences in different types of scenarios. And grouping the attack skill controls according to the operation preference degrees in various types of scenes from high to low to obtain a plurality of attack skill control group corresponding to different types of scenes, and further respectively configuring the attack skill controls to different states of the target control group to establish the corresponding relation between the operation preference and the state of the attack skill control group and establish the target control group.
And acquiring the current scene type based on the target control group created by the example, and determining the target attack skill control group corresponding to the current scene type. And if the situation that the user enters the fighting mode is detected, switching the target control group from the current state to a state for displaying the target attack skill control group in the game interface.
In another way of configuring the target control group, the function control can be divided based on the game process to obtain a plurality of function control groups, and then the function control groups are configured to different states of the target control group respectively, and the target control group is created according to the corresponding relationship between each stage (or level, player grade, player online time and the like) in the game process and the state of the target control group. For example, it is detected that a user is about to enter a flight level, at this time, a target control group is switched from a current state to a state corresponding to the flight level in a game interface, where the target control group is configured with a flight function control in the state corresponding to the flight level.
In some exemplary embodiments, switching the target control group from the first state to the second state for implementing the target function in 103 may be implemented as:
determining a target function control group for executing a target function from at least one function control group associated with the target control group; calling the target function control group to load the target function control group into a function control area configured for the target function control group in the second state; and displaying the target control group in the second state in the graphical interface.
In the steps, the function control area is introduced into the target control group, and the function control can be directly selected in the function control area of the target control group without jumping to a lower menu, so that secondary menu interaction is effectively avoided, the interactive operation process is greatly simplified, and the substitution feeling of players is improved.
In a game scenario, different sets of functional controls may perform different non-combat functions. Optionally, the at least one function control group associated with the target control group includes, but is not limited to: at least one of an interactive control group, an expression control group and a flight control group.
In order to avoid the mistaken touch of the function control, if the function space group comprises a plurality of function controls, the function controls can be displayed in a separated mode. For example, in the functionality control area, multiple functionality controls are deployed using a multi-tiered wheel structure.
The expression control is used for triggering the virtual character in the game interface to release the corresponding expression. In an example, the expression control group includes a plurality of expression controls available for selection by the user, such as happy, angry, shy, sad, etc. In the game interface, the thumbnails or explanatory texts can be used as the identifiers of the expression controls, and the identifiers of the expression controls are displayed in the function control area according to the preset display style.
In an alternative embodiment, it is assumed that the function control area is loaded with an expression control group. Based on this, in the above steps, the step of displaying the target control group in the second state in the game interface may be implemented as: in the function control area, a plurality of function controls (namely expression controls) in the expression control group are displayed by adopting a multilayer wheel structure, wherein each layer of wheel structure comprises at least one function control (namely expression control).
In practical application, because the expression controls are numerous, in order to make the layout of the expression controls more reasonable, the display style is preset to be a multilayer roulette structure, that is, the display style surrounds the roulette control and is arranged in multiple layers, as shown in fig. 3. In fig. 3, a plurality of expression controls respectively adopt corresponding text identifiers, the expression controls are arranged in an expression control display area (i.e., a function control area) in a multilayer roulette structure, and one of the expression controls is selected, so that a virtual character in a game interface can be triggered to send out a corresponding expression. For example, in fig. 3, it is assumed that an expression control identified as "smile" is selected, and in this case, a smile expression uttered by the virtual character can be controlled. Optionally, the manner in which the virtual character emits the expression may be that an expression bubble corresponding to the expression control is displayed at the top of the virtual character. Or the expression of the virtual character can be sent by showing the expression corresponding to the expression control on the face of the virtual character, and the expression of the face of the virtual character can be obtained by face-pinching setting.
Optionally, in fig. 3, the target control group in the second state further includes: a title bar control for indicating the current function (i.e., "bubble emoticon"), and a "cancel interaction" control for exiting the emoticon sending function.
Certainly, in order to avoid that the function control area where the expression control group is located occupies too much interface space, the number of controls in the expression control group can be limited. Optionally, a number threshold of the expression controls that can be displayed on a single page in the function control area is set, and if the number of the expression controls in the expression control group exceeds the number threshold of the expression controls, the expression controls in the expression control group are sorted according to the frequency of use, so that the expression controls are respectively allocated to different pages in the function control area. Further, assuming that a wheel control is further included in the target control group, in this case, the slidable wheel control switches the currently displayed expression control page. Or page selection may be made via a page switch control.
In practical application, the expression control may be configured in advance by a game developer, or may also be set according to user data (for example, age, gender, and type of a selected virtual character), or a user may also upload a custom expression and generate a corresponding expression control, which is not limited in the embodiment of the present application.
In an optional embodiment, in the above step, it is assumed that the function control area is loaded with the interactive action control group. Based on this, displaying the target control group in the second state in the graphical interface may also be implemented as:
in the function control area, a plurality of function controls (i.e., interactive controls) in the interactive control group are displayed by using a multi-layer wheel structure, wherein each layer of wheel structure includes at least one function control (i.e., interactive control).
Specifically, referring to the expression control display mode introduced above, the display mode of the interactive action control is similar to that of the expression control, and the similarity is not expanded, and the difference point between the two modes is mainly that: the interaction control group comprises a single-person interaction control group and a multi-person interaction control group.
In practical application, the single-person interactive action control group and the multi-person interactive action control group can be displayed in the same page of the function control area. The single-person interactive action control group and the multi-person interactive action control group can be allocated to different pages of the function control area for displaying. In this case, the paging selection is performed by a roulette control or a paging switching control. In an optional example, the interactive action control group is assumed to comprise a single-person interactive action control group and a multi-person interactive action control group, and based on this, the wheel control in the target control group can be controlled to switch the type of the interactive action control group loaded in the function control area.
For example, to make the layout of the single-person interactive control group more reasonable, the preset display style is a multi-layer roulette structure, i.e. a display style surrounding the roulette control and arranged in multiple layers, as shown in fig. 4. In fig. 4, a plurality of single-person interactive controls respectively adopt corresponding text identifiers, and the single-person interactive controls are arranged in a single-person interactive control display area (i.e., a function control area) in a multilayer wheel disc structure. And after the target control group is switched to the second state, the single-person interactive action control display area is expanded, a selection instruction of any one single-person interactive action control by a user is received, and the virtual character in the game interface is triggered to perform the interactive action corresponding to the single-person interactive action control based on the selection instruction.
For example, in FIG. 4, assume that a single person interactive action control identified as "send love heart" is selected, in which case the virtual character may be controlled to make the send love action.
Optionally, in fig. 4, the target control group in the second state further includes: a title bar control for representing the current function (i.e., "one-man interaction"), and a "cancel interaction" control for exiting the interaction function. After exiting the interactive action function, the target control group can be switched back to the first state or switched to the third state.
For another example, in order to make the layout of the multi-user interactive control group more reasonable, the display style is preset, as shown in fig. 5, compared with a multi-layer roulette structure, that is, a display style surrounding the roulette control and arranged in multiple layers. In fig. 5, a plurality of multi-user interactive controls are respectively identified by corresponding characters, and the multi-user interactive controls are arranged in a multi-layer wheel disc structure in a multi-user interactive control display area (i.e., a function control area). And after the target control group is switched to the second state, expanding the multi-user interactive action control display area, receiving a selection instruction of any multi-user interactive action control from a user, and triggering the virtual character in the game interface to perform the interactive action corresponding to the multi-user interactive action control based on the selection instruction. In fact, an interactive object interacting with the virtual character may be selected, for example, another virtual character closest to the virtual character may be determined by the position of the virtual character. Or, responding to a selection instruction of the user to another virtual character, and determining the other virtual character selected by the instruction as an interactive object.
For example, in fig. 5, it is assumed that a multi-person interactive action control identified as "handshake" is selected, in which case the virtual character may be controlled to make the action of the multi-person interactive action control with another virtual character that is closest in distance.
Similarly, in fig. 5, the target control group in the second state further includes: a title bar control for representing the current function (i.e. "multi-person interaction"), and a "cancel interaction" control for exiting the interaction function. After exiting the interactive action function, the target control group may also be switched back to the first state or switched to the third state.
In the above or following embodiments, if the target control group includes a plurality of function controls, the respective use frequencies of the plurality of function controls in the game scene may also be determined, and the plurality of function controls may be arranged at unequal intervals according to the respective use frequencies, so as to improve the availability of the target control group. The higher the use frequency of the function control is, the larger the interval between the function control and other surrounding function controls is, so that the interface space occupied by the function control is reduced, and the use experience of the control is improved. For the functional controls with lower use frequency in the plurality of functional controls, the functional controls can be mutually superposed so as to reduce the interface space occupied by the target control group. Optionally, a usage frequency threshold is set to determine whether the functionality control needs to be overlaid, for example, if there are two functionality controls whose usage frequencies are lower than the usage frequency threshold, the two functionality controls are partially overlaid. Or, the plurality of function controls can be sorted according to the use frequency, and the arrangement layout of the plurality of function controls is determined based on the sorting result and the preset order.
In practical applications, the arrangement layout of the plurality of function controls may be initiated by the client after detecting the use frequency of the plurality of function controls by the player, or may be adjusted by the player in a user-defined manner.
Taking the expression control group shown in fig. 6 as an example, if the number of expression controls in the expression control group is multiple, the use frequencies of the multiple expression controls by the player in the current game scene may also be counted, and the multiple expression controls are arranged at unequal intervals according to the use frequencies, where the higher the use frequency of the expression control is, the larger the interval between the expression control and other surrounding expression controls is. In fig. 6, the use frequency of crying, like and smiling is higher than that of other expression controls, and the three expression controls have larger intervals with other surrounding expression controls; the frequency of use of the three expression controls of the shame, the funny and the fresh flower is lower than that of the other expression controls, and the three expression controls are spaced from the other surrounding expression controls at a smaller interval and are mutually overlapped.
In other embodiments, the interface space occupied by the icon corresponding to each of the plurality of function controls may be determined according to the frequency of use of each of the plurality of function controls in the game scene. The higher the use frequency of the function control, the larger the interface space occupied by the icon corresponding to the function control, so that the player can conveniently select the function control, and the mistaken touch is prevented.
In fig. 6, the use frequency of crying, like and smiling is higher than that of other expression controls, and the interface spaces occupied by icons corresponding to the three expression controls are larger; the use frequency of the three expression controls of the shame, the funny and the fresh flower is lower than that of other expression controls, and the interface space occupied by the icons corresponding to the three expression controls is smaller.
In some exemplary embodiments, continuing with the game interface assumed above, if the function triggering control is a flight touch control and the target function is a flight function, the state of the virtual character in the graphical interface may also be switched to a flight state.
In the foregoing specific implementation step 103, determining, from at least one function control group associated with the target control group, a target function control group for executing the target function may be implemented as:
and determining a flight function control group for executing the flight function from at least one function control group associated with the target control group, wherein the flight function control group comprises a landing control for triggering a landing operation and a state operation control for controlling the flight state.
Furthermore, in the above step, the step of invoking the target function control group to load the target function control group into the function control area configured for the target function control group in the second state may be implemented as:
and calling the flight function control group to switch the flight trigger control in the target control group into a landing control, and switching the wheel disc control in the target control group into a state operation control to obtain the target control group in the second state.
In practical application, it is assumed that a flight function control group is loaded in the function control area. Assuming that the target control group in the first state is as shown in fig. 2, and assuming that the function trigger control c is a flight trigger control, based on the above assumptions, the target control group in the second state is as shown in fig. 7 in the graphical interface. In fig. 7, the avatar has wings deployed to indicate that the avatar is in flight. In fig. 7, after switching to the second state, the wheel controls in the target control group are switched to operation state controls for controlling the flight direction of the virtual character (i.e., a first arrow control for controlling the virtual character to fly upward and a second arrow control for controlling the virtual character to fly downward), and the function trigger control c (i.e., a flight trigger control) in the target control group is switched to a landing control for controlling the virtual character to land and stop flying.
In the above or following embodiments, it is assumed that the dual wheel controls of the graphical interface are respectively deployed on the left and right sides of the game interface, and the dual wheel controls may also respectively control the flight mode of the virtual character. For example, the left wheel control deployed in the left interface control may be used to control the movement modes of the virtual character in the horizontal directions, such as front, back, left, and right, during flying, and the right wheel control deployed in the right interface space may be used to control the movement modes of the virtual character in the longitudinal directions, such as up and down, so as to facilitate the control of the player on the movement direction of the virtual character in the flying function.
In practical applications, the angle control of the roulette control on the virtual character in flight is not limited to the above angle, and the controllable angle range may also be 360 degrees in the horizontal or longitudinal direction, or other angles.
Further, if it is detected that the virtual character reaches the set landing point (for example, the parcel where the next checkpoint is located), or a selection instruction of the landing control is detected, it indicates that the virtual character needs to land at this time, and in this case, the virtual character switches to the flight stop state, and switches the target control group from the second state shown in fig. 7 to the first state shown in fig. 2.
In some exemplary embodiments, the present application includes other state switching processes for the target control group in addition to the above-described switching process between the first state and the second state.
Specifically, it is still assumed that the graphical interface is a game interface, and based on this, it is determined whether the target control group satisfies the state switching condition. And if the target control group meets the state switching condition, switching the target control group from a third state for executing the fighting function to a first state for triggering the non-fighting function in the game interface.
Optionally, the state switching condition here includes but is not limited to one of the following:
switching conditions are as follows: and the interval between the time of receiving the second instruction aiming at the target control group for the last time and the current time exceeds a set threshold value. This case illustrates that the time interval between the execution time of the last fighting function and the current time is too long, so as to facilitate timely obtaining the non-fighting function, and guide the user to enter a leisure state (i.e., a non-fighting state), it can be determined that the state switching condition for switching to the first state is satisfied.
And switching conditions are as follows: the second instruction for the target control group is an operation instruction for stopping the battle. In this case, the player intends or needs to leave the battle mode, and in order to switch the user to the leisure (i.e., non-battle) state in time, it is determined that the state switching condition for switching to the first state is satisfied.
In practical application, taking the dual-wheel control in the game interface as an example, it is assumed that the dual-wheel control is respectively disposed on the left side and the right side of the game interface. Assume that the player dominant hand is the right hand. Based on the control, the target control group can be composed of a right wheel disc control in the double wheel disc controls and a function control managed by the wheel disc control, so that the release of fighting skills or non-fighting skills is triggered by controlling the right wheel disc control in different states.
Optionally, the right wheel control in the third state may be implemented as a fighting skill trigger wheel for triggering release of the fighting skill; the right wheel control in the first state and/or the second state may be implemented as a non-fighting skill trigger wheel for triggering release of the non-fighting skill. The fighting skill triggering wheel disc and the non-fighting skill triggering wheel disc are mutually exclusive, and in brief, only one wheel disc can be displayed at the same time in the game interface.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects of steps 101 to 103 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subjects of steps 102 and 103 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
Fig. 8 is a flowchart illustrating a control display apparatus according to an embodiment of the present application. As shown in fig. 8, the control showing device includes a display module 81, a transceiver module 82 and a processing module 83, wherein
The display module 81 is used for displaying the target control group in the first state or the second state in the graphical interface;
a transceiver module 82, configured to receive a first instruction for a target control group;
a processing module 83, configured to, in response to a first instruction for a target control group, determine a target function satisfying the first instruction from a plurality of functions configured for the target control group; and the triggering display module switches the target control group from a first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
Optionally, the target control group in the first state includes at least one function trigger control, and the at least one function trigger control corresponds to at least one function one to one.
The processing module 83, in response to the first instruction for the target control group, is specifically configured to, when determining that the target function of the first instruction is satisfied from the plurality of functions configured for the target control group: and responding to a selection instruction of the target function trigger control, and taking the function corresponding to the target function trigger control as the target function.
The target function trigger control is any one of at least one function trigger control.
Wherein, optionally, the at least one functionality-triggering control comprises: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
Optionally, when the target control is switched from the first state to the second state for implementing the target function, the processing module 83 is specifically configured to:
determining a target function control group for executing a target function from at least one function control group associated with the target control group; calling the target function control group to load the target function control group into a function control area configured for the target function control group in the second state; and displaying the target control group in the second state in the graphical interface.
Optionally, the at least one function control group associated with the target control group includes: at least one of an interactive control group, an expression control group and a flight control group.
Optionally, if the function control area is loaded with an interactive control group and/or an expression control group, the processing module 83 is specifically configured to, when a target control group in the second state is displayed in the graphical interface:
and in the function control area, displaying a plurality of function controls in the interactive action control group and/or the expression control group by adopting a multilayer wheel structure, wherein each layer of wheel structure comprises at least one function control.
The interactive action control group comprises a single-person interactive action control group and a multi-person interactive action control group.
The processing module 83 is further configured to: and controlling the wheel disc control in the target control group to switch the type of the interactive action control group loaded in the function control area.
Optionally, if the function trigger control is a flight touch control and the target function is a flight function, the processing module 83 is further configured to: and switching the virtual character in the graphical interface to the flight state.
Further, when determining the target function control group for executing the target function from the at least one function control group associated with the target control group, the processing module 83 is specifically configured to:
and determining a flight function control group for executing the flight function from at least one function control group associated with the target control group, wherein the flight function control group comprises a landing control for triggering a landing operation and a state operation control for controlling the flight state.
Furthermore, the processing module 83, when invoking the target function control group to load the target function control group into the function control area configured for the target function control group in the second state, is specifically configured to:
and calling the flight function control group to switch the flight trigger control in the target control group into a landing control, and switching the wheel disc control in the target control group into a state operation control to obtain the target control group in the second state.
Wherein, if the graphical interface is a game interface, the processing module 83 is further configured to: judging whether the target control group meets a state switching condition or not; and if the target control group meets the state switching condition, switching the target control group from a third state for executing the fighting function to a first state for selecting the non-fighting function in the game interface.
Wherein, optionally, the state switching condition includes: the interval between the moment of receiving the second instruction aiming at the target control group for the last time and the current moment exceeds a set threshold value; or the second instruction for the target control group is an operation instruction for stopping the battle.
Optionally, the processing module 83 is further configured to determine, if the target control group includes a plurality of function controls, respective use frequencies of the plurality of function controls in the game scene, and arrange the plurality of function controls at unequal intervals according to the respective use frequencies. Wherein, the higher the usage frequency of the function control, the more the function control is spaced from other surrounding function controls.
In this embodiment, at least one function is configured in the target control group, and the target control group is switched from the first state to the second state for implementing the target function, so that the target function is executed in the graphical interface, and therefore, the layout of the function controls can be integrated, the problem of poor usability caused by the scattered layout of the function controls is avoided, the layout rationality of the human-computer interaction interface is improved, and the human-computer interaction efficiency is improved. In this embodiment, the controls for implementing different functions can be displayed in different states of the target control group, so that the triggering difficulty of various functions is greatly reduced, and the utilization rate of various functions is improved.
Fig. 9 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application, and as shown in fig. 9, the electronic device includes: memory 901, processor 902, communications component 903, and display component 904.
The memory 901 is used for storing computer programs and may be configured to store other various data to support operations on the electronic device. Examples of such data include instructions for any application or method operating on the electronic device, contact data, phonebook data, messages, pictures, videos, and so forth.
The memory 901 may be implemented by any type or combination of volatile and non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
A processor 902, coupled to the memory 901, for executing the computer program in the memory 901 for: displaying, by the display component 904, the set of target controls in the first state or the second state in the graphical interface; in response to a first instruction for the target control group, determining a target function satisfying the first instruction from a plurality of functions configured for the target control group; and the triggering display module switches the target control group from a first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
Further optionally, the target control group in the first state includes at least one function trigger control, and the at least one function trigger control corresponds to at least one function one to one.
The processor 902, in response to the first instruction for the target control group, is specifically configured to, when determining that the target function of the first instruction is satisfied from the plurality of functions configured for the target control group: and responding to a selection instruction of the target function trigger control, and taking the function corresponding to the target function trigger control as the target function.
The target function trigger control is any one of at least one function trigger control.
Wherein, optionally, the at least one functionality-triggering control comprises: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
Further optionally, when the processor 902 switches the target control from the first state to the second state for implementing the target function, specifically, the following is performed:
determining a target function control group for executing a target function from at least one function control group associated with the target control group; calling the target function control group to load the target function control group into a function control area configured for the target function control group in the second state; and displaying the target control group in the second state in the graphical interface.
Optionally, the at least one function control group associated with the target control group includes: at least one of an interactive control group, an expression control group and a flight control group.
Further optionally, if the function control area is loaded with an interactive control group and/or an expression control group, the processor 902, when displaying the target control group in the second state in the graphical interface, is specifically configured to:
and in the function control area, displaying a plurality of function controls in the interactive action control group and/or the expression control group by adopting a multilayer wheel structure, wherein each layer of wheel structure comprises at least one function control.
The interactive action control group comprises a single-person interactive action control group and a multi-person interactive action control group.
Wherein, further optionally, the processor 902 is further configured to: and controlling the wheel disc control in the target control group to switch the type of the interactive action control group loaded in the function control area.
If the function trigger control is a flight touch control and the target function is a flight function, the processor 902 is further configured to: and switching the virtual character in the graphical interface to the flight state.
Further, when determining the target function control group for executing the target function from the at least one function control group associated with the target control group, the processor 902 is specifically configured to:
and determining a flight function control group for executing the flight function from at least one function control group associated with the target control group, wherein the flight function control group comprises a landing control for triggering a landing operation and a state operation control for controlling the flight state.
Further, the processor 902, when invoking the target function control group to load the target function control group into the function control area configured for the target function control group in the second state, is specifically configured to:
and calling the flight function control group to switch the flight trigger control in the target control group into a landing control, and switching the wheel disc control in the target control group into a state operation control to obtain the target control group in the second state.
Wherein, if the graphical interface is a game interface, the processor 902 is further configured to: judging whether the target control group meets a state switching condition or not; and if the target control group meets the state switching condition, switching the target control group from a third state for executing the fighting function to a first state for selecting the non-fighting function in the game interface.
Wherein, optionally, the state switching condition includes: the interval between the moment of receiving the second instruction aiming at the target control group for the last time and the current moment exceeds a set threshold value; or the second instruction for the target control group is an operation instruction for stopping the battle.
Optionally, the processor 902 is further configured to determine, if the target control group includes a plurality of function controls, respective use frequencies of the plurality of function controls in the game scene, and arrange the plurality of function controls at unequal intervals according to the respective use frequencies. Wherein, the higher the usage frequency of the function control, the more the function control is spaced from other surrounding function controls.
Further, as shown in fig. 4, the electronic device further includes: power components 905, audio components 906, and the like. Only some of the components are schematically shown in fig. 4, and the electronic device is not meant to include only the components shown in fig. 4.
Wherein the communication component 903 is configured to facilitate wired or wireless communication between the device in which the communication component is located and other devices. The device in which the communication component is located may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G, or 5G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
The display 904 includes a screen, which may include a Liquid Crystal Display (LCD) and a Touch Panel (TP), among others. The screen may be implemented as a touch screen to receive an input signal from a user, without the screen including a touch panel. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The power supply 905 provides power to various components of the device in which the power supply is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In this embodiment, at least one function is configured in the target control group, and the target control group is switched from the first state to the second state for implementing the target function, so that the target function is executed in the graphical interface, and therefore, the layout of the function controls can be integrated, the problem of poor usability caused by the scattered layout of the function controls is avoided, the layout rationality of the human-computer interaction interface is improved, and the human-computer interaction efficiency is improved. In this embodiment, the controls for implementing different functions can be displayed in different states of the target control group, so that the triggering difficulty of various functions is greatly reduced, and the utilization rate of various functions is improved.
Accordingly, the present application further provides a computer-readable storage medium storing a computer program, where the computer program is capable of implementing the steps that can be executed by the electronic device in the foregoing method embodiments when executed.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include transitory computer readable media (transient media) such as modulated data signals and batches.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (13)

1. A control display method is characterized by comprising the following steps:
displaying the target control group in the first state in the graphical interface;
in response to a first instruction for the target control group, determining a target function that satisfies the first instruction from a plurality of functions configured for the target control group;
and switching the target control group from the first state to a second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
2. The method of claim 1, wherein the target control group in the first state comprises at least one functionality-triggering control, the at least one functionality-triggering control corresponding one-to-one to the at least one functionality;
the responding to a first instruction of the target control group, determining a target function meeting the first instruction from a plurality of functions configured for the target control group, and comprises the following steps:
responding to a selection instruction of a target function trigger control, and taking a function corresponding to the target function trigger control as the target function;
the target function trigger control is any one of the at least one function trigger control.
3. The method of claim 2, wherein the at least one functionality-triggering control comprises: and at least one of the interactive action trigger control, the expression trigger control and the flight trigger control.
4. The method of claim 1, wherein switching the target control from the first state to a second state for implementing the target function comprises:
determining a target function control group used for executing the target function from at least one function control group associated with the target control group;
calling the target function control group to load the target function control group into a function control area configured for the target control group in the second state;
displaying the target control group in the second state in the graphical interface.
5. The method of claim 4, wherein the at least one functional control group associated with the target control group comprises: at least one of an interactive control group, an expression control group and a flight control group.
6. The method according to claim 5, wherein if the function control area is loaded with an interactive control group and/or an expressive control group
The displaying, in the graphical interface, the target control group in the second state includes:
and in the function control area, displaying a plurality of function controls in the interactive control group and/or the expression control group by adopting a multilayer wheel structure, wherein each layer of wheel structure comprises at least one function control.
7. The method of claim 6, wherein the set of interactive controls comprises a single interactive control group and a multiple interactive control group;
the method further comprises the following steps:
and controlling the wheel disc control in the target control group to switch the type of the interactive action control group loaded in the function control area.
8. The method of claim 4, wherein if the functionality-triggering control is a flight-triggering control and the target functionality is a flight functionality, the method further comprises:
switching the virtual role in the graphical interface to a flight state;
determining a target function control group for executing the target function from at least one function control group associated with the target control group, including:
determining a flight function control group for executing the flight function from at least one function control group associated with the target control group, wherein the flight function control group comprises a landing control for triggering a landing operation and a state operation control for controlling a flight state;
the invoking the target function control group to load the target function control group into the function control area configured for the target function control group in the second state includes:
and calling the flight function control group to switch the flight trigger control in the target control group to the landing control, and switching the wheel disc control in the target control group to the state operation control, so as to obtain the target control group in the second state.
9. The method of claim 1, wherein if the graphical interface is a game interface, the method further comprises:
judging whether the target control group meets a state switching condition or not;
and if the target control group meets the state switching condition, switching the target control group from a third state for executing a fighting function to a first state for selecting a non-fighting function in the game interface.
10. The method of claim 9, wherein the state-switching condition comprises:
the interval between the time of receiving the second instruction aiming at the target control group for the last time and the current time exceeds a set threshold value; or
And the second instruction aiming at the target control group is an operation instruction for stopping fighting.
11. A control presentation apparatus, wherein the apparatus is loaded with a graphical interface, the apparatus comprising:
the display module is used for displaying the target control group in the first state or the second state in the graphical interface;
the receiving and sending module is used for receiving a first instruction aiming at the target control group;
a processing module, configured to determine, in response to the first instruction, a target function that satisfies the first instruction from a plurality of functions configured for the target control group; and triggering the display module to switch the target control group from the first state to the second state for realizing the target function, wherein the control layouts of the target control group in different states are different.
12. An electronic device, comprising: a memory and a processor;
the memory is to store one or more computer instructions;
the processor is to execute the one or more computer instructions to: performing the steps of the method of any one of claims 1-10.
13. A computer-readable storage medium storing a computer program, wherein the computer program is capable of performing the steps of the method of any one of claims 1 to 10 when executed.
CN202110442408.4A 2021-04-23 2021-04-23 Control display method, device, equipment and storage medium Pending CN113117345A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110442408.4A CN113117345A (en) 2021-04-23 2021-04-23 Control display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110442408.4A CN113117345A (en) 2021-04-23 2021-04-23 Control display method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113117345A true CN113117345A (en) 2021-07-16

Family

ID=76779436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110442408.4A Pending CN113117345A (en) 2021-04-23 2021-04-23 Control display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113117345A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113975804A (en) * 2021-10-29 2022-01-28 腾讯科技(深圳)有限公司 Virtual control display method, device, equipment, storage medium and product
CN114579229A (en) * 2022-02-14 2022-06-03 众安科技(国际)集团有限公司 Information presentation method and device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736906A (en) * 2011-04-14 2012-10-17 上海三旗通信科技股份有限公司 Method for implementing switching and management of multiple operating interfaces on personal terminal
CN110389701A (en) * 2018-04-23 2019-10-29 北京易轻软件有限公司 Multiple solutions user interaction approach and device applied to enterprise management informatization system
CN111714874A (en) * 2020-06-18 2020-09-29 网易(杭州)网络有限公司 Control state switching method and device and electronic equipment
CN111766989A (en) * 2020-07-02 2020-10-13 网易(杭州)网络有限公司 Interface switching method and device
US20200401298A1 (en) * 2019-06-21 2020-12-24 Beijing Xiaomi Mobile Software Co., Ltd. Always-on display applications and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102736906A (en) * 2011-04-14 2012-10-17 上海三旗通信科技股份有限公司 Method for implementing switching and management of multiple operating interfaces on personal terminal
CN110389701A (en) * 2018-04-23 2019-10-29 北京易轻软件有限公司 Multiple solutions user interaction approach and device applied to enterprise management informatization system
US20200401298A1 (en) * 2019-06-21 2020-12-24 Beijing Xiaomi Mobile Software Co., Ltd. Always-on display applications and apparatus
CN111714874A (en) * 2020-06-18 2020-09-29 网易(杭州)网络有限公司 Control state switching method and device and electronic equipment
CN111766989A (en) * 2020-07-02 2020-10-13 网易(杭州)网络有限公司 Interface switching method and device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113975804A (en) * 2021-10-29 2022-01-28 腾讯科技(深圳)有限公司 Virtual control display method, device, equipment, storage medium and product
CN113975804B (en) * 2021-10-29 2024-04-19 腾讯科技(深圳)有限公司 Virtual control display method, device, equipment, storage medium and product
CN114579229A (en) * 2022-02-14 2022-06-03 众安科技(国际)集团有限公司 Information presentation method and device

Similar Documents

Publication Publication Date Title
US11833426B2 (en) Virtual object control method and related apparatus
CN108404408B (en) Information processing method, information processing apparatus, storage medium, and electronic device
JP7138804B2 (en) INTERFACE DISPLAY METHOD AND DEVICE, TERMINAL AND COMPUTER PROGRAM
US10792562B2 (en) Information processing method, terminal, and computer storage medium
US10552183B2 (en) Tailoring user interface presentations based on user state
US20210247968A1 (en) Method and apparatus for performing visual programming
CN113117345A (en) Control display method, device, equipment and storage medium
US11379104B2 (en) Sharing user interface customization across applications
US20130268876A1 (en) Method and apparatus for controlling menus in media device
US20230014732A1 (en) Application startup and archiving
CN112306351B (en) Virtual key position adjusting method, device, equipment and storage medium
CN112044067A (en) Interface display method, device, equipment and storage medium
US20230241499A1 (en) Position adjustment method and apparatus for operation control, terminal, and storage medium
CN107294835A (en) Document sending method and device in a kind of instant messaging
KR20220058841A (en) Method and apparatus, device, storage medium and program product for adjusting the position of a virtual button
US20230350554A1 (en) Position marking method, apparatus, and device in virtual scene, storage medium, and program product
CN113268182A (en) Application icon management method and electronic equipment
US20230256335A1 (en) Display method and apparatus for real-time battle information, terminal, and storage medium
CN112911052A (en) Information sharing method and device
CN112169319A (en) Application program starting method, device, equipment and storage medium
US20150293888A1 (en) Expandable Application Representation, Milestones, and Storylines
KR101806922B1 (en) Method and apparatus for producing a virtual reality content
CN112527175A (en) Animation implementation method, device, equipment and storage medium
CN115113773B (en) Information processing method, information processing device, computer readable storage medium, and electronic device
CN112221123B (en) Virtual object switching method and device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination