CN114995930A - Control display control method and device and electronic equipment - Google Patents

Control display control method and device and electronic equipment Download PDF

Info

Publication number
CN114995930A
CN114995930A CN202210405745.0A CN202210405745A CN114995930A CN 114995930 A CN114995930 A CN 114995930A CN 202210405745 A CN202210405745 A CN 202210405745A CN 114995930 A CN114995930 A CN 114995930A
Authority
CN
China
Prior art keywords
control
touch
display
target
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210405745.0A
Other languages
Chinese (zh)
Inventor
陈宗民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210405745.0A priority Critical patent/CN114995930A/en
Publication of CN114995930A publication Critical patent/CN114995930A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a display control method and device of a control and electronic equipment; wherein, the method comprises the following steps: responding to preset operation, and displaying operation indication information in a graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters. According to the method, the control layout does not need to be adjusted by a user in person, the control layout which accords with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.

Description

Control display control method and device and electronic equipment
Technical Field
The invention relates to the technical field of interface interaction, in particular to a display control method and device of a control and electronic equipment.
Background
In a game or other application, a user is required to operate a control to issue a corresponding instruction. The layout of controls in the interface may affect the comfort of user operation. In the related art, a part of application programs can be provided with a default control layout, and a user adapts to the default control layout through multiple operations, but poor user experience is easy to occur in the adaptation process, so that the user is lost. For an application program with higher operation requirement, a function of customizing the layout of the control by a user can be provided, but some users are difficult to really know own operation habits, and the customized layout of the control still is difficult to meet the requirement of the user on operation comfort.
Disclosure of Invention
In view of this, the present invention provides a display control method and apparatus for a control, and an electronic device, so as to obtain a control layout according with a user touch operation habit, have a higher operation comfort level, and improve application experience of a user.
In a first aspect, an embodiment of the present invention provides a display control method for a control, where a graphical user interface is provided by a terminal device; the method comprises the following steps: responding to preset operation, and displaying operation indication information in a graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters.
The step of displaying the operation instruction information in the graphical user interface in response to the preset operation includes: responding to a preset operation, and displaying the moved first object in the graphical user interface; the first object is used for indicating a user to execute touch operation at the position of the first object; displaying a second object with a gradually-enlarged area in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
The step of displaying the second object with the gradually enlarged area in the graphical user interface includes: displaying a second object of a default display area size at a specified location in the graphical user interface; wherein, the designated position includes: a preset position or a touch position for touch operation of the first object; and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually increased according to the touch frequency of the touch operation aiming at the second object.
The step of determining the display control parameter of the target control based on the operation parameter of the touch operation includes: determining a control display area in the graphical user interface based on the operation parameters of the touch operation; and determining display control parameters of the target control from the control display area.
The step of determining a control display area in the graphical user interface based on the operation parameters of the touch operation includes: determining a maximum touch area in the graphical user interface aiming at the touch position and the touch time of the first moving object in the operation parameters of the touch operation; establishing a normal distribution relation between each position area and the touch frequency in the maximum touch area based on the touch position of the touch operation and the touch frequency of the touch position in the maximum touch area; determining a control display area from the maximum touch area based on the normal distribution relation; the control display area comprises a plurality of control display areas; the touch frequency of each control display area is different.
The control display area comprises a plurality of control display areas; the step of determining the display control parameters of the target control from the control display area includes: determining a target display area from the control display area based on the control attribute of the target control; the control attribute is used for indicating the trigger frequency of the target control; acquiring a third object from the target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency aiming at the touch operation of the same object in the target display area; and determining the display position and the display size of the target control based on the touch position of the touch operation aiming at the third object and the touch frequency of the touch position.
The step of determining the display position and the display size of the target control based on the touch position of the touch operation for the third object and the touch frequency of the touch position includes: acquiring a first touch position with the highest touch frequency and a second touch position which is farthest away from the first touch position in touch positions of touch operation for a third object; and determining the display position and the display size of the target control based on the first touch position and the second touch position.
After the step of determining the display position and the display size of the target control based on the touch position of the touch operation for the third object and the touch frequency of the touch position, the method further includes: if the target controls comprise a plurality of target controls, the display positions of the target controls are adjusted based on the specified distance between the adjacent target controls, and the final display positions of the target controls are obtained.
Presetting an object control in the graphical user interface, wherein the object control is displayed in a first area in the graphical user interface; responding to a preset operation, and displaying operation indication information in a graphical user interface, wherein the step comprises the following steps: responding to the trigger operation aiming at the object control, and displaying the target object in a second area in the graphical user interface; wherein the second region is: an area of the graphical user interface other than the first area.
In a second aspect, an embodiment of the present invention provides a display control apparatus for a control, which provides a graphical user interface through a terminal device; the above-mentioned device includes: the information display module is used for responding to preset operation and displaying operation indication information in the graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; the parameter acquisition module is used for responding to the touch operation aiming at the target object and acquiring the operation parameters of the touch operation; and the control display module is used for determining the display control parameters of the target control based on the operation parameters of the touch operation and controlling the target control to display based on the display control parameters.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor and a memory, where the memory stores machine executable instructions that can be executed by the processor, and the processor executes the machine executable instructions to implement the display control method for the control.
In a fourth aspect, embodiments of the present invention provide a machine-readable storage medium storing machine-executable instructions, which when invoked and executed by a processor, cause the processor to implement the display control method of the control.
The embodiment of the invention has the following beneficial effects:
the display control method and device of the control and the electronic equipment respond to the preset operation and display operation indication information in the graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters. In the method, the user is guided to execute the touch operation through the moving or deforming target object, so that the operation parameters of the current user are obtained, the display control parameters of the target control are determined based on the operation parameters, the control layout is realized, the user does not need to adjust the control layout personally, the control layout according with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a control display control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an object control and a target object according to an embodiment of the present invention;
FIG. 3 is a diagram illustrating a first object according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a second object according to an embodiment of the present invention;
fig. 5 is an exemplary diagram of normal distribution corresponding to a normal distribution frequency density formula according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a display area of a plurality of controls according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a plurality of control display areas and control display positions according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a display control apparatus of a control according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
The control layout in the application program can adopt a default layout, and a user directly uses the default layout to gradually adapt to and habit the default layout. Disadvantages of using a default layout include: users need to avoid intuitive operation through trial and error and also need to correct conditioned reflex through memory and thinking; a user needs to invest a large amount of time cost to learn and adapt, and the experience in the early stage is poor; different users have different learning degrees, so that users who are difficult to operate can be lost finally; the user is adapted to the application program in a one-way mode, and the efficiency of the whole learning process is low; due to the fact that individual differences exist in the structures, sizes, operation habits and the like of the hands of the users, the preset layout always differs from the ideal layout of the users due to objective factors, and complete elimination cannot be achieved through subjective awareness of the users, for example, a user with a small palm size cannot get used to a screen with a large size no matter how to learn the user.
For the application program with higher operation requirement, on the basis of the default layout, the layout of the control suitable for the operation habit of the user can be configured in a self-defined mode. The way of customizing the layout of the controls has the following disadvantages: the user can summarize the control layout conforming to the intuitive operation habit by continuous trial and error in operation; the user also needs to invest a large amount of time cost to explore, and the early-stage experience is poor; in the process of learning and summarizing operation habits, it is difficult to determine which are effective experience feedbacks and the corresponding controls need to be adjusted; after effective experience feedback is obtained, it is difficult to judge how to adjust the control based on the feedback, so that the size and layout of the adjusted control are still uncomfortable; a user needs to invest a large amount of time cost to adapt the game in a one-way mode, the previous experience is poor, and the efficiency of the whole exploration process is low; the game with relatively complex functions has more controls and larger user-defined workload.
Based on this, the display control method and device for the control and the electronic device provided by the embodiment of the invention can be applied to various application programs with operation interaction requirements, and are particularly suitable for application programs with high operation requirements, such as game programs.
In one embodiment of the present invention, the display control method of the control may be executed in a touch local terminal device or a server. When the display control method of the control runs on the server, the method can be implemented and executed based on a cloud interaction system, wherein the cloud interaction system comprises the server and the client device.
In an optional embodiment, various cloud applications may be run under the cloud interaction system, for example: and (5) cloud games. Taking a cloud game as an example, the cloud game refers to a game mode based on cloud computing. In the cloud game operation mode, the game program operation main body and the game picture presentation main body are separated, the storage and the operation of the display control method of the control are completed on the cloud game server, and the client device is used for receiving and sending data and presenting the game picture, for example, the client device can be a display device with a data transmission function close to a user side, such as a mobile terminal, a television, a computer, a palm computer and the like; but the cloud game server which performs information processing is a cloud. When a game is played, a player operates the client device to send an operation instruction to the cloud game server, the cloud game server runs the game according to the operation instruction, data such as game pictures and the like are encoded and compressed, the data are returned to the client device through a network, and finally the data are decoded through the client device and the game pictures are output.
In an optional implementation manner, taking a game as an example, the local touch terminal device stores a game program and is used for presenting a game screen. The local touch terminal device is used for interacting with a player through a graphical user interface, namely, a game program is downloaded and installed and operated through an electronic device conventionally. The manner in which the local touch terminal device provides the graphical user interface to the player may include a variety of ways, for example, it may be rendered for display on a display screen of the terminal or provided to the player through holographic projection. For example, the local touch terminal device may include a display screen for presenting a graphical user interface including a game screen and a processor for running the game, generating the graphical user interface, and controlling display of the graphical user interface on the display screen.
In a possible implementation manner, an embodiment of the present invention provides a display control method for a control, where a graphical user interface is provided through a touch terminal device, where the touch terminal device may be the aforementioned local touch terminal device, or the aforementioned client device in a cloud interaction system. A graphical user interface is provided through the touch terminal device, and interface contents such as game scene pictures, communication interaction windows and the like can be displayed on the graphical user interface according to the type of the started application program.
To facilitate understanding of the present embodiment, first, a detailed description is given to a display control method for a control disclosed in the present embodiment, as shown in fig. 1, the display control method for a control provides a graphical user interface through a terminal device, where the terminal device may be the local terminal device described above or the client device described above, and the method includes the following steps:
step S102, responding to preset operation, and displaying operation instruction information in a graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object;
the preset operation is used for triggering and displaying the operation indication information, and the preset operation may specifically be triggering a specified control. The operation instruction information may be in the form of characters, marks, images, animations, and the like. In the embodiment, the user is guided to perform the touch operation through the target object, and the display control parameter of the control is determined based on the operation parameter of the touch operation. For this purpose, the operation instruction information is used to prompt the user to perform a touch operation on the moving or deforming target object.
In order to increase the interest, the target object can be a virtual object such as a red envelope, a fruit, a flying disc, a shrewmouse and the like. After the preset operation is triggered, a moving target object is displayed in the graphical user interface, where a moving path of the target object may be a random path or a preset path, for example, the target object moves from a lower left corner to an upper right corner of the graphical user interface. The target object being deformed is displayed in the image user interface, and may be a shape of the target object gradually changing during the display process, for example, a display area of the target object gradually becomes larger, the shape of the target object gradually changes from a square to a circle, and the like. The deformation mode of the target object can be preset or randomly changed.
In practical implementation, the number of the target objects may be multiple, so as to guide the user to perform the touch operation multiple times, and the moving target object and the deforming target object may be displayed simultaneously, alternately or sequentially. The touch operation may be specifically a click operation, a long press operation, or the like.
Step S104, responding to the touch operation aiming at the target object, and acquiring the operation parameters of the touch operation;
it should be noted that steps S102 and S104 may be executed simultaneously or executed in a loop for multiple times. That is, a target object may continuously appear in the graphical user interface within a certain time period, and in the process of appearing the target object, the user performs a touch operation on the target object, and at this time, the operation parameters of the touch operation are obtained.
The touch operation for the target object may specifically be a click operation or a continuous click operation performed on the target object. The operation parameters of the touch operation may include a trigger position and a trigger time of the touch operation. A reference position may be preset as an origin of a coordinate system, and position coordinates of each trigger position are determined based on the coordinate system. After each target object appears on the graphical user interface, timing can be started until the target object is triggered, and the triggering time is obtained.
And S106, determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters.
The display control parameters of the target control may include a display position and a display size of the target control, and the like. For a game program, there are many controls in the graphical user interface, and in this case, an operable area of the graphical user interface may be divided into a plurality of interactive areas, for example, a high-frequency interactive area, a low-frequency interactive area, and a maximum interactive area, according to the operation parameters of the touch operation; for the control with higher use frequency, such as an attack control, the control is placed in a high-frequency interaction area; for the control with lower use frequency, such as a groveling control, the control is placed in a low-frequency interaction area; for the control with very low use frequency, such as a knapsack control, the control is placed in the maximum interaction area. After the interactive area of the target control is determined, the specific display position and the display size of the target control are determined based on the operation parameters of the touch operation.
In the display control method of the control, operation indication information is displayed in a graphical user interface in response to preset operation; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters. In the method, the user is guided to execute touch operation through the moving or deforming target object, so that the operation parameters of the current user are obtained, the display control parameters of the target control are determined based on the operation parameters, the control layout is realized, the user does not need to adjust the control layout in person, the control layout which accords with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.
For ease of understanding, a specific manner of displaying the operation instruction information is provided below.
Fig. 2 is shown as an example. An object control is preset in the graphical user interface and is displayed in a first area of the graphical user interface; responding to the trigger operation aiming at the object control, and displaying the target object in a second area in the graphical user interface; wherein the second region is: an area of the graphical user interface other than the first area.
In actual implementation, the graphical user interface may be roughly divided into a plurality of regions, such as the first region and the second region described above. For a horizontal rectangular touch screen, the first area may be a left area suitable for left-handed operation of a user, and the second area may be a right area suitable for right-handed operation of the user. The left area of the screen is suitable for left-handed operation and the right area of the screen is suitable for right-handed operation. In this embodiment, the object control is set in the left area, and the left area is the first area. When the object control is triggered, the target object is displayed in the right area, i.e., the second area.
The operation aims at only laying out the controls in a partial area of the graphical user interface, for example, only laying out the controls in the right area of the graphical user interface, so that in order to avoid the influence of the misoperation of the left hand of the user on the final layout result, the left hand of the user is controlled by the object control, and the method can improve the accuracy of control layout.
It can be understood that when the control layout needs to be performed on the first area, the object control is displayed in the second area, and the target object is displayed in the first area. For example, object control controls are displayed in the right area of the interface to control the right hand of the user to ensure the accuracy of control layout in the left area.
In one implementation manner, the operation indication information in the foregoing embodiment may be a target object. Specifically, in response to a preset operation, displaying a moving first object in a graphical user interface; the first object is used for indicating a user to execute touch operation at the position of the first object; displaying a second object whose area gradually becomes larger in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
The moving path of the first object may be randomly determined or may be a preset moving path, and fig. 3 illustrates an example in which the moving path of the first object is a curved path moving from a lower left portion to an upper right portion. The broken line in fig. 3 indicates a path, and the first object of the broken line is a position where the first object has passed. When the first object moves along the path, the user performs touch operation on the first object, such as clicking the first object; if the first object is touched during the movement, the first object may disappear at the touch position where the touch operation is performed, or continue to move along the path until moving out of the interface. After the current first object disappears in the interface, other first objects can appear from any position in the interface and continue to start moving randomly.
Generally, if the first object moves outside the user-operable area, the user does not perform the touch operation on the first object any more. Through the target object which moves randomly, the user can be guided to execute touch operation at the corresponding position in the interface, and therefore the maximum touch range which can be operated by the current user is obtained. Based on this, in a specific implementation manner, a moving path with a larger range may be preset, for example, moving from a lower left area to an upper right area of the interface, and moving from an upper left area to a lower right area of the interface, so as to guide the user to perform a farther touch operation, thereby obtaining a maximum touch range operable by the current user.
For display of the second object, displaying the second object at a specified location in the graphical user interface at the default display area size; wherein, this appointed position includes: a preset position or a touch position for touch operation of the first object; and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually increased according to the touch frequency of the touch operation aiming at the second object.
The preset position may be a commonly-used display position of a certain control, or a touch position of a user when performing a touch operation on the first object. It should be noted that the touch position of the user in the touch operation on the first object may be understood as a relatively convenient touch position for the user. Specifically, the touch position of the touch operation for the first object may include a plurality of touch positions, and in actual implementation, a part of the touch positions may be selected as the designated position according to the attribute of the control, the touched frequency of the touch positions, and the like.
Fig. 4 is an example of a second object, and in practical implementation, it can be illustrated that the larger the touch frequency of the second object is, the more convenient the user operates the position of the second object, in this case, the faster the display area of the second object becomes larger, so as to guide the user to continue to perform the touch operation, and the area that is more convenient for the user to operate is obtained as soon as possible at the display position of the second object.
And displaying the second object at the specified position, wherein the second object is mainly used for more accurately determining the display position of the control. In the process that the second object is touched, the display area of the second object is gradually enlarged. Since the touch position of the touch operation is in the display area of the second object, the touch operation can be considered as the touch operation for the second object, the touchable range of the second object can be expanded when the display area of the second object is gradually enlarged, and the touch position of the touch operation may gradually change along with the operation habit of the user, that is, the user tends to trigger the second object at a more comfortable and efficient position. Therefore, the more comfortable and efficient touch position of the user can be accurately obtained through the second object with the gradually enlarged display area.
During the process of performing touch operations on the first object and the second object, the user obtains operation parameters of each touch operation, where the operation parameters may include parameters such as a touch position and a touch time of the touch operation. Determining a control display area in the graphical user interface based on the operation parameters of the touch operation; and determining the display control parameters of the target control from the control display area. Specifically, the operation parameters may be counted to obtain results of touch frequency, touch speed, and the like of each touch position. If the use frequency of the target control is also higher, the touch position with higher touch frequency can be used as the display position of the target control. The display control parameters of the target control may include parameters such as a display position, a display size, and a shape of the target control.
In a specific implementation manner, a maximum touch area is determined in a graphical user interface aiming at the touch position and the touch time of a moving first object in operation parameters of touch operation; based on the touch position of touch operation and the touch frequency of the touch position in the maximum touch area, establishing a normal distribution relation between each position area in the maximum touch area and the touch frequency; determining a control display area from the maximum touch area based on the normal distribution relation; the control display area comprises a plurality of control display areas; the touch frequency of each control display area is different.
The first object can randomly move from each direction of the graphical user interface to each direction of the graphical user interface, so that a user is guided to perform touch operation at each position of the interface, generally, if the operable area of the finger of the user is exceeded, the touch operation is not performed any more, and considering that different users have different hand types and operation habits, the operable areas of different users are different, and based on the difference, the maximum touch area of the user can be obtained through the touch position of the first object.
Suppose the coordinates of the touch position are denoted as P i (x i ,y i ) I represents the ith touch operation; p 0 (x 0 ,y 0 ) Specifically, the position of the palm of the hand of the user may be used as the origin of coordinates. For example, for the user's right hand, the lower right corner position of the interface may be taken as the origin of coordinates. The touch distance between the ith touch operation and the coordinate origin is
Figure BDA0003601784620000131
Average touch speed
Figure BDA0003601784620000132
Wherein n represents the total number of touch operations; t is t i The difference value representing the time when the target object corresponding to the ith touch operation appears on the graphical user interface and the trigger time of the ith touch operation can also be understood as the reaction time.
Obtaining the slowest reaction time t from the operation parameters of the touch operation max And the maximum touch distance r max Obtaining the maximum touch area
Figure BDA0003601784620000133
The maximum touch area may also be referred to as a maximum touch range; r is tmax Representing the touch distance corresponding to the slowest response time; t is t rmax Representing the response time corresponding to the maximum touch distance.
Further, in the maximum touch area, the touch frequency of each touch position is counted, and then a normal distribution frequency density formula is established
Figure BDA0003601784620000134
Wherein, mu ═ n ^ n Ω xdP=∑ i p i x i
Figure BDA0003601784620000135
Wherein n represents the total number of touch operations; μ is the mean of a normal distribution, x i Touch position, p, representing the ith touch operation i Representative touch position x i The touch frequency of (c).
For ease of understanding, fig. 5 is a diagram illustrating a normal distribution corresponding to the above normal distribution frequency density formula. The normal distribution relation between each position area in the maximum touch area and the touch frequency is shown in the figure. In one approach, the maximum touch area may be divided into three control display areas, where the areas in the μ - σ and μ + σ ranges are used as high frequency interactive areas, the areas in the μ -2 σ and μ - σ ranges are used as low frequency interactive areas, and the areas in the μ + σ and μ +2 σ ranges are used as low frequency interactive areas; the region in the range of μ -3 σ and μ -2 σ, and the region in the range of μ +2 σ and μ +3 σ, which is the farthest interactive region.
FIG. 6 illustrates an example of multiple control display areas in which a high frequency interaction area may be used to display controls with a higher frequency of interaction; the low-frequency interaction area can be used for displaying a control with lower interaction frequency; the maximum interaction area is used to display controls that interact very infrequently, i.e., not commonly used. It should be noted that the maximum touch area may be divided into a plurality of control display areas, for example, four control display areas and five control display areas, and the three control display areas are not limited to the above three control display areas.
In a game scene, a large number of controls need to be displayed in a graphical user interface, and if all controls are displayed in a high-frequency interaction area, the pictures are not only blocked, but also the operation is not convenient. In this case, the target control needs to be divided according to the control attribute of the target control, so that the target control is displayed in different control display areas. Specifically, a target display area is determined from a control display area based on the control attribute of the target control; the control attribute is used for indicating the trigger frequency of the target control; acquiring a third object from the target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency aiming at the touch operation of the same object in the target display area; and determining the display position and the display size of the target control based on the touch position of the touch operation aiming at the third object and the touch frequency of the touch position.
The control attribute of the target control can be determined according to the control function of the target control, for example, for the attack control, the attack control is used for triggering attack operation, and therefore, the control attribute of the attack control is a high-frequency operation control; therefore, the target display area of the attack control can be the high-frequency interaction area; for another example, the groveling control is used for triggering groveling operation, so that the attribute of the groveling control is a low-frequency operation control; therefore, the target display area of the groveling control can be the low-frequency interaction area; for another example, the backpack control is used for triggering the operation of opening the backpack, and thus, the control attribute of the backpack control is an uncommon operation control; therefore, the target display area of the knapsack control can be the aforementioned maximum interaction area.
Considering the location of the multiple display controls also in the target display area, the display location of the target control needs to be further determined from the target display area. In the touch operation of the first object and the second object, the target objects located in the target display area include a plurality of target objects, and the third object, which is the target object with the highest touch frequency, is selected from the target objects. It can be understood that, if a user frequently triggers a target object, it can be inferred that the target object is located at a position suitable for user operation, i.e., the user has higher comfort when performing a triggering operation on the target object. Therefore, the display position and the display size of the target control can be further determined according to the third object.
In a specific implementation manner, a first touch position with the highest touch frequency and a second touch position farthest from the first touch position in touch positions of touch operation for a third object are obtained; and determining the display position and the display size of the target control based on the first touch position and the second touch position.
The coordinate point of the first touch position is P h (x h ,y h ) At the second touch positionCoordinate point is P max (x max ,y max ) (ii) a Calculating to obtain a spline curve based on the first touch position and the second touch position
Figure BDA0003601784620000151
Wherein the content of the first and second substances,
Figure BDA0003601784620000152
n represents the spline number; when n is equal to 0, the interpolation point x i And the interpolation point x i+1 B is equal to zero; interpolation point x i And x i+1 The corresponding value of b is equal to 1.
In the foregoing manner, the outer contour of the touch range can be obtained in the target display area by performing interpolation calculation on the first touch position and the second touch position, and the outer contour can be understood as a boundary of the hot area corresponding to the target control. The hot area here may be understood as an area further thinned in the target display area, or may be understood as an area frequently operated by the player in the target display area. In actual implementation, the hot zone may be determined as the display location of the target control.
When the display size of the target control is determined, taking a circular control as an example, the display size is a radius, and the radius r of the control is i Comprises the following steps:
Figure BDA0003601784620000161
then, pass through the radius r of the control i Can find S i =πr i 2
Further, if the target controls include a plurality of controls, and the display positions of the plurality of target controls are close to each other, at this time, in order to facilitate the operation, additional condition parameters of the controls need to be preset, for example, a minimum distance between adjacent controls, and based on the minimum distance, the obtained display positions of the target controls are adjusted to obtain the final display positions of the target controls. Specifically, if the target controls include a plurality of target controls, the display positions of the plurality of target controls are adjusted based on the specified distance between the adjacent target controls, so as to obtain the final display positions of the plurality of target controls. The adjusted distance between the adjacent target controls is usually not less than the specified distance, so that misoperation of a plurality of controls by a user is avoided.
Fig. 7 shows, as an example, a plurality of target display areas can be obtained in the above manner for the left area and the right area on the graphical user interface, for example, a high frequency interactive area 1, a low frequency interactive area 1, and a maximum interactive area 1 are calculated in the left area; and calculating a high-frequency interaction area 2, a low-frequency interaction area 2 and a maximum interaction area 2 in the right area.
During actual implementation, the left area and the right area can be subjected to the interactive operation respectively, so that a target display area in each area is obtained, certain safe size can be reserved between the target display areas in the left area and the right area, mutual overlapping is avoided, the control in the left area is prevented from being displayed in the right area, or the control in the right area is prevented from being displayed in the left area, and the control layout is more reasonable.
Assuming that four target controls, namely a control a, a control B, a control C and a control E, need to be displayed in the high-frequency interaction region 1 of the right region, the display position and the display size of each target control can be obtained based on the above manner, and the final display position of each target control is obtained based on the specified distance between adjacent controls. Similarly, in the high-frequency interaction area 2 on the left side, the display position and the display size of the remote sensing control can be obtained through the method.
In the embodiment, the interaction mode based on actual experience of the players, rapid, efficient and painless is provided, adjustment guide or direct adjustment is indirectly given, so that the interface layout is more in line with the use intuition and the operation habit of each player, and smoother, more natural, more real and more efficient experience is brought.
Through the interaction mode in the embodiment, the memory burden of the player adapting interface can be reduced, and the intuitive operation of the player can be quickly adapted to reduce the trial and error process; the adaptation process can be changed from one-way adaptation of the player to two-way adaptation of the player and the game; the learning strength is reduced, the influence of the learning ability of the player on the hands is reduced, the hands-on time is shortened, and the loss of the player is reduced; compared with the traditional adjusting mode, the layout adjusted by the embodiment is closer to the ideal layout of the player; the player does not need to recognize the self-operation habit, and the uniform interface self-defining efficiency is ensured; different players do not have difference of adjusting ability through unified self-defining operation; through quantitative analysis of the operation, the judgment of effective feedback can be carried out on the inconvenience or the error of the operation; the control can be adjusted according to effective experience feedback, and the adaptive degree after adjustment is ensured; the method for adjusting the layout has short period and high speed, and eliminates poor experience in the adjusting process; the adaptation process basically does not need manual adjustment of a player, and no self-defined workload exists; the general optimization method is to optimize the layout according to the hot zone test result based on the existing layout, and the scheme of the embodiment directly and reversely uses the hot zone test result to generate the preset layout.
In correspondence to the above method embodiment, referring to the schematic structural diagram of the display control apparatus of a control shown in fig. 8, a graphical user interface is provided through a terminal device; the device includes:
an information display module 80, configured to display operation instruction information in a graphical user interface in response to a preset operation; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object;
a parameter obtaining module 82, configured to respond to a touch operation for a target object, and obtain an operation parameter of the touch operation;
and a control display module 84, configured to determine a display control parameter of the target control based on the operation parameter of the touch operation, and control the target control to display based on the display control parameter.
The display control device of the control responds to preset operation and displays operation indication information in the graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters. In the method, the user is guided to execute touch operation through the moving or deforming target object, so that the operation parameters of the current user are obtained, the display control parameters of the target control are determined based on the operation parameters, the control layout is realized, the user does not need to adjust the control layout in person, the control layout which accords with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.
The information display module is further configured to: responding to a preset operation, and displaying the moved first object in the graphical user interface; the first object is used for indicating a user to execute touch operation at the position of the first object; displaying a second object with a gradually-enlarged area in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
The information display module is further configured to: displaying a second object of a default display area size at a specified location in the graphical user interface; wherein, the appointed position includes: a preset position or a touch position for touch operation of the first object; and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually increased according to the touch frequency of the touch operation aiming at the second object.
The control display module is further configured to: determining a control display area in the graphical user interface based on the operand of the touch operation; and determining display control parameters of the target control from the control display area.
The control display module is further configured to: determining a maximum touch area in the graphical user interface aiming at the touch position and the touch time of the moving first object in the operation parameters of the touch operation; establishing a normal distribution relation between each position area and the touch frequency in the maximum touch area based on the touch position of the touch operation and the touch frequency of the touch position in the maximum touch area; determining a control display area from the maximum touch area based on the normal distribution relation; the control display area comprises a plurality of control display areas; the touch frequency of each control display area is different.
The control display area comprises a plurality of control display areas; the control display module is further configured to: determining a target display area from the control display area based on the control attribute of the target control; the control attribute is used for indicating the trigger frequency of the target control; acquiring a third object from the target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency aiming at the touch operation of the same object in the target display area; and determining the display position and the display size of the target control based on the touch position of the touch operation aiming at the third object and the touch frequency of the touch position.
The control display module is further configured to: acquiring a first touch position with highest touch frequency and a second touch position which is farthest from the first touch position in touch positions of touch operation aiming at a third object; and determining the display position and the display size of the target control based on the first touch position and the second touch position.
The above apparatus further comprises a position adjustment module configured to: if the target controls comprise a plurality of target controls, the display positions of the target controls are adjusted based on the specified distance between the adjacent target controls, and the final display positions of the target controls are obtained.
An object control is preset in the graphical user interface and is displayed in a first area of the graphical user interface; the information display module is further configured to: responding to the trigger operation aiming at the object control, and displaying the target object in a second area in the graphical user interface; wherein the second region is: an area of the graphical user interface other than the first area.
The display control method and the display control device of the control shorten and facilitate the optimization and adjustment process of the traditional interface control layout, the original one-way adaptation optimization is changed into two-way adaptation, the game directly generates the interface layout adapted to the intuitive operation of the player, and the process of repeated trial and error of the player is greatly reduced; the whole process is carried out through games, and the hidden automatic completion can be realized in a hidden way by combining with the guidance of novice, so that the negative experience of interface layout adjustment is eliminated to a certain extent; the adjusted layout is more in line with the operation habits of the players, the positions and the sizes of the controls do not need to be learned by the players, the memory burden of the players adapting to the game interface is reduced, the starting speed of a new game is accelerated, and the user viscosity is improved.
The embodiment also provides an electronic device, which comprises a processor and a memory, wherein the memory stores machine executable instructions capable of being executed by the processor, and the processor executes the machine executable instructions to realize the display control method of the control. The electronic device may be a server or a terminal device.
Referring to fig. 9, the electronic device includes a processor 100 and a memory 101, where the memory 101 stores machine executable instructions capable of being executed by the processor 100, and the processor 100 executes the machine executable instructions to implement the display control method of the control.
Further, the electronic device shown in fig. 9 further includes a bus 102 and a communication interface 103, and the processor 100, the communication interface 103, and the memory 101 are connected through the bus 102.
The Memory 101 may include a high-speed Random Access Memory (RAM) and may also include a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The communication connection between the network element of the system and at least one other network element is realized through at least one communication interface 103 (which may be wired or wireless), and the internet, a wide area network, a local network, a metropolitan area network, and the like may be used. The bus 102 may be an ISA bus, PCI bus, EISA bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one double-headed arrow is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
Processor 100 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 100. The Processor 100 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; the device can also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 101, and the processor 100 reads the information in the memory 101 and, in conjunction with the hardware thereof, performs the steps of the method of the foregoing embodiment, such as:
responding to preset operation, and displaying operation indication information in a graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters.
The method guides the user to execute the touch operation through the moving or deforming target object, so that the operation parameters of the current user are obtained, the display control parameters of the target control are determined based on the operation parameters, the control layout is realized, the user does not need to adjust the control layout in person, the control layout which accords with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.
The step of displaying the operation instruction information in the graphical user interface in response to the preset operation includes: responding to a preset operation, and displaying the moved first object in the graphical user interface; the first object is used for indicating a user to execute touch operation at the position of the first object; displaying a second object with a gradually-enlarged area in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
The step of displaying the second object with the gradually enlarged area in the graphical user interface includes: displaying a second object of a default display area size at a specified location in the graphical user interface; wherein, the appointed position includes: a preset position or a touch position for touch operation of the first object; and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually increased according to the touch frequency of the touch operation aiming at the second object.
The user is guided to continue to execute the touch operation, and the area which is convenient for the user to operate is obtained at the display position of the second object as soon as possible.
The step of determining the display control parameter of the target control based on the operation parameter of the touch operation includes: determining a control display area in the graphical user interface based on the operation parameters of the touch operation; and determining display control parameters of the target control from the control display area.
The step of determining a control display area in the graphical user interface based on the operation parameters of the touch operation includes: determining a maximum touch area in the graphical user interface aiming at the touch position and the touch time of the first moving object in the operation parameters of the touch operation; establishing a normal distribution relation between each position area and the touch frequency in the maximum touch area based on the touch position of the touch operation and the touch frequency of the touch position in the maximum touch area; determining a control display area from the maximum touch area based on the normal distribution relation; the control display area comprises a plurality of control display areas; the touch frequency of each control display area is different.
The control display area comprises a plurality of control display areas; the step of determining the display control parameters of the target control from the control display area includes: determining a target display area from the control display area based on the control attribute of the target control; the control attribute is used for indicating the trigger frequency of the target control; acquiring a third object from the target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency aiming at the touch operation of the same object in the target display area; and determining the display position and the display size of the target control based on the touch position of the touch operation aiming at the third object and the touch frequency of the touch position.
The step of determining the display position and the display size of the target control based on the touch position of the touch operation for the third object and the touch frequency of the touch position includes: acquiring a first touch position with highest touch frequency and a second touch position which is farthest from the first touch position in touch positions of touch operation aiming at a third object; and determining the display position and the display size of the target control based on the first touch position and the second touch position.
After the step of determining the display position and the display size of the target control based on the touch position of the touch operation on the third object and the touch frequency of the touch position, the method further includes: if the target controls comprise a plurality of target controls, the display positions of the target controls are adjusted based on the specified distance between the adjacent target controls, and the final display positions of the target controls are obtained.
Presetting an object control in the graphical user interface, wherein the object control is displayed in a first area in the graphical user interface; responding to a preset operation, and displaying operation indication information in a graphical user interface, wherein the step comprises the following steps: responding to the trigger operation aiming at the object control, and displaying the target object in a second area in the graphical user interface; wherein the second region is: an area of the graphical user interface other than the first area.
According to the method, the control is controlled through the object control to control one hand of the user, the accuracy of control layout can be improved, and the final layout result is prevented from being influenced by misoperation of two hands.
The present embodiments also provide a machine-readable storage medium storing machine-executable instructions that, when invoked and executed by a processor, cause the processor to implement a display control method for the above-mentioned control, for example:
responding to preset operation, and displaying operation indication information in a graphical user interface; wherein the operation instruction information is used for: instructing a user to perform touch operation on a moving or deforming target object; responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation; and determining display control parameters of the target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters.
The method guides the user to execute the touch operation through the moving or deforming target object, so that the operation parameters of the current user are obtained, the display control parameters of the target control are determined based on the operation parameters, the control layout is realized, the user does not need to adjust the control layout in person, the control layout which accords with the touch operation habit of the user can be obtained only after the user executes simple touch operation, the control layout enables the user to have higher operation comfort, and the application experience of the user is improved.
The step of displaying the operation instruction information in the graphical user interface in response to the preset operation includes: responding to a preset operation, and displaying the moved first object in a graphical user interface; the first object is used for indicating a user to execute touch operation at the position of the first object; displaying a second object with a gradually-enlarged area in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
The step of displaying the second object with the gradually enlarged area in the graphical user interface includes: displaying a second object of a default display area size at a specified location in the graphical user interface; wherein, the designated position includes: a preset position or a touch position for touch operation of the first object; and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually increased according to the touch frequency of the touch operation aiming at the second object.
The user is guided to continue to execute the touch operation, and the area which is convenient for the user to operate is obtained at the display position of the second object as soon as possible.
The step of determining the display control parameter of the target control based on the operation parameter of the touch operation includes: determining a control display area in the graphical user interface based on the operation parameters of the touch operation; and determining display control parameters of the target control from the control display area.
The step of determining a control display area in the graphical user interface based on the operation parameters of the touch operation includes: determining a maximum touch area in the graphical user interface aiming at the touch position and the touch time of the moving first object in the operation parameters of the touch operation; based on the touch position of touch operation and the touch frequency of the touch position in the maximum touch area, establishing a normal distribution relation between each position area in the maximum touch area and the touch frequency; determining a control display area from the maximum touch area based on the normal distribution relation; the control display area comprises a plurality of control display areas; the touch frequency of each control display area is different.
The control display area comprises a plurality of control display areas; the step of determining the display control parameters of the target control from the control display area includes: determining a target display area from the control display area based on the control attribute of the target control; the control attribute is used for indicating the trigger frequency of the target control; acquiring a third object from the target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency aiming at the touch operation of the same object in the target display area; and determining the display position and the display size of the target control based on the touch position of the touch operation aiming at the third object and the touch frequency of the touch position.
The step of determining the display position and the display size of the target control based on the touch position of the touch operation for the third object and the touch frequency of the touch position includes: acquiring a first touch position with highest touch frequency and a second touch position which is farthest from the first touch position in touch positions of touch operation aiming at a third object; and determining the display position and the display size of the target control based on the first touch position and the second touch position.
After the step of determining the display position and the display size of the target control based on the touch position of the touch operation for the third object and the touch frequency of the touch position, the method further includes: if the target controls comprise a plurality of target controls, the display positions of the target controls are adjusted based on the specified distance between the adjacent target controls, and the final display positions of the target controls are obtained.
An object control is preset in the graphical user interface and is displayed in a first area of the graphical user interface; responding to a preset operation, and displaying operation indication information in a graphical user interface, wherein the step comprises the following steps: responding to the trigger operation aiming at the object control, and displaying the target object in a second area in the graphical user interface; wherein the second region is: an area of the graphical user interface other than the first area.
According to the method, the control is controlled through the object control to control one hand of the user, the accuracy of control layout can be improved, and the final layout result is prevented from being influenced by misoperation of two hands.
The method, the apparatus, the electronic device, and the computer program product of the storage medium for controlling display of the control provided in the embodiments of the present invention include a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the method described in the foregoing method embodiments, and specific implementation may refer to the method embodiments, and will not be described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In addition, in the description of the embodiments of the present invention, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases for those skilled in the art.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that the following embodiments are merely illustrative of the present invention, and not restrictive, and the scope of the present invention is not limited thereto: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present invention, and they should be construed as being included therein. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (12)

1. A display control method of a control is characterized in that a graphical user interface is provided through terminal equipment; the method comprises the following steps:
responding to preset operation, and displaying operation indication information in the graphical user interface; wherein the operation indication information is used for: instructing a user to perform touch operation on a moving or deforming target object;
responding to the touch operation aiming at the target object, and acquiring operation parameters of the touch operation;
and determining display control parameters of a target control based on the operation parameters of the touch operation, and controlling the target control to display based on the display control parameters.
2. The method of claim 1, wherein the step of displaying operation indication information in the graphical user interface in response to a preset operation comprises:
responding to a preset operation, and displaying the moved first object in the graphical user interface; the first object is used for indicating a user to perform touch operation at the position of the first object;
displaying a second object whose area becomes gradually larger in the graphical user interface; the second object is used for indicating a user to execute touch operation in a display area of the second object.
3. The method of claim 2, wherein the step of displaying the second object with the gradually increasing area in the graphical user interface comprises:
displaying a second object of a default display area size at a specified location in the graphical user interface; wherein the designated location comprises: a preset position or a touch position for touch operation of the first object;
and responding to the touch operation aiming at the second object, and controlling the display area of the second object to be gradually enlarged according to the touch frequency of the touch operation aiming at the second object.
4. The method of claim 1, wherein the step of determining display control parameters of a target control based on the operation parameters of the touch operation comprises:
determining a control display area in the graphical user interface based on the operation parameters of the touch operation;
and determining display control parameters of the target control from the control display area.
5. The method of claim 4, wherein the step of determining a control display area in the graphical user interface based on the operating parameters of the touch operation comprises:
determining a maximum touch area in the graphical user interface aiming at the touch position and the touch time of the moving first object in the operation parameters of the touch operation;
establishing a normal distribution relation between each position area in the maximum touch area and the touch frequency based on the touch position of the touch operation and the touch frequency of the touch position in the maximum touch area;
determining a control display area from the maximum touch area based on the normal distribution relation; wherein the control display area comprises a plurality of control display areas; and the touch frequency of each control display area is different.
6. The method of claim 4, wherein the control display area comprises a plurality of; the step of determining the display control parameters of the target control from the control display area comprises:
determining a target display area from the control display area based on the control attribute of the target control; wherein the control attribute is used for indicating the trigger frequency of the target control;
acquiring a third object from a target object based on the operation parameters of the touch operation; the third object is an object with the highest touch frequency of touch operation aiming at the same object in the target display area;
determining a display position and a display size of the target control based on a touch position of the touch operation for the third object and a touch frequency of the touch position.
7. The method of claim 6, wherein the step of determining the display position and the display size of the target control based on the touch position of the touch operation on the third object and the touch frequency of the touch position comprises:
acquiring a first touch position with highest touch frequency and a second touch position which is farthest from the first touch position in the touch positions of the touch operation aiming at the third object;
and determining the display position and the display size of the target control based on the first touch position and the second touch position.
8. The method according to claim 6, wherein after the step of determining the display position and the display size of the target control based on the touch position of the touch operation on the third object and the touch frequency of the touch position, the method further comprises:
if the target controls comprise a plurality of target controls, the display positions of the target controls are adjusted based on the specified distance between the adjacent target controls, and the final display positions of the target controls are obtained.
9. The method according to claim 1, wherein an object control is preset in the graphical user interface, and the object control is displayed in a first area in the graphical user interface; the step of displaying operation instruction information in the graphical user interface in response to a preset operation includes:
displaying the target object in a second area in the graphical user interface in response to a triggering operation for the object control; wherein the second region is: a region of the graphical user interface other than the first region.
10. A display control device of a control is characterized in that a graphical user interface is provided through terminal equipment; the device comprises:
the information display module is used for responding to preset operation and displaying operation indication information in the graphical user interface; wherein the operation indication information is used for: instructing a user to perform touch operation on a moving or deforming target object;
the parameter acquisition module is used for responding to the touch operation aiming at the target object and acquiring the operation parameters of the touch operation;
and the control display module is used for determining the display control parameters of the target control based on the operation parameters of the touch operation and controlling the target control to display based on the display control parameters.
11. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor, the processor executing the machine executable instructions to implement the display control method of the control of any one of claims 1-9.
12. A machine-readable storage medium having stored thereon machine-executable instructions which, when invoked and executed by a processor, cause the processor to implement the display control method of the control of any of claims 1-9.
CN202210405745.0A 2022-04-18 2022-04-18 Control display control method and device and electronic equipment Pending CN114995930A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210405745.0A CN114995930A (en) 2022-04-18 2022-04-18 Control display control method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210405745.0A CN114995930A (en) 2022-04-18 2022-04-18 Control display control method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN114995930A true CN114995930A (en) 2022-09-02

Family

ID=83023478

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210405745.0A Pending CN114995930A (en) 2022-04-18 2022-04-18 Control display control method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN114995930A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115220851A (en) * 2022-09-09 2022-10-21 荣耀终端有限公司 Operation guide method, electronic device and readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115220851A (en) * 2022-09-09 2022-10-21 荣耀终端有限公司 Operation guide method, electronic device and readable storage medium

Similar Documents

Publication Publication Date Title
CN107648847B (en) Information processing method and device, storage medium and electronic equipment
CN107617213B (en) Information processing method and device, storage medium, electronic equipment
US20220161136A1 (en) Information Processing Method and Apparatus, Mobile Terminal, and Storage Medium
US11684858B2 (en) Supplemental casting control with direction and magnitude
CN107433036B (en) Method and device for selecting objects in game
TWI616802B (en) Touch display device, touch display method and unmanned aerial vehicle
US8469810B2 (en) Storage medium having game program stored thereon and game apparatus
JP7447299B2 (en) Adaptive display method and device for virtual scenes, electronic equipment, and computer program
US11042730B2 (en) Method, apparatus and device for determining an object, and storage medium for the same
CN111701226A (en) Control method, device and equipment for control in graphical user interface and storage medium
US20240058696A1 (en) Program, game control method, and information processing apparatus
CN114995930A (en) Control display control method and device and electronic equipment
CN112807692A (en) Information control method and device in game and terminal equipment
US20170090744A1 (en) Virtual reality headset device with front touch screen
CN115920395A (en) Interactive control method and device in game and electronic equipment
CN105302310B (en) A kind of gesture identifying device, system and method
CN112802162B (en) Face adjusting method and device for virtual character, electronic equipment and storage medium
WO2023246310A1 (en) Object selecting method and apparatus, and electronic device
WO2023246172A9 (en) Display control method and apparatus for skill indicator, and electronic device
CN113332712B (en) Game scene picture moving method and device and electronic equipment
CN110665216A (en) Method and device for controlling aiming direction in game, electronic equipment and storage medium
EP3373305A1 (en) Training device usable for rehabilitation and computer program for training device usable for rehabilitation
JP6446149B1 (en) Program, processing apparatus, and processing method
CN113426099B (en) Display control method and device in game
CN115957503A (en) Interactive control method and device for object movement and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination