CN115543064A - Interface display control method, interface display control device and storage medium - Google Patents

Interface display control method, interface display control device and storage medium Download PDF

Info

Publication number
CN115543064A
CN115543064A CN202110721483.4A CN202110721483A CN115543064A CN 115543064 A CN115543064 A CN 115543064A CN 202110721483 A CN202110721483 A CN 202110721483A CN 115543064 A CN115543064 A CN 115543064A
Authority
CN
China
Prior art keywords
interface display
interface
user
terminal
executed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110721483.4A
Other languages
Chinese (zh)
Inventor
邹佳亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202110721483.4A priority Critical patent/CN115543064A/en
Publication of CN115543064A publication Critical patent/CN115543064A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Abstract

The present disclosure relates to an interface display control method, an interface display control apparatus, and a storage medium. The interface display control method is applied to a terminal, if the suspension gesture operation that a user finger approaches a touch screen is detected, the operation logic corresponding to the suspension gesture operation is determined, and the operation logic is executed on the target area of the user interaction interface currently displayed on the terminal. The operation logic comprises interface display operation and/or interface display operation information to be executed. The operation logic corresponding to the suspension gesture operation is displayed on the target area of the user interaction interface, so that the user is ensured to be the established operation logic under the condition that the holding posture is not changed, and the experience of operating the terminal by one hand of the user is improved.

Description

Interface display control method, interface display control device and storage medium
Technical Field
The present disclosure relates to the field of terminal touch, and in particular, to an interface display control method, an interface display control apparatus, and a storage medium.
Background
As the size of the touch screen of the terminal is continuously increased, the difficulty of a user operating the terminal with one hand in some scenes is gradually increased. For example, when the user operates the terminal with one hand in a child holding scene, the holding posture of the terminal has to be changed continuously to perform the touch operation. The experience of the user when operating the terminal is poor, and the comfort level of the user when using the terminal is reduced.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides an interface display control method, an interface display control apparatus, and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided an interface display control method applied to a terminal, the interface display control method including:
responding to detected suspension gesture operation of a user finger approaching a touch screen, and determining operation logic corresponding to the suspension gesture operation, wherein the operation logic comprises interface display operation and/or interface display operation information to be executed; and executing the operation logic on a target area of a user interaction interface currently displayed by the terminal.
In one embodiment, the executing the operation logic on the target area of the user interaction interface currently displayed by the terminal includes:
responding to the interface display operation comprising calling a function option hidden in a user interactive interface, mapping the function option hidden in the user interactive interface into a virtual interface display operation icon, and displaying the interface display operation icon corresponding to the hidden function option in a target area of the user interactive interface currently displayed by the terminal; and/or in response to the interface display operation to be executed including a suspension gesture operation in which the stay time of the currently displayed user interaction interface exceeds a set stay time threshold, marking a set area range including a position area where the suspension gesture operation is located on the user interaction interface currently displayed by the terminal, wherein the marked set area range moves along with the suspension gesture operation; and/or responding to the interface display operation to be executed, wherein the interface display operation to be executed comprises the next interface display operation directly triggered to be executed, and before the next interface display operation is executed, the interface display operation information of the next interface display operation to be executed is displayed in advance in the target area of the user interaction interface currently displayed by the terminal; and/or responding to the interface display operation to be executed and including function options to be operated, and displaying interface display operation information corresponding to the function options to be operated in a target area of a user interaction interface currently displayed by the terminal, wherein the function options to be operated include function options displayed in an area range, in which the distance between the position area where the suspension gesture operation is detected and the position area is greater than a distance threshold value, of calling and detecting.
In one embodiment, the target area includes a position area where a hidden function option is located, and/or a set area range including a position area where the hover gesture operation is detected.
In one embodiment, the detecting a hover gesture operation of a user's finger approaching the touch screen includes:
and determining that the floating gesture operation is detected in response to detecting the self-contained signal that the finger of the user is close to the touch screen of the terminal.
In one embodiment, the interface display control method further includes:
monitoring a trigger signal to an operation logic on a user interaction interface; and within a preset time interval, if the trigger signal of the operation logic on the user interaction interface is not monitored, the operation logic executed on the user interaction interface is cancelled.
According to a second aspect of the embodiments of the present disclosure, there is provided an interface display control apparatus applied to a terminal, the interface display control apparatus including:
the touch screen control device comprises a determining unit, a control unit and a control unit, wherein the determining unit is used for responding to detected suspension gesture operation of a user finger approaching the touch screen and determining operation logic corresponding to the suspension gesture operation, and the operation logic comprises interface display operation and/or interface display operation information to be executed; and the display unit is used for executing the operation logic on a target area of the user interaction interface currently displayed by the terminal.
In one embodiment, the display unit is configured to:
responding to the interface display operation comprising calling a function option hidden in a user interactive interface, mapping the function option hidden in the user interactive interface into a virtual interface display operation icon, and displaying the interface display operation icon corresponding to the hidden function option in a target area of the user interactive interface currently displayed by the terminal; and/or in response to the interface display operation to be executed including a suspension gesture operation in which the stay time of the currently displayed user interaction interface exceeds a set stay time threshold, marking a set area range including a position area where the suspension gesture operation is located on the user interaction interface currently displayed by the terminal, wherein the marked set area range moves along with the suspension gesture operation; and/or responding to the interface display operation to be executed, wherein the interface display operation to be executed comprises the next interface display operation directly triggered to be executed, and before the next interface display operation is executed, the interface display operation information of the next interface display operation to be executed is displayed in advance in the target area of the user interaction interface currently displayed by the terminal; and/or responding to the interface display operation to be executed and including function options to be operated, and displaying interface display operation information corresponding to the function options to be operated in a target area of a user interaction interface currently displayed by the terminal, wherein the function options to be operated include function options displayed in an area range, in which the distance between the position area where the suspension gesture operation is detected and the position area is greater than a distance threshold value, of calling and detecting.
In an embodiment, the target area includes a location area where a hidden function option is located, and/or includes a set area range including a location area where the hover gesture operation is detected.
In one embodiment, the determining unit is configured to:
and determining that the floating gesture operation is detected in response to detecting the self-contained signal that the finger of the user is close to the touch screen of the terminal.
In one embodiment, the interface display control apparatus further includes:
and the cancelling unit is used for monitoring a trigger signal to the operation logic on the user interaction interface, and cancelling the operation logic executed on the user interaction interface if the trigger signal to the operation logic on the user interaction interface is not monitored in a preset time interval.
According to a third aspect of the embodiments of the present disclosure, there is provided an interface display control apparatus including:
a processor; a memory for storing processor-executable instructions; the processor is configured to execute the interface display control method described in the first aspect or any implementation manner of the first aspect.
According to a fourth aspect of the embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium, wherein instructions of the storage medium, when executed by a processor of a mobile terminal, enable the mobile terminal to perform the interface display control method described in the first aspect or any one of the implementation manners of the first aspect.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: if the terminal touch screen detects the suspension gesture operation of the user finger, determining the operation logic corresponding to the suspension gesture operation, and executing the operation logic on the target area of the user interaction interface currently displayed on the terminal. The operation logic comprises interface display operation and/or interface display operation information to be executed. Therefore, the terminal can be conveniently operated by a single hand of a user, and the experience of the user operation terminal is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram illustrating a one-handed operation of a smartphone according to an exemplary embodiment.
Fig. 2 is a flowchart illustrating an interface display control method according to an exemplary embodiment.
Fig. 3 is a flowchart illustrating an interface display control method according to an example embodiment.
FIG. 4 is a diagram illustrating a hover gesture operating to hover a circle in accordance with an exemplary embodiment.
FIG. 5 is a diagram illustrating hidden functionality options brought up into a reading interface, according to an example embodiment.
Fig. 6 is a flowchart illustrating a method of controlling interface display according to an exemplary embodiment.
FIG. 7 is a schematic diagram illustrating a positioning display in accordance with an exemplary embodiment.
Fig. 8 is a flowchart illustrating an interface display control method according to an example embodiment.
FIG. 9 is a schematic diagram illustrating the display of a next page in accordance with an exemplary embodiment.
Fig. 10 is a flowchart illustrating an interface display control method according to an example embodiment.
FIG. 11 is a diagram illustrating a return function option, according to an exemplary embodiment.
Fig. 12 is a block diagram illustrating an interface display control apparatus according to an exemplary embodiment.
FIG. 13 is a block diagram illustrating an apparatus for interface display control in accordance with an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. The following description refers to the accompanying drawings in which the same numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
With the development of science and technology and the deepening of intellectualization, touch intelligent terminals are more and more popular. Meanwhile, functions supported by the terminal are more and more based on user requirements, a screen of the touch screen is used as an interactive interface between the user and the terminal, and the screen of the touch screen of the terminal tends to be large in size in order to improve interactive experience and visual experience of the user. Taking a smartphone in a terminal as an example, the touchscreen size of the smartphone has been expanded from 2.8 feet to 10 feet, and may be larger in the future. The convenience brought by the large screen requires the cooperation of both hands of a user to operate the smart phone.
Further, the Terminal referred to in the embodiments of the present disclosure may also be referred to as a Terminal device, a user equipment, a Mobile Station (MS), a Mobile Terminal (MT), and the like, and is a device for providing voice and/or data connectivity to a user, for example, the Terminal may be a handheld device, a vehicle-mounted device, and the like having a wireless connection function. Currently, some examples of terminals are: a smart Phone (Mobile Phone), a Pocket Computer (PPC), a palm top Computer, a Personal Digital Assistant (PDA), a notebook Computer, a tablet Computer, a wearable device, or a vehicle-mounted device, etc. It should be understood that the embodiments of the present disclosure do not limit the specific technologies and the specific device forms adopted by the terminal.
Further, the touch screen (touch screen) in the embodiments of the present disclosure may also be referred to as a touch screen, a touch panel (touch panel), a touch display screen, and the like, and is an inductive liquid crystal display device capable of receiving input signals such as a touch. Touch screens can be classified into infrared technology touch screens, surface acoustic wave touch screens, resistive touch screens, capacitive touch screens, and the like.
In the embodiment of the present disclosure, a smart phone is taken as an example, and inconvenience caused by an enlarged touch screen is explained.
Fig. 1 is a schematic diagram illustrating a one-handed operation of a smartphone according to an exemplary embodiment. The smart phone operated in the mode of fig. 1 is very common in real life, but because the size of the screen of the touch screen of the smart phone is large, in the display interfaces of most application programs, the button is located above the screen, such as a return button and an information function option, which brings great challenges to a user to operate the smart phone with one hand in some scenes (when the user sits on a bus or a subway without a seat or holds things with one hand). The user has to change the posture of holding the smartphone with one hand or to use two-handed operation. Such interaction is very unfriendly to the user, reducing the experience of the user in operating the smartphone.
In view of this, an embodiment of the present disclosure provides an interface display control method, which is applied to a terminal, and if a terminal touch screen detects a hover gesture operation of a user finger, determines an operation logic corresponding to the hover gesture operation, and executes the operation logic on a target area of a user interaction interface currently displayed on the terminal. The operation logic comprises interface display operation and/or interface display operation information to be executed. Therefore, the corresponding relation between the user suspension gesture and the operation logic can be combined, the suspension signal characteristics corresponding to the suspension gesture are compared, advanced display or positioning display of the operation logic of the terminal is achieved, the user can achieve the set operation logic without changing the holding posture or moving fingers in a large range, the operability of the user when the user operates the terminal with one hand is improved, and the terminal can be operated with one hand conveniently.
Fig. 2 is a flowchart illustrating an interface display control method according to an example embodiment. As shown in fig. 2, the interface display control method includes the following steps.
In step S11, if a hover gesture operation in which a finger of the user approaches the touch screen is detected, an operation logic corresponding to the hover gesture operation is determined.
In the embodiment of the disclosure, the operation logic includes an interface display operation and/or an interface display operation information to be executed. The interface display operation comprises calling a function option hidden in the user interaction interface. The interface display operation to be executed comprises a suspension gesture operation when the staying time of the currently displayed user interaction interface exceeds a set staying time threshold, a next interface display operation directly triggered to be executed, and/or a function option to be operated, and the interface display operation information to be executed comprises information corresponding to the interface display operation to be executed. In other words, the interface display operation to be performed includes positioning display, displaying operation to be performed next in advance, and/or displaying function options located above the terminal where the user cannot operate conveniently with one hand.
In the embodiment of the disclosure, the hover gesture operation is determined by using the physical signal characteristics of the touch screen. Namely, when the finger of the user is close to the terminal touch screen and the terminal touch screen detects the floating self-contained signal, the floating gesture operation is determined to be detected. That is to say, when a finger of a user approaches the touch screen and the distance between the finger and the touch screen meets a set distance threshold, the touch sensor can sense the approach of the finger through the self-capacitance signal, and further determine the self-capacitance signal when the finger performs the gesture operation. And determining the detected floating gesture operation according to the self-contained signal when the gesture operation is executed by the finger. Wherein the distance threshold value can be selected between 10mm and 20 mm.
Further, when the terminal detects that the suspended self-contained signal exists, the operation logic which is required by the user and corresponds to the suspended self-contained signal is displayed in advance on the user interaction interface according to the incidence relation which is input by the user in advance. The experience feeling of the user operating the terminal with one hand is increased, and the comfort level of using the large-screen terminal is improved.
It should be noted that, in the embodiment of the present disclosure, a corresponding relationship between a hover gesture operation and an operation logic is predefined, and if the hover gesture operation is detected, the operation logic corresponding to the hover gesture operation is displayed on a user interaction interface in advance, which is completely different from a principle of controlling a terminal through an idle operation in the related art. Taking the screen capture in the air-spaced operation as an example, the terminal collects the gesture of the air-spaced operation through the infrared sensor to complete the screen capture operation. In the embodiment of the disclosure, the physical signal characteristics of the touch screen are adopted, and the infrared sensor is not required to acquire the suspension gesture operation. In addition, in the related art, a touch signal is used for controlling hardware, such as double-clicking a touch screen to control on or off of a terminal screen, or clicking an atmosphere lamp outside to control, and the like. The on/off of the terminal screen is controlled by double-clicking the touch screen, and the hardware (screen) of the terminal is controlled by touch signals (double-clicking), but the accuracy of the control hardware is poor, and the actual experience of a user is influenced. In order to achieve better effect of controlling hardware, a Linux graphic hardware temperature display (P-sensor) is also adopted in the related art to realize the control hardware. However, no matter which way to control the hardware, the principle is different from the Interface display control method provided by the embodiment of the present disclosure, the Interface display control method provided by the embodiment of the present disclosure uses the floating gesture operation to display the operation logic corresponding to the floating gesture operation on the User Interface (UI) in advance, which is different from acquiring the gesture by an infrared sensor and is different from controlling the terminal hardware by a touch signal.
In step S12, the operation logic is executed on the target area of the user interaction interface currently displayed by the terminal.
In the disclosed embodiment, the target area of the user interaction interface refers to an area convenient for a user to operate with one hand. Further, the area for facilitating the single-hand operation of the user comprises an area corresponding to a finger or a 2/3 area of the user interaction interface close to the bottom. The operation logic desired by the user is displayed in advance on the target area of the user interaction interface currently displayed by the terminal, so that the terminal can be operated by the user with one hand conveniently, and the experience of the user in using the terminal is improved. The operation logic desired by the user can be icons of function options on the user interaction interface, and can also be operations such as a magnifying glass or color change. By the method, the touch operation becomes more intelligent, the user can be better linked with the terminal, and the playability of the terminal is improved.
In the embodiment of the present disclosure, a hover gesture operation needs to be entered in advance, and a correspondence relationship is established between the entered hover gesture operation and an operation logic. And when in actual operation, comparing the characteristics of the self-contained signal operated by the detected hover gesture with the characteristics of the previously-entered self-contained signal to determine the operation logic which is displayed in advance on the user interaction interface. The operation logic corresponding to the suspension gesture operation can be set according to the personalized requirements of the user, the interestingness of the operation terminal is increased for the user in the process, and the personalized and personalized trend is met.
In the embodiment of the disclosure, the terminal is provided with the suspension gesture operation function, and when the user starts the suspension gesture operation function, the suspension gesture operation can be input according to the prompt. The prompting mode comprises a voice prompt and/or a text prompt. The content of the prompt may include "please start the hover gesture operation", "please repeat the hover gesture operation", "hover gesture operation entry is complete", and the like. The entered hover gesture operation includes: the finger is suspended on the touch screen for double click, suspended left-right sliding, suspended picture circle, suspended hook and the like. The method has the advantages that the user can be prompted to set the number of times of repeated operation of the same suspension gesture operation in the process of inputting the suspension gesture operation, the signal characteristics of the suspension gesture operation can be completely recorded through repeated operation of the same suspension gesture operation, and misoperation of the user is reduced when recognition and signal comparison are carried out in the application process. Illustratively, the hover gesture operation "click two times" is continuously repeated 3 times when the hover gesture operation is entered, the time interval between the first click and the second click is recorded when the hover gesture operation "click two times" is entered each time, and the time interval between the first click and the second click in each entry process is formed into a time interval range. And in the process of identifying the suspension gesture operation and comparing the signals, the time interval range is used as a basis for determining that the user operates to click two times.
In the embodiment of the disclosure, a suspension gesture operation name is set in the suspension gesture operation function by default, for example, a suspension drawing circle, a suspension hooking operation and the like, the user may also customize the name of the suspension gesture operation and enter a corresponding suspension gesture operation, for example, the name of the customized suspension gesture operation is "skip jump", and entering the suspension gesture operation corresponding to the customized suspension gesture operation name includes that a thumb clicks a point at a point a and then clicks a point at a point B.
For various hover gesture operations entered in the above examples, a user may customize an operation logic implemented by each hover gesture operation, for example, a double-click hover gesture operation corresponds to displaying a return icon on a user interaction interface, that is, in a current application program, a return key is in an upper left corner of the user interaction interface, and in a case where a single-hand operation by the user is inconvenient, the hover gesture operation performed by the double-click on the touch screen indicates that the user wants to perform the return operation, and the return icon is displayed below a finger, so that the user can click to trigger the return operation.
In the embodiment of the present disclosure, it may be specified that the functions implemented by the same hover gesture operation in different application programs are different, for example, the same double-click hover gesture operation, the corresponding operation logic in the reading program calls out a hidden setting option, and the corresponding operation logic in other programs may return to a previous layer or exit from the reading program. It is also possible to specify the same function implemented in different applications by the same hover gesture operation. The operation logic corresponding to the hovering gesture operation such as hooking is to map the icon displayed in the upper right corner of the user interaction interface. And defining operation logic corresponding to each suspension gesture operation in each common application program (WeChat and reading interface) according to the requirements of the user.
It should be noted that the terminal performs mirror processing on the entered hover gesture operation, so as to ensure that the user can execute the operation logic when the other hand executes the same hover operation gesture after only entering the hover operation gesture of one hand. In addition, the gesture in the embodiment of the present disclosure is a gesture of a hover operation. In other words, the finger is not in contact with the screen of the touch screen when the gesture is manipulated.
Fig. 3 is a flowchart illustrating an interface display control method according to an exemplary embodiment. As shown in fig. 3, the operation logic is executed on the target area of the user interaction interface currently displayed by the terminal, and includes the following steps.
In step S21, if the interface display operation is to invoke a function option hidden in the user interaction interface, the function option hidden in the user interaction interface is mapped to a virtual interface display operation icon.
And when the detected suspension gesture operation is an interface display operation, wherein the interface display operation comprises calling a function option hidden in the user interactive interface, mapping the function option hidden in the user interactive interface into a virtual interface display operation icon, and displaying the virtual interface display operation icon in a target area of the user interactive interface currently displayed by the terminal. In some application programs, some key buttons are required to be hidden for display, and a user is required to call the hidden function options according to a specified operation. Taking the reading interface as an example, in order to simulate a real reading scene, many button buttons are hidden, and a user needs to click a certain fixed position first to call out the UI icon of the button, and then move the finger to the UI icon of the corresponding button for clicking, so that the action of the finger is cumbersome in the process, and the user experience is greatly influenced.
The user inputs the suspension gesture operation of the suspension picture ring in advance, and sets the operation logic corresponding to the suspension gesture operation of the suspension picture ring as a function option hidden in the user interaction interface. When the terminal detects that the user fingers are in a floating circle, the hidden function options in the user interaction interface are mapped into virtual interface display operation icons, and the interface display operation icons are displayed in a target area of the user interaction interface.
In step S22, an interface display operation icon corresponding to the hidden function option is displayed in the target area of the user interaction interface currently displayed by the terminal.
In the embodiment of the present disclosure, the target area may be a position area where the hidden function option is located, or may be a set area range including a position area where the floating gesture operation is detected, that is, an area under the floating finger convenient for the finger operation. For example, when the user interaction interface of the terminal is a reading interface, the hidden reading setting interface can be called out by the double-click suspension gesture operation, and the setting button can be semi-transparently displayed below the finger, so that the user can complete the operation of one-time reading setting without moving the finger to the bottom of the terminal. For another example, in the reading interface, the circle is suspended, and operation options such as marking and amplifying the position of the circle can be defined. That is, when the floating circling gesture is detected, the mapped virtual key window pops up to an area convenient for the user to operate with one hand.
Illustratively, FIG. 4 is a schematic diagram illustrating a hover gesture operating as a hover circling in accordance with an exemplary embodiment. Referring to fig. 4, the user holds the terminal with one hand and draws a circle with the thumb floating. FIG. 5 is a diagram illustrating hidden functionality options brought up into a reading interface, according to an example embodiment. Referring to fig. 5, after the floating frame of fig. 4 is executed, hidden settings corresponding to the floating frame are displayed at the bottom of the user interaction interface according to preset corresponding relationships, where the hidden settings include font adjustment, background color adjustment, and the like.
Fig. 6 is a flowchart illustrating a method of controlling interface display according to an exemplary embodiment. As shown in fig. 6, the operation logic is executed on the target area of the user interaction interface currently displayed by the terminal, and includes the following steps.
In step S31, an interface display operation to be performed is determined.
In step S32, if the interface display operation to be executed is a hover gesture operation in which the staying time on the currently displayed user interaction interface exceeds the set staying time threshold, a set area range including a position area where the hover gesture operation is located is marked on the user interaction interface currently displayed on the terminal, and the marked set area range moves along with the hover gesture operation.
In the embodiment of the disclosure, by detecting the staying time of the suspension gesture on the user interaction interface, when the detected staying time exceeds a set staying time threshold, a positioning operation is performed, that is, a position area corresponding to the suspension gesture is marked on the currently displayed user interaction interface. The set area range including the position area where the suspension gesture is located is a target area, and the display of the interface display operation information to be executed corresponding to the interface display operation to be executed comprises marking the target area.
In one embodiment, when the finger of the user is set to be suspended on the touch screen for a time exceeding the dwell time threshold on the reading interface, the operation logic marks the area below the finger of the user (color marking, underlining marking), and the marked area can move along with the finger. The method is convenient for the user to position the currently browsed content, is beneficial to the user to perform a pointing reading method and improves the reading speed. FIG. 7 is a schematic diagram illustrating a positioning display in accordance with an exemplary embodiment. Referring to fig. 7, if the staying time of the floating finger on the currently displayed user interaction interface exceeds the set staying time threshold value by 5s, the color of the set area range including the position area where the floating gesture is located is marked on the currently displayed user interaction interface of the terminal, and the marked set area range moves along with the floating finger.
Fig. 8 is a flowchart illustrating an interface display control method according to an example embodiment. As shown in fig. 8, the operation logic is executed on the target area of the user interaction interface currently displayed by the terminal, and includes the following steps.
In step S41, an interface display operation to be performed is determined.
In step S42, if the interface display operation to be executed is the next interface display operation that is directly triggered to be executed, before the next interface display operation is executed, interface display operation information of the next interface display operation to be executed is displayed in advance in the target area of the user interaction interface currently displayed by the terminal.
In the embodiment of the disclosure, in order to confirm the next interface display operation of the user, before executing the next interface display operation, interface display operation information of the next interface display operation to be executed is displayed in advance in a target area of a user interaction interface currently displayed by the terminal, and the confirmation of the user is received, so that the misoperation of the user is avoided. In the embodiment of the present disclosure, the target area is a set area range including a position area where the floating gesture operation is detected, that is, an area convenient for finger operation under the floating finger.
For example, in the reading interface, the user may perform a wrong operation instead of turning pages, and the user sets in advance that the corresponding relationship between the left-sliding and the left-turning pages is established. And when detecting the suspension gesture operation of the user sliding leftwards, displaying a next page in advance in a target area of the currently displayed user interaction interface for the user to confirm whether to turn pages. FIG. 9 is a schematic diagram illustrating a hover gesture operation sliding to the left in accordance with an exemplary embodiment. When a hover gesture operation in which the user slides to the left is detected, a "next page" is displayed in advance and a confirmation of the user is received within a target area of the currently displayed user interaction interface as shown in fig. 9.
Fig. 10 is a flowchart illustrating an interface display control method according to an example embodiment. As shown in fig. 10, the operation logic is executed on the target area of the user interaction interface currently displayed by the terminal, including the following steps.
In step S51, an interface display operation to be performed is determined.
In step S52, if the interface display operation to be executed is the function option to be operated, interface display operation information corresponding to the function option to be operated is displayed in the target area of the user interaction interface currently displayed by the terminal.
The function options to be operated comprise function options which are displayed in a region range, wherein the distance between the position region where the hovering gesture operation is detected and the region is greater than a distance threshold. In the embodiment of the present disclosure, the target area is a set area range including a position area where the hovering gesture operation is detected.
In the embodiment of the disclosure, the function option located at the upper left corner or the upper right corner of the user interaction interface is called through the floating gesture operation, and the operation information corresponding to the function option at the upper left corner or the upper right corner is displayed at the position corresponding to the finger, so that the user can operate conveniently.
Illustratively, the user presets that the double-click hovering over the touch screen indicates that the user wants to operate the return function option located in the upper left corner of the user interaction interface. FIG. 11 is a diagram illustrating a return function option, according to an exemplary embodiment. As can be seen in fig. 11, when the double-tap hover gesture operation is detected, a return key is displayed at the user's finger position for confirmation by the user. The return UI originally belonging to the upper left corner is displayed at the finger floating position, so that a user can easily finish the return operation without changing the holding posture.
In the embodiment of the present disclosure, a trigger signal to an operation logic on the user interaction interface is monitored, and if the trigger signal to the operation logic on the user interaction interface is not monitored within a preset time interval, the operation logic executed on the user interaction interface is cancelled.
By the interface display control method provided by the embodiment of the disclosure, the one-hand operability of the terminal is enhanced, and the user can customize the suspension gesture to complete the operation task, so that the interaction logic is more interesting.
Based on the same conception, the embodiment of the disclosure also provides an interface display control device.
It is understood that, in order to implement the above functions, the interface display control device provided in the embodiments of the present disclosure includes a hardware structure and/or a software module corresponding to the execution of each function. The disclosed embodiments can be implemented in hardware or a combination of hardware and computer software, in combination with the exemplary elements and algorithm steps disclosed in the disclosed embodiments. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
Fig. 12 is a block diagram illustrating an interface display control apparatus according to an exemplary embodiment. Referring to fig. 12, the interface display control apparatus 100 is applied to a terminal, and includes a determination unit 101 and a display unit 102.
The determining unit 101 is configured to determine, in response to detecting a hover gesture operation in which a finger of a user approaches a touch screen, an operation logic corresponding to the hover gesture operation, where the operation logic includes an interface display operation and/or interface display operation information to be executed.
The display unit 102 is configured to execute an operation logic on a target area of a user interaction interface currently displayed by the terminal.
In an embodiment of the present disclosure, the display unit 102 is configured to:
responding to the interface display operation including calling the function options hidden in the user interactive interface, mapping the function options hidden in the user interactive interface into virtual interface display operation icons, and displaying the interface display operation icons corresponding to the hidden function options in a target area of the user interactive interface currently displayed by the terminal; and/or the presence of a gas in the gas,
in response to interface display operation to be executed, the suspension gesture operation is performed, wherein the stay time of the currently displayed user interaction interface exceeds a set stay time threshold, a set area range including a position area where the suspension gesture is located is marked on the user interaction interface currently displayed by the terminal, and the marked set area range moves along with the suspension gesture; and/or the presence of a gas in the gas,
responding to the interface display operation to be executed, wherein the interface display operation to be executed comprises the next interface display operation directly triggered to be executed, and before the next interface display operation is executed, displaying interface display operation information of the next interface display operation to be executed in advance in a target area of a user interaction interface currently displayed by the terminal; and/or the presence of a gas in the gas,
and responding to the interface display operation to be executed comprising the function options to be operated, and displaying interface display operation information corresponding to the function options to be operated in a target area of a user interaction interface currently displayed by the terminal, wherein the function options to be operated comprise the function options which are displayed in an area range, wherein the distance between the calling and the position area where the suspension gesture operation is detected is greater than a distance threshold value.
In the embodiment of the present disclosure, the target area includes a position area where the hidden function option is located, and/or a set area range including a position area where the hover gesture operation is detected.
In an embodiment of the present disclosure, the determining unit 101 is configured to:
and determining that the floating gesture operation is detected in response to detecting the self-contained signal that the finger of the user is close to the touch screen of the terminal.
In the embodiment of the present disclosure, the interface display control apparatus further includes:
the cancelling unit 103 is configured to monitor a trigger signal to the operation logic on the user interaction interface, and cancel the operation logic executed on the user interaction interface if the trigger signal to the operation logic on the user interaction interface is not monitored within a preset time interval.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
FIG. 13 is a block diagram illustrating an apparatus for interface display control in accordance with an exemplary embodiment. For example, the apparatus 200 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 13, the apparatus 200 may include one or more of the following components: a processing component 202, a memory 204, a power component 206, a multimedia component 208, an audio component 210, an input/output (I/O) interface 212, a sensor component 214, and a communication component 216.
The processing component 202 generally controls overall operation of the device 200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 202 may include one or more processors 220 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 202 can include one or more modules that facilitate interaction between the processing component 202 and other components. For example, the processing component 202 can include a multimedia module to facilitate interaction between the multimedia component 208 and the processing component 202.
The memory 204 is configured to store various types of data to support operations at the apparatus 200. Examples of such data include instructions for any application or method operating on the device 200, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 204 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 206 provide power to the various components of device 200. Power components 206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for device 200.
The multimedia component 208 includes a screen that provides an output interface between the device 200 and the user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 208 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 200 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 210 is configured to output and/or input audio signals. For example, audio component 210 includes a Microphone (MIC) configured to receive external audio signals when apparatus 200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 204 or transmitted via the communication component 216. In some embodiments, audio component 210 also includes a speaker for outputting audio signals.
The I/O interface 212 provides an interface between the processing component 202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 214 includes one or more sensors for providing various aspects of status assessment for the device 200. For example, the sensor assembly 214 may detect an open/closed state of the device 200, the relative positioning of components, such as a display and keypad of the device 200, the sensor assembly 214 may also detect a change in the position of the device 200 or a component of the device 200, the presence or absence of user contact with the device 200, the orientation or acceleration/deceleration of the device 200, and a change in the temperature of the device 200. The sensor assembly 214 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 214 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 216 is configured to facilitate wired or wireless communication between the apparatus 200 and other devices. The device 200 may access a wireless network based on a communication standard, such as WiFi,4G or 5G, or a combination thereof. In an exemplary embodiment, the communication component 216 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 216 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as memory 204, comprising instructions executable by processor 220 of device 200 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
It is understood that "plurality" in this disclosure means two or more, and other terms are analogous. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. The singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It will be further understood that the terms "first," "second," and the like are used to describe various information and that such information should not be limited by these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the terms "first," "second," and the like are fully interchangeable. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that, unless otherwise specified, "connected" includes direct connections between the two without the presence of other elements, as well as indirect connections between the two with the presence of other elements.
It is further to be understood that while operations are depicted in the drawings in a particular order, this is not to be understood as requiring that such operations be performed in the particular order shown or in serial order, or that all illustrated operations be performed, to achieve desirable results. In certain environments, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the scope of the appended claims.

Claims (12)

1. An interface display control method is applied to a terminal, and comprises the following steps:
responding to detected suspension gesture operation of a user finger close to a touch screen, and determining operation logic corresponding to the suspension gesture operation, wherein the operation logic comprises interface display operation and/or interface display operation information to be executed;
and executing the operation logic on a target area of a user interaction interface currently displayed by the terminal.
2. The interface display control method according to claim 1, wherein the executing the operation logic on the target area of the user interaction interface currently displayed by the terminal includes:
responding to the interface display operation comprising calling a function option hidden in a user interactive interface, mapping the function option hidden in the user interactive interface into a virtual interface display operation icon, and displaying the interface display operation icon corresponding to the hidden function option in a target area of the user interactive interface currently displayed by the terminal; and/or the presence of a gas in the atmosphere,
in response to the interface display operation to be executed including a suspension gesture operation of which the stay time on the currently displayed user interaction interface exceeds a set stay time threshold, marking a set area range including a position area where the suspension gesture operation is located on the currently displayed user interaction interface of the terminal, wherein the marked set area range moves along with the suspension gesture operation; and/or the presence of a gas in the gas,
responding to the interface display operation to be executed, wherein the interface display operation to be executed comprises the next interface display operation directly triggered to be executed, and before the next interface display operation is executed, displaying interface display operation information of the next interface display operation to be executed in advance in a target area of a user interaction interface currently displayed by the terminal; and/or the presence of a gas in the gas,
and in response to that the interface display operation to be executed comprises function options to be operated, displaying interface display operation information corresponding to the function options to be operated in a target area of a user interaction interface currently displayed by the terminal, wherein the function options to be operated comprise function options displayed in an area range, in which the distance between the position area where the suspension gesture operation is detected and the position area is greater than a distance threshold value, of calling and detecting the distance between the position area and the position area where the suspension gesture operation is detected.
3. The interface display control method according to any one of claims 1 to 2, wherein the target area includes a position area where a hidden function option is located, and/or a set area range including the position area where the hover gesture operation is detected.
4. The interface display control method according to any one of claims 1 to 2, wherein the detecting a hover gesture operation in which a user's finger approaches the touch screen includes:
and determining that the floating gesture operation is detected in response to detecting the self-contained signal that the finger of the user is close to the touch screen of the terminal.
5. The interface display control method according to any one of claims 1 to 2, further comprising:
monitoring a trigger signal to an operation logic on a user interaction interface;
and within a preset time interval, if the trigger signal of the operation logic on the user interaction interface is not monitored, canceling the operation logic executed on the user interaction interface.
6. An interface display control device, applied to a terminal, the interface display control device comprising:
the touch screen control device comprises a determining unit, a control unit and a display unit, wherein the determining unit is used for responding to detected suspension gesture operation of a user finger approaching the touch screen and determining operation logic corresponding to the suspension gesture operation, and the operation logic comprises interface display operation and/or interface display operation information to be executed;
and the display unit is used for executing the operation logic on a target area of a user interaction interface currently displayed by the terminal.
7. The interface display control device according to claim 6, wherein the display unit is configured to:
responding to the interface display operation comprising calling a function option hidden in a user interactive interface, mapping the function option hidden in the user interactive interface into a virtual interface display operation icon, and displaying the interface display operation icon corresponding to the hidden function option in a target area of the user interactive interface currently displayed by the terminal; and/or the presence of a gas in the gas,
in response to the interface display operation to be executed including a suspension gesture operation of which the stay time on the currently displayed user interaction interface exceeds a set stay time threshold, marking a set area range including a position area where the suspension gesture operation is located on the currently displayed user interaction interface of the terminal, wherein the marked set area range moves along with the suspension gesture operation; and/or the presence of a gas in the gas,
responding to the interface display operation to be executed, wherein the interface display operation to be executed comprises the next interface display operation directly triggered to be executed, and before the next interface display operation is executed, displaying interface display operation information of the next interface display operation to be executed in advance in a target area of a user interaction interface currently displayed by the terminal; and/or the presence of a gas in the gas,
and in response to that the interface display operation to be executed comprises function options to be operated, displaying interface display operation information corresponding to the function options to be operated in a target area of a user interaction interface currently displayed by the terminal, wherein the function options to be operated comprise function options displayed in an area range, in which the distance between the position area where the suspension gesture operation is detected and the position area is greater than a distance threshold value, of calling and detecting the distance between the position area and the position area where the suspension gesture operation is detected.
8. The interface display control device according to any one of claims 6 to 7, wherein the target area includes a position area where a hidden function option is located, and/or a set area range including a position area where the hover gesture operation is detected.
9. The interface display control device according to any one of claims 6 to 7, wherein the determination unit is configured to:
and determining that the floating gesture operation is detected in response to detecting the self-contained signal that the finger of the user is close to the touch screen of the terminal.
10. The interface display control device according to any one of claims 6 to 7, characterized in that the interface display control device further comprises:
and the cancelling unit is used for monitoring a trigger signal to the operation logic on the user interaction interface, and cancelling the operation logic executed on the user interaction interface if the trigger signal to the operation logic on the user interaction interface is not monitored in a preset time interval.
11. An interface display control apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to be able to execute the interface display control method of any one of claims 1 to 5.
12. A non-transitory computer-readable storage medium having instructions therein, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform the interface display control method of any one of claims 1 to 5.
CN202110721483.4A 2021-06-28 2021-06-28 Interface display control method, interface display control device and storage medium Pending CN115543064A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110721483.4A CN115543064A (en) 2021-06-28 2021-06-28 Interface display control method, interface display control device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110721483.4A CN115543064A (en) 2021-06-28 2021-06-28 Interface display control method, interface display control device and storage medium

Publications (1)

Publication Number Publication Date
CN115543064A true CN115543064A (en) 2022-12-30

Family

ID=84716980

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110721483.4A Pending CN115543064A (en) 2021-06-28 2021-06-28 Interface display control method, interface display control device and storage medium

Country Status (1)

Country Link
CN (1) CN115543064A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117572984A (en) * 2024-01-15 2024-02-20 南京极域信息科技有限公司 Operation window positioning method for large touch screen

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117572984A (en) * 2024-01-15 2024-02-20 南京极域信息科技有限公司 Operation window positioning method for large touch screen

Similar Documents

Publication Publication Date Title
EP3099040B1 (en) Button operation processing method in single-hand mode, apparatus and electronic device
EP3098701B1 (en) Method and apparatus for managing terminal application
CN105975166B (en) Application control method and device
CN107124508B (en) Position adjusting method and device of suspension control, terminal and readable storage medium
EP3121701A1 (en) Method and apparatus for single-hand operation on full screen
EP3125531A1 (en) Control method and device for adjusting shooting function
EP3279786A1 (en) Terminal control method and device, and terminal
KR20130102834A (en) Mobile terminal and control method thereof
KR20160087268A (en) Mobile terminal and control method for the same
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN107992257B (en) Screen splitting method and device
EP3136206B1 (en) Method and apparatus for setting threshold
EP3239827B1 (en) Method and apparatus for adjusting playing progress of media file
KR20120001941A (en) Mobile terminal and control method for mobile terminal
US10705729B2 (en) Touch control method and apparatus for function key, and storage medium
CN111610912B (en) Application display method, application display device and storage medium
US20210165670A1 (en) Method, apparatus for adding shortcut plug-in, and intelligent device
JP2017525076A (en) Character identification method, apparatus, program, and recording medium
CN111522498A (en) Touch response method and device and storage medium
EP3035172A1 (en) Method and device for activating operation state of mobile terminal
CN106980409B (en) Input control method and device
CN115543064A (en) Interface display control method, interface display control device and storage medium
EP4318201A1 (en) Virtual keyboard setting method and apparatus, storage medium, and computer program product
EP3995939A1 (en) Method and device for touch operation, and storage medium
CN114296587A (en) Cursor control method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination