CN111314552B - User interface control method and device and storage medium - Google Patents

User interface control method and device and storage medium Download PDF

Info

Publication number
CN111314552B
CN111314552B CN202010091389.0A CN202010091389A CN111314552B CN 111314552 B CN111314552 B CN 111314552B CN 202010091389 A CN202010091389 A CN 202010091389A CN 111314552 B CN111314552 B CN 111314552B
Authority
CN
China
Prior art keywords
edge
touch
terminal
screen
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010091389.0A
Other languages
Chinese (zh)
Other versions
CN111314552A (en
Inventor
张嫄
钟桂林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202010091389.0A priority Critical patent/CN111314552B/en
Publication of CN111314552A publication Critical patent/CN111314552A/en
Application granted granted Critical
Publication of CN111314552B publication Critical patent/CN111314552B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The disclosure relates to a control method and device of a user interface and a storage medium. The control method of the user interface comprises the following steps: acquiring the foreground operation content of the terminal; determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content; and receiving and responding to touch operation in the UI edge. Like this, have under the condition of more UI control at some edges, can carry out not equidimension to UI according to the use habit to reduce the mistake that the user touched the touch-control screen edge and caused and touch the UI control when holding the terminal, and then promote the use and experience.

Description

User interface control method and device and storage medium
Technical Field
The present disclosure relates to electronic technologies, and in particular, to a method and an apparatus for controlling a user interface, and a storage medium.
Background
Along with the development trend of large-size touch display screens of mobile terminals, the types of the display screens are in various forms such as a full screen, a curved screen, a waterfall screen, an outward folding screen, a surrounding screen and the like, and the display screens extend to the frame range of the mobile terminals. Because the display screen usually has a touch function, when a user holds the mobile terminal in various use scenes, the held fingers or palm inevitably touches the display area, and the displayed content is shielded, and meanwhile, the false touch of the edge of the user interface is also caused.
Disclosure of Invention
The disclosure provides a control method and device of a user interface and a storage medium.
According to a first aspect of the embodiments of the present disclosure, there is provided a control method of a user interface, including:
acquiring the foreground operation content of the terminal;
determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content;
and receiving and responding to touch operation in the UI edge.
In some embodiments, the method further comprises:
setting an APP white list;
determining whether the terminal foreground operation content is a white list APP;
the determining, according to the historical usage data corresponding to the terminal foreground operating content, the UI edge corresponding to the terminal foreground operating content includes:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
In some embodiments, the determining, according to the historical usage data corresponding to the foreground operating content of the terminal, a UI edge corresponding to the foreground operating content of the terminal further includes:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
In some embodiments, the determining, according to historical usage data corresponding to a foreground operating content of a terminal, a UI edge corresponding to the foreground operating content of the terminal includes:
and counting to obtain the UI edge corresponding to the foreground operation content of the terminal according to the edge touch position data of the foreground operation content of the historical use terminal.
In some embodiments, the method further comprises:
determining a current holding posture of the terminal;
the using the edge touch position data of the terminal foreground operation content according to the history to count and obtain the UI edge corresponding to the terminal foreground operation content comprises the following steps:
and counting to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture according to the edge touch position data of the foreground operating content of the historical use terminal in the current holding posture.
In some embodiments, the using the edge touch position data of the foreground operating content of the terminal in the current holding posture according to the history to obtain, through statistics, a UI edge corresponding to the foreground operating content of the terminal in the current holding posture includes at least one of:
counting to obtain a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
counting to obtain a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance from the left edge of the right holding UI edge to the left vertical edge of the touch screen is smaller than the distance from the right edge of the right holding UI edge to the right vertical edge of the touch screen;
and counting to obtain a transverse screen holding UI edge corresponding to the terminal foreground operating content under the transverse screen holding posture according to the historical edge touch data of the terminal foreground operating content under the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
In some embodiments, the determining the current holding posture of the terminal comprises:
acquiring touch data of an edge area of the touch screen, wherein the touch data comprises: at least one of touch points, touch area and touch duration;
and determining the current holding posture of the terminal according to the touch data of the edge area.
In some embodiments, the determining the current holding posture of the terminal according to the touch data of the edge area includes:
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the number of touch points in the left edge area of the touch screen is more than that in the right edge area of the touch screen, determining that the current holding posture is the right-hand vertical screen holding posture;
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen, determining that the current touch holding posture is a left-hand vertical screen holding posture;
and under the transverse screen mode of the terminal, when the touch area in the edge area of the touch screen is larger than an area threshold, determining that the current holding posture is the transverse screen holding posture.
In some embodiments, the determining the current holding posture of the terminal according to the touch data of the edge area includes:
determining the touch duration of the edge area;
and determining the current holding gesture according to the touch points and/or the touch area of the edge area with the touch duration being greater than the duration threshold.
According to a second aspect of the embodiments of the present disclosure, there is provided a control apparatus of a user interface, including:
the acquisition module is used for acquiring the foreground operation content of the terminal;
the system comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining a User Interface (UI) edge corresponding to the foreground operating content of the terminal according to historical use data corresponding to the foreground operating content of the terminal;
and the receiving module is used for receiving and responding to the touch operation in the UI edge.
In some embodiments, the apparatus further comprises:
the setting module is used for setting an application program APP white list;
the second determining module is used for determining whether the terminal foreground operating content is a white list APP;
the first determining module is specifically configured to:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
In some embodiments, the first determining module is further configured to:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
In some embodiments, the first determining module comprises:
and the counting submodule is used for counting to obtain the UI edge corresponding to the foreground operating content of the terminal according to the edge touch position data of the foreground operating content of the historical use terminal.
In some embodiments, the apparatus further comprises:
a third determining module, configured to determine a current holding posture of the terminal;
the statistic submodule is specifically configured to:
and counting to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture according to the edge touch position data of the foreground operating content of the historical use terminal in the current holding posture.
In some embodiments, the statistics submodule includes at least one of:
the first statistical module is used for counting and obtaining a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
the second counting submodule is used for counting and obtaining a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance between the left edge of the right holding UI edge and the left vertical edge of the touch screen is smaller than the distance between the right edge of the right holding UI edge and the right vertical edge of the touch screen;
and the third counting submodule is used for counting edge touch data of the terminal foreground operating content in the transverse screen holding posture according to historical use and obtaining a transverse screen holding UI edge corresponding to the terminal foreground operating content in the transverse screen holding posture aiming at the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
In some embodiments, the third determining module comprises:
the obtaining sub-module is configured to obtain touch data of an edge area of the touch screen, where the touch data includes: at least one of touch points, touch area and touch duration;
and the first determining submodule is used for determining the current holding posture of the terminal according to the touch data of the edge area.
In some embodiments, the first determining sub-module includes:
the second determining sub-module is used for determining that the current holding posture is the right-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are more than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
the third determining sub-module is used for determining that the current touch holding posture is a left-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
and the fourth determining submodule is used for determining that the current holding posture is the horizontal screen holding posture when the touch area in the edge area of the touch screen is larger than the area threshold value in the horizontal screen mode of the terminal.
In some embodiments, the first determining sub-module includes:
a fifth determining submodule, configured to determine a touch duration of the edge area;
and the sixth determining submodule is used for determining the current holding gesture according to the touch points and/or the touch area of the edge area, wherein the touch duration of the edge area is greater than the duration threshold.
According to a third aspect of embodiments of the present disclosure, there is provided a control apparatus of a user interface, the apparatus including at least: a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is used for executing the executable instructions, and the executable instructions execute the steps in the control method of any user interface.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the steps in the control method of any one of the user interfaces described above.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: according to the method and the device for determining the UI display edge, the historical use condition of the terminal in different operation contents is determined, and the corresponding UI display edge is determined. Because the historical use condition reflects the use habit of the user for touch control of the UI interface in different application scenes and the condition of error touch, the UI can be zoomed in different degrees according to the use habit under the condition that more UI controls are arranged on some edges, so that the error touch of the UI controls caused by the fact that the user touches the edges of the touch screen when holding the terminal is reduced, and the use experience is further improved. According to the scheme, the touch control using effect of the mobile terminal with the large-area screen such as the comprehensive screen, the curved-surface screen and the surrounding screen is improved, and bad using feeling caused by mistaken touch is reduced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart one illustrating a method of controlling a user interface in accordance with an exemplary embodiment;
FIG. 2 is a flowchart II illustrating a method of controlling a user interface in accordance with an exemplary embodiment;
FIG. 3 is a schematic diagram illustrating holding a surround screen handset according to an exemplary embodiment;
FIG. 4 is a schematic diagram illustrating a fixed user interface zoom region in accordance with an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating grip regions determined from different grip gestures according to an exemplary embodiment;
FIG. 6A is a schematic diagram illustrating user interface scaling while held in a right hand according to an example embodiment;
FIG. 6B is a diagram illustrating user interface zooming while held in the left hand, according to an exemplary embodiment;
FIG. 6C is a diagram illustrating user interface zooming while holding in a two-handed grip according to an exemplary embodiment;
FIG. 7 is a schematic diagram illustrating asymmetric zooming of a user interface in accordance with an exemplary embodiment;
FIG. 8 is a block diagram illustrating the structure of a control device of a user interface in accordance with an exemplary embodiment;
fig. 9 is a block diagram illustrating a physical structure of a control apparatus of a user interface according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present invention. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the invention, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating a control method of a user interface according to an exemplary embodiment, which is applied to a terminal, as shown in fig. 1, and includes the steps of:
s101, acquiring the foreground operation content of the terminal;
step S102, determining a UI edge corresponding to the foreground operating content of the terminal according to historical use data corresponding to the foreground operating content of the terminal;
and step S103, receiving and responding to the touch operation in the UI edge.
With the development of large-scale mobile terminal touch screens, touch display screens gradually extend to the frame of the mobile terminal. When a user holds the mobile terminal, the touch display screen is easily touched by fingers and a palm part, so that a wrong touch is caused. To this end, the UI (user interface) may be scaled down inside the bezel in some embodiments to reduce false touches. However, different users have different use habits, if the reduction range of the UI is fixed, the reduction is too much for some users, and the waste of the screen display area is caused; and for other users, the zoom-out is too small, so that the palm or the finger still easily touches the touch control of the display screen, the touch is mistakenly touched, and the use feeling is influenced.
Therefore, in the present embodiment, the UI is scaled according to the historical usage data of the terminal operating content, and the size of the user interface is flexibly adjusted. The content currently running in the terminal may be a designated Application (APP), or may be a screen of a non-application such as a desktop or a standby screen. The historical usage data may be a touch operation detected through a touch panel of the mobile terminal, or usage statistics according to UI controls of the application itself, or the like.
By determining the edges of the UI, the UI in all use cases is no longer displayed in a full screen manner. The UI is displayed within the determined UI edge, and the touch screen can only accept touch operation within the UI edge, so that when a user uses the mobile terminal, fingers or palms held by the user are not easy to fall into the user interface to cause mistaken touch on the touch control, for example, mistaken touch on a touch in-out area or other function key areas. Meanwhile, the shielding of the held hand to the display content can be reduced, including: for the display of videos, pictures, texts, etc., for example, for reading applications, the occlusion of reading contents can be reduced. Typically, if the mobile terminal is an electronic book, the method can reduce the occlusion of the text content, so that the text content is displayed in a suitable user interface area.
When the user interface is reduced, the display frame in the holding area or the area outside the user interface can be closed, for example, for a liquid crystal display screen, the backlight of the holding area can be closed, for an OLED display screen, the gate switch of the OLED pixel in the holding area is closed, and the like, so that power consumption can be saved to a certain extent; the background color of the current user interface and the like can be displayed in the holding area or the area outside the user interface, so that the influence of the visual experience caused by the reduction of the user interface is reduced.
According to the technical scheme of the embodiment of the disclosure, the user interface can be flexibly adjusted according to the holding area determined by the user holding the mobile terminal, so that the method and the device are suitable for different use habits of different users, and can be applied to various types of touch display screens.
In some embodiments, the method further comprises:
setting an APP white list;
determining whether the terminal foreground operation content is a white list APP;
the determining, according to the historical usage data corresponding to the terminal foreground operating content, the UI edge corresponding to the terminal foreground operating content includes:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
According to the using characteristics of different APPs, the User Interface (UI) of the APP can be determined to be required to be zoomed, while the UIs of other APPs are not required to be zoomed and are still displayed in a preset maximum area value, for example, full screen display. Because for some APP usage habits, excessive edge touch operations are not required, for example, playing and displaying APP of pictures, videos or documents, only a small number of UI controls may be required, and when a user holds the terminal, the UI controls are not easily touched by mistake. Therefore, the playing and displaying APPs can be set as white list APPs without scaling the UI. For some APPs with a large number of UI controls or UI controls on the edge of the UI, for example, text input APPs, dialing APPs, and the like, which are more likely to be touched by mistake, the UI zooming function may be started to redetermine the edge of the UI.
In some embodiments, the determining, according to the historical usage data corresponding to the foreground operating content of the terminal, a UI edge corresponding to the foreground operating content of the terminal further includes:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
If the use of the APP is often prone to edge mistouches, the above-described UI zoom functionality may be applied at the APP. The historical use data may be historical touch data of the APP, use data of each UI control in a UI interface of the APP, and the like. Through obtaining this APP's historical use data, determine the position that often happens the mistake and touch easily, and then determine the scope that the UI zoomed, that is the UI edge to show UI in the inside at the UI edge.
Therefore, the false touch of the edge is not easy to occur in the subsequent use process. Meanwhile, the usage data generated by the current usage also becomes historical usage data. Therefore, the UI edge corresponding to the foreground operation content of the current terminal can be continuously adjusted in the continuous use process of the terminal, the adjustment accuracy is continuously improved, and the error touch of the UI edge is reduced.
In other embodiments, the method for determining the UI edge and adjusting the UI display position may also be applied to the global use of the terminal. And counting to obtain a UI edge through the use data of the terminal in the historical use process, and applying the UI edge to any use scene. In this way, it is also possible to reduce instability, jitter, and the like of UI display due to repeated switching of applications by the terminal. The use habit of the user can be matched under the global use scene, and the generation of false touch is reduced.
In some embodiments, the determining, according to historical usage data corresponding to a foreground operating content of a terminal, a UI edge corresponding to the foreground operating content of the terminal includes:
and counting to obtain the UI edge corresponding to the foreground operation content of the terminal according to the edge touch position data of the foreground operation content of the historical use terminal.
The historical use data of the terminal is position data of edge touch detected by the touch screen in the historical use process of the terminal. The edge touch position data used for multiple times is accumulated, averaged or calculated according to a distribution function and the like, the edge range which is easy to touch by mistake is obtained through statistics, and then the UI edge can be determined. Therefore, the UI is displayed inside the UI edge, the error touch of the edge can be reduced, the shielding of the content on the UI edge is reduced, and the display effect and the use effect are improved.
In some embodiments, as shown in fig. 2, the method further comprises:
step S201, determining the current holding posture of the terminal;
the using the edge touch position data of the terminal foreground operation content according to the history to count and obtain the UI edge corresponding to the terminal foreground operation content comprises the following steps:
step S202, according to the edge touch position data of the historical use terminal foreground operation content in the current holding posture, counting to obtain the UI edge corresponding to the terminal foreground operation content in the current holding posture.
Since the terminal can be used in different directions, such as landscape and portrait, the posture of the hand holding the terminal is different in different directions, and the user himself/herself may switch the use of the left and right hands. Thus, the edge regions, which are likely to be erroneously touched in different holding postures, are different. Accordingly, it is contemplated herein to determine different UI edges for different gripping gestures to reduce false touches under various gripping gestures.
Here, in using the terminal, a current holding posture of using the terminal may be determined, and then a UI edge corresponding to the current holding posture is called, and then scaling of the UI is performed in response. Of course, if the holding posture is switched during use, the UI edge may be redetermined according to the switched holding posture, and the UI display range may be adjusted. Therefore, here, the current holding posture may be detected once every certain period of time, or the current holding posture may be continuously detected.
In some embodiments, the using the edge touch position data of the foreground operating content of the terminal in the current holding posture according to the history to obtain, through statistics, a UI edge corresponding to the foreground operating content of the terminal in the current holding posture includes at least one of:
counting to obtain a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
counting to obtain a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance from the left edge of the right holding UI edge to the left vertical edge of the touch screen is smaller than the distance from the right edge of the right holding UI edge to the right vertical edge of the touch screen;
and counting to obtain a transverse screen holding UI edge corresponding to the terminal foreground operating content under the transverse screen holding posture according to the historical edge touch data of the terminal foreground operating content under the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
Since it is possible to hold the terminal by both left and right hands when erecting the screen, and the areas touched to the left and right sides of the screen may be asymmetrical when held by one hand. Therefore, the left-hand vertical screen holding posture and the right-hand vertical screen holding posture are respectively counted; meanwhile, the holding posture of the horizontal screen is separately counted. Therefore, the method can adapt to different postures in the use process of the terminal, is convenient for flexible switching, and improves the use experience of users in various use scenes.
In some embodiments, the determining the current holding posture of the terminal comprises:
acquiring touch data of an edge area of the touch screen, wherein the touch data comprises: at least one of touch points, touch area and touch duration;
and determining the current holding posture of the terminal according to the touch data of the edge area.
When a touch signal located in a preset edge area of the display screen is detected, the touch signal may not be a false touch generated by holding but may be an effective touch operation of a user. However, the touch signal of the erroneous touch operation is often different from the touch signal of the effective touch operation, and in order to identify the erroneous touch generated by the holding, the analysis needs to be performed according to the touch data of the edge area. If the touch data meets the characteristics of holding, the touch operation can be determined to be the false touch operation generated during holding.
In an embodiment, the touch data may include a touch area. The touch area refers to the total area of the touch sensor which receives the touch operation and generates the touch signal in the preset display screen edge area. Because a plurality of fingers and partial palms are usually generated during the false touch, a larger touch area is generated, and therefore, the touch area of the touch sensor generating the touch signal can be detected when the touch operation in the preset display screen edge area is detected; when the touch area is larger than a preset area threshold, it may be determined that the touch data is one of the statistical data.
In another embodiment, the touch data may include touch points. For example, there are 3 to 4 areas touched by fingers on one side of the touch display screen, and there are 1 area touched by fingers on the other side. Therefore, the determination can be performed according to the number of the touch areas, and when the number of the touch areas generated by touch is greater than a preset number threshold in the preset display screen edge area, the touch data is determined to be one of the statistical data.
In yet another embodiment, the touch data may further include a touch duration. If the touch duration is short, the user may be actively performing touch operation, which belongs to effective touch, and thus, the user may not be mistakenly touched by holding. If the touch duration is long, the touch may be generated by the user holding the terminal for a long time, and therefore, the touch data is determined to be used as one of the statistical data and is used as a basis for subsequently determining the UI edge.
In some embodiments, the determining the current holding posture of the terminal according to the touch data of the edge area includes:
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the number of touch points in the left edge area of the touch screen is more than that in the right edge area of the touch screen, determining that the current holding posture is the right-hand vertical screen holding posture;
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen, determining that the current touch holding posture is a left-hand vertical screen holding posture;
and under the transverse screen mode of the terminal, when the touch area in the edge area of the touch screen is larger than an area threshold, determining that the current holding posture is the transverse screen holding posture.
For determining the current holding posture of the terminal, the current use direction of the terminal can be a horizontal screen or a vertical screen through the detection of the touch control. Therefore, the holding posture corresponding to the current operation content of the terminal can be quickly determined, the response historical use data is called, the UI edge is determined, and the UI edge is adjusted to be in the appropriate UI display range.
In some embodiments, the determining the current holding posture of the terminal according to the touch data of the edge area includes:
determining the touch duration of the edge area;
and determining the current holding gesture according to the touch points and/or the touch area of the edge area with the touch duration being greater than the duration threshold.
Here, the recognition of the holding gesture is realized by combining several kinds of touch data in the above embodiments.
In the above embodiments of the present disclosure, it is sometimes necessary to count a plurality of holding operations, and the UI edge obtained by the counting is directly called each time of use. However, this method may not be able to cope with special usage requirements, for example, different users may have different usage habits when using the same mobile terminal, and the statistical grip area may not be well adapted to the usage of all users. As another example, the holding gesture may be different for different usage scenarios: in one-handed text entry, the user may be more accustomed to holding the lower half of the mobile terminal; when the user needs to click the touch control at different positions of the user interface during web browsing, the user may be more accustomed to holding the middle portion of the mobile terminal. Thus, different holding postures may cause a difference in the area of the erroneous touch.
Based on this, in the embodiment of the present disclosure, the touch error area generated by holding may be determined according to the touch signal generated by the current holding operation, and the user interface may be scaled directly according to the touch error area generated by holding. However, since the position touched by the hand is not fixed, the user interface may shake when directly zooming, which may affect the appearance. Therefore, the generation of jitter can be reduced by extending the time period of detection.
And detecting the duration of the touch signal generated by the holding operation, and zooming the user interface according to the holding area if the duration is greater than a preset duration threshold. In addition, the zooming process can be set to be a slow gradual change process, so that the jitter can be further reduced, the human-computer interaction feeling of a user is improved, and the influence on the appearance of a display picture is reduced.
Embodiments of the present disclosure also provide the following examples:
for the display screens occupying a larger area of the terminal, such as a full screen, a waterfall screen, a surrounding screen, an outward-folding screen, a curved screen and the like, the screens extend to the frame of the terminal. Thus, when the user holds the bezel by hand, the user is likely to make a wrong touch, as shown in fig. 3. To this end, in some embodiments, it is desirable to narrow the UI (User Interface) to within the bezel, thereby reducing false touches. However, different holding habits of different people are different, if the reduced range of the UI is fixed, the reduction of the UI is excessive for some users, and the waste of the screen is caused; and the wrong touch still exists for some users. Therefore, the method in the embodiment of the present disclosure automatically adapts to different reduction ratios through the mobile terminal itself.
In the embodiment of the present disclosure, through the cooperation with a TP (Touch Panel), the automatic size adjustment of the UI is realized to adapt to terminal users with different usage habits, so that the display area of the terminal screen is increased while the erroneous Touch is reduced.
The technical scheme of the embodiment of the disclosure comprises the following aspects:
firstly, identifying error edges of user habits by using TP;
in the TP operation process, if a fixed multi-finger or large-area touch is recognized at the edge, as shown in fig. 4, the position of the edge is recorded and updated to the register for storage.
Secondly, detecting an edge palm;
the chip of the TP is finished in the scanning frequency, and whether the touch is palm touch is identified through the signal characteristics of the touch signal, so that the false touch is identified or the current touch is determined to be the false touch generated by holding.
The holding gestures can be distinguished through signal characteristics generated by palm touch, including left-hand holding, right-hand holding, transverse holding and the like. And acquiring a holding area generated by holding touch under different holding postures to determine an area for scaling the UI. In this way, the scaling of the UI can be determined based on the user's usage habits in different gestures.
In order to reduce inaccurate data caused by temporary use of the terminal by other people, the holding area generated by holding of the user can be continuously collected in the process of multiple use of the user, and the holding area of the user for adjusting the UI scaling is determined by performing accumulation, averaging and other processing according to the holding areas held for multiple times. As shown in fig. 5. The solid line 11 is the holding area determined by the average position of the right hand holding of the user; the dotted line 12 is a grip area determined by the minimum position (the area where the wrong touch is caused by the grip is the largest) when the user grips with the right hand. The solid line 21 is the grip area determined by the average position of the user's left hand grip; the dotted line 22 is the minimum position when held by the user's left hand. The solid line 31 is a grip area determined by the average position in the horizontal grip; dotted line 32 is the minimum position for the lateral grip. When the UI needs to be scaled, the scaling may be performed based on the position of the solid line according to the actual situation, or based on the position of the dotted line.
For example, when the TP recognizes that the mobile terminal is held in the right hand, scaling is performed by using the solid line 11 as the edge of the UI when held in the right hand; when the TP identifies that the mobile terminal is held by the left hand, zooming is carried out by taking a solid line 21 when the mobile terminal is held by the left hand as the edge of the UI; when the TC identifies that the mobile terminal is a horizontal screen and both hands hold the mobile terminal horizontally, zooming by using a solid line 31 as the edge of the UI when holding the mobile terminal horizontally; and if no multi-finger grip is detected or the palm signal is located in the middle area of the display screen, no zooming is performed.
Because some application programs have higher response requirements on touch operation, for example, application programs with an input interface of a virtual keyboard, a game operation interface and the like can be zoomed by adopting the method; for example, for application programs used for video playing, picture browsing, page browsing, and the like, even if the palm of the user is touched by mistake, the browsing effect is not affected, and then the UI scaling by the method is not needed. Thus, a whitelist may be set for applications that require UI zooming. Thus, when the application program belonging to the white list is opened, the function of scaling the UI is started; otherwise, the function is closed and normal display of the full screen is carried out.
Of course, the function of globally using the UI scaling may be set according to the user's own needs, and no matter what kind of picture is displayed on the terminal, the display interface is scaled according to the detected holding operation, so as to ensure that the whole picture is not blocked by the user's fingers, thereby improving the display effect.
In addition, the method for determining the holding area for adjusting the UI scaling by the user by accumulating, averaging and the like according to the holding areas held for multiple times determines a fixed reduction range according to the historical use condition of the terminal, and the method has the advantages of avoiding errors of TP identification and obtaining a relatively stable scaling effect.
However, the flexibility of this method is low, and when the user of the mobile terminal is a plurality of users, different UI scaling requirements may be brought about due to different usage habits of different users, differences in application scenes, differences in operation methods, and the like, so that different users cannot timely experience a proper UI scaling.
Therefore, in another embodiment, the scaling may be performed directly according to the touch signal identified by the current TP. That is, when zooming is required, the touch signal of the current TP is read to obtain a zoom range, and in order to reduce the image jitter caused by the zooming of the UI interface due to the change of the palm motion held during the touch operation or frequent changes, a delay may be added in the detection process. For example, when the touch signal of the holding motion is detected, the held touch range is continuously detected within a preset time period, and if the change of the touch signal of the holding motion within the time period is smaller than the preset range, the UI is scaled. For example, a 10-second touch signal is continuously detected, and if the touch signal always represents a right-hand holding motion within 10 seconds, scaling is performed by taking a dotted line 12 as an edge of the UI according to the current holding range, as shown in fig. 6A; if the touch signal shows the left-handed holding action within 10 seconds, zooming is performed by taking a dotted line 22 as the edge of the UI according to the current holding range as shown in fig. 6B; if the touch signal always represents the movement of holding both hands in a horizontal direction within 10 seconds, then the dotted line 32 is used as the edge of the UI to perform scaling according to the current holding range, as shown in fig. 6C. In addition, when the UI is zoomed, a simple and convenient mode can be adopted, and the zooming is slowly carried out, so that the visual impact caused by the UI zooming can be reduced, and the visual experience of a user is improved.
Through the technical scheme provided by the embodiment of the disclosure, the holding recognition function of the TP can be applied to the zooming function of the UI, big data learning is carried out according to the use habit of a user, and the error touch prevention range is recognized and counted. Therefore, UI zooming in different ranges can be flexibly carried out according to habits and recognition conditions of different users, and meanwhile, asymmetric UI display as shown in figure 7 can be achieved in the mode, so that a display interface is utilized to the maximum extent, and the relationship between the utilization rate of the display screen and mistaken touch prevention is balanced.
FIG. 8 is a block diagram illustrating a control device of a user interface according to an exemplary embodiment. Referring to fig. 8, the apparatus includes: an obtaining module 810, a first determining module 820 and a receiving module 830.
An obtaining module 810, configured to obtain foreground operating content of a terminal;
a first determining module 820, configured to determine, according to historical usage data corresponding to a terminal foreground operating content, a UI edge corresponding to the terminal foreground operating content;
a receiving module 830, configured to receive and respond to the touch operation in the UI edge.
In some embodiments, the apparatus further comprises:
the setting module is used for setting an application program APP white list;
the second determining module is used for determining whether the terminal foreground operating content is a white list APP;
the first determining module is specifically configured to:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
In some embodiments, the first determining module is further configured to:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
In some embodiments, the first determining module comprises:
and the counting submodule is used for counting to obtain the UI edge corresponding to the foreground operating content of the terminal according to the edge touch position data of the foreground operating content of the historical use terminal.
In some embodiments, the apparatus further comprises:
a third determining module, configured to determine a current holding posture of the terminal;
the statistic submodule is specifically configured to:
and counting to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture according to the edge touch position data of the foreground operating content of the historical use terminal in the current holding posture.
In some embodiments, the statistics submodule includes at least one of:
the first statistical module is used for counting and obtaining a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
the second counting submodule is used for counting and obtaining a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance between the left edge of the right holding UI edge and the left vertical edge of the touch screen is smaller than the distance between the right edge of the right holding UI edge and the right vertical edge of the touch screen;
and the third counting submodule is used for counting edge touch data of the terminal foreground operating content in the transverse screen holding posture according to historical use and obtaining a transverse screen holding UI edge corresponding to the terminal foreground operating content in the transverse screen holding posture aiming at the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
In some embodiments, the third determining module comprises:
the obtaining sub-module is configured to obtain touch data of an edge area of the touch screen, where the touch data includes: at least one of touch points, touch area and touch duration;
and the first determining submodule is used for determining the current holding posture of the terminal according to the touch data of the edge area.
In some embodiments, the first determining sub-module includes:
the second determining sub-module is used for determining that the current holding posture is the right-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are more than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
the third determining sub-module is used for determining that the current touch holding posture is a left-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
and the fourth determining submodule is used for determining that the current holding posture is the horizontal screen holding posture when the touch area in the edge area of the touch screen is larger than the area threshold value in the horizontal screen mode of the terminal.
In some embodiments, the first determining sub-module includes:
a fifth determining submodule, configured to determine a touch duration of the edge area;
and the sixth determining submodule is used for determining the current holding gesture according to the touch points and/or the touch area of the edge area, wherein the touch duration of the edge area is greater than the duration threshold.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 9 is a block diagram illustrating an entity structure of a detection apparatus 900 for detecting communication of devices in the internet of things according to an exemplary embodiment. For example, the apparatus 900 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and so forth.
Referring to fig. 9, apparatus 900 may include one or more of the following components: processing component 901, memory 902, power component 903, multimedia component 904, audio component 905, input/output (I/O) interface 906, sensor component 907, and communications component 908.
The processing component 901 generally controls overall operation of the device 900, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 901 may include one or more processors 910 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 901 may also include one or more modules that facilitate interaction between the processing component 901 and other components. For example, the processing component 901 may include a multimedia module to facilitate interaction between the multimedia component 904 and the processing component 901.
The memory 910 is configured to store various types of data to support operations at the apparatus 900. Examples of such data include instructions for any application or method operating on device 900, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 902 may be implemented by any type or combination of volatile or non-volatile storage devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 903 provides power to the various components of the device 900. The power supply component 903 may include: a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 900.
The multimedia component 904 includes a screen that provides an output interface between the device 900 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 904 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 900 is in an operating mode, such as a shooting mode or a video mode. Each front camera and/or rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
Audio component 905 is configured to output and/or input audio signals. For example, audio component 905 includes a Microphone (MIC) configured to receive external audio signals when apparatus 900 is in an operating mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 910 or transmitted via the communication component 908. In some embodiments, audio component 905 also includes a speaker for outputting audio signals.
I/O interface 906 provides an interface between processing component 901 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
Sensor component 907 includes one or more sensors for providing various aspects of status assessment for device 900. For example, sensor assembly 907 may detect an open/closed state of device 900, the relative positioning of components such as a display and keypad of device 900, sensor assembly 907 may also detect a change in position of device 900 or a component of device 900, the presence or absence of user contact with device 900, orientation or acceleration/deceleration of device 900, and a change in temperature of device 900. Sensor assembly 907 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 907 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 907 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 908 is configured to facilitate communications between the apparatus 900 and other devices in a wired or wireless manner. The apparatus 900 may access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 908 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 908 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In an exemplary embodiment, the apparatus 900 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer readable storage medium comprising instructions, such as the memory 902 comprising instructions, executable by the processor 910 of the apparatus 900 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer readable storage medium having instructions therein, which when executed by a processor of a mobile terminal, enable the mobile terminal to perform any of the methods provided in the above embodiments.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It will be understood that the invention is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (20)

1. A method for controlling a user interface, comprising:
acquiring the foreground operation content of the terminal;
determining a User Interface (UI) edge corresponding to the foreground operating content of the terminal according to historical use data corresponding to the foreground operating content of the terminal; wherein the historical usage data comprises: the terminal foreground operation content is aimed at the edge touch position data of the touch screen in the historical use process of the current holding posture;
and receiving and responding to touch operation in the UI edge.
2. The method of claim 1, further comprising:
setting an application program APP white list;
determining whether the terminal foreground operation content is a white list APP;
the determining, according to the historical usage data corresponding to the terminal foreground operating content, the UI edge corresponding to the terminal foreground operating content includes:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
3. The method according to claim 2, wherein the determining, according to the historical usage data corresponding to the terminal foreground operating content, the UI edge corresponding to the terminal foreground operating content further comprises:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
4. The method according to claim 1, wherein the determining, according to the historical usage data corresponding to the terminal foreground operating content, the UI edge corresponding to the terminal foreground operating content comprises:
and counting to obtain the UI edge corresponding to the foreground operation content of the terminal according to the edge touch position data of the foreground operation content of the historical use terminal.
5. The method of claim 4, further comprising:
determining a current holding posture of the terminal;
the using the edge touch position data of the terminal foreground operation content according to the history to count and obtain the UI edge corresponding to the terminal foreground operation content comprises the following steps:
and counting to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture according to the edge touch position data of the foreground operating content of the historical use terminal in the current holding posture.
6. The method according to claim 5, wherein the using the edge touch position data of the foreground operating content of the terminal in the current holding posture according to history to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture through statistics includes at least one of:
counting to obtain a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
counting to obtain a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance from the left edge of the right holding UI edge to the left vertical edge of the touch screen is smaller than the distance from the right edge of the right holding UI edge to the right vertical edge of the touch screen;
and counting to obtain a transverse screen holding UI edge corresponding to the terminal foreground operating content under the transverse screen holding posture according to the historical edge touch data of the terminal foreground operating content under the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
7. The method according to claim 5, wherein the determining the current holding posture of the terminal comprises:
acquiring touch data of an edge area of the touch screen, wherein the touch data comprises: at least one of touch points, touch area and touch duration;
and determining the current holding posture of the terminal according to the touch data of the edge area.
8. The method according to claim 7, wherein the determining the current holding posture of the terminal according to the touch data of the edge area comprises:
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the number of touch points in the left edge area of the touch screen is more than that in the right edge area of the touch screen, determining that the current holding posture is a right-hand vertical screen holding posture;
when the terminal is in a vertical screen mode, when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen, determining that the current holding posture is a left-hand vertical screen holding posture;
and under the transverse screen mode of the terminal, when the touch area in the edge area of the touch screen is larger than an area threshold, determining that the current holding posture is the transverse screen holding posture.
9. The method according to claim 7, wherein the determining the current holding posture of the terminal according to the touch data of the edge area comprises:
determining the touch duration of the edge area;
and determining the current holding posture according to the touch points and/or the touch area of the edge area with the touch duration being greater than the duration threshold.
10. A control device for a user interface, comprising:
the acquisition module is used for acquiring the foreground operation content of the terminal;
the system comprises a first determining module, a second determining module and a third determining module, wherein the first determining module is used for determining a User Interface (UI) edge corresponding to the foreground operating content of the terminal according to historical use data corresponding to the foreground operating content of the terminal; wherein the historical usage data comprises: the terminal foreground operation content is aimed at the edge touch position data of the touch screen in the historical use process of the current holding posture;
and the receiving module is used for receiving and responding to the touch operation in the UI edge.
11. The apparatus of claim 10, further comprising:
the setting module is used for setting an application program APP white list;
the second determining module is used for determining whether the terminal foreground operating content is a white list APP;
the first determining module is specifically configured to:
and if the obtained terminal foreground operation content is a white list APP, taking the maximum UI value as the UI edge of the APP.
12. The apparatus of claim 11, wherein the first determining module is further configured to:
and if the acquired terminal foreground operating content is not the white list APP, determining a UI edge corresponding to the terminal foreground operating content according to historical use data corresponding to the terminal foreground operating content.
13. The apparatus of claim 10, wherein the first determining module comprises:
and the counting submodule is used for counting to obtain the UI edge corresponding to the foreground operating content of the terminal according to the edge touch position data of the foreground operating content of the historical use terminal.
14. The apparatus of claim 13, further comprising:
a third determining module, configured to determine a current holding posture of the terminal;
the statistic submodule is specifically configured to:
and counting to obtain the UI edge corresponding to the foreground operating content of the terminal in the current holding posture according to the edge touch position data of the foreground operating content of the historical use terminal in the current holding posture.
15. The apparatus of claim 14, wherein the statistics submodule comprises at least one of:
the first statistical module is used for counting and obtaining a left holding UI edge of the corresponding terminal foreground operating content under the left vertical screen holding posture according to edge touch data of the terminal foreground operating content under the left vertical screen holding posture used historically aiming at the left vertical screen holding posture, wherein the distance from the left edge of the left holding UI edge to the left vertical edge of the touch screen is larger than the distance from the right edge of the left holding UI edge to the right vertical edge of the touch screen;
the second counting submodule is used for counting and obtaining a right holding UI edge of the corresponding terminal foreground operating content under the right-hand vertical screen holding posture according to edge touch data of the terminal foreground operating content under the right-hand vertical screen holding posture used historically aiming at the right-hand vertical screen holding posture, wherein the distance between the left edge of the right holding UI edge and the left vertical edge of the touch screen is smaller than the distance between the right edge of the right holding UI edge and the right vertical edge of the touch screen;
and the third counting submodule is used for counting edge touch data of the terminal foreground operating content in the transverse screen holding posture according to historical use and obtaining a transverse screen holding UI edge corresponding to the terminal foreground operating content in the transverse screen holding posture aiming at the transverse screen holding posture, wherein the left edge and the right edge of the transverse screen holding UI edge have the same distance with the corresponding transverse edge of the touch screen.
16. The apparatus of claim 14, wherein the third determining module comprises:
the obtaining sub-module is configured to obtain touch data of an edge area of the touch screen, where the touch data includes: at least one of touch points, touch area and touch duration;
and the first determining submodule is used for determining the current holding posture of the terminal according to the touch data of the edge area.
17. The apparatus of claim 16, wherein the first determining submodule comprises:
the second determining sub-module is used for determining that the current holding posture is a right-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are more than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
the third determining sub-module is used for determining that the current holding posture is a left-hand vertical screen holding posture when the edge area of the touch screen has a plurality of touch points and the touch points in the left edge area of the touch screen are less than the touch points in the right edge area of the touch screen in the vertical screen mode of the terminal;
and the fourth determining submodule is used for determining that the current holding posture is the horizontal screen holding posture when the touch area in the edge area of the touch screen is larger than the area threshold value in the horizontal screen mode of the terminal.
18. The apparatus of claim 16, wherein the first determining submodule comprises:
a fifth determining submodule, configured to determine a touch duration of the edge area;
and the sixth determining submodule is used for determining the current holding posture according to the touch points and/or the touch area of the edge area, wherein the touch duration of the edge area is greater than the duration threshold.
19. A control device for a user interface, characterized in that it comprises at least: a processor and a memory for storing executable instructions operable on the processor, wherein:
the processor is configured to execute the executable instructions, and the executable instructions perform the steps of the method for controlling the user interface according to any one of the preceding claims 1 to 9.
20. A non-transitory computer-readable storage medium having stored thereon computer-executable instructions that, when executed by a processor, perform steps in a method of controlling a user interface according to any one of claims 1 to 9.
CN202010091389.0A 2020-02-13 2020-02-13 User interface control method and device and storage medium Active CN111314552B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010091389.0A CN111314552B (en) 2020-02-13 2020-02-13 User interface control method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010091389.0A CN111314552B (en) 2020-02-13 2020-02-13 User interface control method and device and storage medium

Publications (2)

Publication Number Publication Date
CN111314552A CN111314552A (en) 2020-06-19
CN111314552B true CN111314552B (en) 2022-02-08

Family

ID=71148371

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010091389.0A Active CN111314552B (en) 2020-02-13 2020-02-13 User interface control method and device and storage medium

Country Status (1)

Country Link
CN (1) CN111314552B (en)

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101795321B (en) * 2010-01-27 2012-07-11 华为终端有限公司 Application display method and mobile communication terminal based on user habit
JP5922480B2 (en) * 2012-04-25 2016-05-24 京セラ株式会社 Portable device having display function, program, and control method of portable device having display function
CN103823610B (en) * 2014-02-10 2018-07-03 联想(北京)有限公司 A kind of electronic equipment and its information processing method
CN106527919A (en) * 2016-09-27 2017-03-22 北京小米移动软件有限公司 Method and device for adjusting screen display
CN106775404A (en) * 2016-12-16 2017-05-31 广东欧珀移动通信有限公司 A kind of false-touch prevention method of display interface, device and mobile terminal
CN106855785A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 Method, device and the mobile terminal of screen false-touch prevention are realized during a kind of hands grasping
CN106686232B (en) * 2016-12-27 2020-03-31 努比亚技术有限公司 Control interface optimization method and mobile terminal
CN106873895A (en) * 2016-12-30 2017-06-20 努比亚技术有限公司 The implementation method and device of a kind of edge gesture operation
CN107422977A (en) * 2017-07-31 2017-12-01 北京小米移动软件有限公司 Trigger action processing method, device and computer-readable recording medium
CN107508974B (en) * 2017-08-10 2020-03-17 努比亚技术有限公司 Interface display method, mobile terminal and computer readable storage medium
CN107678591A (en) * 2017-09-30 2018-02-09 联想(北京)有限公司 A kind of mode control method and terminal
CN109857316A (en) * 2017-11-30 2019-06-07 佳能株式会社 The adjustment based on prediction at tangible interface
CN110764681A (en) * 2018-07-25 2020-02-07 优视科技有限公司 Application interface display method and device, client device and electronic device
CN109663353B (en) * 2018-12-28 2023-08-11 努比亚技术有限公司 Game operation method, mobile terminal and computer readable storage medium
CN109814797A (en) * 2019-01-16 2019-05-28 努比亚技术有限公司 Touch-control control method and mobile terminal, computer readable storage medium

Also Published As

Publication number Publication date
CN111314552A (en) 2020-06-19

Similar Documents

Publication Publication Date Title
US11226736B2 (en) Method and apparatus for controlling display and mobile terminal
JP6533535B2 (en) Gesture identification method, apparatus, program, and recording medium
US10969942B2 (en) Method and device for displaying interface
CN107390932B (en) Edge false touch prevention method and device and computer readable storage medium
US11199928B2 (en) Method and apparatus for preventing false touch on edge, and storage medium
EP3232301B1 (en) Mobile terminal and virtual key processing method
CN110968364B (en) Method and device for adding shortcut plugins and intelligent device
CN107656694B (en) Display control method and device of user interface
CN106980409B (en) Input control method and device
CN111522498A (en) Touch response method and device and storage medium
CN111104001B (en) Method and device for preventing screen from being touched mistakenly, mobile terminal and storage medium
CN114185444A (en) Method and device for preventing mistaken touch of touch screen and storage medium
CN112181265B (en) Touch signal processing method, device and medium
CN107402677B (en) Method and device for recognizing finger lifting in touch operation and terminal
CN106990893B (en) Touch screen operation processing method and device
CN111273979A (en) Information processing method, device and storage medium
CN111314552B (en) User interface control method and device and storage medium
CN107844257B (en) Information display method, device, terminal and storage medium
CN107203315B (en) Click event processing method and device and terminal
EP4064016A1 (en) Interface display method of an application and apparatus
CN108877742A (en) Luminance regulating method and device
CN107168631B (en) Application program closing method and device and terminal electronic equipment
US20160195992A1 (en) Mobile terminal and method for processing signals generated from touching virtual keys
CN112363647A (en) Touch operation method and device and storage medium
CN107515694B (en) Terminal control method and device and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant