CN111147660A - Control operation method and electronic equipment - Google Patents

Control operation method and electronic equipment Download PDF

Info

Publication number
CN111147660A
CN111147660A CN201911230192.4A CN201911230192A CN111147660A CN 111147660 A CN111147660 A CN 111147660A CN 201911230192 A CN201911230192 A CN 201911230192A CN 111147660 A CN111147660 A CN 111147660A
Authority
CN
China
Prior art keywords
sliding
interface
operation control
sliding process
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911230192.4A
Other languages
Chinese (zh)
Other versions
CN111147660B (en
Inventor
丁祎
张婷
曾国毅
付琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911230192.4A priority Critical patent/CN111147660B/en
Publication of CN111147660A publication Critical patent/CN111147660A/en
Priority to PCT/CN2020/133858 priority patent/WO2021110133A1/en
Application granted granted Critical
Publication of CN111147660B publication Critical patent/CN111147660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides an operation method of a control and electronic equipment, relates to the technical field of terminals, and can realize one-hand control operation of a user on a display interface under the condition that display contents in the display interface are not changed, so that the use experience of the user in one-hand operation is improved. The method comprises the following steps: displaying a first interface of an application, wherein a first operation control is arranged at the top of the first interface; receiving sliding operation input by a user on the first interface; when the sliding operation is detected to meet a first sliding process, displaying the first operation control in an activated state, wherein the first sliding process is a process that a touch body slides a first preset distance along a first direction after contacting the first interface; and when the sliding operation is finished, responding to the end of the sliding operation, and displaying a second interface, wherein the second interface is an interface to be displayed after the first operation control is clicked.

Description

Control operation method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to an operation method of a control and electronic equipment.
Background
With the development of mobile internet technology, large-screen terminals have gradually become accepted by users and become a trend. Taking a mobile phone as an example of the terminal, when the larger the display screen of the mobile phone is, the larger the screen occupation ratio (i.e. the ratio of the area of the display screen to the area of the front panel of the mobile phone) is, the more contents can be displayed by the mobile phone, and the better display effect can be obtained by the user.
Therefore, most end-vendors tend to have larger display screens and screen fractions in handsets. However, as the display screen and screen ratio of the mobile phone is larger, one-handed operation by the user on the display screen becomes more difficult. In this regard, some mobile phone manufacturers have set a one-handed operation mode in the mobile phone. When the mobile phone enters the one-handed operation mode, as shown in fig. 1A, the mobile phone may reduce the originally displayed desktop 101, and display the reduced desktop 101 in the display area operable by the one-handed operation of the user in a sinking manner. For example, the desktop 101 is displayed in the lower right corner of the display screen. In addition, the cell phone may also delete some of the display content in the desktop 101 while zooming out on the desktop 101. For example, the cell phone may delete the status bar in the desktop 101. In this way, the user can use the right hand in the lower right corner region to operate various controls in the display interface with one hand, particularly the controls that were originally located at the top of the display interface.
However, after the display content in the display interface is reduced or deleted by the mobile phone, the amount of information presented to the user by the mobile phone is reduced, and meanwhile, the display effect presented by the mobile phone is also reduced, thereby reducing the user experience.
Disclosure of Invention
The application provides an operation method of a control and electronic equipment, which can realize one-hand control operation of a user on a display interface under the condition that display contents in the display interface are not changed, and improve the use experience of the user in one-hand operation.
In order to achieve the purpose, the technical scheme is as follows:
in a first aspect, the present application provides an operation method of a control, which is applicable to an electronic device with a touch screen, and the method includes: displaying a first interface of an application, wherein a first operation control is arranged at the top of the first interface; receiving sliding operation input by a user on a first interface; the sliding operation may be divided into one or more sliding processes from when a touch body (e.g., a user's finger) contacts the touch screen until the touch body leaves the touch screen; if the sliding operation meets a first sliding process (namely a process that the touch body slides a first preset distance along a first direction after contacting the first interface) is detected, the electronic equipment can display the first operation control in an activated state; when the sliding operation is finished (that is, the touch body leaves the touch screen), the electronic device may display a second interface in response to the finishing of the sliding operation, where the second interface is an interface to be displayed after the operation control (for example, the first operation control) currently in the activated state is clicked.
That is to say, a user may trigger the electronic device to activate each operation control in the display interface by inputting a sliding operation in the display interface, so that the electronic device enters a one-handed operation mode, and when it is detected that the sliding operation input by the user is finished, it indicates that the user needs to click the activated operation control, and the electronic device may display a new interface by simulating a method of clicking the corresponding operation control. Therefore, under the condition that the display content in the display interface is not changed, the electronic equipment provides an interactive mode for the user to quickly control the operation control to realize one-hand operation in the display interface, so that the use experience of the user in one-hand operation is improved while the display content and the display effect are ensured.
In one possible implementation manner, after displaying the first operation control in the activated state, the method further includes: detecting that the sliding operation meets a second sliding process, wherein the starting position of the second sliding process is the end position of the first sliding process, and the touch body does not leave the touch screen of the electronic equipment in the first sliding process and the second sliding process; then, in response to the second sliding process, the electronic device may display the second operation control in the first interface in an activated state, and display the first operation control in a deactivated state. That is to say, after the first operation control is activated, the electronic device may switch the activated first operation control to the second operation control in response to a second sliding process in the sliding operation input by the user, until the activated operation control is the operation control that the user wishes to click this time.
In a possible implementation manner, when a projection of a sliding distance in the second sliding process in the second direction is greater than a second preset distance, the second operation control is an operation control closest to the first operation control along the second direction, and the second direction is the same as or different from the first direction. That is, the electronic device may switch the activated operation control along the sliding direction during the second sliding (i.e., the second direction).
In one possible implementation manner, after displaying the second operation control in the first interface in an activated state and displaying the first operation control in a deactivated state, the method further includes: detecting that the sliding operation meets a third sliding process, wherein the sliding direction of the third sliding process is opposite to the sliding direction of the second sliding process, and the difference between the sliding distance of the third sliding process and the sliding distance of the second sliding process is smaller than a preset value; in response to the third sliding process, the electronic device may display the first operation control in the activated state again, and display the second operation control in the deactivated state. That is to say, the user can trigger the electronic device to return to the operation control which is activated last time by inputting the sliding process with the opposite sliding direction, so that the user can conveniently and quickly select the operation control which needs to be clicked at this time.
In a possible implementation manner, the first operation control may be a control including a secondary menu; after the first operation control is displayed in the activated state, the method further comprises the following steps: detecting the staying time of the touch body at the end position of the first sliding process; if the staying time is longer than the preset time, which indicates that the user is likely to need to click the currently activated first operation control, the electronic device may display a secondary menu of the first operation control in the first interface. That is to say, when entering the one-handed operation mode, the electronic device may further open the secondary menu of the operation control in response to the sliding process input by the user, so that the user can conveniently select the operation control to be clicked in the secondary menu.
In one possible implementation manner, after the displaying the second-level menu of the first operation control in the first interface, the method further includes: detecting that the sliding operation meets a fourth sliding process, wherein the starting position of the fourth sliding process is the end position of the first sliding process, and similarly, in the fourth sliding process, the touch body does not leave the touch screen of the electronic device; then, in response to the fourth sliding process, the electronic device may display a third operational control in the secondary menu as an activated state.
It can be seen that, after the electronic device activates the secondary menu of the first operation control in the first interface, the electronic device may activate the corresponding operation control in the secondary menu according to the sliding process continuously input by the user in the current sliding operation. Therefore, the user can enter the second-level menu of the first interface by performing one-time sliding operation to control the corresponding operation control through one-hand operation, and the interaction process of performing one-hand operation among the multi-level menus is simplified.
In a possible implementation manner, a refresh button is hidden in the first interface, the refresh button is adjacent to the first operation control in a third direction, and the third direction is the same as or different from the first direction; after the first operation control is displayed in the activated state, the method further comprises the following steps: detecting that the sliding operation meets a fifth sliding process, wherein the fifth sliding process is a process of moving a third preset distance from the end position of the first sliding process along a third direction, and similarly, in the fifth sliding process, the touch body does not leave the touch screen of the electronic device; then, in response to the fifth sliding process, the electronic device may display the refresh button in the first interface and display the refresh button in an activated state. That is, if the refresh button is provided in the first interface, the electronic device may also activate the refresh button according to the corresponding sliding process input by the user in the sliding operation.
In a possible implementation manner, after the displaying the refresh button in the first interface and displaying the refresh button in an activated state, the method further includes: if the touch body is detected to leave the first interface, the electronic equipment can execute a refreshing operation corresponding to the clicking of the refreshing button; if the touch body is detected to continue sliding, the electronic device can continue to display the refresh button in the activated state. Therefore, after the electronic equipment enters the single-hand operation mode, the electronic equipment can not only control the operation control at the top of the display interface, but also support the refreshing operation in the display interface, and avoid the conflict between the sliding operation of the user entering the single-hand operation mode and the refreshing operation supported by the display interface in the same display interface.
In a possible implementation manner, when the starting position of the first sliding process is located in the left half part of the first interface, the first operation control may be an operation control at the leftmost side at the top of the first interface; when the starting position of the first sliding process is located in the right half part of the first interface, the first operation control can be the rightmost operation control at the top of the first interface; or, when the end position of the first sliding process is located in the left half part of the first interface, the first operation control may be the operation control at the leftmost side of the top of the first interface; when the end position of the first sliding process is located in the right half part of the first interface, the first operation control may be the rightmost operation control at the top of the first interface. That is, the electronic device may determine which operation control in the first interface is specifically activated according to the position of the touch point in the first interface during the first sliding process.
Or, the electronic device may further determine which operation control in the first interface is specifically activated according to a holding posture of the user when the user inputs the first sliding process. For example, when the holding posture of the user is a left-hand holding posture, the first operation control may be an operation control at the rightmost side of the top of the first interface; when the holding posture of the user is a right-hand holding posture, the first operation control can be an operation control at the leftmost top of the first interface.
In one possible implementation manner, the displaying, by the electronic device, the first operation control as the activated state includes: the electronic device displays the first operation control in an activated state by changing the shape, size, color or background of the first operation control. Of course, the electronic device may play a prompt tone or generate a vibration to prompt the user while displaying the first operation control in the activated state.
In a possible implementation manner, the end of the sliding operation refers to that the touch body leaves the touch screen of the electronic device; at this time, the electronic device displays the second interface in response to the end of the sliding operation, which specifically includes: when the fact that the touch body leaves the touch screen of the electronic equipment is detected, a click event of the first operation control is reported to an application to which the first interface belongs, so that the application responds to the click event at this time to display a second interface to be displayed after the first operation control is clicked.
In a possible implementation, the first direction may refer to a direction from the top of the first interface to the bottom of the first interface. That is, the user may trigger the electronic device to activate the first operation control on the top of the first interface by entering a pull-down operation in the first interface.
In a second aspect, the present application provides an electronic device with a touch screen, including a display unit, a receiving unit, and a detecting unit, the display unit is configured to: displaying a first interface of an application, wherein a first operation control is arranged at the top of the first interface; the receiving unit is used for: receiving sliding operation input by a user on a first interface; the detection unit is used for: detecting that the sliding operation meets a first sliding process, wherein the first sliding process is a process of sliding a first preset distance along a first direction after a touch body contacts a first interface; the display unit is further configured to: when the sliding operation is detected to meet the first sliding process, displaying a first operation control in an activated state; and when the sliding operation is finished, displaying a second interface, wherein the second interface is an interface to be displayed after the first operation control is clicked.
In a possible implementation manner, the detection unit is further configured to: detecting that the sliding operation meets a second sliding process, wherein the starting position of the second sliding process is the end position of the first sliding process, and the touch body keeps touching on a touch screen of the electronic equipment in the first sliding process and the second sliding process; the display unit is further configured to: and in response to detecting that the sliding operation meets the second sliding process, displaying the second operation control in the first interface as an activated state, and displaying the first operation control as a deactivated state.
In a possible implementation manner, the detection unit is further configured to: detecting that the sliding operation meets a third sliding process, wherein the sliding direction of the third sliding process is opposite to the sliding direction of the second sliding process, and the difference between the sliding distance of the third sliding process and the sliding distance of the second sliding process is smaller than a preset value, wherein in the third sliding process, the touch body keeps touching on the touch screen of the electronic equipment; the display unit is further configured to: and in response to detecting that the sliding operation meets the third sliding process, displaying the first operation control as the activated state again, and displaying the second operation control as the deactivated state.
In a possible implementation manner, the detection unit is further configured to: detecting the staying time of the touch body at the end position of the first sliding process; the display unit is further configured to: and if the retention time is longer than the preset time, displaying a secondary menu of the first operation control in the first interface.
In a possible implementation manner, the detection unit is further configured to: detecting that the sliding operation meets a fourth sliding process, wherein the starting position of the fourth sliding process is the end position of the first sliding process, and in the fourth sliding process, the touch body keeps touching the touch screen of the electronic equipment; the display unit is further configured to: and displaying a third operation control in the secondary menu as an activated state in response to the sliding operation meeting the fourth sliding process.
In a possible implementation manner, a refresh button is hidden in the first interface, the refresh button is adjacent to the first operation control in a third direction, and the third direction is the same as or different from the first direction; in this case, the detection unit is further configured to: detecting that the sliding operation meets a fifth sliding process, wherein the fifth sliding process is a process of moving a third preset distance in a third direction from the end position of the first sliding process, and in the fifth sliding process, the touch body keeps touching on a touch screen of the electronic device; the display unit is further configured to: and in response to detecting that the sliding operation meets the fifth sliding process, displaying the refresh button in the first interface, and displaying the refresh button in an activated state.
In a possible implementation manner, the electronic device further includes an execution unit, where if the detection unit detects that the touch object is away from the first interface, the execution unit executes a refresh operation corresponding to clicking the refresh button; if the execution unit detects that the touch body continues to slide, the display unit can continue to display the refresh button in the activated state.
In a possible implementation manner, the display unit may specifically display the first operation control in an activated state by changing a shape, a size, a color, or a background of the first operation control.
In a possible implementation manner, the end of the sliding operation refers to that the touch body leaves the touch screen of the electronic device; when the detection unit detects that the touch body leaves the touch screen of the electronic device, a click event of the first operation control can be reported to the corresponding application, so that the application responds to the click event and indicates the display unit to display a second interface to be displayed after the first operation control is clicked.
In a third aspect, the present application provides an electronic device, comprising: a touchscreen, one or more processors, one or more memories, and one or more computer programs; the touch screen and the memory are coupled, the one or more computer programs are stored in the memory, and when the electronic device runs, the processor executes the one or more computer programs stored in the memory, so that the electronic device executes the operation method of any one of the controls.
In a fourth aspect, the present application provides a computer-readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of operating the control according to any of the first aspects.
In a fifth aspect, the present application provides a computer program product, which when run on an electronic device, causes the electronic device to perform the method of operating a control according to any of the first aspect.
It is to be understood that the electronic device according to the second and third aspects, the computer-readable storage medium according to the fourth aspect, and the computer program product according to the fifth aspect are all configured to perform the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device can refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Drawings
FIG. 1A is a schematic diagram of an application scenario of a single-handed operation mode in the prior art;
fig. 1B is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a first application scenario diagram of an operation method of a control according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating an architecture of an operating system in an electronic device according to an embodiment of the present disclosure;
fig. 4 is a schematic view of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 5 is a flowchart illustrating an operation method of a control according to an embodiment of the present application;
fig. 6 is a third application scenario schematic diagram of an operation method of a control according to an embodiment of the present application;
fig. 7 is a fourth application scenario schematic diagram of an operation method of a control according to an embodiment of the present application;
fig. 8 is a schematic view of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 9 is a schematic view sixth of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 10 is a schematic view seventh of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 11 is an application scenario diagram eight of an operation method of a control according to an embodiment of the present application;
fig. 12 is a schematic view nine of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 13 is a schematic view ten of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 14 is an eleventh application scenario schematic diagram of an operation method of a control according to an embodiment of the present application;
fig. 15 is a schematic view twelve of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 16 is a schematic diagram thirteen application scenarios of an operation method of a control according to an embodiment of the present application;
fig. 17 is a fourteenth application scenario schematic diagram of an operation method of a control according to an embodiment of the present application;
fig. 18 is a schematic application scenario fifteen of an operation method of a control according to an embodiment of the present application;
fig. 19 is a schematic diagram sixteen of an application scenario of an operation method of a control according to an embodiment of the present application;
fig. 20 is a seventeenth application scenario diagram of an operation method of a control according to an embodiment of the present application;
fig. 21 is an eighteen application scenario schematic diagram of an operation method of a control according to an embodiment of the present application;
fig. 22 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
To facilitate a clear understanding of the following embodiments, a brief description of the related art is first given:
and (4) control: elements presented in a GUI (graphical user interface) can be generally referred to as controls, which can provide a user with certain operations or for displaying certain content.
For example, the control may specifically include a text control, such as a TextView control, an EditText control, or may also include a Button control, such as a Button control, an ImageButton control, or the like, and may also include a picture control, such as an ImageView control, and the like, which is not limited in this embodiment of the present application.
In the embodiment of the application, each control displayed in the display interface can be divided into an operation control and a non-operation control.
The operation control is a control capable of interacting with a user, that is, the control can respond to a touch operation of the user to present a corresponding function. Correspondingly, the non-operation control is a control which cannot respond to the touch operation of the user to realize the related function.
For example, when a control is a Button type control, the electronic device may determine that the control is an operation control; when a control is an ImageView type control, the electronic device can determine that the control is a non-operational control. Or, the electronic device may further identify the control as an operation control or a non-operation control according to parameters such as the size, the position, the layer order, and the like of the control in the display interface, which is not limited in this embodiment of the present application.
Embodiments of the present embodiment will be described in detail below with reference to the accompanying drawings.
The operation method of the widget provided in the embodiment of the present application may be applied to electronic devices such as a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, and a virtual reality device, and the embodiment of the present application does not limit the operation method.
For example, fig. 1B shows a schematic structural diagram of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The sensor module 180 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, a bone conduction sensor, and the like.
Of course, the electronic device 100 may further include a charging management module, a power management module, a battery, a key, an indicator, and 1 or more SIM card interfaces, which are not limited in this embodiment of the application.
In some embodiments of the present application, as also shown in FIG. 1B, a touch sensor may be included in the sensor module 180 described above. The touch sensor may capture touch events of a user on or near the touch sensor (e.g., user manipulation of a touch sensor surface with a finger, stylus, or any other suitable object) and transmit the captured touch information to another device, such as the processor 110.
For example, the touch sensor can be implemented in various ways, such as resistive, capacitive, infrared, and surface acoustic wave. The touch sensor may be integrated with the display screen 194 as a touch screen of the electronic device 100, or the touch sensor and the display screen 194 may be implemented as two separate components to perform input and output functions of the electronic device 100.
In the embodiment of the present application, a rectangular coordinate system may be set in advance in a touch screen including a touch sensor. For example, as shown in fig. 2, a rectangular coordinate system may be established with the upper left corner of the touch screen as the origin O (0,0), or a rectangular coordinate system may be established with the geometric center of the touch screen as the origin O (0,0) (not shown in fig. 2). As the touch object slides within the touch screen, the touch sensor in the touch screen may continuously capture a series of touch events (e.g., coordinates of a touch point, touch time, etc.) generated by the touch object on the touch screen and report the series of touch events to the processor 110. The touch object may be a touch pen, a finger of a user, a joint, or an object such as a touch glove with a touch function, which is not limited in this embodiment of the application, and in the following embodiments, the finger of the user is taken as the touch object for exemplary illustration.
The processor 110 may calculate a sliding track and a sliding distance of the user's finger on the touch screen according to the touch events. If the processor 101 calculates that the sliding distance of the finger of the user in the y-axis direction is greater than the preset value, it indicates that the user wishes to enter the single-hand operation mode at this time to control the operation control at the top of the display interface. In turn, the processor 101 may instruct the touchscreen to mark operational controls located at the top of the display interface. For example, as also shown in FIG. 2, the touch screen may highlight the operational control (i.e., the return button 202) located at the top leftmost side of the display interface 201 such that the return button 202 is in an active state.
Then, if the user confirms that the return button 202 needs to be clicked at this time to implement the return function, the user may leave the finger from the touch screen, and end the present sliding operation. After the touch screen detects that the user's finger leaves, an event that the sliding operation is finished may be sent to the processor 110, and since the return button 202 is in an activated state at this time, the processor 110 may perform a corresponding operation when the return button 201 is clicked, for example, the processor 110 may instruct the touch screen to display a menu at a previous level of the display interface 201 after the return button 201 is clicked (an interface of the menu at the previous level of the display interface 201 is not shown in fig. 2).
It can be seen that, when the display interface 201 is displayed on the touch screen, if a user wishes to enter the one-handed operation mode to control the operation control at the top of the display interface 201, the user may input a preset sliding operation in the display interface 201, and trigger the processor 110 to display the operation control at the top of the display interface 201 in an activated state. Further, in response to the operation of moving away from the user's finger in the sliding operation, the processor 110 may perform a corresponding click operation on the activated operation control, thereby simulating the user's finger to click on the operation control to implement a corresponding function.
Therefore, the user can enter the single-hand operation mode through one-time sliding operation, the click function of the top operation control in the current display interface is realized, the electronic equipment does not need to reduce the content in the display interface in the operation process, the user can conveniently and efficiently operate the operation control at the top of the display interface, and the use experience of the user in the single-hand operation process is improved.
The software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 3 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The structure can also be referred to as a self-developed mobile terminal operating system.
1. Application layer
The application layer may include a series of applications.
As shown in fig. 3, the application programs may include Applications (APP) such as call, contact, camera, gallery, calendar, map, navigation, bluetooth, music, video, and short message.
2. Application framework layer
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application programs of the application layer. The application framework layer includes a number of predefined functions.
Illustratively, a view system (view system), a notification manager, an activity manager, a window manager, a content provider, a view system, an input manager, etc. may be included in the application framework layer.
Wherein the view system can be used for constructing a display interface of an application program. Each display interface may be comprised of one or more controls. Generally, a control may include an interface element such as an icon, button, menu, tab, text box, dialog box, status bar, navigation bar, Widget, and the like.
The notification manager enables an application program to display notification information in a status bar, can be used for conveying notification type messages, can automatically disappear after a short stop, and does not need user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The activity manager described above may be used to manage the lifecycle of each application. Applications typically run in the form of activity in the operating system. The activity manager may schedule activity processes of the applications to manage the lifecycle of each application. The window manager is used for managing window programs.
The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
3. Android runtime and system library
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
Wherein the surface manager is used for managing the display subsystem and providing the fusion of the 2D and 3D layers for a plurality of application programs. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
4. Inner core layer
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver and the like.
In some embodiments of the present application, the above-described viewing system may draw each frame of the display interface in response to a request by an application. The view system may record view information of the display interface when drawing the display interface, where the view information may include positions and sizes of various controls in the display interface, and layer relationships among the controls, and the like. Illustratively, the view information may also record the attribute or type of each control. For example, control 1 is a button type control, or control 2 is a textbox type control, etc.
Illustratively, as shown in fig. 4 (a), a display interface 301 of the application layer is running the WeChat APP. When the user's finger slides in the display interface 301 of the WeChat APP, the touch screen can obtain information of a series of touch points related to the touch operation, such as coordinates (x, y) of the touch points. Furthermore, the touch screen can report the original touch event generated by the touch operation of the user to the kernel layer. After the kernel layer obtains the original touch event, the touch event may be packaged into a high-level touch event that can be read by an application Framework layer (i.e., a Framework layer), where the touch event includes coordinates of a touch point and a type of the touch event, such as an action down event, an action move event, and an action up event. And sends the high-level touch event to an input manager (InputManager) in the application framework layer.
After the input manager acquires the advanced touch event, the sliding track and the sliding distance of the user finger can be calculated in real time according to the coordinates of the touch point in the advanced touch event and the type of the touch event. For example, when the input manager detects an action down event, it indicates that the user's finger is touching the touch screen, and when the input manager detects an action up event, it indicates that the user's finger is off the touch screen. Then, the input manager may recognize a sliding trajectory and a sliding distance of the user's finger on the touch screen according to coordinates of a touch point between adjacent action down events and action up events. If the sliding distance D of the user's finger along the y-axis direction (e.g., the positive y-axis direction) is detected to reach the preset value, it indicates that the user wishes to enter the one-handed operation mode to control the operation control at the top of the display interface 301. At this time, the input manager may determine the specific operation control that needs to be activated at this time according to the view information of the display interface 301 recorded in the view system.
For example, if a touch point of a user's finger is detected to be located in the left half of the display screen, e.g., the abscissa x of the touch point is located in the left half of the display interface 301 when the user's finger slides in the display interface 301, the input manager may request the view system to query an operation control located at the leftmost position at the top of the display interface 301. The view system may determine that the operation control at the leftmost side on the top of the display interface 301 is the return button 302 according to the view information of the display interface 301, that is, the operation control that needs to be activated at this time is the return button 302.
Accordingly, it is detected that the touch point of the user's finger is located on the right half of the display screen, for example, if the abscissa x of the touch point is located on the right half of the display interface 301 when the user's finger slides in the display interface 301, the input manager may request the view system to query the operation control located at the rightmost side on the top of the display interface 301. The view system may determine, according to the view information of the display interface 301, that the operation control on the rightmost side at the top of the display interface 301 is the add button 303, that is, the operation control that needs to be activated at this time is the add button 303.
Taking the operation control to be activated as an example of the return button 302, as shown in (b) in fig. 4, the input manager may invoke the view system to display the return button 302 in an activated state in the display interface 301, so that the user knows that the return button 302 is activated. For example, the view system may display the return button 302 as active by changing the size, shape, or color of the return button 302, among other things.
After the return button 302 is activated, if the user wishes to implement the function associated with clicking the return button 302, the user may end the sliding operation and move the finger away from the display interface 301. After detecting this operation, the touch screen may report an input event that the user's finger leaves the display interface 301 to the input manager layer by layer. In response to the input event, the input manager may simulate an operation of a user clicking the currently activated return button 302, and report a corresponding touch event to the running WeChat APP. For example, the coordinates of the touch point in the touch event may be any coordinates on the return button 302, and the type of the touch event may be a single-click event. Then, the wechat APP in response to the touch event may call a callback function corresponding to the touch event of clicking the return button 302 to implement the corresponding return function.
In some embodiments, after a certain operation control in the display interface 301 is activated, since the user's finger has not left the display interface 301, the user may further continue to input the operation control activated by the sliding operation switch. For example, after the return button 302 is activated, if it is detected that the finger of the user continues to slide in the positive x-axis direction and the sliding distance is greater than the preset value, the view system may determine, according to the view information of the display interface 301, the operation control (e.g., the search button 303) closest to the return button 302 in the positive x-axis direction as the operation control that needs to be activated at that time, and display the search button 303 in an activated state.
That is to say, the user may trigger the electronic device to activate each operation control in the display interface by inputting a sliding operation in the display interface, so that the electronic device enters the one-handed operation mode. When the sliding operation input by the user is detected to be finished, namely when the finger of the user leaves the display interface, the user needs to click the activated operation control, and the electronic equipment can simulate a method for clicking the corresponding operation control to realize the corresponding function of the operation control.
Therefore, under the condition that the display content in the display interface is not changed, the electronic equipment provides an interactive mode for the user to quickly control the operation control to realize one-hand operation in the display interface, so that the use experience of the user in one-hand operation is improved while the display content and the display effect are ensured.
An example of the electronic device 100 is a mobile phone, and the method for operating a control according to the embodiment of the present application is described in detail below with reference to the accompanying drawings.
As shown in fig. 5, an operation method of a control provided in an embodiment of the present application may include the following steps S501 to S507:
s501, displaying a first interface of a first application by the mobile phone.
The first interface may be an interface displayed when any APP installed in the mobile phone runs.
For example, as shown in fig. 6, the first interface may be a home page 601 that the mobile phone is displaying when running the WeChat APP. One or more controls can be included in the home page 601. For example, the top of the home page 601 is provided with a return button 602, a title 603, a search button 604, an add button 605, and the like. The return button 602, the search button 604 and the add button 605 are all operation controls that can respond to a user touch operation; and a title 603 is a non-operation control that cannot respond to a user touch operation.
When the touch screen of the mobile phone is large, it becomes difficult for the user to click on the operation controls on the top of the home page 601 with a single hand. In this regard, in the embodiment of the present application, when the user wishes to control the operation control in the first interface (for example, the top page 601) being displayed, a sliding operation may be input into the top page 601, so as to trigger the mobile phone to continue to perform the following steps S502 to S507.
S502, the mobile phone detects the sliding operation input by the user in the first interface.
Still taking the home page 601 of the WeChat APP as the first interface for example, the mobile phone can detect the sliding operation input by the user in real time through the touch sensor on the touch screen while displaying the home page 601. The sliding operation refers to the whole process that the finger of the user generates certain displacement on the touch screen from the time when the finger of the user contacts the touch screen until the finger of the user leaves the touch screen.
As shown in fig. 7, taking the upper left corner of the front page 601 as the origin O (0,0), the x-axis of the rectangular coordinate system is parallel to the shorter side of the mobile phone, and the y-axis of the rectangular coordinate system is parallel to the longer side of the mobile phone. When detecting that the finger of the user touches the touch screen, the mobile phone can acquire the coordinates of the touch point a0 based on the rectangular coordinate system: a0(x0, y 0). The touch point a0 is the starting position when the user inputs a slide operation. During the process that the finger of the user inputs the sliding operation from the touch point a0 on the home page 601, the mobile phone can acquire the coordinates of a series of touch points such as the touch point a1, the touch point a2 and the touch point A3 in real time. The mobile phone can determine the sliding direction, the sliding distance, the sliding track and the like of the finger of the user in the home page 601 according to the collected coordinates of the touch points.
Of course, the mobile phone may also record information such as touch time of the corresponding touch point while acquiring the coordinates of each touch point in the sliding operation, and the mobile phone may determine the time, speed, and the like of finger staying at different positions when the user inputs the sliding operation according to the touch time of the touch point, which is not limited in this embodiment of the present application.
S503, responding to the first sliding process of the user finger in the sliding operation, and displaying the first operation control in the first interface as an activation state by the mobile phone.
In the embodiment of the application, the mobile phone can divide the sliding operation into different sliding processes according to the detected coordinates of the touch points (a0, a1 and a2 … …) in the sliding operation. That is to say, a sliding operation input by the user to the touch screen may be divided into different sliding processes in stages, and the user finger does not leave the touch screen in each sliding process until the user finger is detected to leave the touch screen, which indicates that the sliding operation is finished. The mobile phone can respond to different sliding processes to realize the control function of the user on the operation control in the first interface.
Exemplarily, as shown in fig. 8, the first sliding process of the sliding operation is: the process begins when the user's finger touches the first interface (e.g., home page 601) until the user's finger slides in the positive y-axis direction by a preset distance D1. After the mobile phone acquires the starting position a0(x0, y0) of the sliding operation, the sliding distance of the finger of the user in the positive direction of the y axis can be calculated in real time by taking a0(x0, y0) as a starting point according to the subsequently acquired coordinates of the touch point. For example, when the mobile phone collects the touch point a1(x1, y1), the sliding distance between the start position a0 and the touch point a1 in the positive direction of the y-axis can be calculated as y1-y 0. When y1-y0 ≧ D1, the cell phone can determine that the user has completed the first sliding process of the sliding operation in home page 601, indicating that the user now wishes to enter the one-handed operation mode to control the operation control in home page 601. At this time, the user's finger is still in contact with the touch screen and does not leave the touch screen, that is, the user's sliding operation in the top page 601 has not been finished.
Then, in response to the first sliding process of the sliding operation, the mobile phone may display the first operation control at the top of the home page 601 in an activated state, so as to prompt the user that the one-handed operation mode has been entered, and the currently activated control is the first operation control. For example, the mobile phone may display the first operation control in an activated state by changing the shape, size, or color of the first operation control. Or, the mobile phone can also prompt the user that the current first operation control is in an activated state through a text mode, a vibration mode, a prompt tone mode or the like.
As also shown in fig. 8, after detecting that the user inputs the first sliding process of the sliding operation, the mobile phone may activate the leftmost control at the top of the home page 601 by default, i.e., the return button 602, and then the mobile phone may deepen the background color of the return button 602, so that the return button 602 is displayed in an activated state. Of course, the mobile phone may also activate the operation control on the rightmost side at the top of the top page 601 or any operation control in the top page 601 by default, which is not limited in this embodiment of the present application.
Illustratively, after detecting that the user inputs the first sliding process of the sliding operation, the mobile phone may call a view system (view) in the application framework layer, and query the view information recorded by the view system when drawing the home page 601. The view information records information such as the position, size, and attribute of each control in the top page 601. Further, the mobile phone may determine, according to the view information of the home page 601, that the operation control located at the leftmost side of the top in the home page 601 is the return button 602, and the return button 602 is an operation control that needs to be activated this time. In turn, the cell phone may continue to invoke the view system to redraw the background color of the back button 602 so that the back button 602 is displayed in an active state in the home page 601.
Or after detecting that the user inputs the first sliding process of the sliding operation, the mobile phone may call the screenshot application to perform the screenshot operation on the home page 601, so as to obtain the screenshot of the home page 601. Furthermore, the mobile phone can recognize the operation control in the screenshot through an OCR (optical character recognition) algorithm or an AI recognition algorithm, so as to recognize each operation control and the position of the operation control in the home page 601. In this way, the cellular phone can determine that the return button 602 located at the leftmost position on the top of the top page 601 is an operation control to be activated, and thus display the return button 602 in an activated state.
Of course, a person skilled in the art may also set other methods capable of identifying each operation control and the position of the operation control in the home page 601 according to actual experience or actual application scenarios, which is not limited in this embodiment of the present application.
For example, after detecting that the user inputs the first sliding process of the sliding operation, the operating system of the mobile phone may send a request to the running WeChat APP, where the request may be used to query the operation control located at the top leftmost side of the home page 601. The wechat APP may determine, in response to the request, that the operation control located at the leftmost side of the top of the home page 601 is the return button 602, and further, the wechat APP may return the identifier of the return button 602 to the mobile phone operating system, so that the mobile phone operating system may call the view system according to the identifier of the return button 602 to display the return button 602 in an activated state.
In some embodiments, after detecting that the user inputs the first sliding process of the sliding operation, the mobile phone may further determine which operation control in the top page 601 is specifically activated according to the abscissa of the touch point in the first sliding process.
For example, the mobile phone may determine that the user has completed the first sliding process of the sliding operation based on the touch point a0(x0, y0) and the touch point a1(x1, y1), where a start position of the first sliding process is a0 and an end position is a 1. Then the phone can also determine the position of the abscissa x0 of the starting position a0 and the abscissa x1 of the ending position a1 in the first sliding process in the front page 601. Taking the length of the top page 601 in the x-axis as w units (e.g., in pixels), for example, if the abscissa x0 and x1 are both located in the interval of 0 to w/2, indicating that the user has input the first sliding process in the left half of the top page 601, the handset may display the operation control (i.e., the back button 602) located at the leftmost position at the top of the top page 601 in an activated state. Accordingly, if the abscissa x0 and x1 are both located in the interval w/2 to w, indicating that the user has input the first sliding process in the right half of the top page 601, the cellular phone may display the operation control (i.e., the add button 605) located at the rightmost side of the top page 601 in an activated state.
In other embodiments, the mobile phone may also determine which operation control in the front page 601 is specifically activated according to the abscissa x0 of the start position a0 or the abscissa x1 of the end position a1 during the first sliding process.
For example, when the abscissa x0 of the start position a0 (or the abscissa x1 of the end position a 1) is within the interval of 0 to w/2 (i.e., the left half of the front page 601), the cell phone may display the return button 602, which is positioned at the top leftmost side of the front page 601, as being activated. When the abscissa x0 of the start position a0 (or the abscissa x1 of the end position a 1) is within the interval w/2 to w (i.e., the right half of the front page 601), the cell phone can display the add button 605 located at the top and rightmost side of the front page 601 in an activated state.
In other embodiments, the mobile phone may further determine which operation control in the top page 601 is specifically activated according to the holding gesture of the user when inputting the first sliding process.
For example, the mobile phone may detect the holding gesture of the user as a left-hand holding gesture or a right-hand holding gesture through an acceleration sensor, a touch sensor on a touch screen, or the like. Then, after the mobile phone detects that the user has input the first sliding process of the sliding operation, if the user is in a left-hand holding posture, and the operation control at the rightmost side of the top of the home page 601 is operated by the single hand of the user, the mobile phone may display the add button 605 at the rightmost side of the top of the home page 601 in an activated state. Accordingly, if the user is holding the gesture with the right hand, and the user is convenient to operate the operation control at the leftmost position on the top of the home page 601 with a single hand, the cellular phone may display the return button 602 located at the leftmost position on the top of the home page 601 in the activated state. Of course, it may also be configured to activate the return button 602 at the leftmost top of the front page 601 in the left-hand holding posture, and activate the add button 605 at the rightmost top of the front page 601 in the right-hand holding posture, which is not limited in this embodiment of the application.
That is to say, when the first operation control is activated for the first time in the first interface, the mobile phone may select different first operation controls to activate according to different policies. In this way, the user can enter the single-hand operation mode by using different first operation controls as entries, so that the user can select the operation control which needs to be clicked finally in the first interface in a manner of selecting the shortest path.
In addition, as shown in fig. 9, when the user inputs the first sliding process in the home page 601, if the mobile phone detects that the sliding distance of the finger of the user in the positive direction of the y axis has not reached the preset value D1, the mobile phone may prompt the user to continue to pull down and slide in a manner of characters, animations, and the like to activate the operation control at the top, so as to guide the user to quickly complete the first sliding process of the sliding operation to trigger the mobile phone to enter the one-handed operation mode. Of course, in the first sliding process of the user inputting the sliding operation, the mobile phone may also perform gradual change or transparency processing on the display effect of the first operation control (for example, the return button 602) according to the increase of the sliding distance of the finger of the user in the y-axis positive direction, until the user completes the first sliding process of the sliding operation, the mobile phone finally displays the return button 602 in the activated state.
Or, if the mobile phone detects that the sliding distance of the finger of the user in the positive direction of the y axis has not reached the preset value D1, the mobile phone leaves the touch screen, which indicates that the purpose of the user inputting the pull-down operation is not to hope that the mobile phone enters the one-handed operation mode to activate the operation control in the display interface, and then the mobile phone may execute a corresponding operation instruction, such as a refresh instruction, a page-down instruction, and the like, on the top page 601 according to the existing response flow of the pull-down operation.
In some embodiments, the touch screen of the mobile phone can support a pull-up operation or a pull-down operation input by a user on the interface when displaying some interfaces, so as to display the display contents at different positions in the interface in a sliding manner. As shown in fig. 10 (a), the cell phone is displaying the first content of the top page 601 in the WeChat APP, and the first content is located at the bottom of the top page 601. At this time, if the cellular phone detects that the user's finger is slid from the touch point a0 to the touch point a1 in the y-axis direction of the home page 601 (i.e., the first sliding process of the sliding operation), the cellular phone can inquire whether the currently displayed first content is located at the top of the home page 601. If the first content is not at the top of the top page 601, the user's intent to enter the first swipe process may be to wish to perform a pull-down operation to display the content at the top of the top page 601. Then, as shown in fig. 10 (b), in response to the first sliding process, the handset slidable home page 601 displays the second content at the top of the home page 601. Accordingly, if the currently displayed first content is located at the top of the home page 601, the mobile phone may enter the one-handed operation mode to activate the operation control in the home page 601 according to the above method. That is to say, after the mobile phone detects the first sliding process of the sliding operation input by the user, it can be determined whether the operation control in the home page 601 needs to be activated according to the display content in the current display interface.
In some embodiments, as also shown in fig. 10 (b), after the cell phone slides the top page 601 to display the second content at the top of the top page 601, the cell phone may automatically display the first operation control (e.g., the back button 602) at the top of the top page 601 in an activated state. Therefore, the user can trigger the mobile phone to slide the display content in the current display interface upwards by inputting the sliding operation of sliding downwards, and can also trigger the mobile phone to enter a single-hand operation mode to activate the operation control in the display interface.
Or, in some embodiments, the mobile phone may further determine whether the operation control in the home page 601 needs to be activated according to parameters such as a sliding speed or a touch area in the first sliding process of the user inputting the sliding operation.
For example, when it is detected that the sliding speed of the user in the first sliding process is greater than the speed threshold, the mobile phone may activate a first operation control located at the top of the home page 601; if the sliding speed of the user in the first sliding process is smaller than the speed threshold, the mobile phone can report the corresponding sliding event to the WeChat APP according to the existing flow, and the WeChat APP is triggered to execute a page turning instruction corresponding to the sliding event. For another example, when it is detected that the touch area of the finger of the user in the first sliding process is greater than the area threshold, the mobile phone may activate a first operation control located at the top of the home page 601; if the touch area of the user finger is smaller than the area threshold value in the first sliding process, the mobile phone can report the corresponding sliding event to the WeChat APP according to the existing flow, and the WeChat APP is triggered to execute a page turning instruction corresponding to the sliding event.
And S504, responding to a second sliding process of the user finger in the sliding operation, and displaying a second operation control in the first interface as an activated state by the mobile phone.
For example, in step S503, the mobile phone displays the return button 602 as the activated state, after the user completes the first sliding process of the sliding operation in the home page 601, the mobile phone displays the return button 602 as the activated state in response to the first sliding process, at this time, the sliding operation input by the user is not finished, and the finger of the user is not away from the home page 601. If the currently activated return button 602 is not an operation control that the user needs to click, the user may continue the second sliding process of the sliding finger input sliding operation to trigger the mobile phone to switch the activated operation control in the home page 601.
For example, as shown in fig. 11, the mobile phone may use the end position a1 of the first sliding process as the start position of the second sliding process, and detect the sliding distance of the finger of the user on the x-axis or the y-axis in real time. If the sliding distance of the finger of the user on the x axis or the y axis is greater than or equal to the preset value D2(D1 and D2 may be the same or different), the user is said to complete the second sliding process in the sliding operation. At this time, the user's finger is still in contact with the touch screen and does not leave the touch screen, that is, the user's sliding operation in the top page 601 has not been finished.
For example, the mobile phone may switch the activated first operation control in the home page 601 to the second operation control according to the sliding direction of the user's finger in the second sliding process.
For example, after the first sliding process is finished, if the user continues to input the second sliding process of the sliding operation along the x-axis direction, the mobile phone may switch the activated first operation control in the top page 601 to the second operation control along the x-axis direction.
As also shown in fig. 11, the cellular phone may start to detect whether the user inputs the second sliding process with the end position a1(x1, y1) of the above-described first sliding process as the start position. When the sliding distance x2-x1 ≧ D2 on the x-axis between the starting position A1(x1, y1) and the touch point A2(x2, y2) is detected, the user's finger continues to move a distance in the positive direction of the x-axis, and the second sliding process in the sliding operation is completed. At this point, the handset may restore the return button 602 to the deactivated state in response to the second sliding process, while the handset may display the search button 604 adjacent to the return button 602 in the positive x-axis direction as the activated state. That is, the handset may switch the activated operational control from the back button 602 (i.e., the first operational control) to the search button 604 (i.e., the second operational control) in response to the second sliding process by the user.
For example, after detecting that the user completes the second sliding process of the sliding operation, the mobile phone may invoke the view system to query, according to the sliding direction (i.e., the positive x-axis direction) in the second sliding process, the operation control closest to the return button 602 in the positive x-axis direction as the search button 604. Further, the handset callable view displays the search button 604 in the home page 601 in an activated state, while restoring the activated return button 602 in a deactivated state.
Or, when the mobile phone detects that the user's finger completes the second sliding process of the sliding operation along the negative x-axis direction with the ending position a1(x1, y1) of the first sliding process as the starting position, the mobile phone may prompt the user that there is no operable control in the form of text, vibration, etc. since there is no operation control adjacent to the return button 602 in the negative x-axis direction. Alternatively, the handset may display the activated return button 602 as deactivated and exit the single-handed mode of operation.
That is, the mobile phone can laterally switch the activated operation control for the user in the x-axis direction by determining the position relationship between the abscissa x1 of the starting position a1 and the abscissa x2 of the ending position a2 in the second sliding process, so that the user can conveniently and quickly activate the operation control which needs to be clicked in the first interface.
Similarly, after the first sliding process is finished, if the user continues to input the second sliding process of the sliding operation along the y-axis direction, the mobile phone may switch the activated first operation control in the top page 601 to the second operation control along the y-axis direction.
For example, after the mobile phone activates the return button 602 in the front page 601, the mobile phone may start to detect whether the user inputs the second sliding process in the y-axis direction with the end position a1(x1, y1) of the first sliding process as a start position. As shown in FIG. 12, when the sliding distance y3-y1 ≧ D2 on the y-axis between the start position A1(x1, y1) and the touch point A3(x3, y3) is detected, it indicates that the user's finger continues to move a distance in the positive direction of the y-axis, and the second sliding process in the current sliding operation is completed.
In response to the second sliding process, as shown in fig. 12, the mobile phone may use the option 606 closest to the return button 602 in the front page 601 in the y-axis direction as an operation control to be activated, and then display the option 606 in the activated state, and at the same time, restore the activated return button 602 to the deactivated state.
For example, if the user inputs the second sliding process of the sliding operation in the y-axis direction, the mobile phone may determine the operation control that needs to be activated at this time according to the position of the activated return button 602 in the home page 601. For example, if the return button 602 is located on the left side of the home page 601, the mobile phone may preferentially determine the operation control located below the return button 602 on the left side of the home page 601 as the operation control that needs to be activated; if the return button 602 is located on the right side of the home page 601, the cellular phone may preferentially determine the operation control located below the return button 602 on the right side of the home page 601 as the operation control that needs to be activated.
That is to say, the mobile phone can longitudinally switch the activated operation control for the user in the y-axis direction by determining the position relationship between the ordinate y1 of the start position a1 and the ordinate y3 of the end position A3 in the second sliding process, so that the user can conveniently and quickly activate the operation control which needs to be clicked in the first interface.
It can be understood that, after entering the one-handed operation mode, the user may continuously input a plurality of sliding processes in one sliding operation according to the method described in step S504 to switch the activated operation control in the first interface until the activated operation control is the operation control that the user wishes to click this time.
It should be noted that, after the mobile phone switches the activated operation control from the first operation control (e.g., the return button 602) to the second operation control (e.g., the option 606) in response to the second sliding process input by the user, the user may also trigger the mobile phone to return the activated operation control from the option 606 to the return button 602 by inputting a sliding process in a direction opposite to that of the second sliding process. For example, as shown in (a) of fig. 13, after the cell phone displays the option 606 as an activated state in response to the second sliding process input by the user, the mobile phone may continue to detect the sliding process input by the user with the end position a3 of the second sliding process as a start position. If the user's finger is detected to have moved a preset distance D2 in the negative direction along the y-axis from the start position A3, the handset may redisplay the return button 602 as activated and the activated option 606 as deactivated, as shown in FIG. 13 (b).
In addition, as shown in fig. 14 (a), the cellular phone displays the return button 602 at the top of the home page 601 in an activated state in response to the first swipe process or the second swipe process of the user's present swipe operation. At this time, the user's finger has not left the home page 601, if the mobile phone detects that the user's finger continues to slide along the y-axis negative direction, because there are no other operation controls already in the home page 601 in the y-axis negative direction of the return button 602, it indicates that the operation intention of the user at this time may be that the user needs to exit the one-handed operation mode, the mobile phone may display the activated return button 602 in a deactivated state, and at this time, there are no activated operation controls in the home page 601, that is, the mobile phone exits the one-handed operation mode.
Accordingly, as shown in (b) of fig. 14, the mobile phone displays the add button 605 at the top of the home page 601 in an activated state in response to the first swipe process or the second swipe process of the user's present swipe operation. At this time, the user's finger has not left the home page 601, and if the mobile phone detects that the user's finger continues to slide along the positive x-axis direction, the mobile phone may display the activated add button 605 in a deactivated state because there are no other operation controls in the home page 601 in the negative x-axis direction of the add button 605, thereby exiting the single-handed operation mode. Or, after the mobile phone displays the add button 605 at the top of the home page 601 as the activated state, if the mobile phone detects that the user's finger continues to slide in the positive direction of the x-axis, the mobile phone may also display the operation control (for example, the option 606) closest to the add button 605 in the y-axis direction as the activated state.
For example, in the process that the mobile phone performs the above steps S503-S504 in response to the sliding operation input by the user, if the operation control activated in the home page 601 by the mobile phone is the operation control that the user wishes to click this time (for example, the return button 602), as shown in fig. 15, the user may leave the finger from the home page 601, that is, end the sliding operation this time. At this time, the mobile phone may perform an operation of clicking the return button 602 in response to an operation of moving away the user's finger. For example, the mobile phone may send a touch event to the running WeChat APP, coordinates of a touch point carried in the touch event may be any coordinate point on the return button 602, and the type of the touch event may be a single-click event. Then, the WeChat APP can realize the function of clicking the return button 602 in response to the touch event, and at this time, the mobile phone can display the previous menu of the home page 601 after clicking the return button 602. As shown in fig. 15, the upper menu of the first page 601 in the WeChat APP may be a desktop 701.
Therefore, the user can activate the operation control needing to be controlled in the display interface by inputting one-time sliding operation, and execute corresponding clicking operation on the activated operation control, so that the use experience of the user in one-hand operation is improved while the display content and the display effect are ensured.
And S505, responding to the stopping operation of the user finger in the sliding operation, and displaying a secondary menu after the second operation control is clicked in the first interface by the mobile phone.
In some embodiments, the activated operational control may be an operational control having a two-level menu. That is, when the user clicks on such an operation control, the mobile phone may display, in response to the click operation of the user, a secondary menu triggered by the operation control in the current display interface.
Taking the second operation control activated by the mobile phone in the home page 601 as an example of the add button 605, as shown in (a) in fig. 16, the mobile phone may display the add button 605 in an activated state in response to the second sliding process of the sliding operation, at this time, the finger of the user has not left the home page 601, that is, the sliding operation input by the user this time has not been finished. Since the add button 605 is an operation control having a secondary menu, after the add button 605 is activated, the mobile phone can continue to detect the staying time of the user's finger at the end position of the second sliding process (e.g., the touch point a 4).
If the staying time of the finger of the user at the touch point a4 exceeds the preset staying time (for example, 1s), which indicates that the user has input the preset staying operation in the current sliding operation, the user is likely to need to click the currently activated operation control, i.e., the add button 605. At this time, as shown in (b) in fig. 16, the cellular phone may perform an operation of clicking the add button 605 in response to a stay operation of the user's finger, thereby displaying the second-level menu 1601 after the add button 605 is clicked in the top page 601. For example, after detecting that the staying time of the finger of the user at the touch point a4 exceeds 1s, the cell phone may report a click event of the add button 605 to the WeChat APP, so that the WeChat APP may display the secondary menu 1601 in the home page 601 in response to the click event. At this time, the user's finger has not left the touch screen, that is, the sliding operation input by the user this time has not been finished.
In addition, besides the secondary menu 1601 of the add button 605 can be triggered to be opened by the mobile phone according to the staying time of the finger of the user, the mobile phone can also be provided with other ways to open the secondary menu 1601 of the add button 605. For example, the cell phone may detect the degree of pressure of the user's finger at the end position of the second sliding process (e.g., touch point A4). When the pressing force of the user's finger at the touch point a4 is greater than the pressure threshold, the cellular phone can perform an operation of clicking the add button 605, and the second-level menu 1601 after the add button 605 is clicked is displayed in the home page 601.
For another example, after the second sliding process is finished, the mobile phone may continue to detect the sliding operation of the user's finger on the touch screen. If a sliding operation of the user's finger is detected, the cellular phone may inquire whether the currently activated add button 605 contains a secondary menu. If the second-level menu is included, the mobile phone can execute an operation of clicking the add button 605, and the second-level menu 1601 after the add button 605 is clicked is displayed in the home page 601. Correspondingly, if the currently activated add button 605 does not include a secondary menu, the mobile phone may switch the activated operation control to the operation control around the add button 605 according to the sliding direction of the user.
In some embodiments, for an operation control not including a secondary menu, such as the above-mentioned return button 601, after the mobile phone displays such an operation control in an activated state, the dwell time of the user's finger in the home page 601 may also be continuously detected. And if the retention time of the fingers of the user exceeds the preset retention time, triggering the mobile phone to execute the operation of clicking the operation control. That is to say, the mobile phone may perform a click operation on the activated operation control in response to an operation of the user finger leaving the touch screen, and may also perform a click operation on the activated operation control in response to a staying operation of the user finger, which is not limited in this embodiment of the present application.
And S506, responding to a third sliding process of the user finger in the sliding operation, and displaying a third operation control in the secondary menu as an activated state by the mobile phone.
Still by way of example with the above-described secondary menu 1601 of the add button 605, as shown in fig. 17, the user's finger resting on touch point a4 triggers the handset to display the secondary menu 1601 on the home page 601. The operation controls (e.g., the sweep button, the collect and pay button, and the add button in fig. 17) in the secondary menu 1601 can be arranged laterally along the y-axis or longitudinally along the x-axis (not shown in fig. 17). The cell phone can continue to detect the sliding distance of the user's finger on the x-axis or the y-axis with the touch point a4 as the starting position. Similar to the second sliding process, if the sliding distance of the finger of the user on the x-axis or the y-axis is greater than or equal to the preset value D3(D1, D2 and D3 may be the same or different), it indicates that the user has completed the third sliding process in the sliding operation.
Still as shown in fig. 17, when the mobile phone detects that the user's finger moves from the start position a4(x4, y4) to the touch point a5(x5, y5), it can be calculated that the sliding distance y5-y4 ≧ D3 of the user's finger on the y-axis is calculated, which indicates that the mobile phone displays the secondary menu 1601 and then the user's finger continues to move a distance in the positive direction of the y-axis, thereby completing the third sliding process in the current sliding operation. At this time, the cellular phone may enter the one-handed operation mode of the secondary menu 1601 in the top page 601 in response to the third sliding process, and display an operation control (for example, a scan button 1701) closest to the add button 605 in the secondary menu 1601 in the positive direction along the y-axis as an activated state.
For example, after the mobile phone displays the scan button 1701 in the secondary menu 1601 as an activated state, the mobile phone may continue to switch the activated operation control in the secondary menu 1601 in response to the sliding process of the user's finger input in the first page 601. For a specific method for the mobile phone to switch the activated operation control in the secondary menu 1601, reference may be made to the specific method for the mobile phone to switch the activated operation control in the home page 601 in step S504, so details are not described here.
For example, as shown in (a) of fig. 18, when the receipt and payment button 1801 in the secondary menu 1601 is activated, the mobile phone may detect the sliding distance of the user's finger on the x-axis or the y-axis by using the touch point B1 of the user's finger on the home page 601 as the starting position. If the user's finger is detected to move to the touch point B2 after moving from the touch point B1 by a preset distance in the y-axis negative direction, the cellular phone may return the activated operation control to the scan button 1701 in the secondary menu 1601 and display the receipt and payment button 1801 in a deactivated state, as shown in (B) of fig. 18.
As also shown in (B) of fig. 18, when the sweep button 1701 in the secondary menu 1601 is activated, the cellular phone can detect the sliding distance of the user's finger on the x-axis or the y-axis with the touch point B2 of the user's finger in the home page 601 at this time as the start position. If the user's finger is detected to move to the touch point B3 after moving from the touch point B2 by a preset distance in the y-axis negative direction, since the sweep button 1701 is already the topmost operation control in the y-axis direction in the secondary menu 1601, the cellular phone may exit the secondary menu 1601 in response to the sliding process, and move back to the state where the add button 605 in the top page 601 is activated, as shown in (c) in fig. 18.
It can be seen that, in the embodiment of the application, a user can trigger the mobile phone to activate each operation control in the display interface by inputting a pull-down sliding operation, and can also trigger the mobile phone to activate a secondary menu of the operation controls in the display interface, and activate the corresponding operation control in the secondary menu according to the sliding process of the user finger. Therefore, the user can enter the secondary menu of the display interface by performing one-time sliding operation to control the corresponding operation control through one-hand operation, and the interaction process of performing one-hand operation among the multi-level menus is simplified.
And S507, responding to the operation of the user for finger to leave in the sliding operation, and displaying the second interface after the third operation control is clicked by the mobile phone.
For example, after the user inputs the third sliding process of the sliding operation, the mobile phone is triggered to activate the scan button 1701 in the secondary menu 1601, as shown in (a) in fig. 19, if the currently activated scan button 1701 is an operation control that the user needs to click this time, the user may leave the finger from the top page 601, that is, the sliding operation is ended.
At this time, the cellular phone may perform an operation of clicking the sweep button 1701 in response to an operation of moving the user's finger away. For example, the mobile phone may send a click event of the scan button 1701 to the running WeChat APP, and the WeChat APP may display a scan interface 1901 entered after the scan button 1701 is clicked in response to the click event, as shown in (b) of FIG. 19.
In some embodiments, some APPs may have a control hidden in their display interface that is refreshed by a pull-down. As shown in fig. 20 (a), the cell phone is displaying a home page 2001 of the music APP. The top of the home page 2001 is provided with three operation controls of a set button 2002, a search bar 2003, and more options 2004. And a pull-down refreshing control is hidden below the three operation controls. In general, if it is detected that the user inputs a slide operation in the positive y-axis direction (i.e., a pull-down operation) in the home page 2001, the cellular phone may display a hidden pull-down refresh control and perform a refresh operation.
In the embodiment of the present application, as also shown in fig. 20 (a), if the mobile phone detects that the finger of the user moves from the touch point C1 to the touch point C2 by the distance D1 in the positive y-axis direction, which indicates that the user has completed the first sliding process of the present sliding operation, the mobile phone may display the top operation control (for example, the setting button 2002) as the activated state in the home page 2001 in response to the first sliding process, as described in step S503.
Further, if the user needs to switch the activated operation control in the home page 2001 longitudinally along the y-axis, as shown in (b) of fig. 20, the user can slide the touch point C2 as the starting position for a distance D2 to the touch point C3 continuously along the positive direction of the y-axis, i.e., the second sliding process of the current sliding operation is completed. At this time, as described in step S504, the mobile phone may query the operation control closest to the activated setting button 2002 in the positive y-axis direction as the hidden pull-down refresh control 2005. At this time, as also shown in (b) in fig. 20, the cellular phone may display the hidden pull-down refresh control 2005 in the home page 2001, and display the pull-down refresh control 2005 in an activated state and the set button 2002 in a deactivated state. At this time, if it is detected that the user finger leaves the home page 2001, that is, the current sliding operation is ended, the mobile phone may execute the refresh operation corresponding to the pull-down refresh control 2005.
Alternatively, as shown in (C) of fig. 20, if the user's finger does not leave the home page 2001, the cellular phone may continue to detect a sliding process of the user's finger in the positive y-axis direction starting from the touch point C3, at which time the cellular phone may still display the hidden pull-down refresh control 2005 in the home page 2001 and display the pull-down refresh control 2005 in an activated state. Of course, if the user's finger is detected to slide in the positive x-axis direction starting at touch point C3, the handset may switch the activated operational control from the set button 2002 to the positive x-axis search bar 2003 or more options 2004. The specific method for switching the activated operation control can be referred to in the related description of step S504.
That is, after entering the one-handed operation mode, the pull-down refresh control 2005 is the last operation control in the home page 2001 that can be switched longitudinally along the y-axis. Therefore, after the user inputs the pull-down operation, the mobile phone can be triggered to enter the single-hand operation mode to control the operation control at the top of the display interface, and the pull-down refreshing operation in the display interface can be supported, so that the conflict between the sliding operation of the user entering the single-hand operation mode and the pull-down refreshing operation supported by the display interface in the same display interface is avoided.
Alternatively, in other embodiments, as shown in fig. 21 (a), after the mobile phone detects that the sliding of the user's finger from the touch point C1 to the touch point C2 completes the first sliding process of the sliding operation, the mobile phone may query whether the pull-down refresh control 2005 is hidden in the current home page 2001. If the pull-down refresh control 2005 is not hidden in the home page 2001, the cellular phone may enter the one-handed operation mode in response to the first sliding process as described in step S503 above, and the operation control at the top (for example, the set button 2002) is displayed in the home page 2001 in an activated state.
Accordingly, if the pull-down refresh control 2005 is hidden in the home page 2001, as shown in (b) of fig. 21, the mobile phone may display the hidden pull-down refresh control 2005 in the home page 2001 and display the pull-down refresh control 2005 in an activated state, thereby prompting the user that the pull-down refresh control 2005 has been selected. At this time, if it is detected that the user's finger leaves the home page 2001, the mobile phone may perform a refresh operation corresponding to the pull-down refresh control 2005.
Alternatively, as also shown in fig. 21 (b), if the mobile phone detects that the user's finger continues to slide along the y-axis from the touch point C2 to the touch point C3 after activating the pull-down refresh control 2005, it indicates that the pull-down operation or the pull-up operation input by the user at this time does not wish to refresh the current home page 2001, but wishes to enter the one-handed operation mode to control the operation control in the home page 2001. At this time, as shown in (c) in fig. 21, the cellular phone may hide the pull-down refresh control 2005 again in the home page 2001 and display the operation control (for example, the set button 2002) on the top of the home page 2001 in an activated state in accordance with the method described in step S503. Subsequently, the mobile phone can respond to the sliding operation input by the user to control the operation control in the home page 2001 according to the method described in steps S504-S507.
It should be noted that the above embodiment is exemplified by hiding the pull-down refresh control 2005 in the home page 2001. It can be understood that the mobile phone may also display a corresponding refresh control in the display interface of the APP. At this time, the method for activating the refresh control in the display interface by the user inputting the sliding operation is the same as the method for activating the pull-down refresh control 2005 hidden in the home page 2001 in the above embodiment, and therefore, the detailed description thereof is omitted here.
It can be seen that, in the embodiment of the application, a user inputs a complete sliding operation once, and the mobile phone can activate an operation control that the user needs to click in the display interface (or a secondary menu of the display interface) in response to the sliding process at different stages in the sliding operation, and perform a corresponding click operation on the activated operation control, so as to implement a one-handed control function on the operation control, and simultaneously, display content and display effect in the display interface are not affected, thereby improving the use experience of the user during one-handed operation while ensuring the display content and the display effect.
The embodiment of the application discloses electronic equipment, which comprises a processor, and a memory, input equipment and output equipment which are connected with the processor. In which an input device and an output device may be integrated into one device, for example, a touch sensor may be used as the input device, a display screen may be used as the output device, and the touch sensor and the display screen may be integrated into a touch screen.
At this time, as shown in fig. 22, the electronic device may include: a touch screen 2201, the touch screen 2201 comprising a touch sensor 2206 and a display 2207; one or more processors 2202; a memory 2203; one or more application programs (not shown); and one or more computer programs 2204, which may be connected via one or more communication buses 2205. Wherein the one or more computer programs 2204 are stored in the memory 2203 and configured to be executed by the one or more processors 2202, the one or more computer programs 2204 comprising instructions that may be used to perform the steps of the embodiments described above. All relevant contents of the steps related to the above method embodiment may be referred to the functional description of the corresponding entity device, and are not described herein again.
For example, the processor 2202 may be specifically the processor 110 shown in fig. 1B, the memory 2203 may be specifically the internal memory 121 shown in fig. 1B, the display 2207 may be specifically the display 194 shown in fig. 1B, and the touch sensor 2206 may be specifically the touch sensor in the sensor module 180 shown in fig. 1B, which is not limited in this embodiment of the present invention.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only a specific implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any changes or substitutions within the technical scope disclosed in the embodiments of the present application should be covered by the scope of the embodiments of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the protection scope of the claims.

Claims (28)

1. An operating method of a control, comprising:
displaying a first interface of an application, wherein the application is installed in an electronic device with a touch screen, and a first operation control is arranged on the top of the first interface;
receiving sliding operation input by a user on the first interface;
when the sliding operation is detected to meet a first sliding process, displaying the first operation control in an activated state, wherein the first sliding process is a process that a touch body slides a first preset distance along a first direction after contacting the first interface;
and when the sliding operation is finished, responding to the end of the sliding operation, and displaying a second interface, wherein the second interface is an interface to be displayed after the first operation control is clicked.
2. The method of claim 1, after displaying the first operational control in an active state, further comprising:
detecting that the sliding operation meets a second sliding process, wherein the starting position of the second sliding process is the end position of the first sliding process, and the touch body keeps touching on the touch screen of the electronic equipment in the first sliding process and the second sliding process;
and in response to detecting that the sliding operation meets the second sliding process, displaying a second operation control in the first interface as an activated state, and displaying the first operation control as a deactivated state.
3. The method of claim 2,
when the projection of the sliding distance in the second sliding process in the second direction is greater than a second preset distance, the second operation control is the operation control closest to the first operation control along the second direction, and the second direction is the same as or different from the first direction.
4. The method of claim 2 or 3, wherein after displaying the second operational control in the first interface in an activated state and the first operational control in a deactivated state, further comprising:
detecting that the sliding operation meets a third sliding process, wherein the sliding direction of the third sliding process is opposite to the sliding direction of the second sliding process, and the difference between the sliding distance of the third sliding process and the sliding distance of the second sliding process is smaller than a preset value;
in response to detecting that the sliding operation satisfies the third sliding process, displaying the first operation control as the activated state again, and displaying the second operation control as the deactivated state.
5. The method according to any of claims 1-4, wherein the first operational control is a control comprising a secondary menu; after the first operation control is displayed in an activated state, the method further comprises the following steps:
detecting the staying time of the touch body at the end position of the first sliding process;
and if the retention time is longer than the preset time, displaying a secondary menu of the first operation control in the first interface.
6. The method of claim 5, after displaying the second level menu of the first operational control in the first interface, further comprising:
detecting that the sliding operation meets a fourth sliding process, wherein the starting position of the fourth sliding process is the end position of the first sliding process, and the touch body keeps touching on a touch screen of the electronic equipment in the fourth sliding process;
and displaying a third operation control in the secondary menu as an activated state in response to the sliding operation meeting the fourth sliding process.
7. The method of any of claims 1-6, wherein a refresh button is hidden within the first interface, the refresh button being adjacent to the first operational control in a third direction, the third direction being the same or different from the first direction;
wherein after displaying the first operation control in an activated state, the method further comprises:
detecting that the sliding operation meets a fifth sliding process, where the fifth sliding process is a process of moving a third preset distance in the third direction from an end position of the first sliding process, and in the fifth sliding process, the touch body keeps touching on the touch screen of the electronic device;
in response to detecting that the sliding operation satisfies the fifth sliding process, displaying the refresh button in the first interface, and displaying the refresh button in an activated state.
8. The method of claim 7, wherein after displaying the refresh button in the first interface and displaying the refresh button in an active state, further comprising:
if the touch body is detected to leave the first interface, executing a refreshing operation corresponding to the clicking of the refreshing button;
and if the touch body is detected to continuously slide, continuously displaying the refresh button in an activated state.
9. The method according to any one of claims 1 to 8,
when the starting position of the first sliding process is located in the left half part of the first interface, the first operation control is an operation control at the leftmost side of the top of the first interface; when the starting position of the first sliding process is located in the right half part of the first interface, the first operation control is an operation control on the rightmost side of the top of the first interface; alternatively, the first and second electrodes may be,
when the end position of the first sliding process is positioned in the left half part of the first interface, the first operation control is the operation control on the leftmost side at the top of the first interface; when the end position of the first sliding process is located in the right half part of the first interface, the first operation control is an operation control on the rightmost side of the top of the first interface.
10. The method according to any one of claims 1 to 8,
when the holding posture of the user is a left-hand holding posture, the first operation control is an operation control on the rightmost side of the top of the first interface;
when the holding gesture of the user is a right-hand holding gesture, the first operation control is an operation control on the leftmost side of the top of the first interface.
11. The method of any of claims 1-10, wherein displaying the first operational control in an active state comprises:
displaying the first operation control in an activated state by changing the shape, size, color or background of the first operation control.
12. The method according to any one of claims 1-11, wherein the end of the sliding operation refers to the touch object moving away from a touch screen of the electronic device;
wherein, in response to the end of the sliding operation, displaying a second interface includes:
when the fact that the touch body leaves the touch screen of the electronic equipment is detected, reporting a click event of the first operation control to the application, so that the application responds to the click event to display a second interface to be displayed after the first operation control is clicked.
13. The method of any one of claims 1-12, wherein the first direction is a direction pointing from a top of the first interface to a bottom of the first interface.
14. An electronic device, comprising:
a touch screen, wherein the touch screen comprises a touch sensor and a display screen;
one or more processors;
a memory;
wherein the memory has stored therein one or more computer programs, the one or more computer programs comprising instructions, which when executed by the electronic device, cause the electronic device to perform the steps of:
displaying a first interface of an application, wherein a first operation control is arranged at the top of the first interface;
receiving sliding operation input by a user on the first interface;
when the sliding operation is detected to meet a first sliding process, displaying the first operation control in an activated state, wherein the first sliding process is a process that a touch body slides a first preset distance along a first direction after contacting the first interface;
and when the sliding operation is finished, responding to the end of the sliding operation, and displaying a second interface, wherein the second interface is an interface to be displayed after the first operation control is clicked.
15. The electronic device of claim 14, wherein after displaying the first operational control in the activated state, the electronic device is further configured to perform:
detecting that the sliding operation meets a second sliding process, wherein the starting position of the second sliding process is the end position of the first sliding process, and the touch body keeps touching the touch screen in the first sliding process and the second sliding process;
and in response to detecting that the sliding operation meets the second sliding process, displaying a second operation control in the first interface as an activated state, and displaying the first operation control as a deactivated state.
16. The electronic device according to claim 15, wherein when a projection of a sliding distance in the second sliding process in a second direction is greater than a second preset distance, the second operation control is an operation control closest to the first operation control in the second direction, and the second direction is the same as or different from the first direction.
17. The electronic device according to claim 15 or 16, wherein after displaying the second operation control in the first interface in an activated state and the first operation control in a deactivated state, the electronic device is further configured to perform:
detecting that the sliding operation meets a third sliding process, wherein the sliding direction of the third sliding process is opposite to the sliding direction of the second sliding process, and the difference between the sliding distance of the third sliding process and the sliding distance of the second sliding process is smaller than a preset value;
in response to detecting that the sliding operation satisfies the third sliding process, displaying the first operation control as the activated state again, and displaying the second operation control as the deactivated state.
18. The electronic device of any of claims 14-17, wherein the first operational control is a control comprising a secondary menu;
after displaying the first operation control in the activated state, the electronic device is further configured to perform:
detecting the staying time of the touch body at the end position of the first sliding process;
and if the retention time is longer than the preset time, displaying a secondary menu of the first operation control in the first interface.
19. The electronic device of claim 18, wherein after displaying the secondary menu of the first operational control in the first interface, the electronic device is further configured to perform:
detecting that the sliding operation meets a fourth sliding process, wherein the starting position of the fourth sliding process is the end position of the first sliding process, and the touch body keeps touching the touch screen in the fourth sliding process;
and displaying a third operation control in the secondary menu as an activated state in response to the sliding operation meeting the fourth sliding process.
20. The electronic device of any of claims 14-19, wherein a refresh button is hidden within the first interface, the refresh button being adjacent to the first operational control in a third direction, the third direction being the same or different from the first direction;
after displaying the first operation control in the activated state, the electronic device is further configured to perform:
detecting that the sliding operation meets a fifth sliding process, wherein the fifth sliding process is a process of moving a third preset distance in the third direction from the end position of the first sliding process, and in the fifth sliding process, the touch body keeps touching the touch screen;
in response to detecting that the sliding operation satisfies the fifth sliding process, displaying the refresh button in the first interface, and displaying the refresh button in an activated state.
21. The electronic device of claim 20, wherein after displaying the refresh button in the first interface and displaying the refresh button in an active state, the electronic device is further configured to perform:
if the touch body is detected to leave the first interface, executing a refreshing operation corresponding to the clicking of the refreshing button;
and if the touch body is detected to continuously slide, continuously displaying the refresh button in an activated state.
22. The electronic device of any of claims 14-21,
when the starting position of the first sliding process is located in the left half part of the first interface, the first operation control is an operation control at the leftmost side of the top of the first interface; when the starting position of the first sliding process is located in the right half part of the first interface, the first operation control is an operation control on the rightmost side of the top of the first interface; alternatively, the first and second electrodes may be,
when the end position of the first sliding process is positioned in the left half part of the first interface, the first operation control is the operation control on the leftmost side at the top of the first interface; when the end position of the first sliding process is located in the right half part of the first interface, the first operation control is an operation control on the rightmost side of the top of the first interface.
23. The electronic device of any of claims 14-21,
when the holding posture of the user is a left-hand holding posture, the first operation control is an operation control on the rightmost side of the top of the first interface;
when the holding gesture of the user is a right-hand holding gesture, the first operation control is an operation control on the leftmost side of the top of the first interface.
24. The electronic device according to any one of claims 14 to 23, wherein the displaying, by the electronic device, the first operation control in the activated state specifically includes:
displaying the first operation control in an activated state by changing the shape, size, color or background of the first operation control.
25. The electronic device according to any one of claims 14 to 24, wherein the end of the sliding operation means that the touch body leaves the touch screen;
wherein, in response to the end of the sliding operation, the electronic device displays a second interface, which specifically includes:
when the fact that the touch body leaves the touch screen is detected, reporting a click event of the first operation control to the application, so that the application responds to the click event and displays a second interface to be displayed after the first operation control is clicked.
26. The electronic device of any one of claims 14-25, wherein the first direction is a direction from a top of the first interface to a bottom of the first interface.
27. A computer-readable storage medium having instructions stored therein, which when run on an electronic device, cause the electronic device to perform a method of operating a control of any of claims 1-13.
28. A computer program product containing instructions for causing an electronic device to perform the method of operating a control according to any one of claims 1 to 13 when the computer program product is run on the electronic device.
CN201911230192.4A 2019-12-04 2019-12-04 Control operation method and electronic equipment Active CN111147660B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911230192.4A CN111147660B (en) 2019-12-04 2019-12-04 Control operation method and electronic equipment
PCT/CN2020/133858 WO2021110133A1 (en) 2019-12-04 2020-12-04 Control operation method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911230192.4A CN111147660B (en) 2019-12-04 2019-12-04 Control operation method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111147660A true CN111147660A (en) 2020-05-12
CN111147660B CN111147660B (en) 2021-06-15

Family

ID=70517659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911230192.4A Active CN111147660B (en) 2019-12-04 2019-12-04 Control operation method and electronic equipment

Country Status (2)

Country Link
CN (1) CN111147660B (en)
WO (1) WO2021110133A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111679778A (en) * 2020-06-01 2020-09-18 掌阅科技股份有限公司 Interactive display method of electronic book comment information, electronic equipment and storage medium
CN111782327A (en) * 2020-06-30 2020-10-16 百度在线网络技术(北京)有限公司 Display control method, display control device, electronic equipment and readable medium
WO2021110133A1 (en) * 2019-12-04 2021-06-10 华为技术有限公司 Control operation method and electronic device
CN113093795A (en) * 2021-03-30 2021-07-09 华南理工大学 Semi-automatic wireless control method and device for unmanned surface vehicle
CN113190230A (en) * 2021-05-21 2021-07-30 广东群创信息科技有限公司 Medical display method and system for setting UI interface in user-defined mode
WO2024087940A1 (en) * 2022-10-28 2024-05-02 Oppo广东移动通信有限公司 Application interface control method and apparatus, electronic device, and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080291162A1 (en) * 2004-02-27 2008-11-27 Research In Motion Limited Track wheel with reduced space requirements
EP2020634A1 (en) * 2007-07-31 2009-02-04 Palo Alto Research Institute Incorporated User interface for a context-aware leisure-activity recommendation system
CN102625931A (en) * 2009-07-20 2012-08-01 惠普发展公司,有限责任合伙企业 User interface for initiating activities in an electronic device
CN105094801A (en) * 2015-06-12 2015-11-25 阿里巴巴集团控股有限公司 Application function activating method and application function activating device
CN105224187A (en) * 2015-09-23 2016-01-06 北京金山安全软件有限公司 Menu execution control method and terminal equipment
US9367205B2 (en) * 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
CN106155452A (en) * 2015-03-23 2016-11-23 华为技术有限公司 The implementation method of a kind of one-handed performance and terminal
CN106406656A (en) * 2016-08-30 2017-02-15 维沃移动通信有限公司 Application program toolbar control method and mobile terminal
CN107111423A (en) * 2015-01-06 2017-08-29 甲骨文国际公司 Select the operable item in the graphic user interface of mobile computer system
US10282080B2 (en) * 2005-02-18 2019-05-07 Apple Inc. Single-handed approach for navigation of application tiles using panning and zooming
CN109753326A (en) * 2017-11-06 2019-05-14 阿里巴巴集团控股有限公司 Processing method, device, equipment and machine readable media

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111147660B (en) * 2019-12-04 2021-06-15 华为技术有限公司 Control operation method and electronic equipment

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080291162A1 (en) * 2004-02-27 2008-11-27 Research In Motion Limited Track wheel with reduced space requirements
US10282080B2 (en) * 2005-02-18 2019-05-07 Apple Inc. Single-handed approach for navigation of application tiles using panning and zooming
EP2020634A1 (en) * 2007-07-31 2009-02-04 Palo Alto Research Institute Incorporated User interface for a context-aware leisure-activity recommendation system
CN102625931A (en) * 2009-07-20 2012-08-01 惠普发展公司,有限责任合伙企业 User interface for initiating activities in an electronic device
US9367205B2 (en) * 2010-02-19 2016-06-14 Microsoft Technolgoy Licensing, Llc Radial menus with bezel gestures
CN107111423A (en) * 2015-01-06 2017-08-29 甲骨文国际公司 Select the operable item in the graphic user interface of mobile computer system
CN106155452A (en) * 2015-03-23 2016-11-23 华为技术有限公司 The implementation method of a kind of one-handed performance and terminal
CN105094801A (en) * 2015-06-12 2015-11-25 阿里巴巴集团控股有限公司 Application function activating method and application function activating device
CN105224187A (en) * 2015-09-23 2016-01-06 北京金山安全软件有限公司 Menu execution control method and terminal equipment
CN106406656A (en) * 2016-08-30 2017-02-15 维沃移动通信有限公司 Application program toolbar control method and mobile terminal
CN109753326A (en) * 2017-11-06 2019-05-14 阿里巴巴集团控股有限公司 Processing method, device, equipment and machine readable media

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021110133A1 (en) * 2019-12-04 2021-06-10 华为技术有限公司 Control operation method and electronic device
CN111679778A (en) * 2020-06-01 2020-09-18 掌阅科技股份有限公司 Interactive display method of electronic book comment information, electronic equipment and storage medium
CN111782327A (en) * 2020-06-30 2020-10-16 百度在线网络技术(北京)有限公司 Display control method, display control device, electronic equipment and readable medium
CN113093795A (en) * 2021-03-30 2021-07-09 华南理工大学 Semi-automatic wireless control method and device for unmanned surface vehicle
CN113190230A (en) * 2021-05-21 2021-07-30 广东群创信息科技有限公司 Medical display method and system for setting UI interface in user-defined mode
CN113190230B (en) * 2021-05-21 2024-05-03 广东群创信息科技有限公司 Medical display method and system for user-defined setting of UI (user interface)
WO2024087940A1 (en) * 2022-10-28 2024-05-02 Oppo广东移动通信有限公司 Application interface control method and apparatus, electronic device, and storage medium

Also Published As

Publication number Publication date
CN111147660B (en) 2021-06-15
WO2021110133A1 (en) 2021-06-10

Similar Documents

Publication Publication Date Title
JP7414842B2 (en) How to add comments and electronic devices
CN110351422B (en) Notification message preview method, electronic equipment and related products
US12001612B2 (en) Air gesture-based interaction method and electronic device
CN111147660B (en) Control operation method and electronic equipment
WO2021115194A1 (en) Application icon display method and electronic device
CN111371949A (en) Application program switching method and device, storage medium and touch terminal
CN110806831A (en) Touch screen response method and electronic equipment
CN112068907A (en) Interface display method and electronic equipment
WO2021057699A1 (en) Method for controlling electronic device with flexible screen, and electronic device
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
CN114035721B (en) Touch screen display method and device and storage medium
CN114816200A (en) Display method and electronic equipment
JP2024513773A (en) Display methods, electronic devices, storage media, and program products
US20230186013A1 (en) Annotation method and electronic device
CN115904160A (en) Icon moving method, related graphical interface and electronic equipment
EP4365722A1 (en) Method for displaying dock bar in launcher and electronic device
WO2022222688A1 (en) Window control method and device
CN115700431A (en) Desktop display method and electronic equipment
WO2022188632A1 (en) Theme display method and apparatus, terminal, and computer storage medium
CN116610248A (en) Gesture control method and electronic device
CN117008772A (en) Display method of application window and electronic equipment
CN116166156A (en) Icon moving method and related device
CN117648040A (en) Method for generating desktop folder and electronic equipment
CN117411964A (en) Implementation method of single-hand operation and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant