WO2022252788A1 - 一种控制方法及电子设备 - Google Patents

一种控制方法及电子设备 Download PDF

Info

Publication number
WO2022252788A1
WO2022252788A1 PCT/CN2022/084089 CN2022084089W WO2022252788A1 WO 2022252788 A1 WO2022252788 A1 WO 2022252788A1 CN 2022084089 W CN2022084089 W CN 2022084089W WO 2022252788 A1 WO2022252788 A1 WO 2022252788A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
touch screen
touch
electronic device
control center
Prior art date
Application number
PCT/CN2022/084089
Other languages
English (en)
French (fr)
Inventor
丁晨鹏
李旭
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP22814835.9A priority Critical patent/EP4332744A1/en
Priority to BR112023023988A priority patent/BR112023023988A2/pt
Publication of WO2022252788A1 publication Critical patent/WO2022252788A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to the technical field of terminals, and in particular to a control method and electronic equipment.
  • the notification center may be an entrance for managing pushes from applications (applications, APPs) in electronic devices or displaying resident state information.
  • the control center can be the portal to control the state of the equipment.
  • a notification center and a control center on an electronic device can be simultaneously displayed in one window.
  • sliding down from the top of the screen of the electronic device can call out the control center 11 and the notification center 12 at the same time, wherein the control center 11 is on the upper side, the notification center 12 is on the lower side, and the control center 11 can be folded by default to display the most
  • the commonly used shortcut switches support expanding to view more information, and the notification center 12 can support scrolling up and down like a list.
  • the display interface of the electronic device will be overcrowded when displaying the notification center and control center. Control centers on electronic devices are not designed to carry too much information at once. Therefore, how to facilitate the user to operate the notification center and control center on the electronic device is an urgent technical problem to be solved at present.
  • the present application provides a control method and an electronic device, which can facilitate a user to operate a notification center and a control center on the electronic device.
  • the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen; switching the first interface to the second interface in response to receiving a first operation on the touch screen
  • the first operation refers to the operation in which the initial position of the touch body touching the touch screen is located in the first area of the touch screen and slides along the first direction on the touch screen; after the touch body completes the first operation and leaves the touch screen, it responds to the touch screen receiving Go to the second operation, switch the second interface to the third interface
  • the second operation refers to the operation that the touch body re-contacts the touch screen in the second area of the touch screen, and slides along the second direction on the touch screen; After the touch body completes the second operation and leaves the touch screen, the third interface is switched to the first interface in response to the touch screen receiving a third operation.
  • the third operation means that the initial position of the touch body re-contacting the touch screen is located in the third area of the touch screen , and the operation of sliding along the third direction on the touch screen; wherein, the second interface is the display interface of the notification center, and the third interface is the display interface of the control center; or, the second interface is the display interface of the control center, and the third interface It is the display interface of the notification center.
  • the user can switch the notification center and the control center on the electronic device, and can directly return to the upper initial interface of the electronic device (such as the desktop, the display interface of the application, etc.) from the switched interface, which improves the convenience of user operation. Convenience and user experience.
  • the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen; switching the first interface to the second interface in response to receiving a first operation on the touch screen
  • the first operation refers to the operation in which the initial position of the touch body touching the touch screen is located in the first area of the touch screen and slides along the first direction on the touch screen; after the touch body completes the first operation and leaves the touch screen, it responds to the touch screen receiving Go to the second operation, switch the second interface to the third interface
  • the second operation refers to the operation that the touch body re-contacts the touch screen in the second area of the touch screen, and slides along the second direction on the touch screen; After the touch body completes the second operation and leaves the touch screen, in response to the touch screen receiving the first operation again, switch the third interface to the second interface; wherein, the second interface is the display interface of the notification center, and the third interface is the display of the control center interface; or, the second interface is
  • the third interface after switching the third interface to the second interface, it may further include: after the touch body completes the first operation and leaves the touch screen, responding to the touch screen receiving the third operation, switching the third interface to the second interface
  • the second interface is switched to the first interface
  • the third operation refers to an operation in which the initial position of the touch body re-contacting the touch screen is located in the third area of the touch screen, and the touch screen slides along the third direction.
  • the user can directly return to the upper initial interface (such as desktop, application display interface, etc.) of the electronic device from the switched interface on the electronic device, which improves the convenience of user operation and user experience.
  • the first interface includes a display interface of a desktop on the electronic device, or, the first interface includes a display interface of an application on the electronic device.
  • both the second interface and the third interface are displayed in the first window. Therefore, when switching between the two interfaces, the content in one interface can be used to replace the content in the other interface, so that there is no need to close one window and open the other window, which improves the switching efficiency.
  • the first window is a status bar window.
  • the first interface and the second interface are displayed in different windows.
  • the first interface when the first interface is the display interface of the application, the first interface may be displayed in the display window of the application, and the second interface may be displayed in the status bar window.
  • the first area is located on the first side of the top of the touch screen, and the first direction is a direction from the top of the touch screen toward the bottom of the touch screen;
  • the second area is located on the top of the touch screen On the second side of the top, the second direction is the same as the first direction;
  • the third area is an area on the touch screen other than the first area and the second area, and the third direction is opposite to the first direction.
  • the notification center is an entry on the electronic device for managing pushes from applications on the electronic device or displaying resident status information;
  • the control center is an entry for the electronic device to An entry that controls the state of an electronic device.
  • the first target interface before switching the first target interface to the second target interface, it may further include: determining that the operation of the touch body on the touch screen meets the trigger condition, and the trigger condition is trigger Conditions for interface switching; wherein, the first target interface is the first interface, and the second target interface is the second interface; or, the first target interface is the second interface, and the second target interface is the third interface; or, the first target The interface is the third interface, and the second target interface is the first interface; or, the first target interface is the third interface, and the second target interface is the second interface; or, the first target interface is the second interface, and the second target interface is the first interface.
  • the interface is switched when the trigger condition is met, and the switching effect is improved.
  • the conditions for triggering interface switching may specifically include: the distance between the position where the touch body touches the touch screen at the current moment and the initial position is greater than or equal to a preset distance threshold .
  • condition for triggering interface switching may specifically include: the position where the touch body touches the touch screen at the current moment reaches a preset position on the touch screen.
  • the conditions for triggering interface switching may specifically include: the distance between the position of the touch body when it leaves the touch screen and the initial position is less than a preset distance threshold, and the touch The speed when the body leaves the touch screen is greater than or equal to the preset speed threshold.
  • the process of switching the first target interface to the second target interface may further include: increasing the transparency of the first target interface, or reducing the transparency of the first target interface The clarity of the interface. In this way, transition processing can be performed during the switching process of the two interfaces to enhance the switching effect.
  • switching the first interface to the second interface may include: covering the second interface on the first interface; or switching the first interface to the second interface Including: blurring the first interface, and then covering the second interface on the blurred first interface; or switching the second interface to the third interface includes: closing the second interface, and opening the third interface; or Switching from the second interface to the third interface includes: closing the second interface and opening the third interface, where the third interface is overlaid on the first interface; or switching the third interface to the first interface includes: closing the overlay on the first interface the third interface, presenting the first interface; or switching the third interface to the second interface includes: closing the third interface, and opening the second interface; or switching the third interface to the second interface includes: closing the third interface, And opening the second interface, where the second interface is overlaid on the first interface; or switching the second interface to the first interface includes: closing the second interface overlaid on the first interface, and presenting the first interface.
  • the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen, the first interface includes a desktop display interface on the electronic device, or, the first The interface includes a display interface of an application on the electronic device; in response to receiving a first operation on the touch screen, covering the second interface on the first interface, the first operation refers to that the initial position of the touch body touching the touch screen is located at the top of the touch screen.
  • the second interface includes the display interface of the notification center or the display interface of the control center; after the touch body completes the first operation and leaves the touch screen, in response to the touch screen receiving the second operation, close The second interface and the third interface are opened, wherein the opened third interface covers the first interface, and the second operation refers to that the initial position of the touch body re-contacting the touch screen is located in the second area at the top of the touch screen, and The operation of sliding the bottom of the touch screen, the third interface includes the display interface of the notification center or the display interface of the control center, and the third interface is different from the second interface; after the touch body completes the second operation and leaves the touch screen, in response to the touch screen receiving the first The third operation is to close the third interface and present the first interface.
  • the third operation refers to an operation in which the touch body re-contacts the touch screen at a starting position located in a third area other than the top of the touch screen, and slides toward the top of the touch screen.
  • the method before covering the second interface on the first interface, the method may further include: reducing the definition of the first interface.
  • the method may further include: increasing the transparency of the second interface.
  • the present application provides an electronic device, which may include: a touch screen; one or more processors; and a memory.
  • one or more computer programs are stored in the memory, and the one or more computer programs include instructions.
  • the electronic device executes the first aspect, the second aspect, or the method provided in the third aspect. method.
  • the present application provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the computer program runs on an electronic device, the electronic device executes the first aspect, the second aspect, or the first aspect.
  • the present application provides a computer program product, which, when the computer program product is run on an electronic device, causes the electronic device to execute the method provided in the first aspect, the second aspect, or the third aspect.
  • FIG. 1 is a schematic diagram of a display interface of a mobile phone in the related art
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application.
  • FIG. 3 is a schematic diagram of coordinate axes on a screen of an electronic device provided in an embodiment of the present application
  • FIG. 4 is a schematic diagram of the architecture of an operating system in an electronic device provided in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of an application scenario of a method for operating a mobile phone provided in an embodiment of the present application
  • Fig. 6 is a schematic diagram of the hand-on point, the trigger threshold point, and the hand-off point when the user's finger slides on the mobile phone provided by the embodiment of the present application;
  • Fig. 7 is a schematic diagram of the area where the overhand point and the trigger threshold point are located when the user's finger slides on the mobile phone according to the embodiment of the present application;
  • Fig. 8 is a schematic diagram of a process of switching from the desktop to the notification center on the mobile phone according to the embodiment of the present application;
  • Fig. 9 is a schematic diagram of the process of switching from the desktop to the control center on the mobile phone according to the embodiment of the present application.
  • Fig. 10 is a schematic diagram of a process of switching from a notification center to a control center on a mobile phone according to an embodiment of the present application
  • Fig. 11 is a schematic diagram of the process of switching from the control center to the notification center on the mobile phone provided by the embodiment of the present application;
  • Fig. 12 is a schematic diagram of the process of switching from the desktop to the notification center, then to the control center, and then back to the desktop on the mobile phone provided by the embodiment of the present application;
  • Fig. 13 is a schematic diagram of the process of switching from the desktop to the control center, then switching to the notification center, and then returning to the desktop on the mobile phone provided by the embodiment of the present application;
  • FIG. 14 is a schematic diagram of a system architecture of a mobile phone provided by an embodiment of the present application.
  • Fig. 15 is a schematic flow diagram of an outbound notification center and/or control center provided by an embodiment of the present application
  • FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • first and second and the like in the specification and claims herein are used to distinguish different objects, not to describe a specific order of objects.
  • first response message and the second response message are used to distinguish different response messages, rather than describing a specific order of the response messages.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • multiple means two or more, for example, multiple processing units refer to two or more processing units, etc.; multiple A component refers to two or more components or the like.
  • the notification center and control center on the electronic device can be controlled independently, and the two can be displayed through different windows at this time. For example, sliding from the top left of the screen of the electronic device can call out the notification center, and sliding from the top right of the screen of the electronic device can call out the control center; or, sliding from the top of the screen of the electronic device can call out the notification center, from the electronic device Swipe up from the bottom of the screen to bring up the Control Center, and more.
  • this method allows users to choose whether to call out to the notification center or call out to the control center based on their own needs, the priority of different windows in the system software architecture on the electronic device is different, and the priority of the display window of the notification center is often lower than that of the control center.
  • the priority of the display window This makes it possible to call out the display window of the control center when the notification center is being displayed on the electronic device, and at this time the control center is overlaid on the notification center.
  • it is necessary to return to the interface before calling out the notification center it is necessary to close the display window of the control center first, and then close the display window of the notification center. It can be seen that the return operation is very inconvenient.
  • the display window of the notification center cannot be called out at this time. It can be seen that such a scheme in which the notification center and the control center are separately set causes great inconvenience to the user's operation, and the user experience is poor.
  • the embodiment of the present application implements a horizontal design for the notification center and control center on the electronic device, so that the notification center and control center can be switched alternately based on user needs, which improves the ease of user operation. Convenience and user experience.
  • the electronic device when one of the notification center and the control center is in the display state, and the electronic device detects the operation of calling out the other, the electronic device can close the one that is being displayed, and display the other one that the user is currently calling out, thereby avoiding There is an overlapping nesting problem between the notification center and the control center, so that the user can quickly return to the interface before calling out the notification center or the control center.
  • the current electronic device is displaying the control center, and when the electronic device detects the operation of calling out the notification center, the electronic device can close the control center and display the notification center; When operating the control center, the electronic device can close the notification center and display the notification control center.
  • the electronic device can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, and a cellular phone.
  • Telephone personal digital assistant (PDA), augmented reality (augmented reality, AR) device, virtual reality (virtual reality, VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device , smart home devices and/or smart city devices, etc.
  • Exemplary embodiments of electronic equipment include but are not limited to electronic equipment equipped with iOS, android, Windows, Harmony OS or other operating systems, wherein this solution does not make special restrictions on the specific type of the electronic equipment.
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • the electronic device 100 may include a processor 110 , a memory 120 , a display screen 130 and a sensor 140 .
  • the structure shown in the embodiment of this solution does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may be a general purpose processor or a special purpose processor.
  • the processor 110 may include a central processing unit (central processing unit, CPU) and/or a baseband processor.
  • the baseband processor can be used to process communication data
  • the CPU can be used to implement corresponding control and processing functions, execute software programs, and process data of the software programs.
  • a program (or an instruction or code) may be stored in the memory 120, and the program may be executed by the processor 110, so that the processor 110 executes the method described in this solution.
  • the memory 120 may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory 120, so as to avoid repeated access, reduce the waiting time of the processor 110, and improve the efficiency of the system.
  • the display screen 130 is used to display images, videos and the like.
  • the display screen 130 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 130 , where N is a positive integer greater than 1.
  • the sensor 140 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, or a bone conduction sensor.
  • the sensor 140 may also include a touch sensor.
  • the touch sensor can be used to detect a touch operation on or near it.
  • the touch sensor can collect the touch event of the user on or near it (such as the operation of the user on the surface of the touch sensor with any suitable object such as a finger or a stylus), and send the collected touch information to other devices, Such as processor 110 and so on.
  • the touch sensor can be implemented in various ways such as resistive, capacitive, infrared, and surface acoustic wave.
  • the touch sensor can be arranged on the display screen 130, and the touch screen is composed of the touch sensor and the display screen 130, also called "touch screen”; or, the touch sensor and the display screen 130 can be used as two independent components to realize the input and output function.
  • a Cartesian coordinate system may be pre-set in the touch screen including the touch sensor.
  • a rectangular coordinate system can be established with the upper left corner of the touch screen A as the origin (0,0), or a rectangular coordinate system can be established with the geometric center of the touch screen A as the origin (0,0) (not shown in the figure). Shows).
  • the touch sensor in the touch screen can continuously collect a series of touch events (such as the coordinates of the touch point, touch events, etc.) generated by the touch object on the touch screen, and report this series of touch events to the Processor 110.
  • the above-mentioned touch body can be a touch pen, a user's finger or joint, or an object such as a touch glove with a touch function, which is not limited in this solution.
  • the user's finger is used as the touch body for example. sexual description.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the Android system with a layered architecture is taken as an example to illustrate the software structure of the electronic device 100 .
  • FIG. 4 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom. Among them, Huawei's self-developed mobile terminal operating system can also refer to this structure.
  • the application layer can consist of a series of application packages.
  • the application package may include applications (applications, APPs) such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions.
  • the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • Content providers are used to store and retrieve data and make it accessible to applications.
  • Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
  • the view system can be used to build the display interface of the application.
  • Each display interface can consist of one or more controls.
  • controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification information in the notification center, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in the notification center, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes, etc.
  • Activity Manager can be used to manage the life cycle of each application. Applications usually run in the operating system in the form of activities. The activity manager can schedule the activity process of the application to manage the life cycle of each application.
  • the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the user's finger can touch the touch screen and slide on the touch screen.
  • the user's finger can touch and slide on the top left side of the touch screen, or touch and slide on the top right side of the touch screen, or touch and slide on the bottom left or right side of the touch screen, etc.
  • Information about a series of touch points related to this touch operation can be obtained, for example, coordinates (x, y) of the touch points, touch events, and the like.
  • the touch screen can report the original touch event generated by the user's touch operation to the kernel layer.
  • the touch event can be encapsulated into an advanced touch event that can be read by the application framework layer (i.e. the framework layer), that is, a touch event, which includes the coordinates of the touch point, time, and the current time.
  • the type of touch event for example, action down event, action move event, and action up event.
  • the kernel layer can send the high-level touch event to the panel manager (inputmanager) in the framework layer of the application program.
  • the panel manager After the panel manager acquires the above-mentioned advanced touch events, it can calculate the user's finger's sliding start point, sliding track, sliding distance, sliding speed, or sliding distance in real time according to the touch point coordinates, time and type of touch event in the advanced touch event.
  • the speed of the hand point For example, when the panel manager detects an action down event, it means that the user's finger touches the touch screen, and when the panel manager detects an action up event, it means that the user's finger leaves the touch screen.
  • the panel manager can recognize the sliding track and sliding distance of the user's finger on the touch screen according to the touch point coordinates between adjacent action down events and action up events, and/or, according to adjacent action down events and action
  • the touch point coordinates and time between the up events identify the sliding trajectory of the user's finger on the touch screen and the speed of the sliding point.
  • the user's finger when the user calls out the notification center or the control center on the mobile phone, the user's finger is on the mobile phone, and the user's operation may include: the user's finger touches the screen of the mobile phone, the user's finger slides on the screen of the mobile phone, and the user's finger leaves the screen of the mobile phone Wait.
  • the touch position when the user's finger touches the mobile phone can be called the starting point; the position where the notification center or the control center is called out during the sliding process of the user's finger can be called the trigger threshold point; the position when the user's finger leaves the screen of the mobile phone can be called For off-hand point.
  • the notification center or the control center can be located at the initial position; if the user’s finger does not leave the mobile phone screen and continues to slide after the notification center or the control center is called out, the notification center or the control center can follow the finger The direction of sliding; when the finger leaves the screen, the notification center or control center can return to the initial position. For example: after the finger is pulled down to call out the notification center, if the finger continues to slide down without leaving the screen, the notification center will follow the finger and slide down, and after the finger leaves the screen, the notification center will bounce up to the balance point (that is, just after returning to the notification center) initial position when it appears).
  • the overhand point may be a point located in a preset area on the mobile phone;
  • the trigger threshold point may be a point at a preset distance from the overhand point, or a point in a preset area, or both. Combination (at this time, the two meet one and trigger the outgoing notification center or control center). Exemplarily, as shown in FIG.
  • the overhand point may be located in areas z1 and/or z2 on the mobile phone 100, wherein, when the user's finger touches the area z1 on the mobile phone 100, the overhand point is located in area z1; the trigger threshold The point is a point on the area z3, and the hand-off point may be a point between the area z3 and the bottom of the screen of the mobile phone 100, wherein the area z3 may be a line.
  • the area where the trigger threshold point is located may also be a plane.
  • the area z4 on the mobile phone 100 may be the area where the trigger threshold point is located. noodle.
  • the speed of the user's finger's hand-leaving point is greater than a preset speed threshold, or the user When the sliding speed of the finger is greater than a certain preset speed threshold, the outgoing notification center or control center is triggered.
  • the velocity of the hand-leaving point and/or the velocity of the user's finger sliding can be calculated by a velocity tracker (velocity tracker).
  • the outbound notification center, the outbound control center, and the conversion between the notification center and the control center are introduced respectively.
  • the upper hand point is located in the area z1 on the mobile phone 100 in FIG. Describe; the above hand point is located in the area z2 and the user's finger slides across the trigger threshold point, and when the user's finger leaves the mobile phone screen to call out the control center as an example, the process of calling out the control center is described.
  • the screen of the mobile phone 100 may be in a lighted state.
  • the screen of the mobile phone 100 may be on the standby interface, or on the display interface of the desktop, or the screen of the mobile phone 100 may be on the display interface of the application program in the mobile phone.
  • a display interface in which the screen of the mobile phone 100 is on the desktop is taken as an example.
  • the user's finger may touch the top left side of the screen of the mobile phone 100 .
  • the user's finger slides down without reaching the trigger threshold point.
  • the display interface of the desktop can be gradually blurred, as shown in FIG. 8(b); wherein, in this process, the user The closer the finger is to the upper hand point, the smaller the blur.
  • the complete blurring refers to that the specific content displayed on the display interface is not visible; where, when the display interface of the desktop is completely blurred, the user's finger may slide to the trigger threshold point.
  • the notification center can be displayed, as shown in FIG. 8( c ). After the notification center is displayed on the screen of the mobile phone 100, if the user's finger leaves the screen of the mobile phone 100, the interface shown in (d) in FIG. 8 may be displayed on the mobile phone 100.
  • the mobile phone 100 can return to the interface before the notification center is displayed, which is shown in (e) in FIG. Show. Continuing with (e) in FIG. 8, after the user's finger slides upwards, the user's finger can leave the screen of the mobile phone 100, and the interface shown in (f) in FIG.
  • the screen of the mobile phone 100 may be in a lighted state. Take the display interface where the screen of the mobile phone 100 is on the desktop as an example. As shown in (a) of FIG. 9 , the user's finger may touch the top right side of the screen of the mobile phone 100 . Next, the user's finger slides down without reaching the trigger threshold point. At this time, following the slide of the user's finger, the display interface of the desktop is gradually blurred, as shown in FIG. 9( b ). When the user's finger slides to the trigger threshold point, the control center can be displayed, as shown in FIG. 9( c ).
  • the interface shown in (d) in FIG. 9 may be displayed on the mobile phone 100.
  • the mobile phone 100 can return to the interface before the display of the control center, which is shown in (e) in Figure 9 Show.
  • the user's finger can leave the screen of the mobile phone 100, and the interface shown in (f) in FIG.
  • the user's finger may touch the top right side of the screen of the mobile phone 100 .
  • the user's finger slides down, and the trigger threshold point is not reached.
  • the transparency of the notification center can be gradually increased following the slide of the user's finger, as shown in FIG. 10(b).
  • the closer the user's finger is to the upper hand point, the lower the transparency, and the closer the user's finger is to the trigger threshold point the higher the transparency, that is to say, when the user's finger slides, the notification center gradually changes from clear to transparent.
  • the notification center can also be blurred during the user's finger swipe. It can be understood that, during the sliding process of the user's finger, transition processing actions such as adjusting transparency or the aforementioned blurring processing are optional. In some embodiments, during the sliding process of the user's finger, the drop-down content may be displayed directly without any transition processing, and the background behind or the notification/control center may be blocked. In some embodiments, blurring can be understood as adjusting the clarity of the interface so that the interface becomes blurred; adjusting transparency can be understood as adjusting the transparency of the interface so that the interface becomes transparent.
  • the control center can be displayed, as shown in FIG. 10(c).
  • the interface shown in (d) in FIG. 10 may be displayed on the mobile phone 100.
  • the mobile phone 100 can return to the interface before the display of the control center, which is shown in (e) in Figure 10 Show.
  • the user's finger may touch the top left side of the screen of the mobile phone 110 .
  • the user's finger slides down, and the trigger threshold point is not reached.
  • the transparency of the control center can be gradually increased following the slide of the user's finger, as shown in Figure 11(b).
  • the control center can also be blurred during the user's finger swipe.
  • the notification center can be displayed, as shown in FIG. 11(c).
  • the interface shown in (d) in FIG. 11 may be displayed on the mobile phone 110.
  • the mobile phone 110 can return to the interface before the notification center is displayed, as shown in (e) in Figure 11 .
  • the user can call out the notification center first, then call out the control center, and then return directly from the interface of the control center to the interface before calling out the notification center, that is, the interface when the mobile phone 100 is in the on state.
  • the user can first call out the control center, then call out the notification center, and then directly return from the interface of the notification center to the interface before calling out the control center, that is, the interface when the mobile phone 100 is in the on state.
  • the display interface in which the screen of the mobile phone 100 is on the desktop is taken as an example to introduce respectively.
  • the user's finger can touch the top left side of the screen of the mobile phone 100, and slide down to call out the notification center, as shown in FIG. 12 Shown in (b).
  • the user can touch the top right side of the screen of the mobile phone 100 with a finger, and slide down to call out the control center, as shown in (d) in FIG. 12 .
  • the user's finger can touch the lower area of the screen of the mobile phone 100 and slide upward.
  • the user's finger when the user's finger slides upwards, the user's finger can leave the screen of the mobile phone 100, and at this time the screen of the mobile phone 100 can return to the interface showing the desktop, that is, the interface shown in (a) in FIG. 12 is displayed.
  • the user's finger can touch the top right side of the screen of the mobile phone 100, and slide down to call out the control center, as shown in FIG. 13 Shown in (b).
  • the user's finger can touch the top left side of the screen of the mobile phone 100 and slide down to call out the control center, as shown in (d) in FIG. 13 .
  • the user's finger can touch the lower area of the screen of the mobile phone 100 and slide upward.
  • the user's finger when the user's finger slides upward, the user's finger can leave the screen of the mobile phone 100, and at this time the screen of the mobile phone 100 can return to the interface showing the desktop, that is, the interface shown in (a) in FIG. 13 is displayed.
  • the outgoing call notification center and the control center can be repeatedly and alternately pulled down, but no matter which one is currently displayed on the mobile phone 100, when the mobile phone 100 receives an instruction to close the currently displayed interface (for example, the user's finger slides upwards on the screen, etc.), the mobile phone 100 directly returns to the desktop, that is, returns to the interface before the pull-down.
  • the notification center and the control center may be displayed in different windows, or may be displayed in the same window.
  • closing the notification center can be understood as closing the window to which the notification center belongs
  • closing the control center can be understood as closing the window to which the control center belongs
  • opening the notification center can be understood as opening the notification center
  • opening the control center can be understood as opening the window to which the control center belongs.
  • closing the notification center and opening the control center can be understood as replacing the content of the notification center with the content of the control center
  • closing the control center and opening the notification center can be understood as replacing the control with the content of the notification center Center content.
  • opening the notification center can be understood as displaying the content of the notification center in the window.
  • opening the content of the control center can be understood as displaying the content of the control center in the window.
  • both the notification center and the control center can be displayed in the status bar window.
  • FIG. 14 shows a schematic diagram of a system architecture and a processing procedure of a mobile phone 100 .
  • the system architecture of the mobile phone 100 may include a status bar window 1401, a panel container 1402, a panel mutual pull controller 1403, a notification center panel controller 1404, a control center panel controller 1405, and a notification center panel 1406. and Control Center panel 1407 .
  • the status bar window 1401 may receive the user's operation event detected by the touch screen on the mobile phone 100, wherein the operation event may include the position coordinates of the user's finger. After receiving the operation event, the status bar window 1401 may determine whether the user's operation event is a pull-down sliding event based on the position coordinates of the user's finger at different times in the operation event. In addition, after the status bar window 1401 determines that the user's operation event is a pull-down sliding operation event, it can determine whether there is currently a panel (such as a notification panel or a control panel) in an open state.
  • a panel such as a notification panel or a control panel
  • the status bar window 1401 determines that no panel is currently open, it sends the received operation event to the panel container 1402, so that the operation event is processed by the panel container 1402; , the received operation event is sent to the panel mutual pull controller 1403 so that the panel mutual pull controller 1403 processes the operation event.
  • the panel container 1402 can determine the overhand point of the user's finger based on the position coordinates of the user's finger in the operation event, and then determine whether the user's current operation purpose is to open the notification center or the control center based on the overhand point. Wherein, when the panel container 1402 determines that the user's current operation purpose is to open the notification center, the panel container 1402 may send an operation event to the notification center panel controller 1404 . When the panel container 1402 determines that the user's current operation purpose is to open the control center, the panel container 1402 may send an operation event to the control center panel controller 1405 .
  • the panel mutual pull controller 1403 can determine whether the current operation purpose of the user is to close the notification center and open the control center, or to close the control Center and open Notification Center. Wherein, when the panel mutual pull controller 1403 determines that the user's current operation purpose is to close the control center and open the notification center, the panel mutual pull controller 1403 may send the operation event to the notification center panel controller 1404 . When the panel mutual pull controller 1403 determines that the user's current operation purpose is to close the notification center and open the control center, the panel mutual pull controller 1403 may send an operation event to the control center panel controller 1405 .
  • the panel mutual pull controller 1403 may blur the interface before the target panel is opened based on the operation event, or adjust the transparency of the content before the target panel is opened.
  • the panel mutual pull controller 1403 can obtain the operation event from the target controller corresponding to the user's current operation purpose, that is to say, the target controller can Send the operation event to the panel mutual pull controller 1403 .
  • the notification center controller 1404 can open the notification center panel 1406 according to the operation event.
  • the notification center controller 1404 can send the operation event to the panel mutual pull controller 1403, so that the panel mutual pull control
  • the controller 1403 may blur the interface before the notification center panel is opened based on the operation event, or adjust the transparency of the content before the notification center panel is opened.
  • the control center controller 1405 can open the control center panel 1407 according to the operation event.
  • the control center controller 1404 can send the operation event to the panel mutual pull controller 1403, so that the panel mutual pull control
  • the controller 1403 can blur the interface before the control center panel is opened based on the operation event, or adjust the transparency of the content before the control center is opened.
  • FIG. 15 shows a schematic flowchart of an outbound notification center and/or control center. As shown in Figure 15, the following steps are included:
  • Step 1501 the status bar window 1401 responds to the received operation event, and determines that the operation event is a pull-down event.
  • the status bar window 1401 may receive an operation event sent by the touch screen of the mobile phone 100, and the operation event may include the position coordinates of the user's finger. After receiving the operation event, the status bar window 1401 can determine whether the user's operation event is a pull-down event based on the position coordinates of the user's finger at different times in the operation event.
  • Step 1502 the status bar window 1401 judges whether there is any panel opened.
  • the status bar window 1401 can determine whether a panel is currently open from the panel record information.
  • the panel record information may include opening records and/or closing records of the notification center panel and the control center panel. For example, when the information recorded in the panel record information is that both the notification center and the control center are closed, it can be determined that no panel is currently open; when the information recorded in the panel record information is that the notification center is closed and the control center is in the When in the open state, it can be determined that a panel is currently open. If the status bar window 1401 determines that no panel is opened, then execute step 1503; otherwise, execute step 1508.
  • Step 1503 the panel container 1402 responds to the operation event sent by the status bar window 1401 , and determines the position of the access point in the operation event.
  • the panel container 1402 can determine the area where the overhand point of the user's finger is located based on the coordinate position of the user's finger in the operation event, and then, it can determine according to the preset correspondence between the area where the overhand point is located and the notification center and the control center. Display whether the user's current operation purpose is to open the notification center or the control center.
  • the preset correspondence between the area where the access point is located and the notification center and the control center can be: when the access point is in area z1, it corresponds to the call out notification center; when the access point is in area z2, it corresponds to the call out control center , then when the panel container determines that the top-hand point is located in the zone z1, it can determine that the user's current operation purpose is to open the notification center.
  • step 1504 if the location of the overhand point corresponds to the outbound notification center
  • Step 1504 the notification center controller 1404 responds to the operation event sent by the panel container 1402 and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 responds to the operation event sent by the notification center controller 1404 to blur the background. Wherein, if it is determined that the trigger threshold point is reached, step 1505 is performed; otherwise, the determination is continued.
  • the notification center controller 1404 may determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process in the operation event.
  • the position coordinates of the overhand point are (2,1)
  • the current position coordinates of the user's finger are (2,5)
  • the sliding distance is 4 distances.
  • the preset trigger threshold point is At a point that is 4 distances away from the tophand point
  • the user’s finger slides to the trigger threshold point; if the coordinates of the preset trigger threshold point are (k, 5), where k ⁇ 0, then the current location of the user’s finger is The coordinates are just at the trigger threshold point.
  • the notification center controller 1404 determines that the user's finger has left the hand before sliding to the trigger threshold point according to the coordinate position during the sliding process of the user's finger, the notification center controller 1404 can also calculate the distance of the user's finger according to the operation event. The speed of the hand point. When the speed at the hand-off point is greater than the preset speed threshold, it can also be considered that the trigger threshold point has been reached.
  • the notification center controller 1404 may send information about whether the trigger threshold point is reached to the panel mutual pull controller 1403 , and send an operation event to the panel mutual pull controller 1403 .
  • the panel mutual pull controller 1403 can, based on the operation event, send a message to the background (such as calling out the notification center)
  • the previous interface is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the notification center panel.
  • the panel mutual pull controller 1403 can adjust the blurring degree of the background based on the distance between the current position coordinates of the user's finger and the trigger threshold point in the operation event; for example, when the distance between the current position coordinates of the user's finger and the trigger threshold point When the distance between is relatively long, the degree of blur is small, and when the distance between the current position coordinates of the user's finger and the trigger threshold point is relatively short, the degree of blur is relatively large.
  • Step 1505 the notification center controller 1404 opens the notification center panel 1406 .
  • Step 1506 the control center controller 1405 responds to the operation event sent by the panel container 1402 and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 responds to the operation event sent by the control center controller 1405 to blur the background. Wherein, if it is determined that the trigger threshold point is reached, step 1507 is performed; otherwise, the determination is continued.
  • the control center controller 1405 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event. At the same time, the control center controller 1405 can send information to the panel mutual pull controller 1403 whether the trigger threshold point is reached, and send the operation event to the panel mutual pull controller 1403 .
  • the panel mutual pull controller 1403 may, based on the operation event, perform a background check (such as calling out the control center) The previous interface) is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the control center panel.
  • a background check such as calling out the control center
  • the previous interface is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the control center panel.
  • Step 1507 the control center controller 1405 opens the control center panel 1407 .
  • Step 1508 the panel mutual pull controller 1403 responds to the operation event sent by the status bar window 1401 , and determines the position of the access point in the operation event.
  • step 1509 if the location of the overhand point corresponds to the outbound notification center, execute step 1509; if the location of the overhand point corresponds to the outbound control center, execute step 1511.
  • Step 1509 the notification center controller 1404 responds to the operation event sent by the panel mutual pull controller 1403 , and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 adjusts the transparency of the control center panel 1407 based on the operation event. Wherein, if it is judged that the trigger threshold point is reached, then execute step 1510, otherwise continue to judge.
  • the notification center controller 1404 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event.
  • the panel mutual pull controller 1403 can adjust the transparency of the control center panel 1407 based on the operation event, so as to enhance the call out effect of the notification center panel.
  • the panel mutual pull controller 1403 can adjust the transparency of the rear control center panel 1407 based on the distance between the current position coordinates of the user's finger and the trigger threshold point in the operation event; for example, when the current position coordinates of the user's finger are When the distance between the threshold points is relatively long, the transparency of the control center panel 1407 is small; when the distance between the current location coordinate of the user's finger and the trigger threshold point is relatively short, the transparency of the control center panel 1407 is relatively large.
  • the notification center controller 1404 may also feed back the information of reaching the trigger threshold point to the panel mutual pull controller 1403, so that the panel mutual pull controller 1403 can notify the center controller 1404 instructs the control center controller 1405 to close the control center panel 1407 when it is determined that the trigger threshold point is reached.
  • Step 1510 the notification center controller 1404 opens the notification center panel 1406 .
  • Step 1511 the control center controller 1405 responds to the operation event sent by the panel mutual pull controller 1403, and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 adjusts the transparency of the notification center panel 1406 based on the operation event. Wherein, if it is judged that the trigger threshold point is reached, then execute step 1512, otherwise continue to judge.
  • the control center controller 1405 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event.
  • the panel mutual pull controller 1403 can adjust the transparency of the notification center panel 1406 based on the operation event, so as to enhance the call out effect of the control center panel.
  • control center controller 1405 may also feed back the information of reaching the trigger threshold point to the panel mutual pull controller 1403, so that the panel mutual pull controller 1403 Step 1405 instructs the notification center controller 1404 to close the notification center panel 1406 when it is determined that the trigger threshold point is reached.
  • Step 1512 the control center controller 1405 opens the control center panel 1407 .
  • FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
  • a chip 1600 includes one or more processors 1601 and an interface circuit 1602 .
  • the chip 1600 may also include a bus 1603 . in:
  • the processor 1601 may be an integrated circuit chip with signal processing capability. In the implementation process, the control process involved in the above solutions may be completed by an integrated logic circuit of hardware in the processor 1601 or instructions in the form of software.
  • the interface circuit 1602 can be used for sending or receiving data, instructions or information.
  • the processor 1601 can use the data, instructions or other information received by the interface circuit 1602 to process, and can send the processing completion information through the interface circuit 1602 .
  • the chip further includes a memory, which may include a read-only memory and a random access memory, and provides operation instructions and data to the processor.
  • a portion of the memory may also include non-volatile random access memory (NVRAM).
  • the memory stores executable software modules or data structures, and the processor can execute corresponding operations by calling operation instructions stored in the memory (the operation instructions can be stored in the operating system).
  • the interface circuit 1602 may be used to output an execution result of the processor 1601 .
  • the corresponding functions of the processor 1601 and the interface circuit 1602 can be realized by hardware design, software design, or a combination of software and hardware, which is not limited here.
  • the chip may be applied to the electronic device 100 shown in FIG. 2 to implement the method provided in the embodiment of the present application.
  • processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
  • CPU central processing unit
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor can be a microprocessor, or any conventional processor.
  • the method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted via a computer-readable storage medium.
  • the computer instructions may be transmitted from one website site, computer, server, or data center to another website site by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供了一种控制方法及电子设备。方法可以包括:在电子设备的触摸屏显示第一界面;响应于触摸屏接收到第一操作,将第一界面切换至第二界面;在触摸体完成第一操作离开触摸屏后,响应于触摸屏接收到第二操作,将第二界面切换至第三界面;在触摸体完成第二操作离开触摸屏后,响应于触摸屏接收到第三操作,将第三界面切换至第一界面;其中,第二界面和第三界面均包括通知中心的显示界面或控制中心的显示界面,且第二界面和第三界面不同。由此,用户可以在电子设备上对通知中心和控制中心进行切换,并可以由切换后的界面直接返回至电子设备的上初始界面(如桌面,应用的显示界面等),提升了用户操作的便利性和用户体验。

Description

一种控制方法及电子设备
本申请要求于2021年6月1日提交至中国国家知识产权局、申请号为202110611045.2、申请名称为“一种控制方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种控制方法及电子设备。
背景技术
随着智能手机,平板电脑等电子设备的屏幕的尺寸逐渐增大,为了便于用户操控电子设备,在电子设备上常会设置有通知中心和控制中心。其中,通知中心可以是管理来自电子设备中应用(application,APP)的推送或显示常驻状态信息的入口。控制中心可以是控制设备状态的入口。
目前,电子设备上的通知中心和控制中心可以同时在一个窗口中同时显示。例如,如图1所示,从电子设备的屏幕的顶部下滑可以同时呼出控制中心11和通知中心12,其中,控制中心11在上侧,通知中心12在下侧,控制中心11可以默认折叠展示最常用的快捷开关并支持展开查看更多信息,通知中心12可以像列表一样支持上下滚动浏览。但随着电子设备可控制的设备越来越多,若在电子设备的控制中心中同时加入多个设备的控制,则电子设备显示通知中心和控制中心时其显示界面会过分拥挤,即此时电子设备上的控制中心不适合一次承载过多的信息。因此,如何便于用户操作电子设备上的通知中心和控制中心是目前急需解决的技术问题。
发明内容
本申请提供了一种控制方法及电子设备,能够便于用户操作电子设备上的通知中心和控制中心。
第一方面,本申请提供了一种控制方法,应用于具有触摸屏的电子设备,该方法可以包括:在触摸屏显示第一界面;响应于触摸屏接收到第一操作,将第一界面切换至第二界面,第一操作是指触摸体接触触摸屏的起始位置位于触摸屏的第一区域内,且在触摸屏上沿第一方向滑动的操作;在触摸体完成第一操作离开触摸屏后,响应于触摸屏接收到第二操作,将第二界面切换至第三界面,第二操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的第二区域内,且在触摸屏上沿第二方向滑动的操作;在触摸体完成第二操作离开触摸屏后,响应于触摸屏接收到第三操作,将第三界面切换至第一界面,第三操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的第三区域内,且在触摸屏上沿第三方向滑动的操作;其中,第二界面为通知中心的显示界面,第三界面为控制中心的显示界面;或者,第二界面为控制中心的显示界面,第三界面为通知中心的显示界面。由此,用户可以在电子设备上对通知中心和控制中心进行切换,并可以由切换后的界面直接返回至电子设备的上初始界面(如桌面,应用的显示界面等),提升了用户操作的便利性和用户体验。
第二方面,本申请提供了一种控制方法,应用于具有触摸屏的电子设备,该方法可以包 括:在触摸屏显示第一界面;响应于触摸屏接收到第一操作,将第一界面切换至第二界面,第一操作是指触摸体接触触摸屏的起始位置位于触摸屏的第一区域内,且在触摸屏上沿第一方向滑动的操作;在触摸体完成第一操作离开触摸屏后,响应于触摸屏接收到第二操作,将第二界面切换至第三界面,第二操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的第二区域内,且在触摸屏上沿第二方向滑动的操作;在触摸体完成第二操作离开触摸屏后,响应于触摸屏再次接收到第一操作,将第三界面切换至第二界面;其中,第二界面为通知中心的显示界面,第三界面为控制中心的显示界面;或者,第二界面为控制中心的显示界面,第三界面为通知中心的显示界面。由此,用户可以在电子设备上对通知中心和控制中心进行交替切换,提升了用户操作的便利性和用户体验。
在第二方面的一种可能的实现方式中,将第三界面切换至第二界面之后,还可以包括:在触摸体完成第一操作离开触摸屏后,响应于触摸屏接收到第三操作,将第二界面切换至第一界面,第三操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的第三区域内,且在触摸屏上沿第三方向滑动的操作。由此,用户可以在电子设备上可以由切换后的界面直接返回至电子设备的上初始界面(如桌面,应用的显示界面等),提升了用户操作的便利性和用户体验。
在第一方面或第二方面的一种可能的实现方式中,第一界面包括电子设备上的桌面的显示界面,或者,第一界面包括电子设备上的应用的显示界面。
在第一方面或第二方面的一种可能的实现方式中,第二界面和第三界面均显示于第一窗口中。由此在两个界面进行切换时,可以用其中一个界面中的内容替换另一个界面中的内容,从而不用关闭一个窗口并打开另一个窗口,提升了切换效率。
在第一方面或第二方面的一种可能的实现方式中,第一窗口为状态栏窗口。
在第一方面或第二方面的一种可能的实现方式中,第一界面和第二界面显示于不同的窗口中。例如,第一界面为应用的显示界面时,第一界面可以显示于应用的显示窗口中,第二界面可以显示于状态栏窗口中。
在第一方面或第二方面的一种可能的实现方式中,第一区域位于触摸屏的顶部的第一侧,第一方向为由触摸屏的顶部朝向触摸屏的底部的方向;第二区域位于触摸屏的顶部的第二侧,第二方向与第一方向相同;第三区域为触摸屏上除第一区域和第二区域以外的区域,第三方向与第一方向相反。
在第一方面或第二方面的一种可能的实现方式中,通知中心为电子设备上用于管理来自电子设备上应用的推送或显示常驻状态信息的入口;控制中心为电子设备上用于控制电子设备的状态的入口。
在第一方面或第二方面的一种可能的实现方式中,将第一目标界面切换至第二目标界面之前,还可以包括:确定触摸体在触摸屏上的操作达到触发条件,触发条件为触发界面切换的条件;其中,第一目标界面为第一界面,第二目标界面为第二界面;或者,第一目标界面为第二界面,第二目标界面为第三界面;或者,第一目标界面为第三界面,第二目标界面为第一界面;或者,第一目标界面为第三界面,第二目标界面为第二界面;或者,第一目标界面为第二界面,第二目标界面为第一界面。由此,在达到触发条件时进行界面切换,提升切换效果。
在第一方面或第二方面的一种可能的实现方式中,触发界面切换的条件,具体可以包括:触摸体当前时刻接触触摸屏的位置与起始位置之间的距离大于或等于预设距离阈值。
在第一方面或第二方面的一种可能的实现方式中,触发界面切换的条件,具体可以包括:触摸体当前时刻接触触摸屏的位置到达触摸屏上的预设位置。
在第一方面或第二方面的一种可能的实现方式中,触发界面切换的条件,具体可以包括:触摸体离开触摸屏时的位置与起始位置之间的距离小于预设距离阈值,且触摸体离开触摸屏时的速度大于或等于预设速度阈值。
在第一方面或第二方面的一种可能的实现方式中,将第一目标界面切换至第二目标界面的过程中,还可以包括:提高第一目标界面的透明度,或者,降低第一目标界面的清晰度。由此可以在两个界面切换过程中进行过渡处理,增强切换效果。
在第一方面或第二方面的一种可能的实现方式中,将第一界面切换至第二界面可以包括:将第二界面覆盖在第一界面上;或者将第一界面切换至第二界面包括:将第一界面模糊处理,然后将第二界面覆盖在模糊处理后的第一界面上;或者将第二界面切换至第三界面包括:关闭第二界面,并打开第三界面;或者将第二界面切换至第三界面包括:关闭第二界面,并打开第三界面,第三界面覆盖在第一界面上;或者将第三界面切换至第一界面包括:关闭覆盖在第一界面上的第三界面,呈现第一界面;或者将第三界面切换至第二界面包括:关闭第三界面,并打开第二界面;或者将第三界面切换至第二界面包括:关闭第三界面,并打开第二界面,第二界面覆盖在第一界面上;或者将第二界面切换至第一界面包括:关闭覆盖在第一界面上的第二界面,呈现第一界面。
第三方面,本申请提供了一种控制方法,应用于具有触摸屏的电子设备,该方法可以包括:在触摸屏显示第一界面,第一界面包括电子设备上的桌面的显示界面,或者,第一界面包括电子设备上的应用的显示界面;响应于触摸屏接收到第一操作,将第二界面覆盖在第一界面上,第一操作是指触摸体接触触摸屏的起始位置位于触摸屏的顶部的第一区域内,且向触摸屏的底部滑动的操作,第二界面包括通知中心的显示界面或控制中心的显示界面;在触摸体完成第一操作离开触摸屏后,响应于触摸屏接收到第二操作,关闭第二界面且打开第三界面,其中,打开后的第三界面覆盖在第一界面上,第二操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的顶部的第二区域内,且向触摸屏的底部滑动的操作,第三界面包括通知中心的显示界面或控制中心的显示界面,且第三界面与第二界面不同;在触摸体完成第二操作离开触摸屏后,响应于触摸屏接收到第三操作,关闭第三界面,呈现第一界面,第三操作是指触摸体重新接触触摸屏的起始位置位于触摸屏的顶部以外的第三区域内,且向触摸屏的顶部滑动的操作。
在第三方面的一种可能的实现方式中,将第二界面覆盖在第一界面上之前,该方法还可以包括:降低第一界面的清晰度。
在第三方面的一种可能的实现方式中,关闭第二界面之前,该方法还可以包括:提高第二界面的透明度。
第四方面,本申请提供了一种电子设备,该电子设备可以包括:触摸屏;一个或多个处理器;存储器。其中,存储器中存储有一个或多个计算机程序,一个或多个计算机程序包括指令,当指令被电子设备执行时,使得电子设备执行第一方面,第二方面,或第三方面中所提供的方法。
第五方面,本申请提供了一种计算机可读存储介质,计算机可读存储介质存储有计算机程序,当计算机程序在电子设备上运行时,使得电子设备执行第一方面,第二方面,或第三方面中所提供的方法。
第六方面,本申请提供了一种计算机程序产品,当计算机程序产品在电子设备上运行时,使得电子设备执行第一方面,第二方面,或第三方面中所提供的方法。
附图说明
图1是相关技术中的一种手机的显示界面的示意图;
图2是本申请实施例提供的一种电子设备的硬件结构示意图;
图3是本申请实施例提供的一种电子设备的屏幕上的坐标轴的示意图;
图4是本申请实施例提供的一种电子设备中操作系统的架构示意图;
图5是本申请实施例提供的一种手机的操作方法的应用场景示意图;
图6是本申请实施例提供的一种用户手指在手机上滑动时上手点、触发阈值点、离手点的示意图;
图7是本申请实施例提供的一种用户手指在手机上滑动时上手点和触发阈值点所处区域的示意图;
图8是本申请实施例提供的一种在手机上由桌面切换至通知中心的过程示意图;
图9是本申请实施例提供的一种在手机上由桌面切换至控制中心的过程示意图;
图10是本申请实施例提供的一种在手机上由通知中心切换至控制中心的过程示意图;
图11是本申请实施例提供的一种在手机上由控制中心切换至通知中心的过程示意图;
图12是本申请实施例提供的一种在手机上由由桌面切换至通知中心,再切换至控制中心,再返回到桌面的过程示意图;
图13是本申请实施例提供的一种在手机上由由桌面切换至控制中心,再切换至通知中心,再返回到桌面的过程示意图;
图14是本申请实施例提供的一种手机的系统架构示意图;
图15是本申请实施例提供的一种呼出通知中心和/或控制中心的流程示意图
图16是本申请实施例提供的一种芯片的结构示意图。
具体实施方式
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。
本文中的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一响应消息和第二响应消息等是用于区别不同的响应消息,而不是用于描述响应消息的特定顺序。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本申请实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个处理单元是指两个或者两个以上的处理单元等;多个元件是指两个或者两个以上的元件等。
为了便于用户操作,电子设备上的通知中心和控制中心可以单独进行控制,此时两者可 以通过不同的窗口进行显示。例如,从电子设备的屏幕的顶部左侧下滑可以呼出通知中心,从电子设备的屏幕的顶部右侧下滑可以呼出控制中心;或者,从电子设备的屏幕的顶部下滑可以呼出通知中心,从电子设备的屏幕的底部上滑可以呼出控制中心,等等。这种方式虽然可以使得用户基于自身需求选择呼出通知中心还是呼出控制中心,但在电子设备上系统软件架构中不同窗口之间的优先级不同,通知中心的显示窗口的优先级往往低于控制中心的显示窗口的优先级。这就使得当电子设备上正在显示通知中心时,可以呼出控制中心的显示窗口,此时控制中心覆盖在通知中心之上。当需要回到呼出通知中心之前的界面时,则需要先关闭控制中心的显示窗口,再关闭通知中心的显示窗口,可见返回操作十分不便。此外,当电子设备上正在显示控制中心时,受显示窗口的优先级影响,此时不能呼出通知中心的显示窗口。可见,此种通知中心和控制中心分离设置的方案,对用户的操作造成了极大的不便,用户体验较差。
进一步地,为了便于用户操作且提升用户体验,本申请实施例将电子设备上的通知中心和控制中心进行平级设计,使得通知中心和控制中心可以基于用户需求反复交替切换,提升了用户操作的便利性和用户体验。此外,在通知中心和控制中心两者中的一个处于显示状态,且电子设备检测到呼出另一个的操作时,电子设备可以关闭正在显示的一个,并显示用户当前呼出的另一个,由此避免通知中心和控制中心之间出现重叠嵌套问题,使得用户可以快速返回至呼出通知中心或控制中心之前的界面。举例来说,当前电子设备正在显示控制中心,在电子设备检测到呼出通知中心的操作时,电子设备可以关闭控制中心,并显示通知中心;当前电子设备正在显示通知中心,在电子设备检测到呼出控制中心的操作时,电子设备可以关闭通知中心,并显示通知控制中心。
可以理解的是,本方案中,电子设备可以手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备等。电子设备的示例性实施例包括但不限于搭载iOS、android、Windows、鸿蒙系统(Harmony OS)或者其他操作系统的电子设备,其中,本方案对该电子设备的具体类型不作特殊限制。
下面介绍本方案中提供的一种电子设备的硬件结构示意图。
图2是本申请实施例提供的一种电子设备的硬件结构示意图。如图2所示,该电子设备100可以包括处理器110,存储器120,显示屏130和传感器140。可以理解的是,本方案实施例示意的结构并不构成对电子设备100的具体限定。在本方案另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
其中,处理器110可以是通用处理器或者专用处理器。例如,处理器110可以包括中央处理器(central processing unit,CPU)和/或基带处理器。其中,基带处理器可以用于处理通信数据,CPU可以用于实现相应的控制和处理功能,执行软件程序,处理软件程序的数据。
存储器120上可以存有程序(也可以是指令或者代码),程序可被处理器110运行,使得处理器110执行本方案中描述的方法。可选地,该存储器120可以保存处理器110刚用过或 循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从存储器120中直接调用,以避免了重复存取,减少了处理器110的等待时间,提高系统的效率。
显示屏130用于显示图像,视频等。显示屏130包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备100可以包括1个或N个显示屏130,N为大于1的正整数。
传感器140可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,环境光传感器,或,骨传导传感器等。
本方案中,传感器140中也可以包括触摸传感器。该触摸传感器可以用于检测作用于其上或附近的触摸操作。其中,触摸传感器可以采集用户在其上或附近的触摸事件(如用户使用手指、触控笔等任何适合的物体在触摸传感器表面上的操作),并将采集到的触摸信息发送给其他器件,如处理器110等。
示例性的,触摸传感器可采用电阻式,电容式,红外线以及表面声波等多种方式实现。触摸传感器可以设置于显示屏130,由触摸传感器与显示屏130组成触摸屏,也称“触控屏”;或者,触摸传感器和显示屏130可作为两个独立的部件来实现电子设备100的输入和输出功能。
在本方案的一些实施例中,可以预先在包含触摸传感器的触摸屏中设置直角坐标系。例如,如图3所示,可以以触摸屏A的左上角为原点(0,0)建立直角坐标系,也可以以触摸屏A的几何中心作为原点(0,0)建立直角坐标系(图中未示出)。当触摸体在触摸屏上滑动时,触摸屏中的触摸传感器可以持续采集触摸体在触摸屏上生成的一系列触摸事件(例如触摸点的坐标、触摸事件等),并将这一系列的触摸事件上报给处理器110。其中,上述触摸体可以为触摸笔,用户的手指或关节,或者,具有触摸功能的触控手套等物体,本方案中对此不做任何限制,后续实施例中以用户手指为触摸体进行示例性说明。
可以理解的是,上述电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。本方案以分层架构的Android系统为例,示例性说明电子设备100的软件结构。
图4是本申请实施例的电子设备100的软件结构框图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将Android系统分为四层,从上至下分别为应用程序层,应用程序框架层,安卓运行时(Android runtime)和系统库,以及内核层。其中,华为自研发的移动终端操作系统也可以参考该结构。
1、应用程序层
应用程序层可以包括一系列应用程序包。如图4所示,应用程序包可以包括相机,图库,日历,通话,地图,导航,蓝牙,音乐,视频,短信息等应用程序(application,APP)。
2、应用程序框架层
应用程序框架层为应用程序层的应用程序提供应用编程接口(application programming  interface,API)和编程框架。应用程序框架层包括一些预先定义的函数。
如图4所示,应用程序框架层可以包括窗口管理器,内容提供器,视图系统,电话管理器,资源管理器,通知管理器等。
窗口管理器用于管理窗口程序。窗口管理器可以获取显示屏大小,判断是否有状态栏,锁定屏幕,截取屏幕等。
内容提供器用来存放和获取数据,并使这些数据可以被应用程序访问。所述数据可以包括视频,图像,音频,拨打和接听的电话,浏览历史和书签,电话簿等。
视图系统可用于构建应用程序的显示界面。每个显示界面可以由一个或多个控件组成。一般而言,控件可以包括图标,按钮,菜单,选项卡,文本框,对话框,状态栏,导航栏,微件(widget)等界面元素。
电话管理器用于提供电子设备100的通信功能。例如通话状态的管理(包括接通,挂断等)。
资源管理器为应用程序提供各种资源,比如本地化字符串,图标,图片,布局文件,视频文件等等。
通知管理器使应用程序可以在通知中心中显示通知信息,可以用于传达告知类型的消息,可以短暂停留后自动消失,无需用户交互。比如通知管理器被用于告知下载完成,消息提醒等。通知管理器还可以是以图表或者滚动条文本形式出现在系统顶部状态栏的通知,例如后台运行的应用程序的通知,还可以是以对话窗口形式出现在屏幕上的通知。例如在通知中心提示文本信息,发出提示音,电子设备振动,指示灯闪烁等。
活动管理器可用于管理每个应用的生命周期。应用通常以activity的形式运行在操作系统中。活动管理器可以调度应用的activity进程管理每个应用的生命周期。
3、Android Runtime和系统库
Android Runtime包括核心库和虚拟机。Android runtime负责安卓系统的调度和管理。
核心库包含两部分:一部分是java语言需要调用的功能函数,另一部分是安卓的核心库。
应用程序层和应用程序框架层运行在虚拟机中。虚拟机将应用程序层和应用程序框架层的java文件执行为二进制文件。虚拟机用于执行对象生命周期的管理,堆栈管理,线程管理,安全和异常的管理,以及垃圾回收等功能。
系统库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。2D图形引擎是2D绘图的绘图引擎。
4、内核层
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
在本方案的一些实施例中,用户手指可以触碰触摸屏,以及在触摸屏上滑动。例如,用户手指可以在触摸屏的顶部左侧触碰并滑动,也可以在触摸屏的顶部右侧触碰并滑动,亦可以在触摸屏的底部左侧或右侧触碰并滑动,等。示例性的,以用户手指在触摸屏的顶部左侧 触碰为例,如图5所示,应用程序层中正在运行通话的显示界面41,用户手指在触摸屏的顶部左侧向下滑动时,触摸屏可得到这一触摸操作相关的一系列触摸点的信息,例如,触摸点的坐标(x,y)和触摸事件等。进而,触摸屏可向内核层上报用户这一触摸操作产生的原始触摸事件。内核层得到上述原始触摸事件后,可将该触摸事件封装为应用程序框架层(即framework层)能够读取的高级触摸事件,即touch event,touch event中包含触摸点的坐标、时间以及本次触摸事件的类型,例如,action down事件、action move事件以及action up事件。之后,内核层可以将该高级触摸事件发送给应用程序框架层中的面板管理器(inputmanager)。
面板管理器获取到上述高级触摸事件后,可以根据高级触摸事件中的触摸点坐标、时间和触摸事件的类型,实时计算用户手指的滑动起始点,滑动轨迹,滑动距离,滑动速度,或滑动离手点的速度。例如,当面板管理器检测到action down事件时,说明用户手指接触触摸屏,当面板管理器检测到action up事件时,说明用户手指离开触摸屏。那么,面板管理器可根据相邻的action down事件和action up事件之间的触摸点坐标识别出用户手指在触摸屏上的滑动轨迹和滑动距离,和/或,根据相邻的action down事件和action up事件之间的触摸点坐标和时间识别出用户手指在触摸屏上的滑动轨迹和滑动离手点的速度。
接下来,以电子设备100为手机,在手机上呼出通知中心和控制中心举例,基于上文所描述的内容并结合附图对本申请实施例中涉及的控制过程进行详细描述。可以理解的是,通知中心和控制中心也可以替换为其他的内容,在此不做限定。
本申请实施例中,用户在手机上呼出通知中心或控制中心时手指在手机上,用户的操作可以包括:用户手指触碰手机的屏幕,用户手指在手机的屏幕上滑动,用户手指离开手机屏幕等。其中,用户手指触碰手机时的触碰位置可以称为上手点;用户手指滑动过程中触发通知中心或控制中心呼出的位置可以称为触发阈值点;用户手指离开手机的屏幕时的位置可以称为离手点。当通知中心或者控制中心被呼出时,通知中心或者控制中心可以位于初始位置;如果在通知中心或者控制中心被呼出后,用户手指未离开手机屏幕而继续滑动,则通知中心或者控制中心可以跟随手指滑动的方向而运动;当手指离开屏幕后,通知中心或控制中心可以回到初始位置。例如:手指下拉呼出通知中心后,如果手指未离开屏幕而继续向下滑动,则通知中心跟随手指也向下滑动,在手指离开屏幕后,通知中心向上反弹至平衡点(即回到通知中心刚出现时的初始位置)。
在一个例子中,上手点可以为位于手机上预设区域中的点;触发阈值点可以为与上手点相距预设距离的点,也可以为处于预设区域的点,亦可以是两者的结合(此时,两者符合一个即触发呼出通知中心或控制中心)。示例性的,如图6所示,上手点可以位于手机100上区域z1和/或z2中,其中,当用户手指触碰手机100上的区域z1时,上手点则位于区域z1中;触发阈值点为区域z3上的点,离手点可以为区域z3与手机100的屏幕底部之间的点,其中,区域z3可以为一条线。示例性的,触发阈值点所在的区域除了是一条线之外,还可以是一个面,如图7所示,手机100上的区域z4可以为触发阈值点所在的区域,此时区域z4为一个面。
在一个例子中,若用户手指离开手机的屏幕时,用户手指未滑动到上述所规定的触发阈值点,此时也可以在用户手指的离手点的速度大于某一预设速度阈值,或者用户手指的滑动速度大于某一预设速度阈值时,触发呼出通知中心或控制中心。示例性的,离手点的速度和/ 或用户手指滑动的速度均可以通过速度跟踪器(velocity tracker)计算得出。
接下来,对呼出通知中心,呼出控制中心,以及通知中心和控制中心的转换,分别进行介绍。其中,为便于叙述,下面将以上手点位于图7中手机100上的区域z1且用户手指滑动越过触发阈值点,以及在用户手指离开手机屏幕时呼出通知中心为例,对呼出通知中心的过程进行描述;以上手点位于区域z2且用户手指滑动越过触发阈值点,以及在用户手指离开手机屏幕时呼出控制中心为例,对呼出控制中心的过程进行描述。
(1)呼出通知中心(此时未呼出控制中心)
示例性的,在手机100上未呼出控制中心,且当前需要呼出通知中心时,手机100的屏幕可以处于点亮状态。例如,手机100的屏幕可以处于待机界面,也可以处于桌面的显示界面,亦可以是手机100的屏幕处于手机中应用程序的显示界面。
示例性的,以手机100的屏幕处于桌面的显示界面为例。如图8中(a)所示,用户手指可以触碰手机100的屏幕的顶部左侧。接着,用户手指向下滑动,且未达到触发阈值点,此时,可以跟随用户手指的滑动,逐渐模糊桌面的显示界面,即图8(b)中所示;其中,在该过程中,用户手指距离上手点越近,模糊程度越小,用户手指越接近触发阈值点,模糊程度越大,也即是说,用户手指滑动时,桌面的显示界面由清晰逐渐转变为模糊,直至完全模糊,其中,所述的完全模糊指的是显示界面中显示的具体内容不可见;其中,在桌面的显示界面完全模糊时,用户手指可以是滑动到触发阈值点。在用户手指滑动到触发阈值点时,可以显示通知中心,即图8(c)中所示。在手机100的屏幕上显示出通知中心后,若用户手指离开手机100的屏幕,手机100上可以显示如图8中(d)所示的界面。而在手机100的屏幕上显示出通知中心后,若用户手指未离开手机100的屏幕,而是向相反方向滑动,手机100可以退回到显示通知中心之前的界面,即图8中(e)所示。继续图8中(e),用户手指向上滑动后,用户手指可以离开手机100的屏幕,此时可以显示如图8中(f)所示的界面,即返回到手机100的桌面的显示界面。
可选地,手机100上显示如图8中(d)所示的界面后,可以转向图10中的(a),以进行通知中心向控制中心的切换。
(2)呼出控制中心(此时未呼出通知中心)
示例性的,在手机100上未呼出通知中心,且当前需要呼出控制中心时,手机100的屏幕可以处于点亮状态。以手机100的屏幕处于桌面的显示界面为例。如图9中(a)所示,用户手指可以触碰手机100的屏幕的顶部右侧。接着,用户手指向下滑动,且未达到触发阈值点,此时,可以跟随用户手指的滑动,逐渐模糊桌面的显示界面,即图9(b)中所示。在用户手指滑动到触发阈值点时,可以显示控制中心,即图9(c)中所示。在手机100的屏幕上显示出控制中心后,若用户手指离开手机100的屏幕,手机100上可以显示如图9中(d)所示的界面。而在手机100的屏幕上显示出控制中心后,若用户手指未离开手机100的屏幕,而是向相反方向滑动,手机100可以退回到显示控制中心之前的界面,即图9中(e)所示。继续图9中(e),用户手指向上滑动后,用户手指可以离开手机100的屏幕,此时可以显示如图9中(f)所示的界面,即返回到手机100的桌面的显示界面。
可选地,手机100上显示如图9中(d)所示的界面后,可以转向图11中的(a),以进行控制中心向通知中心的切换。
(3)通知中心与控制中心之间的切换
a)由通知中心切换至控制中心(此时已呼出通知中心)
示例性的,在呼出通知中心后,如图10中(a)所示,用户手指可以触碰手机100的屏幕的顶部右侧。接着,用户手指向下滑动,且未达到触发阈值点,此时,可以跟随用户手指的滑动,逐渐提升通知中心的透明度,即图10(b)中所示。其中,在该过程中,用户手指距离上手点越近,透明度越低,用户手指越接近触发阈值点,透明度越高,也即是说,用户手指滑动时,通知中心由清晰逐渐转变为透明,直至完全不可见;其中,在通知中心完全不可见时,用户手指可以是滑动到触发阈值点。此外,在用户手指滑动过程中也可以模糊通知中心。可以理解的是,用户手指滑动过程中,调整透明度或者前述的模糊处理等过渡处理动作均是可选的。在一些实施例中,用户手指滑动过程中,可以不做任何过渡处理,就直接显示出下拉的内容,遮挡后面的背景或者通知/控制中心。在一些实施例中,模糊处理可以理解为调整界面的清晰度,以使得界面变的模糊;调整透明度可以理解为调整界面的透明度,以使得界面变透明。
在用户手指滑动到触发阈值点时,可以显示控制中心,即图10(c)中所示。在手机100的屏幕上显示出控制中心后,若用户手指离开手机100的屏幕,手机100上可以显示如图10中(d)所示的界面。而在手机100的屏幕上显示出控制中心后,若用户手指未离开手机100的屏幕,而是向相反方向滑动,手机100可以退回到显示控制中心之前的界面,即图10中(e)所示。继续图10中(e),用户手指向上滑动后,用户手指可以离开手机100的屏幕,此时可以显示如图10中(f)所示的界面,即返回到手机100上通知中心的显示界面。
可选地,手机100上显示如图10中(d)所示的界面后,可以转向图11中的(a),以进行控制中心向通知中心的切换。
b)由控制中心切换至通知中心(此时已呼出控制中心)
示例性的,在呼出控制中心后,如图11中(a)所示,用户手指可以触碰手机110的屏幕的顶部左侧。接着,用户手指向下滑动,且未达到触发阈值点,此时,可以跟随用户手指的滑动,逐渐提升控制中心的透明度,即图11(b)中所示。此外,在用户手指滑动过程中也可以模糊控制中心。
在用户手指滑动到触发阈值点时,可以显示通知中心,即图11(c)中所示。在手机110的屏幕上显示出通知中心后,若用户手指离开手机110的屏幕,手机110上可以显示如图11中(d)所示的界面。而在手机110的屏幕上显示通知中心后,若用户手指未离开手机110的屏幕,而是向相反方向滑动,手机110可以退回到显示通知中心之前的界面,即图11中(e)所示。继续图11中(e),用户手指向上滑动后,用户手指可以离开手机110的屏幕,此时可以显示如图11中(f)所示的界面,即返回到手机110上控制中心的显示界面。
可选地,手机100上显示如图11中(d)所示的界面后,可以转向图10中的(a),以进行通知中心向控制中心的切换。
(4)手机100处于点亮状态时的界面、通知中心和控制中心之间的转换
手机100处于点亮状态时,用户可以先呼出通知中心,再呼出控制中心,然后从控制中心的界面直接返回到呼出通知中心前的界面,即手机100处于点亮状态时的界面。此外,用户可以先呼出控制中心,再呼出通知中心,然后从通知中心的界面直接返回到呼出控制中心 前的界面,即手机100处于点亮状态时的界面。下面以手机100的屏幕处于桌面的显示界面为例,分别进行介绍。
a)桌面→通知中心→控制中心→桌面
示例性的,如图12中(a)所示,手机100的屏幕处于桌面的显示界面时,用户手指可以触碰手机100的屏幕的顶部左侧,以及向下滑动呼出通知中心,即图12中(b)所示。如图12中(c)所示,在呼出通知中心之后,用户手指可以触碰手机100的屏幕的顶部右侧,以及向下滑动呼出控制中心,即图12中(d)所示。如图12中(e)和(f)所示,在呼出控制中心后,用户手指可以触碰手机100的屏幕下部区域,以及向上滑动。其中,在用户手指向上滑动时,用户手指可以离开手机100的屏幕,此时手机100的屏幕可以返回显示桌面的界面,即显示图12中(a)所示的界面。
b)桌面→控制中心→通知中心→桌面
示例性的,如图13中(a)所示,手机100的屏幕处于桌面的显示界面时,用户手指可以触碰手机100的屏幕的顶部右侧,以及向下滑动呼出控制中心,即图13中(b)所示。如图13中(c)所示,在呼出控制中心之后,用户手指可以触碰手机100的屏幕的顶部左侧,以及向下滑动呼出控制中心,即图13中(d)所示。如图13中(e)和(f)所示,在呼出控制中心后,用户手指可以触碰手机100的屏幕下部区域,以及向上滑动。其中,在用户手指向上滑动时,用户手指可以离开手机100的屏幕,此时手机100的屏幕可以返回显示桌面的界面,即显示图13中(a)所示的界面。
可以理解的是,在图12和/或13中可以反复交替下拉呼出通知中心和控制中心,但不管手机100上当前显示的是哪一个,当手机100接收到关闭当前显示的界面的指令后(比如,用户手指在屏幕上向上滑动等),手机100则直接返回至桌面,即返回至下拉前的界面。
可以理解的是,图8至13中所示的通知中心和/或控制中心中所显示的内容仅是示例性说明,在本申请另一些实施例中,通知中心和/或控制中心中可以包括比图示更多或更少的内容,例如,可以不包括与音乐相关的内容。
可以理解的是,本申请实施例中,通知中心和控制中心可以使用不同的窗口显示,也可以使用相同的窗口显示。其中,当使用不同的窗口显示通知中心和控制中心时,关闭通知中心可以理解为关闭通知中心所属的窗口,关闭控制中心可以理解为关闭控制中心所属的窗口,打开通知中心可以理解为打开通知中心所属的窗口,打开控制中心可以理解为打开控制中心所属的窗口。
当使用同一窗口显示通知中心和控制中心时,关闭通知中心并打开控制中心可以理解为使用控制中心的内容替换通知中心的内容,关闭控制中心并打开通知中心可以理解为使用通知中心的内容替换控制中心的内容。若在呼出通知中心前未打开控制中心,则打开通知中心可以理解为在窗口中显示通知中心的内容。若在呼出控制中心前未打开通知中心,则打开控制中心的内容可以理解为在窗口中显示控制中心的内容。示例性的,通知中心和控制中心可以均在状态栏窗口中显示。
下面以通知中心和控制中心均在手机100的状态栏窗口中显示为例,对终端控制方法所能实现的系统架构进行介绍。
示例性的,图14示出了一种手机100的系统架构及处理过程示意图。如图14所示,该手机100的系统架构中可以包括状态栏窗口1401、面板容器1402、面板互拉控制器1403, 通知中心面板控制器1404,控制中心面板控制器1405,通知中心面板1406,和控制中心面板1407。
其中,用户在呼出通知中心或控制中心的过程中,状态栏窗口1401可以接收手机100上触摸屏检测到的用户的操作事件,其中,该操作事件中可以包括用户手指的位置坐标。状态栏窗口1401接收到操作事件后,可以基于操作事件中用户手指在不同时刻的位置坐标,确定出用户的操作事件是否是下拉滑动事件。此外,状态栏窗口1401确定出用户的操作事件为下拉滑动操作事件后,其可以判断当前有无面板(比如通知面板或控制面板)处于打开状态。其中,状态栏窗口1401若判断出当前无面板处于打开状态,则将其接收到的操作事件中发送至面板容器1402,以由面板容器1402处理该操作事件;若判断出当前有面板处于打开状态,则将其接收到的操作事件中发送至面板互拉控制器1403,以由面板互拉控制器1403处理该操作事件。
面板容器1402可以由操作事件中用户手指的位置坐标,确定出用户手指的上手点,进而由该上手点判断用户当前的操作目的是打开通知中心还是打开控制中心。其中,当面板容器1402判断出用户当前的操作目的是打开通知中心时,面板容器1402可以将操作事件发送至通知中心面板控制器1404。当面板容器1402判断出用户当前的操作目的是打开控制中心时,面板容器1402可以将操作事件发送至控制中心面板控制器1405。
面板互拉控制器1403可以在状态栏窗口1401判断出当前有面板处于展开状态时,基于操作事件中用户手指的位置坐标,判断用户当前的操作目的是关闭通知中心并打开控制中心,还是关闭控制中心并打开通知中心。其中,当面板互拉控制器1403判断出用户当前的操作目的是关闭控制中心并打开通知中心时,面板互拉控制器1403可以将操作事件发送至通知中心面板控制器1404。当面板互拉控制器1403判断出用户当前的操作目的是关闭通知中心并打开控制中心时,面板互拉控制器1403可以将操作事件发送至控制中心面板控制器1405。
此外,为了增强目标面板的呼出效果,面板互拉控制器1403可以基于操作事件对目标面板打开之前的界面进行模糊处理,或者,调整目标面板打开之前的内容的透明度。其中,当状态栏窗口1401确定出无面板被打开时,面板互拉控制器1403可以从用户当前的操作目的对应的目标控制器获取到该操作事件,也即是说,此时目标控制器可以将操作事件发送至面板互拉控制器1403。
通知中心控制器1404可以根据操作事件,打开通知中心面板1406。此外,当状态栏窗口1401确定出无面板被打开,且通知中心控制器1404接收到操作事件时,通知中心控制器1404可以将该操作事件发送至面板互拉控制器1403,以便面板互拉控制器1403可以基于操作事件对通知中心面板打开之前的界面进行模糊处理,或者,调整通知中心面板打开之前的内容的透明度。
控制中心控制器1405可以根据操作事件,打开控制中心面板1407。此外,当状态栏窗口1401确定出无面板被打开,且控制中心控制器1404接收到操作事件时,控制中心控制器1404可以将该操作事件发送至面板互拉控制器1403,以便面板互拉控制器1403可以基于操作事件对控制中心面板打开之前的界面进行模糊处理,或者,调整控制中心打开之前的内容的透明度。
下面结合图14和图15对呼出通知中心和/或控制中心的过程进行详细描述。
示例性的,图15示出了一种呼出通知中心和/或控制中心的流程示意图。如图15所示,包括以下步骤:
步骤1501、状态栏窗口1401响应接收的操作事件,确定操作事件为下拉事件。
状态栏窗口1401可以接收手机100的触摸屏发送的操作事件,该操作事件中可以包括用户手指的位置坐标。状态栏窗口1401接收到操作事件后,可以基于操作事件中用户手指在不同时刻的位置坐标,确定出用户的操作事件是否是下拉事件。
步骤1502、状态栏窗口1401判断有无面板被打开。
状态栏窗口1401可以由面板记录信息中确定出当前是否有面板处于打开状态。其中,面板记录信息中可以包括通知中心面板和控制中心面板的打开记录,和/或,关闭记录。例如,当面板记录信息中记录的信息为通知中心和控制中心均处于关闭状态时,可以确定当前未有面板处于打开状态;当面板记录信息中记录的信息为通知中心处于关闭状态,控制中心处于打开状态时,可以确定当前有面板处于打开状态。若状态栏窗口1401判断出没有面板被打开,则执行步骤1503;否则,则执行步骤1508。
步骤1503、面板容器1402响应状态栏窗口1401发送的操作事件,判断操作事件中上手点的位置。
面板容器1402可以基于操作事件中用户手指的坐标位置,确定出用户手指的上手点所在区域,之后,其可以根据预先设定的上手点所在区域与通知中心和控制中心之间的对应关系,确定出用户当前的操作目的是打开通知中心,还是打开控制中心。例如,继续参阅图7,预先设定的上手点所在区域与通知中心和控制中心之间的对应关系可以为:上手点处于区域z1时对应呼出通知中心,上手点处于区域z2时对应呼出控制中心,则当面板容器确定出上手点位于区域z1中时,其可以确定出用户当前的操作目的是打开通知中心。其中,若上手点的位置对应呼出通知中心,执行步骤1504;若上手点的位置对应呼出控制中心,则执行步骤1506。
步骤1504、通知中心控制器1404响应面板容器1402发送的操作事件,判断是否达到触发阈值点,且面板互拉控制器1403响应通知中心控制器1404发送的操作事件,对背景进行模糊处理。其中,若判断出达到触发阈值点,则执行步骤1505;否则,则继续判断。
通知中心控制器1404可以根据操作事件中用户手指滑动过程中的坐标位置,确定是否滑动到预先设定的触发阈值点。示例性的,上手点的位置坐标为(2,1),用户手指当前所在的位置坐标为(2,5),则滑动距离为4个距离,此时,若预先设定的触发阈值点为与上手点相距4个距离的点,则用户手指滑动到触发阈值点;若预先设定的触发阈值点的坐标为(k,5),其中,k≥0,则此时用户手指当前所在的坐标刚好处于触发阈值点。此外,通知中心控制器1404若根据用户手指滑动过程中的坐标位置,确定出用户手指在未滑动到触发阈值点时已离手,则通知中心控制器1404也可以根据操作事件计算用户手指的离手点的速度。当离手点的速度大于预设速度阈值时,也可以认为是达到了触发阈值点。
在通知中心控制器1404判断过程中,通知中心控制器1404可以向面板互拉控制器1403发送是否到达触发阈值点的信息,以及将操作事件发送至面板互拉控制器1403。其中,面板互拉控制器1403在接收到通知中心控制器1404发送的信息后,若该信息指示未到达触发阈值点,则面板互拉控制器1403可以基于操作事件,对背景(比如呼出通知中心之前的界面)进行模糊处理(比如降低当前显示界面的清晰度等),以增强通知中心面板的呼出效果。示例性的,面板互拉控制器1403可以基于操作事件中用户手指当前的位置坐标与触发阈值点之间的距离,调整背景的模糊程度;例如,当用户手指当前的位置坐标与触发阈值点之间的距离较远时,模糊程度较小,当用户手指当前的位置坐标与触发阈值点之间的距离较近时,模糊程度较大。
步骤1505、通知中心控制器1404打开通知中心面板1406。
步骤1506、控制中心控制器1405响应面板容器1402发送的操作事件,判断是否达到触发阈值点,且面板互拉控制器1403响应控制中心控制器1405发送的操作事件,对背景进行模糊处理。其中,若判断出达到触发阈值点,则执行步骤1507;否则,则继续判断。
控制中心控制器1405可以根据操作事件中用户手指滑动过程中的坐标位置,或者滑动过程中的滑动速度,或者离手点的速度等,确定是否滑动到预先设定的触发阈值点。同时,控制中心控制器1405可以向面板互拉控制器1403发送是否到达触发阈值点的信息,以及将操作事件发送至面板互拉控制器1403。其中,面板互拉控制器1403在接收到控制中心控制器1405发送的信息后,若该信息指示未到达触发阈值点,则面板互拉控制器1403可以基于操作事件,对背景(比如呼出控制中心之前的界面)进行模糊处理(比如降低当前显示界面的清晰度等),以增强控制中心面板的呼出效果。
步骤1507、控制中心控制器1405打开控制中心面板1407。
步骤1508、面板互拉控制器1403响应状态栏窗口1401发送的操作事件,判断操作事件中上手点的位置。
其中,若上手点的位置对应呼出通知中心,执行步骤1509;若上手点的位置对应呼出控制中心,执行步骤1511。
步骤1509、通知中心控制器1404响应面板互拉控制器1403发送的操作事件,判断是否达到触发阈值点,且面板互拉控制器1403基于操作事件调整控制中心面板1407的透明度。其中,若判断出达到触发阈值点,则执行步骤1510,否则继续判断。
通知中心控制器1404可以根据操作事件中用户手指滑动过程中的坐标位置,或者滑动过程中的滑动速度,或者离手点的速度等,确定是否滑动到预先设定的触发阈值点。同时,在通知中心控制器1404判断过程中,面板互拉控制器1403可以基于操作事件调整控制中心面板1407的透明度,以增强通知中心面板的呼出效果。示例性的,面板互拉控制器1403可以基于操作事件中用户手指当前的位置坐标与触发阈值点之间的距离,调整背控制中心面板1407的透明度;例如,当用户手指当前的位置坐标与触发阈值点之间的距离较远时,控制中心面板1407的透明度较小,当用户手指当前的位置坐标与触发阈值点之间的距离较近时,控制中心面板1407的透明度较大。
此外,在通知中心控制器1404判断出达到触发阈值点时,通知中心控制器1404也可以向面板互拉控制器1403反馈到达触发阈值点的信息,以便面板互拉控制器1403在通知中心控制器1404判断出达到触发阈值点时指示控制中心控制器1405关闭控制中心面板1407。
步骤1510、通知中心控制器1404打开通知中心面板1406。
步骤1511、控制中心控制器1405响应面板互拉控制器1403发送的操作事件,判断是否达到触发阈值点,且面板互拉控制器1403基于操作事件调整通知中心面板1406的透明度。其中,若判断出达到触发阈值点,则执行步骤1512,否则继续判断。
控制中心控制器1405可以根据操作事件中用户手指滑动过程中的坐标位置,或者滑动过程中的滑动速度,或者离手点的速度等,确定是否滑动到预先设定的触发阈值点。同时,在控制中心控制器1405判断过程中,面板互拉控制器1403可以基于操作事件调整通知中心面板1406的透明度,以增强控制中心面板的呼出效果。
此外,在控制中心控制器1405判断出达到触发阈值点时,控制中心控制器1405也可以向面板互拉控制器1403反馈到达触发阈值点的信息,以便面板互拉控制器1403在控制中心 控制器1405判断出达到触发阈值点时指示通知中心控制器1404关闭通知中心面板1406。
步骤1512、控制中心控制器1405打开控制中心面板1407。
基于上述实施例中的描述的方案,本申请实施例还提供了一种芯片。请参阅图16,图16为本申请实施例提供的一种芯片的结构示意图。如图16所示,芯片1600包括一个或多个处理器1601以及接口电路1602。可选的,芯片1600还可以包含总线1603。其中:
处理器1601可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方案中涉及的控制过程可以通过处理器1601中的硬件的集成逻辑电路或者软件形式的指令完成。接口电路1602可以用于数据、指令或者信息的发送或者接收,处理器1601可以利用接口电路1602接收的数据、指令或者其它信息,进行加工,可以将加工完成信息通过接口电路1602发送出去。
可选的,芯片还包括存储器,存储器可以包括只读存储器和随机存取存储器,并向处理器提供操作指令和数据。存储器的一部分还可以包括非易失性随机存取存储器(NVRAM)。可选的,存储器存储了可执行软件模块或者数据结构,处理器可以通过调用存储器存储的操作指令(该操作指令可存储在操作系统中),执行相应的操作。
可选的,接口电路1602可用于输出处理器1601的执行结果。
需要说明的,处理器1601、接口电路1602各自对应的功能既可以通过硬件设计实现,也可以通过软件设计来实现,还可以通过软硬件结合的方式来实现,这里不作限制。示例性的,该芯片可应用于上述图2中所示的电子设备100中,以实现本申请实施例中提供的方法。
可以理解的是,本申请的实施例中的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件,硬件部件或者其任意组合。通用处理器可以是微处理器,也可以是任何常规的处理器。
本申请的实施例中的方法步骤可以通过硬件的方式来实现,也可以由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于随机存取存储器(random access memory,RAM)、闪存、只读存储器(read-only memory,ROM)、可编程只读存储器(programmable rom,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者通过所述计算机可读存储介质进行传输。所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可 读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
可以理解的是,在本申请的实施例中涉及的各种数字编号仅为描述方便进行的区分,并不用来限制本申请的实施例的范围。

Claims (21)

  1. 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:
    在所述触摸屏显示第一界面;
    响应于所述触摸屏接收到第一操作,将所述第一界面切换至第二界面,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的第一区域内,且在所述触摸屏上沿第一方向滑动的操作;
    在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,将所述第二界面切换至第三界面,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第二区域内,且在所述触摸屏上沿第二方向滑动的操作;
    在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,将所述第三界面切换至第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第三区域内,且在所述触摸屏上沿第三方向滑动的操作;
    其中,所述第二界面为通知中心的显示界面,所述第三界面为控制中心的显示界面;或者,所述第二界面为控制中心的显示界面,所述第三界面为通知中心的显示界面。
  2. 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:
    在所述触摸屏显示第一界面;
    响应于所述触摸屏接收到第一操作,将所述第一界面切换至第二界面,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的第一区域内,且在所述触摸屏上沿第一方向滑动的操作;
    在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,将所述第二界面切换至第三界面,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第二区域内,且在所述触摸屏上沿第二方向滑动的操作;
    在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏再次接收到所述第一操作,将所述第三界面切换至所述第二界面;
    其中,所述第二界面为通知中心的显示界面,所述第三界面为控制中心的显示界面;或者,所述第二界面为控制中心的显示界面,所述第三界面为通知中心的显示界面。
  3. 根据权利要求2所述的方法,其特征在于,所述将所述第三界面切换至所述第二界面之后,还包括:
    在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,将所述第二界面切换至所述第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第三区域内,且在所述触摸屏上沿第三方向滑动的操作。
  4. 根据权利要求1-3任一所述的方法,其特征在于,所述第一界面包括所述电子设备上的桌面的显示界面,或者,所述第一界面包括所述电子设备上的应用的显示界面。
  5. 根据权利要求1-4任一所述的方法,其特征在于,所述第二界面和所述第三界面均显示于第一窗口中。
  6. 根据权利要求5所述的方法,其特征在于,所述第一窗口为状态栏窗口。
  7. 根据权利要求1-6任一所述的方法,其特征在于,所述第一界面和所述第二界面显示于不同的窗口中。
  8. 根据权利要求1-7任一所述的方法,其特征在于,所述第一区域位于所述触摸屏的顶 部的第一侧,所述第一方向为由所述触摸屏的顶部朝向所述触摸屏的底部的方向;
    所述第二区域位于所述触摸屏的顶部的第二侧,所述第二方向与所述第一方向相同;
    所述第三区域为所述触摸屏上除所述第一区域和所述第二区域以外的区域,所述第三方向与所述第一方向相反。
  9. 根据权利要求1-8任一所述的方法,其特征在于,所述通知中心为所述电子设备上用于管理来自所述电子设备上应用的推送或显示常驻状态信息的入口;
    所述控制中心为所述电子设备上用于控制所述电子设备的状态的入口。
  10. 根据权利要求1-9任一所述的方法,其特征在于,将第一目标界面切换至第二目标界面之前,还包括:
    确定所述触摸体在触摸屏上的操作达到触发条件,所述触发条件为触发界面切换的条件;
    其中,所述第一目标界面为第一界面,所述第二目标界面为第二界面;或者,所述第一目标界面为第二界面,所述第二目标界面为第三界面;或者,所述第一目标界面为第三界面,所述第二目标界面为第一界面;或者,所述第一目标界面为第三界面,所述第二目标界面为第二界面;或者,所述第一目标界面为第二界面,所述第二目标界面为第一界面。
  11. 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:
    所述触摸体当前时刻接触所述触摸屏的位置与所述起始位置之间的距离大于或等于预设距离阈值。
  12. 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:
    所述触摸体当前时刻接触所述触摸屏的位置到达所述触摸屏上的预设位置。
  13. 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:
    所述触摸体离开所述触摸屏时的位置与所述起始位置之间的距离小于预设距离阈值,且所述触摸体离开所述触摸屏时的速度大于或等于预设速度阈值。
  14. 根据权利要求10-13任一所述的方法,其特征在于,将所述第一目标界面切换至所述第二目标界面的过程中,还包括:
    提高所述第一目标界面的透明度,或者,降低所述第一目标界面的清晰度。
  15. 根据权利要求1-14任一所述的方法,其特征在于,
    所述将所述第一界面切换至所述第二界面包括:将所述第二界面覆盖在所述第一界面上;或者
    所述将所述第一界面切换至所述第二界面包括:将所述第一界面模糊处理,然后将所述第二界面覆盖在模糊处理后的所述第一界面上;或者
    所述将所述第二界面切换至所述第三界面包括:关闭所述第二界面,并打开所述第三界面;或者
    所述将所述第二界面切换至所述第三界面包括:关闭所述第二界面,并打开所述第三界面,所述第三界面覆盖在所述第一界面上;或者
    所述将所述第三界面切换至所述第一界面包括:关闭覆盖在所述第一界面上的所述第三界面,呈现所述第一界面;或者
    所述将所述第三界面切换至所述第二界面包括:关闭所述第三界面,并打开所述第二界面;或者
    所述将所述第三界面切换至所述第二界面包括:关闭所述第三界面,并打开所述第二界面,所述第二界面覆盖在所述第一界面上;或者
    所述将所述第二界面切换至所述第一界面包括:关闭覆盖在所述第一界面上的所述第二界面,呈现所述第一界面。
  16. 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:
    在所述触摸屏显示第一界面,所述第一界面包括所述电子设备上的桌面的显示界面,或者,所述第一界面包括所述电子设备上的应用的显示界面;
    响应于所述触摸屏接收到第一操作,将第二界面覆盖在所述第一界面上,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的顶部的第一区域内,且向所述触摸屏的底部滑动的操作,所述第二界面包括通知中心的显示界面或控制中心的显示界面;
    在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,关闭所述第二界面且打开第三界面,其中,打开后的所述第三界面覆盖在所述第一界面上,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的顶部的第二区域内,且向所述触摸屏的底部滑动的操作,所述第三界面包括通知中心的显示界面或控制中心的显示界面,且所述第三界面与所述第二界面不同;
    在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,关闭所述第三界面,呈现所述第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的顶部以外的第三区域内,且向所述触摸屏的顶部滑动的操作。
  17. 根据权利要求16所述的方法,其特征在于,所述将第二界面覆盖在所述第一界面上之前,所述方法还包括:
    降低所述第一界面的清晰度。
  18. 根据权利要求16或17所述的方法,其特征在于,所述关闭所述第二界面之前,所述方法还包括:提高所述第二界面的透明度。
  19. 一种电子设备,其特征在于,包括:
    触摸屏;
    一个或多个处理器;
    存储器;
    其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
  20. 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
  21. 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
PCT/CN2022/084089 2021-06-01 2022-03-30 一种控制方法及电子设备 WO2022252788A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22814835.9A EP4332744A1 (en) 2021-06-01 2022-03-30 Control method and electronic device
BR112023023988A BR112023023988A2 (pt) 2021-06-01 2022-03-30 Método de controle, dispositivo eletrônico, mídia de armazenamento legível por computador e produto de programa de computador

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110611045.2 2021-06-01
CN202110611045.2A CN115495002A (zh) 2021-06-01 2021-06-01 一种控制方法及电子设备

Publications (1)

Publication Number Publication Date
WO2022252788A1 true WO2022252788A1 (zh) 2022-12-08

Family

ID=84322737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/084089 WO2022252788A1 (zh) 2021-06-01 2022-03-30 一种控制方法及电子设备

Country Status (4)

Country Link
EP (1) EP4332744A1 (zh)
CN (1) CN115495002A (zh)
BR (1) BR112023023988A2 (zh)
WO (1) WO2022252788A1 (zh)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8473871B1 (en) * 2012-10-16 2013-06-25 Google Inc. Multiple seesawing panels
US20140365945A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Device, method, and graphical user interface for providing navigation and search functionalities
US20150082257A1 (en) * 2013-09-17 2015-03-19 Lg Electronics Inc. Mobile terminal and control method thereof
CN106406728A (zh) * 2016-08-31 2017-02-15 瓦戈科技(上海)有限公司 移动终端桌面手势的操作方法
CN107632757A (zh) * 2017-08-02 2018-01-26 努比亚技术有限公司 一种终端控制方法、终端及计算机可读存储介质
CN108255404A (zh) * 2018-01-19 2018-07-06 广东欧珀移动通信有限公司 用户界面显示方法、装置及终端
CN108563388A (zh) * 2018-02-27 2018-09-21 努比亚技术有限公司 屏幕操作方法、移动终端及计算机可读存储介质
US20190369861A1 (en) * 2018-06-01 2019-12-05 Apple Inc. Accessing system user interfaces on an electronic device
US20200363937A1 (en) * 2018-01-19 2020-11-19 Guangdong Oppo Mobile Telecommunications Corp., Ltd. User interface display method, device, and apparatus

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8473871B1 (en) * 2012-10-16 2013-06-25 Google Inc. Multiple seesawing panels
US20140365945A1 (en) * 2013-06-09 2014-12-11 Apple Inc. Device, method, and graphical user interface for providing navigation and search functionalities
US20150082257A1 (en) * 2013-09-17 2015-03-19 Lg Electronics Inc. Mobile terminal and control method thereof
CN106406728A (zh) * 2016-08-31 2017-02-15 瓦戈科技(上海)有限公司 移动终端桌面手势的操作方法
CN107632757A (zh) * 2017-08-02 2018-01-26 努比亚技术有限公司 一种终端控制方法、终端及计算机可读存储介质
CN108255404A (zh) * 2018-01-19 2018-07-06 广东欧珀移动通信有限公司 用户界面显示方法、装置及终端
US20200363937A1 (en) * 2018-01-19 2020-11-19 Guangdong Oppo Mobile Telecommunications Corp., Ltd. User interface display method, device, and apparatus
CN108563388A (zh) * 2018-02-27 2018-09-21 努比亚技术有限公司 屏幕操作方法、移动终端及计算机可读存储介质
US20190369861A1 (en) * 2018-06-01 2019-12-05 Apple Inc. Accessing system user interfaces on an electronic device
CN110554828A (zh) * 2018-06-01 2019-12-10 苹果公司 访问电子设备上的系统用户界面

Also Published As

Publication number Publication date
EP4332744A1 (en) 2024-03-06
BR112023023988A2 (pt) 2024-01-30
CN115495002A (zh) 2022-12-20

Similar Documents

Publication Publication Date Title
US11709560B2 (en) Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator
DK180317B1 (en) Systems, methods, and user interfaces for interacting with multiple application windows
US20230244317A1 (en) Proxy Gesture Recognizer
US20230409165A1 (en) User interfaces for widgets
US20240152223A1 (en) Systems and Methods for Interacting with Multiple Applications that are Simultaneously Displayed on an Electronic Device with a Touch-Sensitive Display
CA2909730C (en) Event recognition by device with touch-sensitive display using multiple gesture recognizers
JP5859508B2 (ja) 対話型ポップアップビューを備えたデバイス、方法、およびグラフィカルユーザインタフェース
US8826164B2 (en) Device, method, and graphical user interface for creating a new folder
AU2010339633B2 (en) Apparatus and method having multiple application display modes including mode with display resolution of another apparatus
US8839122B2 (en) Device, method, and graphical user interface for navigation of multiple applications
US20160253086A1 (en) Device, method, and graphical user interface for managing multiple display windows
EP3594793A1 (en) Device, method, and graphical user interface for managing folders
US20110163967A1 (en) Device, Method, and Graphical User Interface for Changing Pages in an Electronic Document
US11829591B2 (en) User interface for managing input techniques
AU2018269159A1 (en) Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display
WO2022028310A1 (zh) 添加批注的方法、电子设备及相关装置
WO2022252788A1 (zh) 一种控制方法及电子设备
CN114461312B (zh) 显示的方法、电子设备及存储介质
KR102619538B1 (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22814835

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112023023988

Country of ref document: BR

WWE Wipo information: entry into national phase

Ref document number: 2022814835

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 18565936

Country of ref document: US

Ref document number: 2023574265

Country of ref document: JP

ENP Entry into the national phase

Ref document number: 2022814835

Country of ref document: EP

Effective date: 20231128

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 112023023988

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20231116