WO2022252788A1 - 一种控制方法及电子设备 - Google Patents
一种控制方法及电子设备 Download PDFInfo
- Publication number
- WO2022252788A1 WO2022252788A1 PCT/CN2022/084089 CN2022084089W WO2022252788A1 WO 2022252788 A1 WO2022252788 A1 WO 2022252788A1 CN 2022084089 W CN2022084089 W CN 2022084089W WO 2022252788 A1 WO2022252788 A1 WO 2022252788A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- interface
- touch screen
- touch
- electronic device
- control center
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 87
- 230000004044 response Effects 0.000 claims abstract description 24
- 230000008569 process Effects 0.000 claims description 36
- 238000004590 computer program Methods 0.000 claims description 15
- 238000010586 diagram Methods 0.000 description 21
- 230000000694 effects Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 11
- 230000009471 action Effects 0.000 description 10
- 238000007726 management method Methods 0.000 description 6
- 229920001621 AMOLED Polymers 0.000 description 4
- 230000007704 transition Effects 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 235000012149 noodles Nutrition 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present application relates to the technical field of terminals, and in particular to a control method and electronic equipment.
- the notification center may be an entrance for managing pushes from applications (applications, APPs) in electronic devices or displaying resident state information.
- the control center can be the portal to control the state of the equipment.
- a notification center and a control center on an electronic device can be simultaneously displayed in one window.
- sliding down from the top of the screen of the electronic device can call out the control center 11 and the notification center 12 at the same time, wherein the control center 11 is on the upper side, the notification center 12 is on the lower side, and the control center 11 can be folded by default to display the most
- the commonly used shortcut switches support expanding to view more information, and the notification center 12 can support scrolling up and down like a list.
- the display interface of the electronic device will be overcrowded when displaying the notification center and control center. Control centers on electronic devices are not designed to carry too much information at once. Therefore, how to facilitate the user to operate the notification center and control center on the electronic device is an urgent technical problem to be solved at present.
- the present application provides a control method and an electronic device, which can facilitate a user to operate a notification center and a control center on the electronic device.
- the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen; switching the first interface to the second interface in response to receiving a first operation on the touch screen
- the first operation refers to the operation in which the initial position of the touch body touching the touch screen is located in the first area of the touch screen and slides along the first direction on the touch screen; after the touch body completes the first operation and leaves the touch screen, it responds to the touch screen receiving Go to the second operation, switch the second interface to the third interface
- the second operation refers to the operation that the touch body re-contacts the touch screen in the second area of the touch screen, and slides along the second direction on the touch screen; After the touch body completes the second operation and leaves the touch screen, the third interface is switched to the first interface in response to the touch screen receiving a third operation.
- the third operation means that the initial position of the touch body re-contacting the touch screen is located in the third area of the touch screen , and the operation of sliding along the third direction on the touch screen; wherein, the second interface is the display interface of the notification center, and the third interface is the display interface of the control center; or, the second interface is the display interface of the control center, and the third interface It is the display interface of the notification center.
- the user can switch the notification center and the control center on the electronic device, and can directly return to the upper initial interface of the electronic device (such as the desktop, the display interface of the application, etc.) from the switched interface, which improves the convenience of user operation. Convenience and user experience.
- the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen; switching the first interface to the second interface in response to receiving a first operation on the touch screen
- the first operation refers to the operation in which the initial position of the touch body touching the touch screen is located in the first area of the touch screen and slides along the first direction on the touch screen; after the touch body completes the first operation and leaves the touch screen, it responds to the touch screen receiving Go to the second operation, switch the second interface to the third interface
- the second operation refers to the operation that the touch body re-contacts the touch screen in the second area of the touch screen, and slides along the second direction on the touch screen; After the touch body completes the second operation and leaves the touch screen, in response to the touch screen receiving the first operation again, switch the third interface to the second interface; wherein, the second interface is the display interface of the notification center, and the third interface is the display of the control center interface; or, the second interface is
- the third interface after switching the third interface to the second interface, it may further include: after the touch body completes the first operation and leaves the touch screen, responding to the touch screen receiving the third operation, switching the third interface to the second interface
- the second interface is switched to the first interface
- the third operation refers to an operation in which the initial position of the touch body re-contacting the touch screen is located in the third area of the touch screen, and the touch screen slides along the third direction.
- the user can directly return to the upper initial interface (such as desktop, application display interface, etc.) of the electronic device from the switched interface on the electronic device, which improves the convenience of user operation and user experience.
- the first interface includes a display interface of a desktop on the electronic device, or, the first interface includes a display interface of an application on the electronic device.
- both the second interface and the third interface are displayed in the first window. Therefore, when switching between the two interfaces, the content in one interface can be used to replace the content in the other interface, so that there is no need to close one window and open the other window, which improves the switching efficiency.
- the first window is a status bar window.
- the first interface and the second interface are displayed in different windows.
- the first interface when the first interface is the display interface of the application, the first interface may be displayed in the display window of the application, and the second interface may be displayed in the status bar window.
- the first area is located on the first side of the top of the touch screen, and the first direction is a direction from the top of the touch screen toward the bottom of the touch screen;
- the second area is located on the top of the touch screen On the second side of the top, the second direction is the same as the first direction;
- the third area is an area on the touch screen other than the first area and the second area, and the third direction is opposite to the first direction.
- the notification center is an entry on the electronic device for managing pushes from applications on the electronic device or displaying resident status information;
- the control center is an entry for the electronic device to An entry that controls the state of an electronic device.
- the first target interface before switching the first target interface to the second target interface, it may further include: determining that the operation of the touch body on the touch screen meets the trigger condition, and the trigger condition is trigger Conditions for interface switching; wherein, the first target interface is the first interface, and the second target interface is the second interface; or, the first target interface is the second interface, and the second target interface is the third interface; or, the first target The interface is the third interface, and the second target interface is the first interface; or, the first target interface is the third interface, and the second target interface is the second interface; or, the first target interface is the second interface, and the second target interface is the first interface.
- the interface is switched when the trigger condition is met, and the switching effect is improved.
- the conditions for triggering interface switching may specifically include: the distance between the position where the touch body touches the touch screen at the current moment and the initial position is greater than or equal to a preset distance threshold .
- condition for triggering interface switching may specifically include: the position where the touch body touches the touch screen at the current moment reaches a preset position on the touch screen.
- the conditions for triggering interface switching may specifically include: the distance between the position of the touch body when it leaves the touch screen and the initial position is less than a preset distance threshold, and the touch The speed when the body leaves the touch screen is greater than or equal to the preset speed threshold.
- the process of switching the first target interface to the second target interface may further include: increasing the transparency of the first target interface, or reducing the transparency of the first target interface The clarity of the interface. In this way, transition processing can be performed during the switching process of the two interfaces to enhance the switching effect.
- switching the first interface to the second interface may include: covering the second interface on the first interface; or switching the first interface to the second interface Including: blurring the first interface, and then covering the second interface on the blurred first interface; or switching the second interface to the third interface includes: closing the second interface, and opening the third interface; or Switching from the second interface to the third interface includes: closing the second interface and opening the third interface, where the third interface is overlaid on the first interface; or switching the third interface to the first interface includes: closing the overlay on the first interface the third interface, presenting the first interface; or switching the third interface to the second interface includes: closing the third interface, and opening the second interface; or switching the third interface to the second interface includes: closing the third interface, And opening the second interface, where the second interface is overlaid on the first interface; or switching the second interface to the first interface includes: closing the second interface overlaid on the first interface, and presenting the first interface.
- the present application provides a control method applied to an electronic device with a touch screen, the method may include: displaying a first interface on the touch screen, the first interface includes a desktop display interface on the electronic device, or, the first The interface includes a display interface of an application on the electronic device; in response to receiving a first operation on the touch screen, covering the second interface on the first interface, the first operation refers to that the initial position of the touch body touching the touch screen is located at the top of the touch screen.
- the second interface includes the display interface of the notification center or the display interface of the control center; after the touch body completes the first operation and leaves the touch screen, in response to the touch screen receiving the second operation, close The second interface and the third interface are opened, wherein the opened third interface covers the first interface, and the second operation refers to that the initial position of the touch body re-contacting the touch screen is located in the second area at the top of the touch screen, and The operation of sliding the bottom of the touch screen, the third interface includes the display interface of the notification center or the display interface of the control center, and the third interface is different from the second interface; after the touch body completes the second operation and leaves the touch screen, in response to the touch screen receiving the first The third operation is to close the third interface and present the first interface.
- the third operation refers to an operation in which the touch body re-contacts the touch screen at a starting position located in a third area other than the top of the touch screen, and slides toward the top of the touch screen.
- the method before covering the second interface on the first interface, the method may further include: reducing the definition of the first interface.
- the method may further include: increasing the transparency of the second interface.
- the present application provides an electronic device, which may include: a touch screen; one or more processors; and a memory.
- one or more computer programs are stored in the memory, and the one or more computer programs include instructions.
- the electronic device executes the first aspect, the second aspect, or the method provided in the third aspect. method.
- the present application provides a computer-readable storage medium.
- the computer-readable storage medium stores a computer program.
- the computer program runs on an electronic device, the electronic device executes the first aspect, the second aspect, or the first aspect.
- the present application provides a computer program product, which, when the computer program product is run on an electronic device, causes the electronic device to execute the method provided in the first aspect, the second aspect, or the third aspect.
- FIG. 1 is a schematic diagram of a display interface of a mobile phone in the related art
- FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided in an embodiment of the present application.
- FIG. 3 is a schematic diagram of coordinate axes on a screen of an electronic device provided in an embodiment of the present application
- FIG. 4 is a schematic diagram of the architecture of an operating system in an electronic device provided in an embodiment of the present application.
- FIG. 5 is a schematic diagram of an application scenario of a method for operating a mobile phone provided in an embodiment of the present application
- Fig. 6 is a schematic diagram of the hand-on point, the trigger threshold point, and the hand-off point when the user's finger slides on the mobile phone provided by the embodiment of the present application;
- Fig. 7 is a schematic diagram of the area where the overhand point and the trigger threshold point are located when the user's finger slides on the mobile phone according to the embodiment of the present application;
- Fig. 8 is a schematic diagram of a process of switching from the desktop to the notification center on the mobile phone according to the embodiment of the present application;
- Fig. 9 is a schematic diagram of the process of switching from the desktop to the control center on the mobile phone according to the embodiment of the present application.
- Fig. 10 is a schematic diagram of a process of switching from a notification center to a control center on a mobile phone according to an embodiment of the present application
- Fig. 11 is a schematic diagram of the process of switching from the control center to the notification center on the mobile phone provided by the embodiment of the present application;
- Fig. 12 is a schematic diagram of the process of switching from the desktop to the notification center, then to the control center, and then back to the desktop on the mobile phone provided by the embodiment of the present application;
- Fig. 13 is a schematic diagram of the process of switching from the desktop to the control center, then switching to the notification center, and then returning to the desktop on the mobile phone provided by the embodiment of the present application;
- FIG. 14 is a schematic diagram of a system architecture of a mobile phone provided by an embodiment of the present application.
- Fig. 15 is a schematic flow diagram of an outbound notification center and/or control center provided by an embodiment of the present application
- FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
- first and second and the like in the specification and claims herein are used to distinguish different objects, not to describe a specific order of objects.
- first response message and the second response message are used to distinguish different response messages, rather than describing a specific order of the response messages.
- words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
- multiple means two or more, for example, multiple processing units refer to two or more processing units, etc.; multiple A component refers to two or more components or the like.
- the notification center and control center on the electronic device can be controlled independently, and the two can be displayed through different windows at this time. For example, sliding from the top left of the screen of the electronic device can call out the notification center, and sliding from the top right of the screen of the electronic device can call out the control center; or, sliding from the top of the screen of the electronic device can call out the notification center, from the electronic device Swipe up from the bottom of the screen to bring up the Control Center, and more.
- this method allows users to choose whether to call out to the notification center or call out to the control center based on their own needs, the priority of different windows in the system software architecture on the electronic device is different, and the priority of the display window of the notification center is often lower than that of the control center.
- the priority of the display window This makes it possible to call out the display window of the control center when the notification center is being displayed on the electronic device, and at this time the control center is overlaid on the notification center.
- it is necessary to return to the interface before calling out the notification center it is necessary to close the display window of the control center first, and then close the display window of the notification center. It can be seen that the return operation is very inconvenient.
- the display window of the notification center cannot be called out at this time. It can be seen that such a scheme in which the notification center and the control center are separately set causes great inconvenience to the user's operation, and the user experience is poor.
- the embodiment of the present application implements a horizontal design for the notification center and control center on the electronic device, so that the notification center and control center can be switched alternately based on user needs, which improves the ease of user operation. Convenience and user experience.
- the electronic device when one of the notification center and the control center is in the display state, and the electronic device detects the operation of calling out the other, the electronic device can close the one that is being displayed, and display the other one that the user is currently calling out, thereby avoiding There is an overlapping nesting problem between the notification center and the control center, so that the user can quickly return to the interface before calling out the notification center or the control center.
- the current electronic device is displaying the control center, and when the electronic device detects the operation of calling out the notification center, the electronic device can close the control center and display the notification center; When operating the control center, the electronic device can close the notification center and display the notification control center.
- the electronic device can be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook, and a cellular phone.
- Telephone personal digital assistant (PDA), augmented reality (augmented reality, AR) device, virtual reality (virtual reality, VR) device, artificial intelligence (artificial intelligence, AI) device, wearable device, vehicle-mounted device , smart home devices and/or smart city devices, etc.
- Exemplary embodiments of electronic equipment include but are not limited to electronic equipment equipped with iOS, android, Windows, Harmony OS or other operating systems, wherein this solution does not make special restrictions on the specific type of the electronic equipment.
- FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
- the electronic device 100 may include a processor 110 , a memory 120 , a display screen 130 and a sensor 140 .
- the structure shown in the embodiment of this solution does not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
- the illustrated components can be realized in hardware, software or a combination of software and hardware.
- the processor 110 may be a general purpose processor or a special purpose processor.
- the processor 110 may include a central processing unit (central processing unit, CPU) and/or a baseband processor.
- the baseband processor can be used to process communication data
- the CPU can be used to implement corresponding control and processing functions, execute software programs, and process data of the software programs.
- a program (or an instruction or code) may be stored in the memory 120, and the program may be executed by the processor 110, so that the processor 110 executes the method described in this solution.
- the memory 120 may store instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory 120, so as to avoid repeated access, reduce the waiting time of the processor 110, and improve the efficiency of the system.
- the display screen 130 is used to display images, videos and the like.
- the display screen 130 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
- the electronic device 100 may include 1 or N display screens 130 , where N is a positive integer greater than 1.
- the sensor 140 may include a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, an ambient light sensor, or a bone conduction sensor.
- the sensor 140 may also include a touch sensor.
- the touch sensor can be used to detect a touch operation on or near it.
- the touch sensor can collect the touch event of the user on or near it (such as the operation of the user on the surface of the touch sensor with any suitable object such as a finger or a stylus), and send the collected touch information to other devices, Such as processor 110 and so on.
- the touch sensor can be implemented in various ways such as resistive, capacitive, infrared, and surface acoustic wave.
- the touch sensor can be arranged on the display screen 130, and the touch screen is composed of the touch sensor and the display screen 130, also called "touch screen”; or, the touch sensor and the display screen 130 can be used as two independent components to realize the input and output function.
- a Cartesian coordinate system may be pre-set in the touch screen including the touch sensor.
- a rectangular coordinate system can be established with the upper left corner of the touch screen A as the origin (0,0), or a rectangular coordinate system can be established with the geometric center of the touch screen A as the origin (0,0) (not shown in the figure). Shows).
- the touch sensor in the touch screen can continuously collect a series of touch events (such as the coordinates of the touch point, touch events, etc.) generated by the touch object on the touch screen, and report this series of touch events to the Processor 110.
- the above-mentioned touch body can be a touch pen, a user's finger or joint, or an object such as a touch glove with a touch function, which is not limited in this solution.
- the user's finger is used as the touch body for example. sexual description.
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
- the Android system with a layered architecture is taken as an example to illustrate the software structure of the electronic device 100 .
- FIG. 4 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
- the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
- the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and the system library, and the kernel layer from top to bottom. Among them, Huawei's self-developed mobile terminal operating system can also refer to this structure.
- the application layer can consist of a series of application packages.
- the application package may include applications (applications, APPs) such as camera, gallery, calendar, call, map, navigation, Bluetooth, music, video, and short message.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions.
- the application framework layer can include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
- a window manager is used to manage window programs.
- the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
- Content providers are used to store and retrieve data and make it accessible to applications.
- Said data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebook, etc.
- the view system can be used to build the display interface of the application.
- Each display interface can consist of one or more controls.
- controls may include interface elements such as icons, buttons, menus, tabs, text boxes, dialog boxes, status bars, navigation bars, and widgets.
- the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
- the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
- the notification manager enables the application to display notification information in the notification center, which can be used to convey notification-type messages, and can automatically disappear after a short stay without user interaction.
- the notification manager is used to notify the download completion, message reminder, etc.
- the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in the notification center, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes, etc.
- Activity Manager can be used to manage the life cycle of each application. Applications usually run in the operating system in the form of activities. The activity manager can schedule the activity process of the application to manage the life cycle of each application.
- the Android Runtime includes core library and virtual machine. The Android runtime is responsible for the scheduling and management of the Android system.
- the core library consists of two parts: one part is the function function that the java language needs to call, and the other part is the core library of Android.
- the application layer and the application framework layer run in virtual machines.
- the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
- the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
- a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
- the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
- 2D graphics engine is a drawing engine for 2D drawing.
- the kernel layer is the layer between hardware and software.
- the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
- the user's finger can touch the touch screen and slide on the touch screen.
- the user's finger can touch and slide on the top left side of the touch screen, or touch and slide on the top right side of the touch screen, or touch and slide on the bottom left or right side of the touch screen, etc.
- Information about a series of touch points related to this touch operation can be obtained, for example, coordinates (x, y) of the touch points, touch events, and the like.
- the touch screen can report the original touch event generated by the user's touch operation to the kernel layer.
- the touch event can be encapsulated into an advanced touch event that can be read by the application framework layer (i.e. the framework layer), that is, a touch event, which includes the coordinates of the touch point, time, and the current time.
- the type of touch event for example, action down event, action move event, and action up event.
- the kernel layer can send the high-level touch event to the panel manager (inputmanager) in the framework layer of the application program.
- the panel manager After the panel manager acquires the above-mentioned advanced touch events, it can calculate the user's finger's sliding start point, sliding track, sliding distance, sliding speed, or sliding distance in real time according to the touch point coordinates, time and type of touch event in the advanced touch event.
- the speed of the hand point For example, when the panel manager detects an action down event, it means that the user's finger touches the touch screen, and when the panel manager detects an action up event, it means that the user's finger leaves the touch screen.
- the panel manager can recognize the sliding track and sliding distance of the user's finger on the touch screen according to the touch point coordinates between adjacent action down events and action up events, and/or, according to adjacent action down events and action
- the touch point coordinates and time between the up events identify the sliding trajectory of the user's finger on the touch screen and the speed of the sliding point.
- the user's finger when the user calls out the notification center or the control center on the mobile phone, the user's finger is on the mobile phone, and the user's operation may include: the user's finger touches the screen of the mobile phone, the user's finger slides on the screen of the mobile phone, and the user's finger leaves the screen of the mobile phone Wait.
- the touch position when the user's finger touches the mobile phone can be called the starting point; the position where the notification center or the control center is called out during the sliding process of the user's finger can be called the trigger threshold point; the position when the user's finger leaves the screen of the mobile phone can be called For off-hand point.
- the notification center or the control center can be located at the initial position; if the user’s finger does not leave the mobile phone screen and continues to slide after the notification center or the control center is called out, the notification center or the control center can follow the finger The direction of sliding; when the finger leaves the screen, the notification center or control center can return to the initial position. For example: after the finger is pulled down to call out the notification center, if the finger continues to slide down without leaving the screen, the notification center will follow the finger and slide down, and after the finger leaves the screen, the notification center will bounce up to the balance point (that is, just after returning to the notification center) initial position when it appears).
- the overhand point may be a point located in a preset area on the mobile phone;
- the trigger threshold point may be a point at a preset distance from the overhand point, or a point in a preset area, or both. Combination (at this time, the two meet one and trigger the outgoing notification center or control center). Exemplarily, as shown in FIG.
- the overhand point may be located in areas z1 and/or z2 on the mobile phone 100, wherein, when the user's finger touches the area z1 on the mobile phone 100, the overhand point is located in area z1; the trigger threshold The point is a point on the area z3, and the hand-off point may be a point between the area z3 and the bottom of the screen of the mobile phone 100, wherein the area z3 may be a line.
- the area where the trigger threshold point is located may also be a plane.
- the area z4 on the mobile phone 100 may be the area where the trigger threshold point is located. noodle.
- the speed of the user's finger's hand-leaving point is greater than a preset speed threshold, or the user When the sliding speed of the finger is greater than a certain preset speed threshold, the outgoing notification center or control center is triggered.
- the velocity of the hand-leaving point and/or the velocity of the user's finger sliding can be calculated by a velocity tracker (velocity tracker).
- the outbound notification center, the outbound control center, and the conversion between the notification center and the control center are introduced respectively.
- the upper hand point is located in the area z1 on the mobile phone 100 in FIG. Describe; the above hand point is located in the area z2 and the user's finger slides across the trigger threshold point, and when the user's finger leaves the mobile phone screen to call out the control center as an example, the process of calling out the control center is described.
- the screen of the mobile phone 100 may be in a lighted state.
- the screen of the mobile phone 100 may be on the standby interface, or on the display interface of the desktop, or the screen of the mobile phone 100 may be on the display interface of the application program in the mobile phone.
- a display interface in which the screen of the mobile phone 100 is on the desktop is taken as an example.
- the user's finger may touch the top left side of the screen of the mobile phone 100 .
- the user's finger slides down without reaching the trigger threshold point.
- the display interface of the desktop can be gradually blurred, as shown in FIG. 8(b); wherein, in this process, the user The closer the finger is to the upper hand point, the smaller the blur.
- the complete blurring refers to that the specific content displayed on the display interface is not visible; where, when the display interface of the desktop is completely blurred, the user's finger may slide to the trigger threshold point.
- the notification center can be displayed, as shown in FIG. 8( c ). After the notification center is displayed on the screen of the mobile phone 100, if the user's finger leaves the screen of the mobile phone 100, the interface shown in (d) in FIG. 8 may be displayed on the mobile phone 100.
- the mobile phone 100 can return to the interface before the notification center is displayed, which is shown in (e) in FIG. Show. Continuing with (e) in FIG. 8, after the user's finger slides upwards, the user's finger can leave the screen of the mobile phone 100, and the interface shown in (f) in FIG.
- the screen of the mobile phone 100 may be in a lighted state. Take the display interface where the screen of the mobile phone 100 is on the desktop as an example. As shown in (a) of FIG. 9 , the user's finger may touch the top right side of the screen of the mobile phone 100 . Next, the user's finger slides down without reaching the trigger threshold point. At this time, following the slide of the user's finger, the display interface of the desktop is gradually blurred, as shown in FIG. 9( b ). When the user's finger slides to the trigger threshold point, the control center can be displayed, as shown in FIG. 9( c ).
- the interface shown in (d) in FIG. 9 may be displayed on the mobile phone 100.
- the mobile phone 100 can return to the interface before the display of the control center, which is shown in (e) in Figure 9 Show.
- the user's finger can leave the screen of the mobile phone 100, and the interface shown in (f) in FIG.
- the user's finger may touch the top right side of the screen of the mobile phone 100 .
- the user's finger slides down, and the trigger threshold point is not reached.
- the transparency of the notification center can be gradually increased following the slide of the user's finger, as shown in FIG. 10(b).
- the closer the user's finger is to the upper hand point, the lower the transparency, and the closer the user's finger is to the trigger threshold point the higher the transparency, that is to say, when the user's finger slides, the notification center gradually changes from clear to transparent.
- the notification center can also be blurred during the user's finger swipe. It can be understood that, during the sliding process of the user's finger, transition processing actions such as adjusting transparency or the aforementioned blurring processing are optional. In some embodiments, during the sliding process of the user's finger, the drop-down content may be displayed directly without any transition processing, and the background behind or the notification/control center may be blocked. In some embodiments, blurring can be understood as adjusting the clarity of the interface so that the interface becomes blurred; adjusting transparency can be understood as adjusting the transparency of the interface so that the interface becomes transparent.
- the control center can be displayed, as shown in FIG. 10(c).
- the interface shown in (d) in FIG. 10 may be displayed on the mobile phone 100.
- the mobile phone 100 can return to the interface before the display of the control center, which is shown in (e) in Figure 10 Show.
- the user's finger may touch the top left side of the screen of the mobile phone 110 .
- the user's finger slides down, and the trigger threshold point is not reached.
- the transparency of the control center can be gradually increased following the slide of the user's finger, as shown in Figure 11(b).
- the control center can also be blurred during the user's finger swipe.
- the notification center can be displayed, as shown in FIG. 11(c).
- the interface shown in (d) in FIG. 11 may be displayed on the mobile phone 110.
- the mobile phone 110 can return to the interface before the notification center is displayed, as shown in (e) in Figure 11 .
- the user can call out the notification center first, then call out the control center, and then return directly from the interface of the control center to the interface before calling out the notification center, that is, the interface when the mobile phone 100 is in the on state.
- the user can first call out the control center, then call out the notification center, and then directly return from the interface of the notification center to the interface before calling out the control center, that is, the interface when the mobile phone 100 is in the on state.
- the display interface in which the screen of the mobile phone 100 is on the desktop is taken as an example to introduce respectively.
- the user's finger can touch the top left side of the screen of the mobile phone 100, and slide down to call out the notification center, as shown in FIG. 12 Shown in (b).
- the user can touch the top right side of the screen of the mobile phone 100 with a finger, and slide down to call out the control center, as shown in (d) in FIG. 12 .
- the user's finger can touch the lower area of the screen of the mobile phone 100 and slide upward.
- the user's finger when the user's finger slides upwards, the user's finger can leave the screen of the mobile phone 100, and at this time the screen of the mobile phone 100 can return to the interface showing the desktop, that is, the interface shown in (a) in FIG. 12 is displayed.
- the user's finger can touch the top right side of the screen of the mobile phone 100, and slide down to call out the control center, as shown in FIG. 13 Shown in (b).
- the user's finger can touch the top left side of the screen of the mobile phone 100 and slide down to call out the control center, as shown in (d) in FIG. 13 .
- the user's finger can touch the lower area of the screen of the mobile phone 100 and slide upward.
- the user's finger when the user's finger slides upward, the user's finger can leave the screen of the mobile phone 100, and at this time the screen of the mobile phone 100 can return to the interface showing the desktop, that is, the interface shown in (a) in FIG. 13 is displayed.
- the outgoing call notification center and the control center can be repeatedly and alternately pulled down, but no matter which one is currently displayed on the mobile phone 100, when the mobile phone 100 receives an instruction to close the currently displayed interface (for example, the user's finger slides upwards on the screen, etc.), the mobile phone 100 directly returns to the desktop, that is, returns to the interface before the pull-down.
- the notification center and the control center may be displayed in different windows, or may be displayed in the same window.
- closing the notification center can be understood as closing the window to which the notification center belongs
- closing the control center can be understood as closing the window to which the control center belongs
- opening the notification center can be understood as opening the notification center
- opening the control center can be understood as opening the window to which the control center belongs.
- closing the notification center and opening the control center can be understood as replacing the content of the notification center with the content of the control center
- closing the control center and opening the notification center can be understood as replacing the control with the content of the notification center Center content.
- opening the notification center can be understood as displaying the content of the notification center in the window.
- opening the content of the control center can be understood as displaying the content of the control center in the window.
- both the notification center and the control center can be displayed in the status bar window.
- FIG. 14 shows a schematic diagram of a system architecture and a processing procedure of a mobile phone 100 .
- the system architecture of the mobile phone 100 may include a status bar window 1401, a panel container 1402, a panel mutual pull controller 1403, a notification center panel controller 1404, a control center panel controller 1405, and a notification center panel 1406. and Control Center panel 1407 .
- the status bar window 1401 may receive the user's operation event detected by the touch screen on the mobile phone 100, wherein the operation event may include the position coordinates of the user's finger. After receiving the operation event, the status bar window 1401 may determine whether the user's operation event is a pull-down sliding event based on the position coordinates of the user's finger at different times in the operation event. In addition, after the status bar window 1401 determines that the user's operation event is a pull-down sliding operation event, it can determine whether there is currently a panel (such as a notification panel or a control panel) in an open state.
- a panel such as a notification panel or a control panel
- the status bar window 1401 determines that no panel is currently open, it sends the received operation event to the panel container 1402, so that the operation event is processed by the panel container 1402; , the received operation event is sent to the panel mutual pull controller 1403 so that the panel mutual pull controller 1403 processes the operation event.
- the panel container 1402 can determine the overhand point of the user's finger based on the position coordinates of the user's finger in the operation event, and then determine whether the user's current operation purpose is to open the notification center or the control center based on the overhand point. Wherein, when the panel container 1402 determines that the user's current operation purpose is to open the notification center, the panel container 1402 may send an operation event to the notification center panel controller 1404 . When the panel container 1402 determines that the user's current operation purpose is to open the control center, the panel container 1402 may send an operation event to the control center panel controller 1405 .
- the panel mutual pull controller 1403 can determine whether the current operation purpose of the user is to close the notification center and open the control center, or to close the control Center and open Notification Center. Wherein, when the panel mutual pull controller 1403 determines that the user's current operation purpose is to close the control center and open the notification center, the panel mutual pull controller 1403 may send the operation event to the notification center panel controller 1404 . When the panel mutual pull controller 1403 determines that the user's current operation purpose is to close the notification center and open the control center, the panel mutual pull controller 1403 may send an operation event to the control center panel controller 1405 .
- the panel mutual pull controller 1403 may blur the interface before the target panel is opened based on the operation event, or adjust the transparency of the content before the target panel is opened.
- the panel mutual pull controller 1403 can obtain the operation event from the target controller corresponding to the user's current operation purpose, that is to say, the target controller can Send the operation event to the panel mutual pull controller 1403 .
- the notification center controller 1404 can open the notification center panel 1406 according to the operation event.
- the notification center controller 1404 can send the operation event to the panel mutual pull controller 1403, so that the panel mutual pull control
- the controller 1403 may blur the interface before the notification center panel is opened based on the operation event, or adjust the transparency of the content before the notification center panel is opened.
- the control center controller 1405 can open the control center panel 1407 according to the operation event.
- the control center controller 1404 can send the operation event to the panel mutual pull controller 1403, so that the panel mutual pull control
- the controller 1403 can blur the interface before the control center panel is opened based on the operation event, or adjust the transparency of the content before the control center is opened.
- FIG. 15 shows a schematic flowchart of an outbound notification center and/or control center. As shown in Figure 15, the following steps are included:
- Step 1501 the status bar window 1401 responds to the received operation event, and determines that the operation event is a pull-down event.
- the status bar window 1401 may receive an operation event sent by the touch screen of the mobile phone 100, and the operation event may include the position coordinates of the user's finger. After receiving the operation event, the status bar window 1401 can determine whether the user's operation event is a pull-down event based on the position coordinates of the user's finger at different times in the operation event.
- Step 1502 the status bar window 1401 judges whether there is any panel opened.
- the status bar window 1401 can determine whether a panel is currently open from the panel record information.
- the panel record information may include opening records and/or closing records of the notification center panel and the control center panel. For example, when the information recorded in the panel record information is that both the notification center and the control center are closed, it can be determined that no panel is currently open; when the information recorded in the panel record information is that the notification center is closed and the control center is in the When in the open state, it can be determined that a panel is currently open. If the status bar window 1401 determines that no panel is opened, then execute step 1503; otherwise, execute step 1508.
- Step 1503 the panel container 1402 responds to the operation event sent by the status bar window 1401 , and determines the position of the access point in the operation event.
- the panel container 1402 can determine the area where the overhand point of the user's finger is located based on the coordinate position of the user's finger in the operation event, and then, it can determine according to the preset correspondence between the area where the overhand point is located and the notification center and the control center. Display whether the user's current operation purpose is to open the notification center or the control center.
- the preset correspondence between the area where the access point is located and the notification center and the control center can be: when the access point is in area z1, it corresponds to the call out notification center; when the access point is in area z2, it corresponds to the call out control center , then when the panel container determines that the top-hand point is located in the zone z1, it can determine that the user's current operation purpose is to open the notification center.
- step 1504 if the location of the overhand point corresponds to the outbound notification center
- Step 1504 the notification center controller 1404 responds to the operation event sent by the panel container 1402 and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 responds to the operation event sent by the notification center controller 1404 to blur the background. Wherein, if it is determined that the trigger threshold point is reached, step 1505 is performed; otherwise, the determination is continued.
- the notification center controller 1404 may determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process in the operation event.
- the position coordinates of the overhand point are (2,1)
- the current position coordinates of the user's finger are (2,5)
- the sliding distance is 4 distances.
- the preset trigger threshold point is At a point that is 4 distances away from the tophand point
- the user’s finger slides to the trigger threshold point; if the coordinates of the preset trigger threshold point are (k, 5), where k ⁇ 0, then the current location of the user’s finger is The coordinates are just at the trigger threshold point.
- the notification center controller 1404 determines that the user's finger has left the hand before sliding to the trigger threshold point according to the coordinate position during the sliding process of the user's finger, the notification center controller 1404 can also calculate the distance of the user's finger according to the operation event. The speed of the hand point. When the speed at the hand-off point is greater than the preset speed threshold, it can also be considered that the trigger threshold point has been reached.
- the notification center controller 1404 may send information about whether the trigger threshold point is reached to the panel mutual pull controller 1403 , and send an operation event to the panel mutual pull controller 1403 .
- the panel mutual pull controller 1403 can, based on the operation event, send a message to the background (such as calling out the notification center)
- the previous interface is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the notification center panel.
- the panel mutual pull controller 1403 can adjust the blurring degree of the background based on the distance between the current position coordinates of the user's finger and the trigger threshold point in the operation event; for example, when the distance between the current position coordinates of the user's finger and the trigger threshold point When the distance between is relatively long, the degree of blur is small, and when the distance between the current position coordinates of the user's finger and the trigger threshold point is relatively short, the degree of blur is relatively large.
- Step 1505 the notification center controller 1404 opens the notification center panel 1406 .
- Step 1506 the control center controller 1405 responds to the operation event sent by the panel container 1402 and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 responds to the operation event sent by the control center controller 1405 to blur the background. Wherein, if it is determined that the trigger threshold point is reached, step 1507 is performed; otherwise, the determination is continued.
- the control center controller 1405 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event. At the same time, the control center controller 1405 can send information to the panel mutual pull controller 1403 whether the trigger threshold point is reached, and send the operation event to the panel mutual pull controller 1403 .
- the panel mutual pull controller 1403 may, based on the operation event, perform a background check (such as calling out the control center) The previous interface) is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the control center panel.
- a background check such as calling out the control center
- the previous interface is blurred (such as reducing the clarity of the current display interface, etc.), so as to enhance the call-out effect of the control center panel.
- Step 1507 the control center controller 1405 opens the control center panel 1407 .
- Step 1508 the panel mutual pull controller 1403 responds to the operation event sent by the status bar window 1401 , and determines the position of the access point in the operation event.
- step 1509 if the location of the overhand point corresponds to the outbound notification center, execute step 1509; if the location of the overhand point corresponds to the outbound control center, execute step 1511.
- Step 1509 the notification center controller 1404 responds to the operation event sent by the panel mutual pull controller 1403 , and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 adjusts the transparency of the control center panel 1407 based on the operation event. Wherein, if it is judged that the trigger threshold point is reached, then execute step 1510, otherwise continue to judge.
- the notification center controller 1404 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event.
- the panel mutual pull controller 1403 can adjust the transparency of the control center panel 1407 based on the operation event, so as to enhance the call out effect of the notification center panel.
- the panel mutual pull controller 1403 can adjust the transparency of the rear control center panel 1407 based on the distance between the current position coordinates of the user's finger and the trigger threshold point in the operation event; for example, when the current position coordinates of the user's finger are When the distance between the threshold points is relatively long, the transparency of the control center panel 1407 is small; when the distance between the current location coordinate of the user's finger and the trigger threshold point is relatively short, the transparency of the control center panel 1407 is relatively large.
- the notification center controller 1404 may also feed back the information of reaching the trigger threshold point to the panel mutual pull controller 1403, so that the panel mutual pull controller 1403 can notify the center controller 1404 instructs the control center controller 1405 to close the control center panel 1407 when it is determined that the trigger threshold point is reached.
- Step 1510 the notification center controller 1404 opens the notification center panel 1406 .
- Step 1511 the control center controller 1405 responds to the operation event sent by the panel mutual pull controller 1403, and judges whether the trigger threshold point is reached, and the panel mutual pull controller 1403 adjusts the transparency of the notification center panel 1406 based on the operation event. Wherein, if it is judged that the trigger threshold point is reached, then execute step 1512, otherwise continue to judge.
- the control center controller 1405 can determine whether to slide to a preset trigger threshold point according to the coordinate position of the user's finger during the sliding process, or the sliding speed during the sliding process, or the speed of the hand-off point in the operation event.
- the panel mutual pull controller 1403 can adjust the transparency of the notification center panel 1406 based on the operation event, so as to enhance the call out effect of the control center panel.
- control center controller 1405 may also feed back the information of reaching the trigger threshold point to the panel mutual pull controller 1403, so that the panel mutual pull controller 1403 Step 1405 instructs the notification center controller 1404 to close the notification center panel 1406 when it is determined that the trigger threshold point is reached.
- Step 1512 the control center controller 1405 opens the control center panel 1407 .
- FIG. 16 is a schematic structural diagram of a chip provided by an embodiment of the present application.
- a chip 1600 includes one or more processors 1601 and an interface circuit 1602 .
- the chip 1600 may also include a bus 1603 . in:
- the processor 1601 may be an integrated circuit chip with signal processing capability. In the implementation process, the control process involved in the above solutions may be completed by an integrated logic circuit of hardware in the processor 1601 or instructions in the form of software.
- the interface circuit 1602 can be used for sending or receiving data, instructions or information.
- the processor 1601 can use the data, instructions or other information received by the interface circuit 1602 to process, and can send the processing completion information through the interface circuit 1602 .
- the chip further includes a memory, which may include a read-only memory and a random access memory, and provides operation instructions and data to the processor.
- a portion of the memory may also include non-volatile random access memory (NVRAM).
- the memory stores executable software modules or data structures, and the processor can execute corresponding operations by calling operation instructions stored in the memory (the operation instructions can be stored in the operating system).
- the interface circuit 1602 may be used to output an execution result of the processor 1601 .
- the corresponding functions of the processor 1601 and the interface circuit 1602 can be realized by hardware design, software design, or a combination of software and hardware, which is not limited here.
- the chip may be applied to the electronic device 100 shown in FIG. 2 to implement the method provided in the embodiment of the present application.
- processor in the embodiments of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits (application specific integrated circuit, ASIC), field programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof.
- CPU central processing unit
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- a general-purpose processor can be a microprocessor, or any conventional processor.
- the method steps in the embodiments of the present application may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
- the software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
- An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
- the storage medium may also be a component of the processor.
- the processor and storage medium can be located in the ASIC.
- all or part of them may be implemented by software, hardware, firmware or any combination thereof.
- software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, the processes or functions according to the embodiments of the present application will be generated in whole or in part.
- the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
- the computer instructions may be stored in or transmitted via a computer-readable storage medium.
- the computer instructions may be transmitted from one website site, computer, server, or data center to another website site by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , computer, server or data center for transmission.
- the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
- the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (21)
- 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:在所述触摸屏显示第一界面;响应于所述触摸屏接收到第一操作,将所述第一界面切换至第二界面,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的第一区域内,且在所述触摸屏上沿第一方向滑动的操作;在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,将所述第二界面切换至第三界面,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第二区域内,且在所述触摸屏上沿第二方向滑动的操作;在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,将所述第三界面切换至第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第三区域内,且在所述触摸屏上沿第三方向滑动的操作;其中,所述第二界面为通知中心的显示界面,所述第三界面为控制中心的显示界面;或者,所述第二界面为控制中心的显示界面,所述第三界面为通知中心的显示界面。
- 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:在所述触摸屏显示第一界面;响应于所述触摸屏接收到第一操作,将所述第一界面切换至第二界面,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的第一区域内,且在所述触摸屏上沿第一方向滑动的操作;在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,将所述第二界面切换至第三界面,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第二区域内,且在所述触摸屏上沿第二方向滑动的操作;在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏再次接收到所述第一操作,将所述第三界面切换至所述第二界面;其中,所述第二界面为通知中心的显示界面,所述第三界面为控制中心的显示界面;或者,所述第二界面为控制中心的显示界面,所述第三界面为通知中心的显示界面。
- 根据权利要求2所述的方法,其特征在于,所述将所述第三界面切换至所述第二界面之后,还包括:在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,将所述第二界面切换至所述第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的第三区域内,且在所述触摸屏上沿第三方向滑动的操作。
- 根据权利要求1-3任一所述的方法,其特征在于,所述第一界面包括所述电子设备上的桌面的显示界面,或者,所述第一界面包括所述电子设备上的应用的显示界面。
- 根据权利要求1-4任一所述的方法,其特征在于,所述第二界面和所述第三界面均显示于第一窗口中。
- 根据权利要求5所述的方法,其特征在于,所述第一窗口为状态栏窗口。
- 根据权利要求1-6任一所述的方法,其特征在于,所述第一界面和所述第二界面显示于不同的窗口中。
- 根据权利要求1-7任一所述的方法,其特征在于,所述第一区域位于所述触摸屏的顶 部的第一侧,所述第一方向为由所述触摸屏的顶部朝向所述触摸屏的底部的方向;所述第二区域位于所述触摸屏的顶部的第二侧,所述第二方向与所述第一方向相同;所述第三区域为所述触摸屏上除所述第一区域和所述第二区域以外的区域,所述第三方向与所述第一方向相反。
- 根据权利要求1-8任一所述的方法,其特征在于,所述通知中心为所述电子设备上用于管理来自所述电子设备上应用的推送或显示常驻状态信息的入口;所述控制中心为所述电子设备上用于控制所述电子设备的状态的入口。
- 根据权利要求1-9任一所述的方法,其特征在于,将第一目标界面切换至第二目标界面之前,还包括:确定所述触摸体在触摸屏上的操作达到触发条件,所述触发条件为触发界面切换的条件;其中,所述第一目标界面为第一界面,所述第二目标界面为第二界面;或者,所述第一目标界面为第二界面,所述第二目标界面为第三界面;或者,所述第一目标界面为第三界面,所述第二目标界面为第一界面;或者,所述第一目标界面为第三界面,所述第二目标界面为第二界面;或者,所述第一目标界面为第二界面,所述第二目标界面为第一界面。
- 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:所述触摸体当前时刻接触所述触摸屏的位置与所述起始位置之间的距离大于或等于预设距离阈值。
- 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:所述触摸体当前时刻接触所述触摸屏的位置到达所述触摸屏上的预设位置。
- 根据权利要求10所述的方法,其特征在于,所述触发界面切换的条件,具体包括:所述触摸体离开所述触摸屏时的位置与所述起始位置之间的距离小于预设距离阈值,且所述触摸体离开所述触摸屏时的速度大于或等于预设速度阈值。
- 根据权利要求10-13任一所述的方法,其特征在于,将所述第一目标界面切换至所述第二目标界面的过程中,还包括:提高所述第一目标界面的透明度,或者,降低所述第一目标界面的清晰度。
- 根据权利要求1-14任一所述的方法,其特征在于,所述将所述第一界面切换至所述第二界面包括:将所述第二界面覆盖在所述第一界面上;或者所述将所述第一界面切换至所述第二界面包括:将所述第一界面模糊处理,然后将所述第二界面覆盖在模糊处理后的所述第一界面上;或者所述将所述第二界面切换至所述第三界面包括:关闭所述第二界面,并打开所述第三界面;或者所述将所述第二界面切换至所述第三界面包括:关闭所述第二界面,并打开所述第三界面,所述第三界面覆盖在所述第一界面上;或者所述将所述第三界面切换至所述第一界面包括:关闭覆盖在所述第一界面上的所述第三界面,呈现所述第一界面;或者所述将所述第三界面切换至所述第二界面包括:关闭所述第三界面,并打开所述第二界面;或者所述将所述第三界面切换至所述第二界面包括:关闭所述第三界面,并打开所述第二界面,所述第二界面覆盖在所述第一界面上;或者所述将所述第二界面切换至所述第一界面包括:关闭覆盖在所述第一界面上的所述第二界面,呈现所述第一界面。
- 一种控制方法,其特征在于,应用于具有触摸屏的电子设备,所述方法包括:在所述触摸屏显示第一界面,所述第一界面包括所述电子设备上的桌面的显示界面,或者,所述第一界面包括所述电子设备上的应用的显示界面;响应于所述触摸屏接收到第一操作,将第二界面覆盖在所述第一界面上,所述第一操作是指触摸体接触所述触摸屏的起始位置位于所述触摸屏的顶部的第一区域内,且向所述触摸屏的底部滑动的操作,所述第二界面包括通知中心的显示界面或控制中心的显示界面;在所述触摸体完成所述第一操作离开所述触摸屏后,响应于所述触摸屏接收到第二操作,关闭所述第二界面且打开第三界面,其中,打开后的所述第三界面覆盖在所述第一界面上,所述第二操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的顶部的第二区域内,且向所述触摸屏的底部滑动的操作,所述第三界面包括通知中心的显示界面或控制中心的显示界面,且所述第三界面与所述第二界面不同;在所述触摸体完成所述第二操作离开所述触摸屏后,响应于所述触摸屏接收到第三操作,关闭所述第三界面,呈现所述第一界面,所述第三操作是指所述触摸体重新接触所述触摸屏的起始位置位于所述触摸屏的顶部以外的第三区域内,且向所述触摸屏的顶部滑动的操作。
- 根据权利要求16所述的方法,其特征在于,所述将第二界面覆盖在所述第一界面上之前,所述方法还包括:降低所述第一界面的清晰度。
- 根据权利要求16或17所述的方法,其特征在于,所述关闭所述第二界面之前,所述方法还包括:提高所述第二界面的透明度。
- 一种电子设备,其特征在于,包括:触摸屏;一个或多个处理器;存储器;其中,所述存储器中存储有一个或多个计算机程序,所述一个或多个计算机程序包括指令,当所述指令被所述一个或多个处理器执行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
- 一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
- 一种计算机程序产品,其特征在于,当所述计算机程序产品在电子设备上运行时,使得所述电子设备执行如权利要求1-18任一所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22814835.9A EP4332744A1 (en) | 2021-06-01 | 2022-03-30 | Control method and electronic device |
BR112023023988A BR112023023988A2 (pt) | 2021-06-01 | 2022-03-30 | Método de controle, dispositivo eletrônico, mídia de armazenamento legível por computador e produto de programa de computador |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110611045.2 | 2021-06-01 | ||
CN202110611045.2A CN115495002A (zh) | 2021-06-01 | 2021-06-01 | 一种控制方法及电子设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022252788A1 true WO2022252788A1 (zh) | 2022-12-08 |
Family
ID=84322737
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/084089 WO2022252788A1 (zh) | 2021-06-01 | 2022-03-30 | 一种控制方法及电子设备 |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP4332744A1 (zh) |
CN (1) | CN115495002A (zh) |
BR (1) | BR112023023988A2 (zh) |
WO (1) | WO2022252788A1 (zh) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8473871B1 (en) * | 2012-10-16 | 2013-06-25 | Google Inc. | Multiple seesawing panels |
US20140365945A1 (en) * | 2013-06-09 | 2014-12-11 | Apple Inc. | Device, method, and graphical user interface for providing navigation and search functionalities |
US20150082257A1 (en) * | 2013-09-17 | 2015-03-19 | Lg Electronics Inc. | Mobile terminal and control method thereof |
CN106406728A (zh) * | 2016-08-31 | 2017-02-15 | 瓦戈科技(上海)有限公司 | 移动终端桌面手势的操作方法 |
CN107632757A (zh) * | 2017-08-02 | 2018-01-26 | 努比亚技术有限公司 | 一种终端控制方法、终端及计算机可读存储介质 |
CN108255404A (zh) * | 2018-01-19 | 2018-07-06 | 广东欧珀移动通信有限公司 | 用户界面显示方法、装置及终端 |
CN108563388A (zh) * | 2018-02-27 | 2018-09-21 | 努比亚技术有限公司 | 屏幕操作方法、移动终端及计算机可读存储介质 |
US20190369861A1 (en) * | 2018-06-01 | 2019-12-05 | Apple Inc. | Accessing system user interfaces on an electronic device |
US20200363937A1 (en) * | 2018-01-19 | 2020-11-19 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | User interface display method, device, and apparatus |
-
2021
- 2021-06-01 CN CN202110611045.2A patent/CN115495002A/zh active Pending
-
2022
- 2022-03-30 EP EP22814835.9A patent/EP4332744A1/en active Pending
- 2022-03-30 WO PCT/CN2022/084089 patent/WO2022252788A1/zh active Application Filing
- 2022-03-30 BR BR112023023988A patent/BR112023023988A2/pt unknown
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8473871B1 (en) * | 2012-10-16 | 2013-06-25 | Google Inc. | Multiple seesawing panels |
US20140365945A1 (en) * | 2013-06-09 | 2014-12-11 | Apple Inc. | Device, method, and graphical user interface for providing navigation and search functionalities |
US20150082257A1 (en) * | 2013-09-17 | 2015-03-19 | Lg Electronics Inc. | Mobile terminal and control method thereof |
CN106406728A (zh) * | 2016-08-31 | 2017-02-15 | 瓦戈科技(上海)有限公司 | 移动终端桌面手势的操作方法 |
CN107632757A (zh) * | 2017-08-02 | 2018-01-26 | 努比亚技术有限公司 | 一种终端控制方法、终端及计算机可读存储介质 |
CN108255404A (zh) * | 2018-01-19 | 2018-07-06 | 广东欧珀移动通信有限公司 | 用户界面显示方法、装置及终端 |
US20200363937A1 (en) * | 2018-01-19 | 2020-11-19 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | User interface display method, device, and apparatus |
CN108563388A (zh) * | 2018-02-27 | 2018-09-21 | 努比亚技术有限公司 | 屏幕操作方法、移动终端及计算机可读存储介质 |
US20190369861A1 (en) * | 2018-06-01 | 2019-12-05 | Apple Inc. | Accessing system user interfaces on an electronic device |
CN110554828A (zh) * | 2018-06-01 | 2019-12-10 | 苹果公司 | 访问电子设备上的系统用户界面 |
Also Published As
Publication number | Publication date |
---|---|
EP4332744A1 (en) | 2024-03-06 |
BR112023023988A2 (pt) | 2024-01-30 |
CN115495002A (zh) | 2022-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11709560B2 (en) | Device, method, and graphical user interface for navigating through a user interface using a dynamic object selection indicator | |
DK180317B1 (en) | Systems, methods, and user interfaces for interacting with multiple application windows | |
US20230244317A1 (en) | Proxy Gesture Recognizer | |
US20230409165A1 (en) | User interfaces for widgets | |
US20240152223A1 (en) | Systems and Methods for Interacting with Multiple Applications that are Simultaneously Displayed on an Electronic Device with a Touch-Sensitive Display | |
CA2909730C (en) | Event recognition by device with touch-sensitive display using multiple gesture recognizers | |
JP5859508B2 (ja) | 対話型ポップアップビューを備えたデバイス、方法、およびグラフィカルユーザインタフェース | |
US8826164B2 (en) | Device, method, and graphical user interface for creating a new folder | |
AU2010339633B2 (en) | Apparatus and method having multiple application display modes including mode with display resolution of another apparatus | |
US8839122B2 (en) | Device, method, and graphical user interface for navigation of multiple applications | |
US20160253086A1 (en) | Device, method, and graphical user interface for managing multiple display windows | |
EP3594793A1 (en) | Device, method, and graphical user interface for managing folders | |
US20110163967A1 (en) | Device, Method, and Graphical User Interface for Changing Pages in an Electronic Document | |
US11829591B2 (en) | User interface for managing input techniques | |
AU2018269159A1 (en) | Systems and methods for interacting with multiple applications that are simultaneously displayed on an electronic device with a touch-sensitive display | |
WO2022028310A1 (zh) | 添加批注的方法、电子设备及相关装置 | |
WO2022252788A1 (zh) | 一种控制方法及电子设备 | |
CN114461312B (zh) | 显示的方法、电子设备及存储介质 | |
KR102619538B1 (ko) | 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22814835 Country of ref document: EP Kind code of ref document: A1 |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112023023988 Country of ref document: BR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022814835 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18565936 Country of ref document: US Ref document number: 2023574265 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 2022814835 Country of ref document: EP Effective date: 20231128 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 112023023988 Country of ref document: BR Kind code of ref document: A2 Effective date: 20231116 |