CN116991274B - Upper sliding effect exception handling method and electronic equipment - Google Patents

Upper sliding effect exception handling method and electronic equipment Download PDF

Info

Publication number
CN116991274B
CN116991274B CN202311265723.XA CN202311265723A CN116991274B CN 116991274 B CN116991274 B CN 116991274B CN 202311265723 A CN202311265723 A CN 202311265723A CN 116991274 B CN116991274 B CN 116991274B
Authority
CN
China
Prior art keywords
electronic device
sliding
touch event
event
touch
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311265723.XA
Other languages
Chinese (zh)
Other versions
CN116991274A (en
Inventor
徐超利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311265723.XA priority Critical patent/CN116991274B/en
Publication of CN116991274A publication Critical patent/CN116991274A/en
Application granted granted Critical
Publication of CN116991274B publication Critical patent/CN116991274B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Environmental & Geological Engineering (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

The application provides an up-sliding effect exception handling method and electronic equipment, wherein the method comprises the following steps: the electronic equipment receives a first sliding operation on a first display interface, wherein the first display interface is indicated as an application program page; displaying a movement effect entering a multitasking mode in response to the first up-sliding operation; in the process of displaying the dynamic effect entering the multitasking, responding to a second sliding operation of a user, and triggering a preset processing flow; in the preset process flow: before the action of clearing all states is executed, modifying the value of the current dynamic effect operation zone bit, wherein after the value of the current dynamic effect operation zone bit is modified, when the action of clearing all states is executed, the processing of cancelling the current dynamic effect is not triggered, the situation that the display interface of the electronic equipment returns to an application program page due to the action cancellation of entering multiple tasks can be avoided, the problem of abnormal upper sliding effect can be effectively solved, and the electronic equipment can smoothly display a desktop after the execution of a preset processing flow is completed.

Description

Upper sliding effect exception handling method and electronic equipment
Technical Field
The application relates to the technical field of terminals, in particular to an upward sliding effect exception handling method and electronic equipment.
Background
When the terminal displays the application program page, the user can slide upwards from the bottom edge of the screen and pause, and the terminal responds to the operation of sliding upwards from the bottom edge of the screen and pausing, so that the dynamic effect of entering the multi-task management page is displayed. Typically, in displaying the dynamic effect of entering the multitasking page, the user slides up again from the bottom edge of the screen, and the terminal enters the desktop from the multitasking page. However, when the user slides upwards from the bottom edge of the screen again in the process of entering the dynamic effect of the multi-task management page, the processing logic of canceling the current dynamic effect is caused to cancel the dynamic effect of the multi-task, the terminal returns to the application page instead of entering the desktop, the upper sliding effect is abnormal, and the use experience of the user is reduced.
Disclosure of Invention
In order to solve the technical problems, the application provides an upward sliding effect exception handling method and electronic equipment, in the method, in the process of displaying the movement effect entering the multitasking, the electronic equipment does not trigger to cancel the current movement effect in the process of handling the second upward sliding operation of the user, and the electronic equipment is effectively prevented from displaying a desktop but returning to an exception scene of an application program when the user performs the second upward sliding operation.
In a first aspect, the present application provides a method for handling an anomaly in efficiency of sliding up, where an electronic device receives a first sliding up operation on a first display interface, where the first display interface is indicated as an application page; the electronic equipment responds to the first sliding operation and displays the movement effect entering the multitasking; and in the process of displaying the dynamic effect entering the multitasking, the electronic equipment responds to the second sliding operation of the user and triggers a preset processing flow. In the preset process flow: before the action of clearing all states is executed, modifying the value of the current dynamic effect operation zone bit, wherein after the value of the current dynamic effect operation zone bit is modified, when the action of clearing all states is executed, the processing of cancelling the current dynamic effect is not triggered, the situation that a display interface of the electronic equipment returns to an application program page due to the action cancellation of entering multiple tasks can be avoided, the problem of abnormal upper sliding effect can be effectively solved, the use experience of a user is improved, and the desktop is smoothly displayed by the electronic equipment after the execution of a preset processing flow is finished.
According to a first aspect, modifying the value of the current active run flag bit prior to performing the act of clearing all states, comprises: the value of the current active run flag bit is modified before the act of clearing the snoop state is performed.
In the embodiment of the present application, the action of clearing all states includes an action of clearing a plurality of states, but only the action of clearing the listening state triggers the process of clearing the current active effect. Therefore, the value of the current active operation zone bit can be modified before the action of clearing the interception state is executed, wherein after the value of the current active operation zone bit is modified, the process of cancelling the current active operation is not triggered when the action of clearing the interception state is executed, the situation that the display interface of the electronic equipment returns to the application program page due to the action cancellation of entering the multitasking can be accurately avoided, and the user experience is improved.
According to the first aspect, or any implementation manner of the first aspect, the modifying the value of the current active running flag bit includes: and modifying the value of the current active operation flag bit to be a first value.
In the embodiment of the present application, when the value of the current active operation flag bit is the first value, the process of canceling the current active is not triggered when the action of clearing all (listening) states is executed, so that the situation that the display interface of the electronic device returns to the application page due to the cancellation of the active entering the multitasking can be avoided, and the user experience is improved.
According to a first aspect, or any implementation manner of the first aspect, in response to a first slide-up operation, displaying a dynamic effect of entering a multitasking includes: the input management module converts the received first sliding operation into a first touch event; the input management module distributes the first touch event to the touch interaction management module; the touch interaction management module sends the first touch event to the corresponding gesture processing module under the condition that the first touch event is determined to be triggered in the gesture navigation hot zone; the gesture processing module triggers the view system to display a dynamic effect into the multitasking based on the first touch event.
According to a first aspect, or any implementation manner of the first aspect, the first touch event is indicated as any one of a down event, a move event and an up event, and the gesture processing module triggers the view system to display a dynamic effect of entering the multitasking based on the first touch event, including: when the gesture processing module indicates an up event as the first touch event, the view system is triggered to display a dynamic effect entering the multitasking.
In this embodiment, when the gesture processing module indicates that the first touch event is an up event, it indicates that the user performs a hand lifting operation, that is, the first sliding operation is completed, and the first sliding operation does not generate a new event any more, so that a follow-up manual effect needs to be displayed, and therefore, the view system can be triggered to display a dynamic effect entering multiple tasks.
According to the first aspect, or any implementation manner of the first aspect, before the gesture processing module triggers the view system to display a dynamic effect of entering the multitasking based on the first touch event, the gesture processing module further includes: and when the first touch event is indicated as a move event, the gesture processing module modifies the value of the current active operation zone bit into a second value.
In this embodiment of the present application, when the first touch event indicates a move event, the gesture processing module may trigger the follow-up action, so that the value of the current action running flag bit may be modified to a second value, so as to identify that action display currently exists in the electronic device, and avoid the problem of action conflict.
According to the first aspect, or any implementation manner of the first aspect, the triggering of the preset processing procedure in response to the second sliding-up operation includes: the input management module converts the second sliding operation into a second touch event; the input management module distributes the second touch event to the touch interaction management module; the touch interaction management module sends the second touch event to the corresponding gesture processing module under the condition that the second touch event is determined to be triggered in the gesture navigation hot zone; the gesture processing module sends a state clearing instruction to the sliding sharing state module based on the second touch event; the sliding shared state module is used for responding to the state clearing instruction, modifying the value of the current active operation flag bit and executing the action of clearing all states.
In this embodiment of the present application, in the preset processing flow: after the value of the current dynamic effect running zone bit is modified, the action of clearing all states is executed, the process of canceling the current dynamic effect can not be triggered, the situation that the display interface of the electronic equipment returns to the application program page due to the cancel of the dynamic effect entering the multi-task can be avoided, the problem of abnormal upper sliding effect can be effectively solved, and the user experience is improved.
According to the first aspect, or any implementation manner of the first aspect, the first slide-up operation is indicated as an operation of sliding up and stopping from the bottom edge of the screen; the second up-slide operation is indicated as an operation of sliding upward from the bottom edge of the screen.
In a second aspect, the present application provides an electronic device, comprising: one or more processors; a memory; and a computer program, wherein the computer program is stored on the memory, which when executed by the one or more processors, causes the electronic device to execute the instructions of the first aspect or of the method in any possible implementation of the first aspect.
Any implementation manner of the second aspect and the second aspect corresponds to any implementation manner of the first aspect and the first aspect, respectively. The technical effects corresponding to the second aspect and any implementation manner of the second aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
In a third aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the instructions of the first aspect or of the method in any possible implementation of the first aspect.
Any implementation manner of the third aspect and any implementation manner of the third aspect corresponds to any implementation manner of the first aspect and any implementation manner of the first aspect, respectively. The technical effects corresponding to the third aspect and any implementation manner of the third aspect may be referred to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which are not described herein.
Drawings
Fig. 1 is a schematic diagram of a first scenario provided in an embodiment of the present application;
fig. 2 is a second schematic view of a scenario provided in an embodiment of the present application;
fig. 3 is a third schematic view of a scenario provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 5 is a software block diagram of an electronic device according to an embodiment of the present application;
FIG. 6 is a flowchart of a method for handling an exception with upward sliding effect according to an embodiment of the present application;
FIG. 7 is a schematic diagram of an active effect entering multiple tasks according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a first method for handling a first sliding operation according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a multi-task dynamic effect processing flow provided in an embodiment of the present application;
fig. 10 is a schematic diagram of a preset process flow provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The term "and/or" is herein merely an association relationship describing an associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone.
The terms first and second and the like in the description and in the claims of embodiments of the present application are used for distinguishing between different objects and not necessarily for describing a particular sequential order of objects. For example, the first target object and the second target object, etc., are used to distinguish between different target objects, and are not used to describe a particular order of target objects.
In the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as examples, illustrations, or descriptions. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the description of the embodiments of the present application, unless otherwise indicated, the meaning of "a plurality" means two or more. For example, the plurality of processing units refers to two or more processing units; the plurality of systems means two or more systems.
Fig. 1 is a schematic diagram of a first scenario provided in an embodiment of the present application. Fig. 2 is a second schematic view of a scenario provided in the embodiment of the present application. Fig. 3 is a schematic diagram of a third scenario provided in an embodiment of the present application.
Before describing the embodiments of the present application, first, application scenarios of the embodiments of the present application will be described based on fig. 1, fig. 2, and fig. 3. Fig. 1, 2 and 3 each take a mobile phone as an example of an electronic device.
Fig. 1 (1) shows a first display interface 101 of the mobile phone, where the first display interface 101 is a memo application page, and the first display interface 101 includes a control 1011 for managing to-do events, a reminder icon 1012 without to-do events, a control 1013 for adding to-do events, a control 1014 for entering a note page, and a control 1015 for entering a to-do page. When the first display interface 101 is displayed on the mobile phone, the user makes an operation of sliding up and stopping from the bottom edge of the screen, the operation locus is shown by (1) dotted line in fig. 1, and the user's finger slides in the direction indicated by the arrow on the dotted line and stops at the black dot at the top of the dotted line. The mobile phone displays a dynamic effect of entering a multitasking in response to a received operation of sliding up and stopping from the bottom edge of the screen while displaying the first display interface 101, and finally displays the second display interface 102 as shown in (2) of fig. 1. Wherein, the dynamic effect refers to the dynamic change effect of each element in the display interface.
The second display interface 102 shown in fig. 1 (2) is indicated as a multitasking interface. The multitasking interface includes a preview interface of a plurality of application programs recently used by a user, and the preview interfaces of the plurality of application programs are arranged from right to left in order of application program use time from the near to the far. For example, the user uses the phone first and then uses the memo, and when the mobile phone displays the application page of the memo, the operation of sliding up and stopping from the bottom edge of the screen is made, and the mobile phone displays the second display interface 102, and the second display interface 102 includes the preview interface 1022 of the phone and the preview interface 1024 of the memo. Wherein, the preview interface 1024 of the memo is located at the rightmost side and the interface part is displayed; the phone's preview interface 1022 is located to the left of the memo preview interface 1024, which may be displayed in its entirety. The multitasking interface also includes an identification of the application (application icon and application name) indicated by each preview interface, such as the phone identification 1021 and memo identification 1023 (the memo name is not shown) in the second display interface 102. The second display interface 102 also includes a clean-up background control 1025.
In general, as shown in fig. 2, when the user displays the second display interface 102 on the mobile phone, the user performs an operation of sliding up from the bottom edge of the screen, the operation track being shown by (1) a broken line in fig. 2, and the user's finger slides in the direction indicated by the arrow on the broken line. As shown in fig. 2 (2), the mobile phone enters a desktop state in response to the received operation of sliding up from the bottom edge of the screen, and displays the third display interface 103. The second display interface 102 may be referred to the description of (2) in fig. 1, and is not described herein. The third display interface 103 contains a plurality of applications such as clocks, calendars, gallery, memos, file management, email, music, calculators, cameras, address books, telephones, and information.
However, as shown in fig. 3, when the first display interface 101 is displayed on the mobile phone, the user makes an operation of sliding up (first up) and pausing from the bottom edge of the screen, and in response to this operation, the mobile phone displays a move effect into multitasking, and finally displays the second display interface 102. The user makes an upward sliding operation (second upward sliding operation) from the bottom edge of the screen again in a very short time after the first upward sliding pause (the time interval between the two sliding operations is smaller than the time threshold, the time interval is indicated as the duration of entering the multi-task action, for example, 0.3 seconds), the processing logic of the current action is cancelled, the action cancellation entering the multi-task state is caused, the mobile phone returns to display the first display interface 101 instead of displaying the desktop, and the upward sliding action abnormality occurs, so that the user experience is reduced.
Therefore, the embodiment of the application provides an up-slip effect exception handling method, which is applied to: the electronic device displays a scene of moving into the dynamic effect of the multi-task page in response to the user sliding up from the bottom edge of the screen and pausing (first up operation), and again in response to the user sliding up from the bottom edge of the screen (second up operation). In the embodiment of the application, the electronic device responds to the second sliding operation, triggers the preset processing flow, and sets the current moving effect operation flag bit to the first value before executing the action of clearing all states in the preset processing flow so as not to trigger the execution of the treatment of canceling the current moving effect, thereby effectively avoiding the abnormal scene of returning to the application program when the user performs the second sliding operation in the process of entering the multi-task moving effect, solving the problem of abnormal sliding effect and improving the user experience.
The method for processing the up-sliding effect abnormality provided by the embodiment of the application can be applied to electronic equipment, wherein the electronic equipment can be wearable electronic equipment (such as a watch), a portable computer (such as a mobile phone), a tablet computer, a notebook computer, a personal computer (personal computer, a PC), an Augmented Reality (AR) \virtual reality (VR) equipment, a vehicle-mounted computer and other equipment, and the specific form of the electronic equipment is not particularly limited in the following embodiments.
Before describing the technical solution of the embodiments of the present application, first, an electronic device of the embodiments of the present application will be described with reference to the accompanying drawings. Fig. 4 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. It should be understood that the electronic device 100 shown in fig. 4 is only one example of an electronic device, and that the electronic device 100 may have more or fewer components than shown in the figures, may combine two or more components, or may have a different configuration of components. The various components shown in fig. 4 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: processor 110, external memory interface 120, internal memory 121, universal serial bus (universal serial bus, USB) interface 130, charge management module 140, power management module 141, battery 142, antenna 1, antenna 2, mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headset interface 170D, sensor module 180, keys 190, motor 191, indicator 192, camera 193, display 194, and subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, A P), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and a command center of the electronic device 100, among others. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, such that the processor 110 communicates with the touch sensor 180K through an I2C bus interface to implement a touch function of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through the I2S interface, to implement a function of answering a call through the bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, to implement a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display 194 communicate via a DSI interface to implement the display functionality of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques.
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be a USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, a american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects the shake angle of the electronic device 100, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the electronic device 100 through the reverse motion, so as to realize anti-shake. The gyro sensor 180B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip cover using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip machine, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light outward through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object in the vicinity of the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there is no object in the vicinity of the electronic device 100. The electronic device 100 can detect that the user holds the electronic device 100 close to the ear by using the proximity light sensor 180G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 180G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 180L is used to sense ambient light level. The electronic device 100 may adaptively adjust the brightness of the display 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust white balance when taking a photograph. Ambient light sensor 180L may also cooperate with proximity light sensor 180G to detect whether electronic device 100 is in a pocket to prevent false touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc.
The temperature sensor 180J is for detecting temperature. In some embodiments, the electronic device 100 performs a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by temperature sensor 180J exceeds a threshold, electronic device 100 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 180J in order to reduce power consumption to implement thermal protection. In other embodiments, when the temperature is below another threshold, the electronic device 100 heats the battery 142 to avoid the low temperature causing the electronic device 100 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, the electronic device 100 performs boosting of the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 180K, also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, bone conduction sensor 180M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 180M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 180M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 170 may analyze the voice signal based on the vibration signal of the sound portion vibration bone block obtained by the bone conduction sensor 180M, so as to implement a voice function. The application processor may analyze the heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 180M, so as to implement a heart rate detection function.
The keys 190 include a power-on key, a volume key, etc. The keys 190 may be mechanical keys. Or may be a touch key. The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching different areas of the display screen 194. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, may be used to indicate a state of charge, a change in charge, a message indicating a missed call, a notification, etc.
The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture. In this embodiment, taking an Android system with a layered architecture as an example, a software structure of the electronic device 100 is illustrated.
Fig. 5 is a software configuration block diagram of the electronic device 100 according to the embodiment of the present application.
The layered architecture of the electronic device 100 divides the software into several layers, each with a distinct role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 5, the application package may include applications for cameras, gallery, calendar, phone, map, navigation, WLAN, music, memo, etc.
In some embodiments, as shown in fig. 5, the application layer further includes a desktop Launcher (Launcher) that includes a plurality of classes, such as a touch interaction management module (Touch Interaction Manager), a gesture processing module, a window conversion sliding processing module (windowtransformaswipe handler), a multitasking animation wrapper module (recentanimation wrapper), a remote animation object processing module (remoteanimation targetset), a sliding sharing status module (swiplaredstate), a dynamic efficiency management module, and the like.
The touch interaction management module is used for determining which gesture area of the electronic device the received touch event triggers, so that the touch event is sent to the corresponding consumer. For example, the touch interaction management module invokes an onenputevent method that monitors as a view system an entry to a system Input event callback that will expand the judgment of the entire return gesture, the refreshing of the views and animations, and the triggering of the return event. For example, the MotionEvent type of the touch event is determined, and then preprocessed by the onMotionEvent method. The onInputEvent method processes sliding events (including down events, move events, and up events), and creates different input consumers (inputConsumers) according to different scenes when the down events occur, for example, gestures are used in different situations such as a desktop or other interfaces, and the corresponding inputConsumers are different.
The gesture processing module may be an input consumer created in the onMotionEvent method, including OverviewInputConsumer, otherActivityInputConsumer or newly built consumer (newConsumer), etc. Wherein, the oversview inputConsumer is a consumer who processes events under a desktop or multitasking interface; the OtherActivityInputConsumer is a consumer that processes events under a non-desktop interface. The newly built consumer may be a newly built special consumer or a newly built base consumer (newBaseConsumer). The gesture processing module can process the received touch event and trigger the dynamic effect corresponding to the touch event, such as manual effect, multi-task entering dynamic effect, and the like.
The window conversion sliding processing module is used for setting a multitasking flag bit (isToRecents) to be true after the gesture processing module triggers the action of entering the multitasking so as to identify that the electronic equipment enters the multitasking processing logic; and is further configured to send an animation complete instruction to the multitasking animation packaging module when the electronic device finishes displaying the animation effect entering the multitasking, for example, after displaying (2) in fig. 7, where the animation complete instruction includes information of istorecents=true.
The multi-task animation packaging module and the remote animation target processing module are used for assisting the window conversion sliding processing module to transmit the information of istorecents=true to the application program framework layer.
The sliding shared status module is used to perform actions to clear all status, such as clearing snoop status. In this embodiment of the present application, the sliding-sharing status module is further configured to set, in response to a status clearing instruction, a current active-operation flag bit to a first value (false), and then execute an action of clearing all the statuses, so as not to cause processing logic for canceling the active effect.
The animation management module, for example, a remove animation object set module (removeimmification targetset) and a multitasking animation control compatible module (recentranimate control compat), passes an animation cancellation instruction to the view system of the application framework layer to control the cancellation of the animation into multitasking.
The application framework layer provides an application programming interface (application programming interface, API) and programming framework for application programs of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 5, the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide the communication functions of the electronic device 100. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction. Such as notification manager is used to inform that the download is complete, message alerts, etc. The notification manager may also be a notification in the form of a chart or scroll bar text that appears on the system top status bar, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, a text message is prompted in a status bar, a prompt tone is emitted, the electronic device vibrates, and an indicator light blinks, etc.
In some embodiments, the application framework layer further includes an input management module for retrieving input from hardware of the electronic device and converting the input into events (events), and distributing the events to the respective processing modules.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
It will be appreciated that the layers and components contained in the layers in the software structure shown in fig. 2 do not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer layers than shown, and more or fewer components may be included in each layer, as the present application is not limited.
FIG. 6 is a flowchart of an embodiment of an up-slip effect exception handling method. As shown in fig. 6, the exception handling method is applied to an electronic device, and the electronic device displays a first display interface, where the first display interface is indicated as an application page. The exception handling method comprises the following steps: step S601 to step S604.
Step S601, in response to the first sliding operation, displaying the dynamic effect entering the multitasking.
In the embodiment of the application, the electronic device receives a first sliding operation on a first display interface, where the first display interface is indicated as an application page. It may be appreciated that the user performs a first sliding operation on the first display interface, where the purpose is to enter the multitasking page, and the action of the first sliding operation may be set according to the actual application, for example, the first sliding operation indicates an operation of sliding and pausing from the bottom edge of the screen of the electronic device.
The dynamic effects of this entry into the multitasking include, but are not limited to: the preview page of the application program moves to a preset position from left to right, the identification of the application program indicated by the preview page is displayed, and the desktop background is blurred.
Fig. 7 is a schematic diagram of an active effect entering into multiple tasks according to an embodiment of the present application. The user performs the first sliding operation on the electronic device, and the electronic device displays the follow-up effect, and when the user lifts his/her hand, the interface displayed by the electronic device is the fourth display interface 104 shown in fig. 7 (1). The fourth display interface 104 includes preview pages of the most recently used application, for example, a preview page 1041 of the phone that is not fully displayed and a preview page 1042 of the memo that is fully displayed. The electronic device responds to the first sliding-up operation to display a dynamic effect entering into a multitasking mode, wherein the dynamic effect refers to a dynamic change effect of each element in a display interface, for example, a dynamic change effect of a preview page 1041 of a phone and a preview page 1042 of a memo: the preview page 1041 of the phone and the preview page 1042 of the memo move from left to right to a predetermined position in the direction indicated by the dotted arrow in fig. 7 (1).
Referring to the fifth display interface 105 in fig. 7 (2), the preview page of the phone after movement is shown as page 1052, the preview page of the memo after movement is shown as page 1054, and the identity 1051 of the phone is displayed above the preview page 1052 of the phone, and the identity 1053 of the memo (application name is hidden and displayed on the right side) is displayed above the preview page 1054 of the memo; a clean-up background control 1055 is displayed below the interface on which the preview page of the application program is located. In this embodiment of the present application, after the user performs the first sliding operation on the electronic device and lifts his hand, the electronic device is automatically changed from the fourth display interface 104 to the fifth display interface 105, that is, the process of displaying the electronic device with the dynamic effect of entering the multitasking.
Fig. 8 is a schematic diagram of a processing method of a first sliding operation according to an embodiment of the present application. In one embodiment, as shown in fig. 8, the electronic device displays a dynamic effect of entering a multitasking in response to a first up-slide operation, including: step S6011-step S6014.
In step S6011, the input management module receives a first sliding operation and converts the first sliding operation into a first touch event.
In some embodiments, the electronic device currently displays an application page, and the user makes an operation to slide up and pause from the bottom edge of the screen of the electronic device (a first slide-up operation). The first up-sliding operation includes a pressing operation of a touch screen, one or more sliding operations, and a lifting operation. The input management module converts the first up-slide operation into a first touch event, including: the press operation is converted into a press (down) event, the slide operation is converted into a move (move) event, and the raise operation is converted into a raise (up) event.
In step S6012, the input management module distributes the first touch event to the touch interaction management module in the desktop starter.
In some embodiments, the input management module may send a series of events included in the touch event to the desktop initiator, respectively, for example, the input management module may convert a received press operation into a down event and send the down event to the desktop initiator when the user touches the screen; when a user slides, converting the received sliding operation into a move event, and sending the move event to a desktop starter; after the user lifts his/her hand, the received lifting operation is converted into an up event, and the up event is sent to the desktop launcher.
In step S6013, the touch interaction management module sends the first touch event to the corresponding gesture processing module when determining that the first touch event triggers in the gesture navigation hot zone.
In some embodiments, the touch interaction management module first receives a down event in the first touch event, and determines whether the down event triggers a gesture navigation hotspot according to a pressing position of the down event. And under the condition that a down event in the first touch event is triggered in the gesture navigation hot zone, the touch interaction management module sends the down event to the corresponding gesture processing module. The touch interaction management module may also send both the move event and the up event associated with the down event that are subsequently received to the same gesture processing module. It should be noted that, when the touch interaction management module determines that the first touch event is not triggered in the gesture navigation hot zone, other processing flows corresponding to the first touch event are executed, which is not limited in this embodiment.
In some examples, the gesture processing module may be an input consumer (inputConsumer) that includes OverviewInputConsumer, otherActivityInputConsumer or a newly built consumer (newConsumer), or the like. Wherein, the oversview inputConsumer is a consumer who processes events under a desktop or multitasking interface; the OtherActivityInputConsumer is a consumer that processes events under a non-desktop interface. The newly built consumer may be a new special consumer or a new base consumer (newBaseConsumer).
In some embodiments, the first touch event is processed by the OtherActivityInputConsumer in the gesture processing module because the first touch event is a touch event received by the electronic device while the application interface (non-desktop interface) is displayed.
Step S6014, the gesture processing module triggers the view system to display the dynamic effect entering the multi-task based on the first touch event.
In some embodiments, the gesture processing module triggers the display up-slide and manual effects based on the move event in the received first touch event before triggering the view system to display the move effect into the multitasking based on the first touch event, and sets a current move effect running flag (msislastanimation running) to a second value to indicate that there is currently a move effect running, which may be true (true). Typically, the current active running flag bit is set to a first value after all active triggered by the first touch event is finished, i.e. after the active that enters the multi-task is finished, and the first value may be false (fasle).
In some embodiments, the gesture processing module receives an up event in the first touch event, ends tracking for the first touch event, and triggers the view system to display a move into multitasking effect.
In some embodiments, the gesture processing module may process the first touch event by invoking doActionUp, processForUpWhenHomeEnable, processGestureWhenUp or the like to trigger the view system to display a dynamic effect into multitasking. For example, by invoking processGestureWhenUp, it may be achieved that upon receipt of an up event, processing of the first touch event is started; the information such as the touch position, the sliding speed and the like in the first touch event can be acquired and analyzed by calling the processForUpWhenHomeEnable; by invoking doActionUp, the display of each animation into the multi-tasking animation can be triggered.
Fig. 9 is a schematic diagram of a multi-task dynamic effect processing flow according to an embodiment of the present application. In some embodiments, after the gesture processing module triggers the view system to display a move to multitask effect based on the first touch event, as shown in fig. 9, a window switch slide processing module (windowtransformaswiphandler) sets a multitask flag (isToRecents) to true to identify the electronic device to enter multitask processing logic.
Referring to fig. 9, in step S901, the window conversion slide processing module calls a "finish current to multitasking process conversion" method (finiscourrenttranstorreccentproc) setting isToRecents to true.
In step S902, the window-switching-sliding-processing module also invokes a finish method, and sends an animation complete instruction to the multitasking animation wrapping module (recentrainment wrapper), where the animation complete instruction includes the information of the istorecents=true, so that the multitasking animation wrapping module passes the information of the istorecents=true to the application framework layer.
In one example, the window transformation slide processing module sends an animation complete instruction to the multitasking animation wrapping module when the electronic device has displayed the move effect to multitask, e.g., after displaying (2) in fig. 7.
In step S903, the multitasking animation wrapping module sends an animation complete instruction to a remote animation object processing module (remoteanimation target set) for the remote animation object processing module to pass the information of istorecents=true to the application framework layer.
In step S904, the remote animation object processing module sends an animation completion instruction to the application framework layer by calling the finish method, so as to transfer the information of istorecents=true to the application framework layer.
In the process flow of the multi-task dynamic effect shown in fig. 9, after triggering the view system to display the dynamic effect entering the multi-task, the desktop starter sets the multi-task flag bit to true, and when the dynamic effect entering the multi-task displays the multi-task management page ((2) in fig. 7), the information that the multi-task flag bit is set to true is transferred to the view system of the application framework layer, so as to end the process of displaying the dynamic effect entering the multi-task this time. In other words, in the embodiment of the present application, the process of entering the multitasking action is displayed, and the first up-sliding operation of the user is ended until the desktop initiator transmits the information that the multitasking flag bit is set to true to the application framework layer.
Step S602, in the process of displaying the dynamic effect entering the multitasking, the preset processing flow is triggered in response to the second sliding operation.
Wherein the second up-slide operation is a gesture operation for entering the desktop, for example, the second up-slide operation is indicated as an operation of sliding up from a screen bottom edge of the electronic device.
Fig. 10 is a schematic diagram of a preset process flow provided in an embodiment of the present application. As shown in fig. 10, in one embodiment, the steps of the preset process flow include: step S6021 to step S6025.
In step 6021, the input management module receives a second sliding operation and converts the second sliding operation into a second touch event.
Wherein the second up-sliding operation includes a pressing operation of contacting the screen, one or more sliding operations, and a hand-lifting operation. The input management module converts the second up-slide operation into a second touch event, comprising: the press operation is converted into a press (down) event, the slide operation is converted into a move (move) event, and the raise operation is converted into a raise (up) event.
Step S6022, the input management module distributes the second touch event to the touch interaction management module in the desktop starter.
In some embodiments, the input management module may send a series of events contained in the touch event to the touch interaction management module of the desktop launcher, respectively, e.g., the input management module may send a down event to the desktop launcher when the user touches the screen; when a user slides, a move event is sent to a desktop starter; after the user lifts his hand, the up event is sent to the desktop launcher.
In step S6023, the touch interaction management module sends the second touch event to the corresponding gesture processing module when determining that the second touch event triggers in the gesture navigation hot zone.
In some embodiments, the touch interaction management module first receives a down event in the second touch event, and determines whether the down event triggers a gesture navigation hotspot according to a pressing position of the down event. And under the condition that the down event is triggered in the gesture navigation hot zone, the touch interaction management module sends the down event in the second touch event to the corresponding gesture processing module. It should be noted that, when the touch interaction management module determines that the second touch event is not triggered in the gesture navigation hot zone, other processing flows corresponding to the second touch event are executed, which is not limited in this embodiment.
In step S6024, the gesture processing module sends a status clearing instruction to the slide sharing status module (swipsesharedstate) based on the second touch event.
In some embodiments, the second touch event is processed by a newbaseConsumer in the gesture processing module.
In some embodiments, the newbaseConsumer in the gesture processing module processes the second touch event, including: the following steps one and two.
Step one, according to the second touch event, the isoversiewconsumer sub-module in the newbaseConsumer returns a true value (true), and sets the value of the multitasking flag bit (isToRecents) to false (false).
Wherein the isoversewConsumer sub-module in newBaseConsumer returns a true value (true) indicating that the isoversewConsumer sub-module can process the second touch event.
And step two, based on the second touch event, the newBaseConsumer sends a state clearing instruction to a sliding sharing state module (SwipeS haledState).
In step S6025, the sliding shared status module executes an action of clearing all the states (clearstate) in response to the status clearing instruction.
Wherein the act of clearing all states includes an act of clearing states, which may include, for example: clear snoop state action (clearListenerState), action to restore flag bit state, etc.
It should be noted that, in the related art, when the sliding-sharing status module performs the action of clearing the listening status, if it is determined that the multitasking flag bit is true (istorecents=true) and the current active running flag bit is true (msislastanimation running=true), the sliding-sharing status module triggers a procedure of cancelling the active, that is, the sliding-sharing status module sends an active cancellation instruction to the active management module, where the active cancellation instruction includes information of istorecents=false. The animation management module, such as a remove animation object set module (removeanimation targetset) and a multitasking animation control compatible module (recentranimate control compat), passes an animation cancellation instruction to the view system of the application framework layer to control the cancellation of the animation into the multitasking. However, the cancellation of the dynamic effect of the multitasking causes the electronic device to return to the page of the display application program instead of displaying the desktop, resulting in an abnormality in the up-sliding effect.
Therefore, in the method for handling the abnormal sliding effect provided in the embodiment of the present application, the electronic device executes the following step S603 in the preset processing flow triggered in response to the second sliding operation, so as to optimize the processing flow of the second sliding operation, and solve the problem of the abnormal sliding effect.
In the step S603, before executing the action of clearing all states, the current active operation flag bit is set to a first value.
After the current active effect operation flag bit is set to false, when the action of clearing all states is executed, the execution of the process of canceling the current active effect is not triggered.
In the embodiment of the present application, the first value may be false. For example, the sliding-share status module sets the current active run flag bit to false (i.e., msislastanimation running=false) in response to a status-clearing instruction, and then performs the action of clearing all the status. In this case, when the action of clearing the listening state is executed subsequently, the sliding shared state module may determine that the current active operation flag bit is false, and then the flow of cancelling the active is not triggered, i.e. the sliding shared state module does not send an active cancelling instruction to the active management module.
It should be noted that, the current active operation flag bit is generally set to false after all the active triggered by the first touch event is finished, i.e. after the active entering the multi-task is finished. However, in this embodiment of the present application, once the sliding shared status module receives the status clearing instruction, the current active operation flag bit is changed in advance, so that processing logic for canceling the active is not caused, and when the user performs the second sliding operation in the process of entering the multi-task active, the electronic device returns to the abnormal scene of the application program.
In one example, when the sliding-sharing status module performs the clear listening status action, it may be determined whether to trigger the flow of canceling the action by if (msilastanimation running &getlastanimationtarget () |=null), for example, if the condition is satisfied, the flow of canceling the action is triggered, and if the condition is not satisfied, the flow of canceling the action is not triggered.
Step S604, displaying the desktop after the execution of the preset processing flow is completed.
In the embodiment of the application, the sliding sharing state module executes the action of completing the clearing of all states, and the interface displayed by the electronic equipment becomes a desktop.
In the method for processing abnormal dynamic effects provided by the embodiment of the application, the electronic equipment receives a first sliding operation on a first display interface, wherein the first display interface is indicated as an application page; the electronic equipment responds to the first sliding operation and displays the movement effect entering the multitasking; and in the process of displaying the dynamic effect entering the multitasking, the electronic equipment responds to the second sliding operation of the user and triggers a preset processing flow. In the preset process flow: before executing the action of clearing all states, modifying the value of the current dynamic effect running flag bit so as not to trigger the cancellation of the current dynamic effect when executing the action of clearing all states, avoiding the return of the display interface of the electronic equipment to the application program page caused by the cancellation of the dynamic effect of the entering multitasking, effectively solving the problem of abnormal upper sliding effect, improving the user experience and enabling the electronic equipment to smoothly display the desktop based on the triggered preset processing flow.
It will be appreciated that the electronic device, in order to achieve the above-described functions, includes corresponding hardware and/or software modules that perform the respective functions. The steps of an algorithm for each example described in connection with the embodiments disclosed herein may be embodied in hardware or a combination of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation is not to be considered as outside the scope of this application.
The steps executed by the electronic device 200 in the method for processing the abnormal upper sliding effect provided in the embodiment of the present application may also be executed by a chip system included in the electronic device 200, where the chip system may include a processor and a bluetooth chip. The chip system may be coupled to a memory such that the chip system, when running, invokes a computer program stored in the memory, implementing the steps performed by the electronic device 200 described above. The processor in the chip system can be an application processor or a non-application processor.
The present embodiment also provides a computer readable medium having stored therein computer instructions which, when run on an electronic device, cause the electronic device to perform the related method steps described above to implement the method of the above embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-mentioned related steps to implement the method in the above-mentioned embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component, or a module, and may include a processor and a memory connected to each other; the memory is configured to store computer-executable instructions, and when the device is operated, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the methods in the above method embodiments.
The electronic device, the computer readable medium, the computer program product or the chip provided in this embodiment are configured to execute the corresponding method provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding method provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
Any of the various embodiments of the application, as well as any of the same embodiments, may be freely combined. Any combination of the above is within the scope of the present application. The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions to cause a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes. The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.
The steps of a method or algorithm described in connection with the disclosure of the embodiments disclosed herein may be embodied in hardware, or may be embodied in software instructions executed by a processor. The software instructions may be comprised of corresponding software modules that may be stored in random access Memory (Random Access Memory, RAM), flash Memory, read Only Memory (ROM), erasable programmable Read Only Memory (Erasable Programmable ROM), electrically Erasable Programmable Read Only Memory (EEPROM), registers, hard disk, a removable disk, a compact disc Read Only Memory (CD-ROM), or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor.
Those skilled in the art will appreciate that in one or more of the examples described above, the functions described in the embodiments of the present application may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, these functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer-readable media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the corresponding technical solutions from the scope of the technical solutions of the embodiments of the present application.

Claims (10)

1. A method for handling an up-slip effect exception, the method comprising:
receiving a first sliding operation on a first display interface, wherein the first display interface is indicated as an application program page;
displaying a movement effect entering a multitasking mode in response to the first up-sliding operation;
in the process of displaying the dynamic effect entering the multitasking, responding to a second sliding operation, and triggering a preset processing flow;
after the execution of the preset processing flow is completed, displaying a desktop;
the preset processing flow comprises the following steps:
modifying the value of the current active operation flag bit before executing the action of clearing all states; after the value of the current active effect operation flag bit is modified, when the action of clearing all states is executed, the process of canceling the current active effect is not triggered.
2. The method of claim 1, wherein modifying the value of the current active run flag bit before performing the act of clearing all states comprises:
the value of the current active run flag bit is modified before the act of clearing the snoop state is performed.
3. The method according to claim 1 or 2, wherein said modifying the value of the current active run flag bit comprises:
and modifying the value of the current active operation zone bit to be a first value.
4. The method of claim 1, wherein displaying a move-on effect into a multitasking in response to the first slide-up operation comprises:
the input management module converts the received first sliding operation into a first touch event;
the input management module distributes the first touch event to a touch interaction management module;
the touch interaction management module sends the first touch event to a corresponding gesture processing module under the condition that the first touch event is determined to be triggered in a gesture navigation hot zone;
the gesture processing module triggers a view system to display the dynamic effect entering the multi-task based on the first touch event.
5. The method of claim 4, wherein the first touch event is indicated as any one of a down event, a move event, and an up event, wherein the gesture processing module triggers a view system to display the action to enter multitasking based on the first touch event, comprising:
and when the gesture processing module indicates that the first touch event is an up event, triggering a view system to display the dynamic effect entering the multitasking.
6. The method of claim 5, wherein the gesture processing module, based on the first touch event, further comprises, prior to triggering a view system to display the action to enter multitasking:
and when the first touch event is indicated to be the move event, the gesture processing module modifies the value of the current active operation zone bit into a second value.
7. The method of claim 1, wherein triggering the preset process flow in response to the second slide up operation comprises:
the input management module converts the second sliding operation into a second touch event;
the input management module distributes the second touch event to a touch interaction management module;
The touch interaction management module sends the second touch event to the corresponding gesture processing module under the condition that the second touch event is determined to be triggered in the gesture navigation hot zone;
the gesture processing module sends a state clearing instruction to the sliding sharing state module based on the second touch event;
the sliding shared state module is used for responding to the state clearing instruction, modifying the value of the current active operation zone bit and executing the action of clearing all states.
8. The method of claim 1, wherein the first up-slide operation is indicated as an operation of sliding up and pausing from a bottom edge of the screen;
the second up-slide operation is indicated as an operation of sliding upward from the bottom edge of the screen.
9. An electronic device, the electronic device comprising:
one or more processors;
a memory;
and a computer program, wherein the computer program is stored on the memory, which when executed by the one or more processors, causes the electronic device to perform the up-slip effect exception handling method of any one of claims 1-8.
10. A computer storage medium comprising computer instructions which, when executed on an electronic device, cause the electronic device to perform the up-slip effect exception handling method of any one of claims 1 to 8.
CN202311265723.XA 2023-09-28 2023-09-28 Upper sliding effect exception handling method and electronic equipment Active CN116991274B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311265723.XA CN116991274B (en) 2023-09-28 2023-09-28 Upper sliding effect exception handling method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311265723.XA CN116991274B (en) 2023-09-28 2023-09-28 Upper sliding effect exception handling method and electronic equipment

Publications (2)

Publication Number Publication Date
CN116991274A CN116991274A (en) 2023-11-03
CN116991274B true CN116991274B (en) 2023-12-19

Family

ID=88528751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311265723.XA Active CN116991274B (en) 2023-09-28 2023-09-28 Upper sliding effect exception handling method and electronic equipment

Country Status (1)

Country Link
CN (1) CN116991274B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930339A (en) * 2010-08-11 2010-12-29 惠州Tcl移动通信有限公司 Method for switching interface of electronic equipment and device thereof
CN114518817A (en) * 2022-01-10 2022-05-20 荣耀终端有限公司 Display method, electronic equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006103541A1 (en) * 2005-04-01 2006-10-05 Abb Research Ltd Method and system for providing a user interface

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101930339A (en) * 2010-08-11 2010-12-29 惠州Tcl移动通信有限公司 Method for switching interface of electronic equipment and device thereof
CN114518817A (en) * 2022-01-10 2022-05-20 荣耀终端有限公司 Display method, electronic equipment and storage medium
CN116501210A (en) * 2022-01-10 2023-07-28 荣耀终端有限公司 Display method, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN116991274A (en) 2023-11-03

Similar Documents

Publication Publication Date Title
EP3979628B1 (en) Display method of video call applied to electronic device and related apparatus
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
EP3916534A1 (en) Shortcut function activation method and electronic device
CN111543042B (en) Notification message processing method and electronic equipment
EP4012544A1 (en) Split-screen processing method and terminal device
CN111602108B (en) Application icon display method and terminal
US12001673B2 (en) Split-screen display method and electronic device
US20230117194A1 (en) Communication Service Status Control Method, Terminal Device, and Readable Storage Medium
EP3958106A1 (en) Interface display method and electronic device
CN113641271B (en) Application window management method, terminal device and computer readable storage medium
JP2022501739A (en) Stylus pen detection method, system and related equipment
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
EP4273687A1 (en) Picture sharing method and electronic device
CN116048831B (en) Target signal processing method and electronic equipment
CN114691248B (en) Method, device, equipment and readable storage medium for displaying virtual reality interface
CN116991274B (en) Upper sliding effect exception handling method and electronic equipment
CN116662150B (en) Application starting time-consuming detection method and related device
WO2024139257A1 (en) Method for displaying interfaces of application programs and electronic device
CN118276730A (en) Interface display method of application program and electronic device
CN118131891A (en) Man-machine interaction method and device
CN116048236A (en) Communication method and related device
CN116028966A (en) Application display method, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant