WO2023160455A1 - Procédé de suppression d'objet et dispositif électronique - Google Patents

Procédé de suppression d'objet et dispositif électronique Download PDF

Info

Publication number
WO2023160455A1
WO2023160455A1 PCT/CN2023/076461 CN2023076461W WO2023160455A1 WO 2023160455 A1 WO2023160455 A1 WO 2023160455A1 CN 2023076461 W CN2023076461 W CN 2023076461W WO 2023160455 A1 WO2023160455 A1 WO 2023160455A1
Authority
WO
WIPO (PCT)
Prior art keywords
objects
electronic device
priority
user interface
displayed
Prior art date
Application number
PCT/CN2023/076461
Other languages
English (en)
Chinese (zh)
Inventor
华文
吴昊
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023160455A1 publication Critical patent/WO2023160455A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • the present application relates to the field of computer technology, in particular to a method for deleting an object and an electronic device.
  • a user may install various application programs in an electronic device, and when the user uses the electronic device, the electronic device generates many objects. For example, the notification message from the application program displayed in the notification center; the music software will store the songs the user has listened to; the gallery software will store the saved photos; the shopping software will store the user's shopping records, and so on.
  • the existing method is to delete the above objects one by one or delete them all.
  • deleting one by one will make the deletion inefficient; and deleting all of them is easy to delete important objects by mistake, causing users to miss important information.
  • the embodiment of the present application discloses a method for deleting objects and an electronic device.
  • deleting multiple objects instead of deleting multiple objects at once, multiple objects are deleted in batches, which can leave a pause for the user
  • the timing of deletion improves the user's sense of use.
  • the embodiment of the present application provides a method for deleting an object, which may include:
  • N is a positive integer greater than or equal to 1, and each object in the N objects is marked with a priority
  • the N objects are deleted in batches according to the priorities corresponding to the N objects.
  • each object displayed in the first user interface is marked with a priority level, so when performing the delete operation on the above-mentioned objects, instead of deleting all objects at once, the priority of the objects is Levels are deleted in batches. In this way, even if the user clicks on the operation of deleting all objects, the user can be given an opportunity to suspend deletion, thereby improving the user's sense of use.
  • the method may also include:
  • the N objects after the N objects are deleted in batches according to the priorities corresponding to the N objects, the N objects have been deleted before the deletion behavior is stopped L objects; wherein, the priority of each object in the L objects is less than the priority of each object in the M objects, and L is less than N positive integer of .
  • the notifications that are suspended and deleted in response to the touch operation are notifications with higher priority, and notifications with higher priority may be more important notifications to the user, so immediately click the button to delete all notifications operation, it can also ensure that important notifications are not deleted.
  • the method before deleting the N objects in batches according to the priorities corresponding to the N objects in response to the deletion operation for the N objects, the method further includes:
  • a priority is marked for each of the N objects according to the number of times each of the N objects is opened.
  • the user's usage habits may be considered to mark the priority for each object.
  • the priority of objects opened more times may be marked as high priority, and the priority of objects opened less times may be marked as low priority. In this way, the priority of marking objects can conform to the user's usage habits, thereby improving the user's experience in use.
  • deleting the N objects in batches according to the priorities corresponding to the N objects includes:
  • objects of the third priority among the N objects are deleted, and a fifth user interface is displayed, and objects not displayed in the fifth user interface are displayed.
  • objects with low priority are deleted first, and objects with high priority are deleted after a period of time. That is, the objects with high priority are deleted later, leaving an opportunity for the user to view the objects with high priority, so as to prevent the objects with high priority from being ignored.
  • the object includes one or more of the following: a notification message pushed by an application program, a selected picture in a gallery, a file, a note, a selected song in a music software, and a shopping record.
  • an embodiment of the present application provides an electronic device, which may include one or more processors, a memory, and a communication module; the memory and the communication module are coupled to the one or more processors , the memory is used to store a computer program, and the one or more processors execute the computer program to cause the electronic device to perform:
  • N is a positive integer greater than or equal to 1, and each object in the N objects is marked with a priority
  • the N objects are deleted in batches according to the priorities corresponding to the N objects.
  • the electronic device when the electronic device deletes the N objects in batches according to the priorities corresponding to the N objects, the electronic device responds to the Touching the interface, the electronic device stops deleting and displays a second user interface; M objects are displayed in the second user interface, and M is a positive integer smaller than N.
  • the N objects after the N objects are deleted in batches according to the priorities corresponding to the N objects, the N objects have been deleted before the deletion behavior is stopped L objects; wherein, the priority of each object in the L objects is less than the priority of each object in the M objects, and L is a positive integer smaller than N.
  • the electronic device further executes:
  • a priority is marked for each of the N objects according to the number of times each of the N objects is opened.
  • the electronic device specifically executes:
  • objects of the third priority among the N objects are deleted, and a fifth user interface is displayed, and objects not displayed in the fifth user interface are displayed.
  • the object includes one or more of the following: a notification message pushed by an application program, a selected picture in a gallery, a file, a note, a selected song in a music software, and a shopping record.
  • the embodiment of the present application provides a computer storage medium, the computer storage medium stores a computer program, and when the computer program is executed by a processor, any one of the first aspect and the first aspect of the embodiment of the present application can be realized.
  • an embodiment of the present application provides a computer program product, which, when running on an electronic device, causes the electronic device to execute the first aspect of the embodiment of the present application and any implementation manner of the first aspect provides method for deleting objects.
  • FIG. 1A is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 1B is a software structural block diagram of an electronic device provided in an embodiment of the present application.
  • FIG. 2A is a user interface for an application program menu on an electronic device provided by an embodiment of the present application
  • FIG. 2B is a schematic diagram of an electronic device displaying a notification center provided in an embodiment of the present application.
  • FIG. 2C is a schematic diagram of an electronic device deletion notification message provided by an embodiment of the present application.
  • FIG. 3A is a schematic diagram of an electronic device displaying a first user interface provided in an embodiment of the present application
  • FIG. 3B is a schematic diagram of an electronic device displaying a second user interface according to an embodiment of the present application.
  • FIG. 3C is a third user interface, a fourth user interface, and a fifth user interface displayed by an electronic device provided in an embodiment of the present application;
  • FIG. 4A is a user interface of an application program displayed by an electronic device provided in an embodiment of the present application.
  • Fig. 4B is a schematic diagram of a user deleting pictures in the "Gallery" provided by the embodiment of the present application.
  • FIG. 4C is a schematic diagram of displaying a second user interface by another electronic device provided in an embodiment of the present application.
  • FIG. 4D is a third user interface, a fourth user interface, and a fifth user interface displayed by an electronic device according to an embodiment of the present application;
  • Fig. 5 is a schematic flow chart of deleting an object provided by the embodiment of the present application.
  • the electronic device may be a portable electronic device that also includes other functions such as a personal digital assistant and/or a music player, such as a mobile phone, a tablet computer, a wearable electronic device with a wireless communication function (such as a smart watch) wait.
  • portable electronic devices include, but are not limited to Or portable electronic devices with other operating systems.
  • the aforementioned portable electronic device may also be other portable electronic devices, such as a laptop computer (Laptop) with a touch-sensitive surface or a touch panel. It should also be understood that, in some other embodiments, the above-mentioned electronic device may not be a portable electronic device, but a desktop computer with a touch-sensitive surface or a touch panel.
  • UI user interface
  • the term "user interface (UI)" in the specification, claims and drawings of this application is a medium interface for interaction and information exchange between an application program or an operating system and a user, and it realizes the internal form of information Conversion to and from a form acceptable to the user.
  • the user interface of the application program is the source code written in specific computer languages such as java and extensible markup language (XML). Such as pictures, text, buttons and other controls. Controls, also known as widgets, are the basic elements of the user interface. Typical controls include toolbars, menu bars, text boxes, buttons, and scroll bars. (scrollbar), images and text.
  • the properties and contents of the controls in the interface are defined through labels or nodes.
  • XML specifies the controls contained in the interface through nodes such as ⁇ Textview>, ⁇ ImgView>, and ⁇ VideoView>.
  • a node corresponds to a control or property in the interface, and after the node is parsed and rendered, it is presented as the content visible to the user.
  • the interfaces of many applications, such as hybrid applications usually include web pages.
  • a web page, also called a page, can be understood as a special control embedded in an application program interface.
  • a web page is a source code written in a specific computer language, such as hypertext markup language (GTML), cascading style Tables (cascading style sheets, CSS), java scripts (JavaScript, JS), etc., and the source code of the webpage can be loaded and displayed by a browser or a webpage display component similar in function to the browser as content recognizable by the user.
  • the specific content contained in the web page is also defined by the tags or nodes in the source code of the web page.
  • GTML defines the elements and attributes of the web page through ⁇ p>, ⁇ img>, ⁇ video>, and ⁇ canvas>.
  • GUI graphical user interface
  • FIG. 1A shows a schematic structural diagram of an electronic device 100 .
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, and an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, earphone jack 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and A subscriber identification module (subscriber identification module, SIM) card interface 195 and the like.
  • SIM subscriber identification module
  • the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180G, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, bone conduction sensor 180M, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example: the processor 110 may include an application processor (application processor, AP), a central processing unit (central processing unit, CPU), a graphics processing unit (graphics processing unit, GPU) , neural network processor (neural-network processing unit, NPU), modem processor, image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110 .
  • the controller may be the nerve center and command center of the electronic device 100 .
  • the controller can generate an operation control signal according to the instruction opcode and timing signal, and complete the control of fetching and executing the instruction.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated access is avoided, and the waiting time of the processor 110 is reduced, thereby improving the efficiency of the electronic device 100 .
  • processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transmitter (universal asynchronous receiver/transmitter, UART) interface, mobile industry processor interface (mobile industry processor interface, MIPI), general-purpose input and output (general-purpose input/output, GPIO) interface, subscriber identity module (subscriber identity module, SIM) interface, and /or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input and output
  • subscriber identity module subscriber identity module
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus, including a serial data line (serial data line, SDA) and a serial clock line (derail clock line, SCL).
  • processor 110 may include multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flashlight, the camera 193 and the like through different I2C bus interfaces.
  • processor 110 can couple touch sensor 180K through I2C interface, make processor 110 and touch sensor
  • the sensor 180K communicates through the I2C bus interface to realize the touch function of the electronic device 100 .
  • the I2S interface can be used for audio communication.
  • processor 110 may include multiple sets of I2S buses.
  • the processor 110 may be coupled to the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through the Bluetooth headset.
  • the PCM interface can also be used for audio communication, sampling, quantizing and encoding the analog signal.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is generally used to connect the processor 110 and the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to realize the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interface includes camera serial interface (camera serial interface, CSI), display serial interface (display serial interface, DSI), etc.
  • the processor 110 communicates with the camera 193 through the CSI interface to realize the camera function of the electronic device 100 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to realize the display function of the electronic device 100 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface can be used to connect the processor 110 with the camera 193 , the display screen 194 , the wireless communication module 160 , the audio module 170 , the sensor module 180 and so on.
  • the GPIO interface can also be configured as an I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface conforming to the USB standard specification, specifically, it can be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 100 , and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones and play audio through them. This interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules shown in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
  • the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
  • the charging management module 140 is configured to receive a charging input from a charger.
  • the charger may be a wireless charger or a wired charger.
  • the charging management module 140 can receive charging input from the wired charger through the USB interface 130 .
  • the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 is charging the battery 142 , it can also supply power to the electronic device through the power management module 141 .
  • the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
  • the power management module 141 receives the input from the battery 142 and/or the charging management module 140 to provide power for the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
  • the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 141 may also be disposed in the processor 110 .
  • the power management module 141 and the charging management module 140 may also be set in the same device.
  • the wireless communication function of the electronic device 100 can be through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module Block 160 is implemented by the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signals modulated by the modem processor, and convert them into electromagnetic waves and radiate them through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be set in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be set in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194 .
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent from the processor 110, and be set in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (Wireless Fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global navigation satellite, etc. applied on the electronic device 100.
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the wireless communication module 160 may include a Bluetooth module, a Wi-Fi module, and the like.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR techniques, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • code division multiple access code division multiple access
  • CDMA broadband Code division multiple access
  • WCDMA wideband code division multiple access
  • time division code division multiple access time-division code division multiple access
  • TD-SCDMA time-division code division multiple access
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • Beidou navigation satellite system beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 can realize the display function through the GPU, the display screen 194 , and the application processor.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos and the like.
  • the display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 100 may include 1 or N display screens 194 , where N is a positive integer greater than 1.
  • the electronic device 100 can realize the camera function through the camera 193, ISP, video codec, GPU, display screen 194, application processor AP, neural network processor NPU, and the like.
  • the camera 193 can be used to collect color image data and depth data of the subject.
  • the ISP can be used to process the color image data collected by the camera 193 .
  • the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, etc. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 193 .
  • the camera 193 can be composed of a color camera module and a 3D sensing module.
  • the photosensitive element of the camera head of the color camera module may be a charge coupled device (charge coupled device, CCD) or a complementary metal-oxide-semiconductor (complementary metal-oxide-semiconductor, CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the 3D sensing module may be a time of flight (TOF) 3D sensing module or a structured light (structured light) 3D sensing module.
  • the structured light 3D sensing is an active depth sensing technology, and the basic components of the structured light 3D sensing module may include an infrared (Infrared) emitter, an IR camera module, and the like.
  • the working principle of the structured light 3D sensing module is to first emit a specific pattern of light spots (pattern) on the object to be photographed, and then receive the light spot pattern code (light coding) on the surface of the object, and then compare the similarities and differences with the original projected light spots. And use the principle of trigonometry to calculate the three-dimensional coordinates of the object.
  • the three-dimensional coordinates include the distance between the electronic device 100 and the object to be photographed.
  • TOF 3D sensing is also an active depth sensing technology.
  • the basic components of TOF 3D sensing module can include infrared (Infrared) emitters, IR camera modules, etc.
  • the working principle of the TOF 3D sensing module is to calculate the distance (that is, depth) between the TOF 3D sensing module and the object to be photographed through the time of infrared ray return to obtain a 3D depth map.
  • the structured light 3D sensing module can also be applied in fields such as face recognition, somatosensory game consoles, and industrial machine vision inspection.
  • TOF 3D sensing modules can also be applied to game consoles, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) and other fields.
  • the camera 193 may also be composed of two or more cameras.
  • the two or more cameras can include a color camera, which can be used to collect color image data of the object being photographed.
  • the two or more cameras can use stereo vision (stereo vision) technology to collect depth data of the object being photographed.
  • stereo vision technology is based on the principle of human eye parallax. Under natural light, two or more cameras take images of the same object from different angles, and then perform calculations such as triangulation to obtain the electronic device 100 and the object. Distance information between photographed objects, that is, depth information.
  • the electronic device 100 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • the electronic device 100 may include a front camera 193 and a rear camera 193 .
  • the front camera 193 can usually be used to collect the color image data and depth data of the photographer facing the display screen 194, and the rear 3D camera module can be used to collect the shooting objects (such as people, scenery, etc.) that the photographer is facing. ) color image data with and depth data.
  • the CPU, GPU or NPU in the processor 110 can process the color image data and depth data collected by the camera 193 .
  • the NPU can identify the color image data collected by the camera 193 (specifically, the color camera module) through the neural network algorithm based on the bone point recognition technology, such as a convolutional neural network algorithm (CNN), to Determine the skeletal points of the person being photographed.
  • CNN convolutional neural network algorithm
  • the CPU or GPU can also run the neural network algorithm to determine the skeleton points of the person being photographed according to the color image data.
  • the CPU or GPU or NPU can also be used to confirm the figure (such as body proportion, body parts between the skeletal points), and further determine the body beautification parameters for the photographed person, and finally process the captured image of the photographed person according to the body beautification parameters, so that the photographed person in the captured image
  • the body shape of the person being photographed is beautified. Subsequent embodiments will introduce in detail how to perform body shaping processing on the image of the person being photographed based on the color image data and depth data collected by the camera 193 , which will not be repeated here.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the electronic device 100 may support one or more video codecs.
  • the electronic device 100 can play or record videos in various encoding formats, for example: moving picture experts group (moving picture experts group, MPEG)-1, MPEG-2, MPEG-3, MPEG-4, etc.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NPU neural-network
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, so as to expand the storage capacity of the electronic device 100.
  • the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, save data such as music, photos, videos, etc. in an external memory card.
  • the internal memory 121 may be used to store one or more computer programs including instructions.
  • the processor 110 can execute the above-mentioned instructions stored in the internal memory 121, so that the electronic device 100 executes the photo preview method of the electronic device provided in some embodiments of the present application, as well as various functional applications and data processing.
  • the internal memory 121 may include an area for storing programs and an area for storing data. Wherein, the stored program area can store an operating system; the stored program area can also store one or more application programs (such as a gallery, contacts, etc.) and the like.
  • the data storage area can store data (such as photos, contacts, etc.) created during the use of the electronic device 100 .
  • the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the electronic device 100 can implement audio functions through the audio module 170 , the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signal.
  • the audio module 170 may also be used to encode and decode audio signals.
  • the audio module 170 may be set in the processor 110 , or some functional modules of the audio module 170 may be set in the processor 110 .
  • Speaker 170A also referred to as a "horn" is used to convert audio electrical signals into sound signals.
  • Electronic device 100 can listen to music through speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 receives the call
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a phone call or sending a voice message, the user can put his mouth close to the microphone 170C to make a sound, and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C. In some other embodiments, the electronic device 100 may be provided with two microphones 170C, which may also implement a noise reduction function in addition to collecting sound signals. In some other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions, etc.
  • the earphone interface 170D is used for connecting wired earphones.
  • the earphone interface 170D can be a USB interface 130, or a 3.5mm open mobile terminal platform (OMTP) standard interface, or a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 100 determines the intensity of pressure according to the change in capacitance.
  • the electronic device 100 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 100 may also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view short messages is executed. When a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the icon of the short message application, the instruction of creating a new short message is executed.
  • the gyro sensor 180B can be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of the electronic device 100 around three axes may be determined by the gyro sensor 180B.
  • the gyro sensor 180B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 180B detects the shaking angle of the electronic device 100, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • the air pressure sensor 180C is used to measure air pressure.
  • the electronic device 100 calculates the altitude based on the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • the magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
  • the electronic device 100 may detect opening and closing of the clamshell according to the magnetic sensor 180D.
  • features such as automatic unlocking of the flip cover are set.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 100 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the distance sensor 180F is used to measure the distance.
  • the electronic device 100 may measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may use the distance sensor 180F for distance measurement to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
  • the light emitting diodes may be infrared light emitting diodes.
  • the electronic device 100 emits infrared light through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it may be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear. In order to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in leather case mode, automatic unlock and lock screen in pocket mode.
  • the ambient light sensor 180L is used for sensing ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket, so as to prevent accidental touch.
  • the fingerprint sensor 180G is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access to application locks, take pictures with fingerprints, answer incoming calls with fingerprints, and the like.
  • the temperature sensor 180J is used to detect temperature.
  • the electronic device 100 uses the temperature detected by the temperature sensor 180J to implement a temperature treatment strategy. For example, when the temperature reported by the temperature sensor 180J exceeds the threshold, the electronic device 100 may reduce the performance of the processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection.
  • the electronic device 100 when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 caused by the low temperature.
  • the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • the touch sensor 180K may also be called a touch panel or a touch-sensitive surface.
  • the touch sensor 180K can be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to the touch operation can be provided through the display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 , which is different from the position of the display screen 194 .
  • the bone conduction sensor 180M can acquire vibration signals. In some embodiments, the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human voice. The bone conduction sensor 180M can also contact the human pulse and receive the blood pressure beating signal. In some embodiments, the bone conduction sensor 180M can also be disposed in the earphone, combined into a bone conduction earphone.
  • the audio module 170 can analyze the voice signal based on the vibration signal of the vibrating bone mass of the vocal part acquired by the bone conduction sensor 180M, so as to realize the voice function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
  • the keys 190 include a power key, a volume key and the like.
  • the key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100 .
  • the motor 191 can generate a vibrating reminder.
  • the motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations applied to different applications may correspond to different vibration feedback effects.
  • the motor 191 may also correspond to different vibration feedback effects for touch operations acting on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, and can be used to indicate charging status, power change, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 195 is used for connecting a SIM card.
  • the SIM card can be connected and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195 .
  • the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • SIM card interface 195 can support Nano SIM card, Micro SIM card, SIM card and so on. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards may be the same or different.
  • the SIM card interface 195 is also compatible with different types of SIM cards.
  • the SIM card interface 195 is also compatible with external memory cards.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calling and data communication.
  • the electronic device 100 adopts an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100, and cannot be connected with the electronic device 100. Prepare 100 separations.
  • the electronic device 100 exemplarily shown in FIG. 1A can display various user interfaces described in the following various embodiments through a display screen 194 .
  • the electronic device 100 can detect a touch operation in each user interface through the touch sensor 180K, such as a click operation (such as a touch operation on an icon, a double-click operation) in each user interface, and for example, an upward or downward touch operation in each user interface. A swipe down, or a circle gesture, etc.
  • the electronic device 100 can detect motion gestures performed by the user holding the electronic device 100 , such as shaking the electronic device, through the gyroscope sensor 180B, the acceleration sensor 180E, and the like.
  • the electronic device 100 can detect a non-touch gesture operation through the camera 193 (such as a 3D camera, a depth camera).
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-kernel architecture, a micro-service architecture, or a cloud architecture.
  • the software structure of the electronic device 100 is exemplarily described by taking an Android system with a layered architecture as an example.
  • FIG. 1B is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present invention.
  • the layered architecture divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate through software interfaces.
  • the Android system is divided into four layers, which are respectively the application program layer, the application program framework layer, the Android runtime (Android runtime) and system library, and the kernel layer from top to bottom.
  • the application layer can consist of a series of application packages.
  • the application package may include application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • application programs such as camera, gallery, calendar, call, map, navigation, WLAN, Bluetooth, music, video, and short message.
  • the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
  • the application framework layer includes some predefined functions. As shown in Fig. 1B, the application framework layer may include window manager, content provider, view system, phone manager, resource manager, notification manager, etc.
  • a window manager is used to manage window programs.
  • the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, capture the screen, etc.
  • a display interface can consist of one or more views.
  • a display interface including a text message notification icon may include a view for displaying text and a view for displaying pictures.
  • the phone manager is used to provide communication functions of the electronic device 100 . For example, the management of call status (including connected, hung up, etc.).
  • the resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and so on.
  • the notification manager enables the application to display notification messages in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a short stay without user interaction.
  • the notification manager is used to notify the download completion, message reminder, etc.
  • the notification manager can also be a notification that appears on the top status bar of the system in the form of a chart or scroll bar text, such as a notification of an application running in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, issuing a prompt sound, vibrating the electronic device, and flashing the indicator light, etc.
  • Android Runtime includes core library and virtual machine.
  • Android Runtime is responsible for the scheduling and management of the Android system.
  • the core library consists of two parts: one part is the functions that the java language needs to call, and the other part is the core library of Android.
  • the application layer and the application framework layer run in virtual machines.
  • the virtual machine executes the java files of the application program layer and the application program framework layer as binary files.
  • the virtual machine is used to perform object life cycle management, stack management, thread management, Security and exception management, as well as functions such as garbage collection.
  • a system library can include multiple function modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of various commonly used audio and video formats, as well as still image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, G.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing, etc.
  • 2D graphics engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer includes at least a display driver, a camera driver, an audio driver, and a sensor driver.
  • the software system shown in Figure 1B involves application presentation using sharing capabilities (such as gallery, file manager), instant sharing modules providing sharing capabilities, printing services (print service) and print spooler providing printing capabilities , and the application framework layer provides printing framework, WLAN service, Bluetooth service, and the kernel and bottom layer provide WLAN Bluetooth capability and basic communication protocol.
  • sharing capabilities such as gallery, file manager
  • instant sharing modules providing sharing capabilities
  • printing services printing services
  • print spooler providing printing capabilities
  • the application framework layer provides printing framework, WLAN service, Bluetooth service
  • the kernel and bottom layer provide WLAN Bluetooth capability and basic communication protocol.
  • FIG. 2A schematically shows an exemplary user interface 21 for an application program menu on the electronic device 100 .
  • the user interface 21 may include: a status bar 201, a tray 202 with frequently used application icons, and other application icons. in:
  • the status bar 201 may include: one or more signal strength indicators 203 for mobile communication signals (also referred to as cellular signals), one or more signal strength indicators 207 for wireless high-fidelity (Wi-Fi) signals , battery status indicator 209, time indicator 211.
  • the tray 202 with commonly used application program icons can display: a camera icon 241 , an address book icon 243 , a phone icon 245 , and an information icon 247 .
  • Other application program icons can be, for example: clock icon 210, calendar icon 211, gallery icon 213, memo icon 215, file management icon 217, email icon 219, music icon 221, wallet icon 223, Huawei Video icon 225, sports and health icon 227, weather icon 229, browser icon 231, smart life icon 233, setting icon 235, recorder icon 237, and application store icon 239.
  • User interface 21 may also include indicator 249 . Icons of other application programs may be distributed on multiple pages, and the page indicator 249 may be used to indicate the application program on which page the user is currently browsing. The user can slide the area of other application icons left and right to browse the application icons in other pages.
  • the user interface 21 exemplarily shown in FIG. 2A may be a main interface (Home screen).
  • the electronic device 100 may also include a home screen key.
  • the home screen key may be a physical key or a virtual key.
  • the home screen key can be used to receive an instruction from the user, and return the currently displayed UI to the home interface, so that the user can view the home screen at any time.
  • the above instruction can specifically be an operation instruction in which the user presses the home screen button once, or an operation instruction in which the user presses the home screen button twice in a short period of time, or an operation instruction in which the user presses the home screen button for a long time within a predetermined period of time. operating instructions.
  • the home screen key may also be integrated with a fingerprint reader, so that when the home screen key is pressed, fingerprint collection and identification are performed thereupon.
  • FIG. 2A only exemplarily shows the user interface of the electronic device 100, and should not constitute a reference to this application. Limitations of the Examples.
  • FIG. 2B exemplarily shows the operation of the electronic device 100 for displaying the notification center.
  • the electronic device 100 may display a notification center 261 on the user interface 21 in response to the gesture.
  • the notification center 261 may display a delete control 271, a window 281 for managing notifications, and one or more notification messages from applications.
  • the electronic device 100 may jump to an application program corresponding to the notification message to display detailed information of the above notification message.
  • notifications displayed in the notification center 261 are important information. With more and more application notifications, important notifications may be overwhelmed by non-important notifications, and it is becoming more and more difficult for users to quickly get important notifications from the notification center.
  • FIG. 2C exemplarily shows a schematic diagram of deleting a notification message of the electronic device 100 .
  • the user can choose to delete notification messages one by one, or quickly delete all notification messages.
  • the electronic device 100 may clear one or more notification messages from the application program displayed in the notification center 261 .
  • the electronic device 100 may display a deletion option of the notification center.
  • the electronic device 100 may delete the current notification message.
  • the deletion efficiency will be low.
  • the user chooses to clear all notification messages through the delete control 271 it is easy to delete important notifications, resulting in missing important information.
  • the main problem to be solved in this application is to help the user quickly delete other notification messages while ensuring that the user does not miss important notifications, thereby ensuring the simplicity of the notification center page.
  • the prior art can quickly delete notification messages, it is easy for users to ignore important notification messages. Over time, when users want to get important information, they will not go to the notification center to get it.
  • FIG. 3A exemplarily shows a first user interface displayed on an electronic device such as a smart phone, that is, a notification center 361 .
  • the notification center 361 is a window for displaying notification messages from one or more application programs on electronic devices such as smart phones and tablet computers, and can be obtained by sliding down the top of the display screen of the electronic device 100 .
  • various application programs can be installed in the electronic device 100 , and these application programs will push notification messages to the electronic device 100 .
  • the electronic device 100 receives the notification message pushed by the application program, it can load the notification message to the notification center 361 to display the notification message in the notification center 361 .
  • N notification messages may be displayed on the first user interface (that is, the notification center 361 ), where N is a positive integer greater than or equal to 1.
  • the notification message specifically includes N notification messages, and the above N notification messages may come from the same application program or different application programs.
  • the above N notification messages can be displayed in the notification center 361 from top to bottom according to the time pushed by the application program to the electronic device 100 or displayed in the notification center 361 from bottom to top according to the time pushed to the electronic device 100 by the application program .
  • the message pushed by the information program to the notification center 361 is displayed at the top; the message pushed to the notification center 261 by the "XX group buying" program 40 minutes ago is displayed at the bottom.
  • the electronic device may update the notification message displayed in the notification center 361 .
  • the notification message updated in response to the sliding operation is a message older than the current time or the notification message updated in response to the sliding operation is a message closer to the current time.
  • the window corresponding to the application program in the notification center 361 can be displayed in a folded manner.
  • the electronic device 100 may delete the N notification messages in batches according to the priorities corresponding to the N notification messages.
  • the notification center 361 includes a delete control 371 , and the delete control 371 is used to delete the N notification messages displayed by the notification center 361 .
  • the electronic device 100 detects the user's operation on the delete control 371 in the notification center 361 , and in response to the operation, the electronic device 100 can delete the N notification messages step by step according to the respective priorities corresponding to them.
  • the level-by-level deletion mentioned in the embodiment of the present application is not to clear the notification messages displayed in the notification center at once, but to delete the notification messages in batches according to the priority.
  • each notification message in the above N notification messages is marked with a priority. As shown in FIG. 3A , it is assumed that the priority of notification message 302, notification message 303 and notification message 306 is lower than the deletion priority of notification message 301, notification message 304 and notification message 305, so the electronic device 100 can be displayed in the notification center 361.
  • notification message 302 notification message 303 and notification message 306, that is, notification message 302, notification message 303 and notification message 306 will first disappear in notification center 361, while notification message 301, notification message 304 and notification message 305 will not disappear in the notification center 361 together with the notification message 302 , the notification message 303 and the notification message 306 .
  • the electronic device 100 may continue to respond to the above operations, and delete the notification message 301, the notification message 304, and the notification message 305, that is, the notification message 301, the notification message 304, and the notification message 305.
  • the notification message 305 may disappear from the notification center 361 after a period of time.
  • FIG. 3B exemplarily shows a second user interface displayed on an electronic device such as a smart phone, that is, a notification center 362 .
  • the electronic device 100 detects a touch operation on the first user interface (ie, the area where the notification center 362 is located).
  • the electronic device 100 stops the deletion and displays the second user interface, that is, the notification center 362 .
  • M notification messages may be displayed on the second user interface, where M is a positive integer smaller than N.
  • the M notification messages may specifically include notification messages retained because the electronic device 100 stops the deletion.
  • the electronic device after the electronic device deletes the N notification messages in batches according to the priorities corresponding to the N notification messages, it has deleted L notification messages among the N notification messages before stopping the above deletion behavior.
  • the priority of each object in the L notification messages is lower than the priority of each object in the M objects, and L is a positive integer smaller than N.
  • the electronic device 100 when the electronic device 100 responds to the deletion operation on the N notification messages displayed by the notification center 361, the electronic device 100 can first delete the N notification messages displayed by the notification center 361 according to the priorities corresponding to the N notification messages.
  • the notification message 302, the notification message 303 and the notification message 306 are deleted.
  • the electronic device 100 in response to detecting a touch operation acting on the first user interface (that is, the area where the notification center 361 is located), the electronic device 100 may stop processing the notification message 301. , the deletion behavior of the notification message 303 and the notification message 306.
  • the electronic device 100 can display the second user interface (ie, the notification center 362), and in the notification center 362, the notification messages that are not deleted after the user's touch operation can be displayed, that is, the notification message 301, the notification message 303, and the notification message 306. .
  • the notification message 301 , the notification message 303 and the notification message 306 can be protected from being deleted, so as to prevent the user from missing messages with higher priority.
  • FIG. 3C exemplarily shows a third user interface, a fourth user interface, and a fifth user interface displayed by an electronic device such as a smart phone.
  • the electronic device 100 responds to the deletion of the above N notification messages Operation, the electronic device 100 may sequentially delete the above N notification messages according to the corresponding priorities.
  • the notification center 361 includes a delete control 371 , and the delete control 371 is used to delete the N notification messages displayed by the notification center 361 . That is, the electronic device 100 may detect the user's operation on the delete control 371 in the notification center 361 , and delete the N notification messages displayed on the notification center 361 step by step and in batches in response to the above operation.
  • the electronic device 100 can first delete the notification message with the first priority among the N notification messages according to the priority (for example, the notification message 302, the notification message 303 and the notification message 306), and display the third user interface, that is, FIG. 3C Notification Center 363 is shown. Therefore, the notification message with the second priority and the notification message with the third priority among the N notification messages are displayed on the third user interface (such as notification message 301 , notification message 304 and notification message 305 ). Among them, the priority programs of the first priority, the second priority and the third priority are incremented sequentially.
  • a countdown control 391 is displayed in the third user interface (that is, the notification center 363), and a first time (for example, 30 seconds) is displayed in the countdown control 391.
  • the countdown control 391 is used to prompt the user to stop the electronic device after the above-mentioned first time interval.
  • 100 will delete the second priority notification message (for example, notification message 305 ) among the N notification messages, and display the fourth user interface, that is, the notification center 364 shown in FIG. 3C . Therefore, notification messages of the third priority among notification messages (for example, notification message 301 and notification message 304 ) are displayed on the fourth user interface.
  • a countdown control 392 is displayed in the fourth user interface (that is, the notification center 364), and a second time (for example, 60 seconds) is displayed in the countdown control 392.
  • the countdown control 392 is used to prompt the user to stop the electronic device after the above-mentioned second time interval.
  • 100 will delete the third-priority notification message (for example, notification message 301 and notification message 304 ) among the N notification messages, and display the fifth user interface, that is, the notification center 365 shown in FIG. 3C .
  • the electronic device 100 deletes the notification message 301, the notification message 302, the notification message 303, the notification message 304 and the notification message 305 displayed in the first user interface (that is, the notification center 361 shown in FIG. 3C ) in order according to the deletion priority, so There is no notification message displayed in the fifth user interface.
  • FIG. 3C exemplarily shows three classifications of priorities (first priority, second priority, and third priority), but this embodiment of the present application is not limited to the above three classifications, and priority Levels are classified in more detail, such as 4 types, 6 types, and so on.
  • the notification messages displayed in the notification center are deleted in order according to the deletion priority, that is, the notification messages with lower priority are deleted first, and the notification messages with higher priority are deleted after a period of time. It can prevent users from missing high-priority notifications while keeping the notification center simple.
  • the priority of each notification message is determined by the number of times the electronic device 100 opens the notification message in response to the first operation on the notification message. That is to say, the electronic device 100 responds to the first operation on the N notification messages respectively, thereby collecting the number of times each notification message in the N notification messages is opened, and then according to the number of times each notification message in the N notification messages is opened The number of times marks a priority for each of the N notification messages.
  • the notification message is pushed to the notification center by one or more application programs installed in the electronic device 100 .
  • the frequency of notification messages pushed by the first application program is opened by the user, it means that the electronic device has performed more operations in response to the notification messages pushed by the first application program, that is, the first application program is collected within a preset time. The number of starts is greater than or equal to the first range value. It shows that the user pays more attention to the notification message pushed by the first application program, and the notification message pushed by the first application program may be important to the user.
  • the electronic device 100 may mark the notification message pushed by the second application as the third priority.
  • the frequency of notification messages pushed by the user when opening the second application program is less, it means that the electronic device has less operations in response to the notification messages pushed by the second application program, that is, the second application program is collected within a preset time.
  • the number of starts is greater than or equal to the second range value and less than the first range value, indicating the user's concern for the notification message pushed by the second application program.
  • the attention is average, and the notification message pushed by the second application program may be of average importance to the user.
  • the electronic device 100 may mark the notification message pushed by the second application as the second priority.
  • the electronic device 100 may mark the notification message pushed by the second application as the first priority.
  • the first range value is greater than the second range value.
  • the electronic device 100 may determine the corresponding priority by counting the number of notification messages pushed by the user opening the application from the notification center.
  • the user can set the priority corresponding to the application program according to his actual needs.
  • the user can set the priority of the application when the application is installed on the electronic device 100; or, in the process of using the application, set the corresponding priority of the application through the "Settings" option; or, through the notification center Notification management to set the corresponding priority of the application.
  • the notification message is pushed by the application program, so when the priority of the application program is determined, the priority of the notification message pushed by the application program is consistent with the priority of the application program.
  • the electronic device when the priority corresponding to the application program is not set, the electronic device will use the priority corresponding to the application program matched according to the user's operation habits.
  • the priority corresponding to the application program set by the user is preferentially adopted. For example, assume that the user only sets the priorities corresponding to the first application program, the second application program, the third application program, the fourth application program and the fifth application program, but the electronic device collects the received information of the sixth application program The number of times it is started, may mark a priority for the sixth application.
  • the first application program includes one or more applications whose number of times of opening the pushed notification message at the preset time is greater than the first range value;
  • the second application program includes opening the pushed notification message at the preset time One or more applications whose number of times is greater than or equal to the second range value and less than the first range value;
  • the third application program includes one or more applications whose times of opening pushed notification messages at a preset time are less than the second range value application.
  • the electronic device can first delete the notification messages pushed by the third application program according to their corresponding priorities, delete the notification messages pushed by the second application program after a period of time, and delete the notification messages pushed by the third application program after a period of time. notification message.
  • FIG. 4A exemplarily shows a user interface 400 of an application program (such as "Gallery") displayed on an electronic device such as a smart phone.
  • "Gallery” is an application program for picture management on electronic devices such as smart phones and tablet computers, and it can also be called “album”.
  • the embodiment of the present application does not limit the name of the application program.
  • the APP can support users to perform various operations on pictures stored on electronic devices, such as browsing, editing, deleting, selecting and other operations. That is, the objects managed by "Gallery” are pictures. In other cases, the APP can also support the user to perform the above-mentioned various operations on the pictures stored on the cloud server. It can be understood that, in this embodiment, the picture may be captured by the electronic device 100 using the camera 193 , or may be obtained from other application programs or downloaded from a web page.
  • One or more pictures may be displayed in the user interface 400 .
  • the electronic device detects an upward or downward sliding operation in the user interface 400, in response to the sliding operation, the electronic device can update the Displayed pictures for users to browse pictures. That is to say, the user can slide up or down in the user interface 400 to browse more pictures. Not limited to the operation of sliding up or down, the user can also slide left or right in the user interface 400 to browse more pictures.
  • Picture 419 may be a thumbnail.
  • the original image corresponding to the image 419 can be stored in the electronic device, or can be stored in the cloud server.
  • the pictures referred to in the following embodiments may be stored in an electronic device or on a cloud server.
  • FIG. 4B exemplarily shows a schematic diagram of a user deleting a picture in the "Gallery".
  • the electronic device may detect in the user interface 400 that the user selects a picture 401 , a picture 402 , and so on, a selection operation of N pictures.
  • the electronic device may display a first user interface 4011 .
  • the selected N pictures, share control 431 , select all control 432 , edit control 433 , delete control 434 and more control 435 can be displayed in the first user interface 4011 . It can be understood that the user can swipe up or down in the first user interface 4011 to browse more selected pictures.
  • the electronic device may delete the N pictures in batches according to the priorities corresponding to the N pictures.
  • the electronic device 100 may display the first user interface 4011 in the user interface 400 in response to the user's selection operation.
  • the selected N pictures are displayed in the first user interface 4011 .
  • the electronic device 100 detects the user's operation on the delete control 434 in the first user interface 4011 , and in response to the operation, the electronic device can delete the N pictures step by step according to the respective priorities corresponding to each of the N pictures.
  • the step-by-step deletion mentioned in the embodiment of the present application is not to clear the pictures displayed in the first user interface 4011 at once, but to delete the pictures in batches according to the priority. Wherein, each of the above N pictures is marked with a priority.
  • the priorities of pictures 401, 402, 405, 410, 425, and 428 are lower than those of other selected pictures displayed in the first user interface 4011, so in the A user interface 4011 may display the operations performed by the electronic device 100 to delete pictures 401, 402, 405, 410, 425, and 428 (for example, pictures 401, 402, 405, 410, 425, The edge of picture 428 is gradually virtualized, or picture 401, picture 402, picture 405, picture 410, picture 425, picture 428 gradually disappear, or other pictures will come forward to supplement picture 401, picture 402, picture 405, picture 410, picture 425 , the position of picture 428).
  • the picture 401 , picture 402 , picture 405 , picture 410 , picture 425 , and picture 428 will first be displayed in the first user interface 4011 . And other photos will not be messaged together with picture 401, picture 402, picture 405, picture 410, picture 425, picture 428.
  • the electronic device may continue to respond to the above operation and perform deletion operations on other photos.
  • FIG. 4C schematically shows a second user interface 4012 displayed on an electronic device such as a smart phone.
  • the electronic device 100 detects a touch operation on the first user interface 4011 during the process of deleting the pictures in batches according to the priorities corresponding to the selected N pictures displayed on the first user interface 4011 .
  • the electronic device 100 stops the deletion and displays the second user interface 4012 .
  • M pictures may be displayed on the second user interface 4012, where M is a positive integer smaller than N.
  • the M pictures may specifically include pictures that are retained because the electronic device 100 stops the deletion.
  • L photos in the N pictures have been deleted.
  • the priority of each object in the L photos is lower than the priority of each object in the M objects, and L is a positive integer smaller than N.
  • the electronic device 100 when the electronic device 100 responds to the deletion operation of the selected N photos displayed on the first user interface 4011, the electronic device can correspond to the selected N photos displayed on the first user interface 4011.
  • the electronic device may stop deleting other pictures.
  • the electronic device first deletes picture 401, picture 402, picture 405, picture 410, picture 425, and picture 428. Therefore, the electronic device may display the second user interface 4012, and the pictures that are not deleted after the user's touch operation may be displayed in the second user interface 4012. It can be understood that the above-mentioned undeleted pictures have a higher priority than the deleted pictures, so that the high-priority pictures can be protected from being accidentally deleted.
  • FIG. 4D exemplarily shows a third user interface, a fourth user interface, and a fifth user interface displayed by an electronic device such as a smart phone.
  • the electronic device 100 may delete the above N pictures sequentially according to the corresponding priorities.
  • the first user interface 4011 includes a delete control 434, and the delete control 434 is used to perform a delete operation on the selected N brother picture displayed on the first user interface 4011. That is, the electronic device may detect the user's operation on the delete control 434 in the first user interface 4011, and delete the N pictures displayed on the first user interface 4011 step by step and in batches in response to the above operation.
  • the electronic device 100 may first delete the first-priority picture among the N pictures according to the priority (for example, the picture 401 , the picture 402 , the picture 405 , the picture 410 , the picture 425 , and the picture 428 ), and display the third user interface 4013 . Therefore, the picture with the second priority and the picture with the third priority among the N pictures are displayed on the third user interface 4013 .
  • the priority levels of the first priority, the second priority and the third priority increase sequentially.
  • a countdown control 491 is displayed in the third user interface 4013, and a first time (for example, 20 seconds) is displayed in the countdown control 491.
  • the countdown control 491 is used to remind the user that after the above-mentioned first time interval, the electronic device 100 will delete N
  • the pictures of the second priority in the picture (such as picture 403, picture 406, picture 407, picture 411, picture 412, picture 413, picture 418, picture 420, picture 422, picture 423, picture 424), display the fourth user interface 4014.
  • the fourth user interface 4014 displays notification messages of the third priority among the N pictures (such as picture 404, picture 408, picture 409, picture 414, picture 415, picture 416, picture 417, picture 419, picture 421, picture 426, picture 427).
  • a countdown control 492 is displayed in the fourth user interface 4014, and a second time (for example, 40 seconds) is displayed in the countdown control 492.
  • the countdown control 492 is used to remind the user that the electronic device 100 will delete N
  • the pictures of the third priority in the pictures (such as pictures 404, pictures 408, pictures 409, pictures 414, pictures 415, pictures 416, pictures 417, pictures 419, pictures 421, pictures 426, pictures displayed in the fourth user interface 4014 427), displaying the fifth user interface 4015. There will be no selected photos to display in the fifth user interface 4015 .
  • the electronic device 100 deletes the selected pictures displayed on the first user interface 4011 sequentially according to the deletion priority, so no selected pictures are displayed on the fifth user interface, and unselected pictures are displayed.
  • FIG. 4D exemplarily shows three classifications of priorities (first priority, second priority, and third priority), but this embodiment of the present application is not limited to the above three classifications, and priority Levels are classified in more detail, such as 4 types, 6 types, and so on.
  • the selected pictures in the gallery are deleted in order according to the deletion priority, that is, the pictures with lower priority are deleted first, and the pictures with higher priority are deleted after a period of time.
  • the gallery is concise and prevents users from missing important pictures.
  • the priority of each picture is determined by the number of times the electronic device opens the picture in response to the first operation on the picture in the gallery. That is to say, the electronic device 100 responds to the first operation on the N pictures respectively, thus collects the number of times each of the N pictures is opened, and then calculates the number of times each of the N pictures is opened according to the number of times each of the N pictures is opened Each picture in is marked with a priority.
  • the picture may be captured by the electronic device 100 using the camera 193, or may be obtained from other application programs or downloaded from a web page.
  • the user uses the camera 193 to continuously capture photos during travel, and the continuously captured photos may be repeated photos, which occupy a large amount of device memory.
  • the photos obtained from other applications or downloaded from the web are duplicate photos. For duplicate photos or photos with high similarity, it is not very important to users, just keep one.
  • the electronic device may mark repeated pictures or pictures of the first type as the third priority.
  • the electronic device may mark repeated pictures or pictures of the first type as the second priority.
  • the electronic device may mark repeated pictures or pictures of the first type as the first priority.
  • the fourth range is greater than the third range value.
  • the electronic device 100 can determine the priority corresponding to the application program by counting the number of times the user can open the picture in the gallery.
  • the user can set the priority corresponding to the pictures in the gallery according to his actual needs.
  • the priority is set according to the time when the pictures are acquired, and the pictures belonging to the first time period are marked as the first priority, and the pictures belonging to the second time period are marked as the second priority.
  • the priority can also be set according to the type of pictures, and the food pictures are marked as the first priority, the people pictures are marked as the second priority, and the landscape pictures are marked as the third priority.
  • the electronic device when the priority corresponding to the picture in the gallery is not set, the electronic device will adopt the priority of the picture matched according to the user's operation habit.
  • the priority of the picture set by the user is given priority.
  • the first category of pictures includes one or more pictures or repeated pictures whose number of times opened at the preset time is less than the third range value; the second category of pictures includes the number of times opened at the preset time is greater than or equal to the fourth range One or more pictures whose range value is smaller than the third range value; the third type of picture includes one or more pictures whose opening times are less than the fourth range value within the preset time. Therefore, when the above-mentioned pictures of the first type, pictures of the second type and pictures of the third type are respectively selected, if the electronic device responds to detecting a deletion operation on the above-mentioned pictures, the electronic device can first delete the pictures of the first type according to their corresponding priorities.
  • FIG. 5 is a schematic flowchart of deleting an object according to an embodiment of the present application.
  • the method can be applied to the electronic device 100 shown in FIG. 1A .
  • This method can be applied to the electronic device 100 shown in FIG. 1B .
  • the method may include, but is not limited to, the following steps:
  • Step S501 displaying a first user interface.
  • N objects displayed in the first user interface there are N objects displayed in the first user interface, and the objects may include one or more of the following: notification messages pushed by applications, pictures in the "Gallery", songs in music software, and shopping records in shopping software , in the folder Notes in files and memos, and more.
  • N is a positive integer greater than or equal to 1.
  • the notification message When the object is a notification message pushed by an application program, the notification message will be displayed in the notification center of the electronic device, so the first user interface is the window where the notification center is located.
  • the first user interface is an interface provided by the "Gallery" for displaying pictures.
  • the first user interface is an interface provided by the music software for the user to display music
  • the song includes but is not limited to: songs downloaded by the user, songs listened to by the user recently, songs collected by the user, etc. wait.
  • the first user interface is the interface provided by the shopping software for displaying the shopping record.
  • the shopping record includes but is not limited to: the shopping record browsed by the user, the shopping record added by the user shopping records, user favorite shopping records, and so on.
  • the first user interface is an interface provided by the folder for displaying files.
  • the first user interface is an interface provided by the memo for displaying notes.
  • each of the above N objects is marked with a priority.
  • the priority can be marked by the user. If the user does not mark the priority, the electronic device can match the corresponding priority according to the usage habit of the user.
  • the electronic device may respond to the first operations on the N objects, thereby collecting the number of times each of the N objects is opened, and then according to the number of times each of the N objects is opened
  • the number of opens marks a priority for each of the N objects. Taking the priority level as an example, it is assumed that it is classified into the first priority level, the second priority level and the third priority level, wherein the priority levels of the first priority level, the second priority level and the third priority level Incremented sequentially.
  • the electronic device may mark the first object as a third priority.
  • the electronic device may mark the second object as a second priority.
  • the electronic device may mark the bulk object as first priority.
  • the first range value is greater than the second range value.
  • the first object includes one or more objects whose number of times the pushed notification message is opened at the preset time is greater than the first range value;
  • the second object includes one or more objects whose number of times the pushed notification message is opened at the preset time is greater than Or one or more objects equal to the second range value and less than the first range value;
  • the third object includes one or more objects whose times of opening the pushed notification message at the preset time are less than the second range value.
  • the object is opened can be understood as the behavior of clicking on the object to perform operations such as viewing and editing on the object. For example, click on the push notification information to view the detailed content of the notification information; click on the thumbnail version of the picture to view the picture; click on the song to play the song; click on the shopping record to view the details of the product.
  • Step S502 in response to the deletion operation on the N objects, delete the N objects in batches according to the priorities corresponding to the N objects.
  • the electronic device in response to the deletion operation for N objects, the electronic device first deletes objects with lower priority among the N objects according to the priorities corresponding to the N objects, and then deletes objects with higher priority among the N objects until Clear the above N objects.
  • the electronic device when the electronic device deletes N objects in batches according to the N corresponding priorities, the user clicks on the display screen of the electronic device, and the electronic device may respond to the to stop the deletion. That is, the deleted batch of objects (for example, objects with higher priority) is stopped, and the second user interface is displayed. There are M objects displayed in the second user interface, where M is a positive integer smaller than N. It can be understood that after the electronic device deletes N objects in batches according to the priorities corresponding to the N objects, before stopping the deletion behavior, L objects among the N objects have been deleted, leaving M objects not deleted. Wherein, the priority of each object in the L objects is less than the priority of each object in the M objects. L is a positive integer smaller than N. Therefore, even in the process of deletion, objects with higher priority can be protected from deletion through user operations.
  • the electronic device In response to the deletion operation on the N objects, the electronic device first deletes the object with the first priority among the N objects according to the priorities corresponding to the N objects, and displays the third user interface. Wherein, objects of the second priority and objects of the third priority are displayed in the third user interface.
  • the electronic device deletes the second-priority object among the N objects after a first time interval, and displays a fourth user interface. Wherein, objects of the third priority are displayed on the fourth user interface.
  • the electronic device deletes the third-priority object among the N objects after a second time interval, and displays a fifth user interface. Wherein, there are no objects displayed in the fifth user interface.
  • the electronic device responds to a touch operation on the first user interface during the process of deleting the first-priority object among the N objects, it will stop deleting the second-priority object and the third-priority object.
  • Objects to display the user interface containing objects of the second priority and objects of the third priority.
  • the electronic device If the electronic device is in the process of deleting objects with the second priority among the N objects after a first time interval, in response to a touch operation on the second user interface, it will stop deleting the objects with the third priority and display the objects containing the third priority. level object user interface.
  • words such as “first” and “second” are only used for the purpose of distinguishing descriptions, and cannot be understood as indicating or implying relative importance, nor can they be understood as indicating or implying order .
  • Features defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • words such as “exemplary” or “for example” are used as examples, illustrations or descriptions. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • the processes can be completed by computer programs or hardware related to the computer programs.
  • the computer programs can be stored in computer-readable storage media.
  • the computer programs During execution, it may include the processes of the foregoing method embodiments.
  • the aforementioned storage medium includes: various media capable of storing computer program codes such as read-only memory (ROM) or random access memory (RAM), magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Des modes de réalisation de la présente demande concernent un procédé de suppression d'objet et un dispositif électronique. Le procédé peut consister : à afficher une première interface utilisateur, N objets étant affichés dans la première interface utilisateur, N étant un nombre entier positif supérieur ou égal à 1 et chaque objet parmi les N objets étant marqué avec une priorité ; et à la suite d'une opération de suppression pour les N objets, à supprimer les N objets en lots selon les priorités correspondant aux N objets. Au moyen des modes de réalisation de la présente demande, pendant un processus de suppression d'une pluralité d'objets, la pluralité d'objets peut être supprimée en lots plutôt qu'en une fois, de telle sorte qu'une opportunité d'arrêt de suppression puisse être fournie à un utilisateur, ce qui permet d'améliorer l'expérience d'utilisation d'un utilisateur.
PCT/CN2023/076461 2022-02-25 2023-02-16 Procédé de suppression d'objet et dispositif électronique WO2023160455A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210180310.0A CN116700568A (zh) 2022-02-25 2022-02-25 删除对象的方法及电子设备
CN202210180310.0 2022-02-25

Publications (1)

Publication Number Publication Date
WO2023160455A1 true WO2023160455A1 (fr) 2023-08-31

Family

ID=87764676

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/076461 WO2023160455A1 (fr) 2022-02-25 2023-02-16 Procédé de suppression d'objet et dispositif électronique

Country Status (2)

Country Link
CN (1) CN116700568A (fr)
WO (1) WO2023160455A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793938A (zh) * 2015-04-23 2015-07-22 广州视源电子科技股份有限公司 通知栏消息显示方法和系统
CN105094538A (zh) * 2015-06-30 2015-11-25 联想(北京)有限公司 一种信息处理方法及电子设备
US20190035067A1 (en) * 2017-07-26 2019-01-31 Canon Kabushiki Kaisha Image data management method, production apparatus, production system, image data management method for production system, and non-transitory computer-readable recording medium
CN112260936A (zh) * 2020-10-22 2021-01-22 Oppo广东移动通信有限公司 通知消息的管理方法、装置、终端及存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104793938A (zh) * 2015-04-23 2015-07-22 广州视源电子科技股份有限公司 通知栏消息显示方法和系统
CN105094538A (zh) * 2015-06-30 2015-11-25 联想(北京)有限公司 一种信息处理方法及电子设备
US20190035067A1 (en) * 2017-07-26 2019-01-31 Canon Kabushiki Kaisha Image data management method, production apparatus, production system, image data management method for production system, and non-transitory computer-readable recording medium
CN112260936A (zh) * 2020-10-22 2021-01-22 Oppo广东移动通信有限公司 通知消息的管理方法、装置、终端及存储介质

Also Published As

Publication number Publication date
CN116700568A (zh) 2023-09-05

Similar Documents

Publication Publication Date Title
WO2021013158A1 (fr) Procédé d'affichage et appareil associé
CN109814766B (zh) 一种应用显示方法及电子设备
WO2021103981A1 (fr) Procédé et appareil de traitement d'affichage à écran divisé, et dispositif électronique
WO2021139768A1 (fr) Procédé d'interaction pour traitement de tâches inter-appareils, et dispositif électronique et support de stockage
WO2021036571A1 (fr) Procédé d'édition de bureau et dispositif électronique
WO2021000881A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021000839A1 (fr) Procédé de division d'écran et dispositif électronique
WO2021000804A1 (fr) Procédé et appareil d'affichage dans un état verrouillé
WO2021082835A1 (fr) Procédé d'activation de fonction et dispositif électronique
CN110119296B (zh) 切换父页面和子页面的方法、相关装置
WO2022068483A1 (fr) Procédé et appareil de démarrage d'application, et dispositif électronique
WO2019072178A1 (fr) Procédé de traitement de notification, et dispositif électronique
WO2022068819A1 (fr) Procédé d'affichage d'interface et appareil associé
WO2021190524A1 (fr) Procédé de traitement de capture d'écran, interface utilisateur graphique et terminal
EP4261680A1 (fr) Procédé d'affichage de gadgets logiciels et dispositif électronique
WO2022057852A1 (fr) Procédé d'interaction entre de multiples applications
WO2020155875A1 (fr) Procédé d'affichage destiné à un dispositif électronique, interface graphique personnalisée et dispositif électronique
CN112068907A (zh) 一种界面显示方法和电子设备
US20230236714A1 (en) Cross-Device Desktop Management Method, First Electronic Device, and Second Electronic Device
WO2023138305A1 (fr) Procédé d'affichage de cartes, dispositif électronique et moyen de stockage lisible par ordinateur
WO2023116411A1 (fr) Procédé de gestion pour carte d'application, dispositif électronique et support de stockage
WO2022089276A1 (fr) Procédé de traitement de collecte et appareil associé
WO2023160455A1 (fr) Procédé de suppression d'objet et dispositif électronique
CN114356186A (zh) 一种拖动阴影动画效果的实现方法及相关设备
WO2023207799A1 (fr) Procédé de traitement de messages et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23759080

Country of ref document: EP

Kind code of ref document: A1