CN114153356A - Message processing method and electronic equipment - Google Patents

Message processing method and electronic equipment Download PDF

Info

Publication number
CN114153356A
CN114153356A CN202111331750.3A CN202111331750A CN114153356A CN 114153356 A CN114153356 A CN 114153356A CN 202111331750 A CN202111331750 A CN 202111331750A CN 114153356 A CN114153356 A CN 114153356A
Authority
CN
China
Prior art keywords
application
icon
user
message
user interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111331750.3A
Other languages
Chinese (zh)
Inventor
高雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Device Co Ltd
Original Assignee
Huawei Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Device Co Ltd filed Critical Huawei Device Co Ltd
Priority to CN202111331750.3A priority Critical patent/CN114153356A/en
Publication of CN114153356A publication Critical patent/CN114153356A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72406User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by software upgrading or downloading
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72469User interfaces specially adapted for cordless or mobile telephones for operating the device by selecting functions from two or more displayed items, e.g. menus or icons
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72484User interfaces specially adapted for cordless or mobile telephones wherein functions are triggered by incoming communication events

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a message processing method and electronic equipment, which can improve the processing efficiency of unread messages and quickly eliminate marks on application icons. The method comprises the following steps: the method comprises the steps that the electronic equipment displays a desktop, a plurality of application icons are arranged on the desktop, marks are arranged on part of the application icons, the marks can indicate that the application has unread messages, no marks are arranged on part of the application icons, and the applications with the marks on the icons can be divided into a first application type and a second application type; after receiving a first user operation, a displayed user interface only comprises a marked application icon and a first control; upon detecting a user operation acting on the first control, the indicia on the application icon on the desktop of the electronic device that is of the first application type may be dismissed.

Description

Message processing method and electronic equipment
The application is a divisional application with application date of 2018, 12 and 24 months and application number of 201811584684.9, and the name of the invention is a message processing method and electronic equipment.
Technical Field
The present application relates to the field of electronic devices, and in particular, to a message processing method and an electronic device.
Background
Currently, a plurality of Applications (APPs) are usually installed on a mobile phone, and these APPs can provide convenience for users in various aspects of life. When the APP receives a new message, in order to remind the user of the new message and avoid the user from missing important information, a red dot is generally displayed on the APP icon. However, as more and more APPs are installed on the mobile phone, the number of new messages received by the APPs is more and more, and the red dots on the APP icons cause certain troubles for users.
Disclosure of Invention
The embodiment of the application provides a message processing method and electronic equipment, which can quickly eliminate red dots on an APP icon, reduce the trouble of the red dots on the APP icon to a user, and improve the message processing efficiency.
In a first aspect, an embodiment of the present application provides a message processing method, including: the method comprises the steps that the electronic equipment displays a first user interface, wherein the first user interface comprises an icon of a first application, an icon of a second application, an icon of a third application, an icon of a fourth application and an icon of a fifth application; the icons of the first application, the second application, the third application and the fourth application are respectively provided with marks of unread messages, and the icon of the fifth application is not provided with the marks of the unread messages; said first application and said second application being of a first application type, said third application and said fourth application being of a second application type; when the electronic equipment detects the operation of a first user, displaying a second user interface; the second user interface comprises an icon of the first application, an icon of the second application, an icon of the third application, an icon of the fourth application and a first control; the electronic equipment detects user operation acting on the first control; the electronic equipment displays the first user interface; the icon of the first application and the icon of the second application do not have the mark with the unread message, and the icon of the third application and the icon of the fourth application respectively have the mark with the unread message.
According to the embodiment of the application, the marked application icons can be displayed in the second user interface in a centralized manner, different elimination strategies are adopted for the marks on the application icons of different application types, so that a user can eliminate the marks on part of the application icons according to the requirement of the user, and the mark processing efficiency is improved. The user does not need to check the application marked on each icon, so that the user operation is reduced, and the operation efficiency is improved.
In one possible implementation, the first user interface may be a desktop, the first user operation may be a left-swipe gesture in the desktop, and the second user interface may be a negative one-screen.
In another possible implementation manner, the first user interface may be a desktop, and the first user operation may be clicking a third control.
Possibly, the third control element may be an icon in the first user interface.
Possibly, the third control may be a hover button in any of the user interfaces.
Possibly, the third control may be a control in a fourth user interface; the fourth user interface may be a user interface displayed by the electronic device upon detecting a pull-down gesture in the first user interface, which may also be referred to as a switch page.
In a possible implementation manner, the first user operation may be a preset gesture.
Possibly, the preset gesture may refer to the joint drawing a preset shape, which may be, for example, a Z-shape. The preset gesture can be input in any user interface.
Possibly, the preset gesture may be a hover gesture.
The first user operation may be a voice command, an eye rotation, shaking the electronic device, pressing a key, or the like.
In one possible implementation, the above-mentioned mark with unread messages may be a red dot.
Further, a number may be included in the red dot to indicate the number of unread messages for the application.
In a possible implementation manner, the second user interface includes a first area and a second area; the icon of the first application and the icon of the second application are displayed in the first area, and the icon of the third application and the icon of the fourth application are displayed in the second area.
According to the embodiment of the application, all the marked application icons are displayed in a concentrated mode according to the application types, and the marks on the application icons belonging to the same application type can be reserved or eliminated in a concentrated mode. In addition, the user can also view unread messages belonging to the same type of application collectively.
In a possible implementation manner, after the displaying the second user interface, the method further includes: when the electronic device detects a second user operation acting on the icon of the third application in the second user interface, the application type of the third application is changed from the second application type to the first application type. Different application types may be viewed as different markup removal policies. For example, the mark on the application icon may be eliminated after the user clicks the first control, or the mark on the application icon may not be eliminated after the user clicks the first control.
In one possible implementation manner, the second user operation is an operation of displaying an icon of the third application in the first area.
Specifically, the operation of displaying the icon of the third application in the first area may be dragging the icon of the third application from the second area to the first area.
In a possible implementation manner, after the electronic device displays the second user interface, before the electronic device detects a user operation acting on the first control, the method further includes: and when the electronic equipment detects the user operation acting on the icon of the first application in the second user interface, displaying a third user interface. The user operation may be a click operation.
In a possible implementation manner, the third user interface is an interface of the first application.
In the embodiment of the application, the user can click the application icon in the second user interface to start the application, so that the electronic device displays the user interface of the application. The user may view the unread messages of the application in the user interface of the application, thereby eliminating the mark on the application icon in the first user interface.
In a possible implementation manner, the third user interface includes a third area, and the third area includes one or more unread messages of the first application.
In the embodiment of the application, the electronic equipment can display all unread messages of the first application in a centralized manner in the third user interface, so that a user can conveniently view all unread messages of the first application, user operation is reduced, and operation efficiency is prompted.
In a possible implementation manner, when the electronic device detects a user operation applied to the third area, the interface of the first application is displayed. The user operation may be a click operation on the unread message in the third area.
In the embodiment of the application, the user can click the unread message in the third area to enable the electronic device to display the interface of the first application, so that more messages related to the unread message can be viewed. For example, an unread message sent by a certain contact of the WeChat can be displayed in the third area, and the user can click on the unread message to make the electronic device display a chat interface with the contact in the WeChat, so as to view a chat record before the unread message.
In a possible implementation manner, the third user interface further includes a fourth area, and the icon of the first application and the icon of the second application are displayed in the fourth area; the method further comprises the following steps: and when the electronic equipment detects a user operation acting on the icon of the second application in the fourth area, displaying one or more unread messages of the second application in the third area. The user operation may be a click operation.
Further, the electronic apparatus may further detect that the slide operation acting in the third area switches the content displayed in the third area.
In a possible implementation manner, when the electronic device detects a user operation acting on any one of the one or more unread messages, the electronic device cancels display of the any one of the unread messages in the third area.
In the embodiment of the application, the electronic device can cancel displaying the unread message in the third area according to the user operation, so that the unread message displayed in the third area is left after the user selects, and the interference of part of unread message to the user when viewing other unread message can be reduced.
Furthermore, the electronic device can record the use habit of the user within a period of time, for example, the user deletes the record of the unread message, and the electronic device selectively displays the unread message in the third area according to the use habit, so that the displayed message meets the psychological expectation of the user, the user experience is improved, the operation of the user for checking the unread message is reduced, and the operation efficiency is improved.
In a possible implementation manner, the third user interface further includes a second control; after displaying the one or more unread messages of the second application in the third area, the method further includes: and when the electronic equipment detects the user operation acting on the second control, canceling to display one or more unread messages of the second application in the third area. The user operation may be a click operation.
In a possible implementation manner, after canceling the display of the one or more unread messages of the second application in the third area, the method further includes: displaying the first user interface; wherein the icon of the second application does not have the mark with the unread message.
In the embodiment of the application, the user can quickly eliminate the mark on the second application icon by clicking the second control, so that the operation is simple and convenient, and the efficiency is high.
In a second aspect, an embodiment of the present application provides an electronic device, including: one or more processors, memory; the memory is coupled to the one or more processors and is configured to store computer program code, which includes computer instructions that, when executed by the one or more processors, cause the electronic device to perform the message processing method as provided in the first aspect or any one of the implementations of the first aspect.
In a third aspect, an embodiment of the present application provides a computer storage medium, which includes computer instructions, and when the computer instructions are executed on an electronic device, the electronic device is caused to execute the message processing method as provided in the first aspect or any one of the implementation manners of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer program product, which when run on a computer, causes the computer to execute the message processing method as provided in the first aspect or any one of the implementation manners of the first aspect.
It is to be understood that the electronic device of the second aspect, the computer storage medium of the third aspect, or the computer program product of the fourth aspect, are all provided to execute the message processing method of the first aspect. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a usage scenario provided by an embodiment of the present application;
4-18 are schematic diagrams of embodiments of some user interfaces provided by embodiments of the present application;
fig. 19 is a schematic diagram illustrating a process in which the electronic device displays a red dot on an icon of a short message APP according to an embodiment of the present application;
fig. 20 is a schematic process diagram of displaying a red dot on an icon of a short message APP by an electronic device according to another embodiment of the present application;
fig. 21 is a schematic process diagram of an electronic device displaying a user interface 40 according to an embodiment of the present application;
fig. 22 is a schematic process diagram of centralized displaying of messages by an electronic device according to an embodiment of the present application;
fig. 23 is a schematic diagram of a process of eliminating a red dot on an APP icon by the electronic device according to the embodiment of the present application;
fig. 24 is a schematic flowchart of a message processing method according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described in detail and clearly with reference to the accompanying drawings.
The method and the device for processing the message can divide the new message received by the APP into the first type message and the second type message. After the APP receives the first type of message, the electronic device may display a mark, such as a red dot, on an icon of the APP; after the first type of message is viewed by the user, the electronic device may cancel the display mark on the icon of the APP. After the APP receives the second type of message, the electronic equipment does not display a mark on the icon of the APP. Wherein the first type of message may be a message that needs to be processed by the user. The second type of message may be a message that does not require user processing. The following examples are described by taking as an example the display of a red dot on an APP icon.
For example, when the smart home APP receives a message about device update, for example, an upgradable firmware version exists, the user is required to upgrade the firmware version, and at this time, a red dot is displayed on an icon of the smart home APP. The messages described above in relation to device updates belong to the first type of messages. When the map APP receives a road condition message, no processing is required for a user, and at the moment, red dots may not be displayed on the icon of the map APP. The traffic information belongs to the second type of information. The messages mentioned in the subsequent embodiments of the present application belong to the first type of messages.
In one possible implementation, a number may also be included within the red dot on the APP icon to indicate the number of messages that are not viewed. Wherein a red dot comprising a number may also be referred to as a quantity red dot. In the following embodiments, red dots including no number and red dots including a number are collectively referred to as red dots. In the following examples, red dots with numerals are used as examples for explanation.
The message processing method provided by the embodiment of the application can enable the electronic equipment to display messages which are received by the APP and are not checked in a centralized manner, and quickly eliminate the red dots on the APP icons. Thus, the message processing efficiency can be improved.
The electronic device related to the embodiment of the present application may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a notebook computer, an ultra-mobile personal computer (UMPC), a handheld computer, a netbook, a Personal Digital Assistant (PDA), a wearable electronic device, a virtual reality device, or the like.
Fig. 1 shows a schematic structural diagram of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1. In this embodiment of the application, the display screen 194 may be configured to display red dots or a number of red dots on each APP icon for prompting a user that a new message is to be processed.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. In this embodiment of the application, the internal memory 121 may be configured to store data of each APP message, and may also be configured to store a red dot removal policy corresponding to each APP.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, message manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The message manager can be used for storing the data of the messages reported by the APPs and processing the data reported by the APPs. Possibly, the message manager may also be used to store which APPs the different red dot elimination policies apply to, respectively. The different red dot elimination strategies will be described in detail in the following embodiments, and will not be described in detail here.
Specifically, the data of the message may include an id of the message (message id), an id of APP (APP id), a processing state of the message (status), a generation time (happy time), a message type (msg type), and a message description (description). The processing state of the message may include two types: untreated, treated. When the processing state of the message is unprocessed, the status field is 0; the status field is 1 when the processing status of the message is processed.
For example, if the smart home APP receives a message sent by xiaoming and sharing hua as an AI loudspeaker, and the message needs to be reported to the message manager, the data of the message reported by the smart home APP to the message manager may be as follows:
Figure BDA0003349133920000121
after receiving the message reported by the smart home APP, the message manager processes the message, which may specifically include generating a message ID corresponding to the message, and filling in other fields (such as APP ID, status, happy time, and the like). In addition, the message manager can also return the message ID to the smart home APP for subsequent communication between the message manager and the smart home APP.
Illustratively, if the message ID of the message is 123, the APP ID is smart home123456, the processing state of the message is unprocessed, and the generation time of the message is 2018, 11, 21, 15, 02 min 27 s, the data of the message processed by the message manager is as follows:
Figure BDA0003349133920000122
in one possible implementation, the message manager may be part of the notification manager.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
An application scenario and an embodiment of a User Interface (UI) in the scenario related to the embodiment of the present application are described below.
Fig. 3 illustrates a desktop (desktop) for use on the electronic device 100.
A in fig. 3 shows a home page in the desktop of the electronic device 100, which is referred to as a user interface 3a, the user interface 3a may include a status bar 301, a time component icon 302 and a weather component icon 303, icons of a plurality of applications such as an album icon, a microblog icon, a camera icon, a WeChat icon, a setting icon, a computer icon, a mailbox icon, a memo icon, a music icon, etc., and the user interface 3a may further include a page indicator 304, a telephone icon 305, a short message icon 306, a contact icon 307, a navigation bar 308, etc. Wherein:
status bar 301 may include: an operator indicator (e.g., the operator's name "china mobile"), one or more signal strength indicators for wireless fidelity (Wi-Fi) signals, one or more signal strength indicators for mobile communication signals (which may also be referred to as cellular signals), a time indicator, and a battery status indicator.
The time component icon 302 can be used to indicate the current time, such as the date, day of the week, time division information, and the like.
The weather component icon 303 may be used to indicate a weather type, such as cloudy sunny, light rain, etc., and may also be used to indicate information such as temperature, etc.
Page indicator 304 may be used to indicate which page of applications the user is currently browsing. The user may slide the area of the plurality of application icons from side to browse the application icons in other pages.
Navigation bar 308 may include: a return key 3051, a Home screen key 3052, a call task history key 3053, and other system navigation keys. The main interface is an interface displayed by the electronic device 100 after any user interface detects a user operation on the main interface key 3052. When it is detected that the user clicks the return key 3051, the electronic apparatus 100 may display a user interface previous to the current user interface. When it is detected that the user clicks the home interface button 3052, the electronic device 100 may display a home interface. When it is detected that the user clicks the outgoing task history key 3053, the electronic device 100 may display a task that the user has recently opened. The names of the navigation keys can be other keys, for example, 3051 can be called Back Button, 3052 can be called Home Button, 3053 can be called Menu Button, and the application is not limited thereto. The navigation keys in the navigation bar 305 are not limited to virtual keys, but may be implemented as physical keys.
Page 2 of the desktop of the electronic device 100 is shown in fig. 3b, where page 2 is referred to as a user interface 3b, the user interface 3b may include a status bar, icons of a plurality of applications, such as a cloud sharing icon, a pay pal icon, a QQ icon, a map icon, a huaji AR icon, a facebook icon, a twitter icon, a pan icon, a smart home icon, etc., and the user interface 3b may further include a page indicator, a telephone icon, a short message icon and contact icon, a navigation bar, etc. Wherein:
the status bar, the page indicator, the phone icon, the short message icon and the contact icon, and the navigation bar are respectively consistent with the status bar 301, the page indicator 304, the phone icon 305, the short message icon 306, the contact icon 307, and the navigation bar 308 shown in the user interface 3a, and are not described herein.
Based on the scenario provided in FIG. 3, some UI embodiments implemented on electronic device 100 are described below.
As shown in a of fig. 4, the electronic apparatus 100 may detect a first user operation (the first user operation may be, for example, a left-sliding operation of the user in the user interface 3 a) acting on the user interface 3a, and in response to the first user operation, the electronic apparatus 100 may display the user interface 40 shown in b of fig. 4, that is, a negative one screen, which is a screen for presenting information on the left side of the plurality of desktop screens of the system. In another possible embodiment, there is no red dot on the icon of each APP in user interface 3a and user interface 3 b. The electronic device 100 displays only red dots on the icons of the respective APPs in the user interface 40.
B in fig. 4 exemplarily shows the user interface 40. The user interface 40 may be used to display which APPs different red dot elimination strategies apply. The red dot elimination strategy provided by the embodiment of the application can include, but is not limited to, the following: view messages and eliminate red dots, not view messages and not eliminate red dots, and not view messages and eliminate red dots. Icons of APPs to which different red dot elimination strategies are applicable may be respectively displayed in different areas.
Wherein, messages of the APP suitable for viewing messages and eliminating red dots can be collectively displayed by the electronic device 100 for viewing by the user, and red dots on the APP icon can be eliminated; messages for APPs that do not view messages and do not eliminate red dots may not be displayed centrally by electronic device 100, and red dots on APP icons may not be eliminated; messages for APPs that are not viewing messages and eliminate red dots may not be displayed centrally by electronic device 100, and red dots on APP icons may be eliminated.
In particular, the user interface 40 may include: a first display area 401, a second display area 402, a third display area 403, and an all dismiss control 404. Wherein:
the first display area 401 may be used to display icons of APPs whose red dot elimination policy is to view messages and eliminate red dots, and may include, for example, a WeChat icon 4011, a microblog icon, a mailbox icon, and a QQ icon. The icons of the APPs displayed in the first display area 401 may be used to collectively display the unviewed messages of the APP represented by the icon. For example, the electronic apparatus 100 may detect a click operation on the wechat icon 4011, and in response to the click operation, the electronic apparatus 100 may display a user interface 7a shown by a in fig. 7, and the user interface 7a may be used to collectively display the unviewed messages of the wechat APP. The following description of the embodiments can be found in relation to the introduction of the user interface 7a, which is not detailed here. In another possible embodiment, the electronic apparatus 100 may detect a click operation on the wechat icon 4011, and in response to the click operation, the electronic apparatus 100 may display a user interface for wechat.
The second display area 402 may be used to display icons of APPs whose red dot elimination policy is not to view messages and not to eliminate red dots, and may include, for example, a pay pal icon, a map icon, and a short message icon.
The third display area 403 may be used to display an icon of an APP in which the red dot elimination policy is not to view messages and eliminates red dots, for example, the icon may include a set icon, a pan icon, a hua is an AR icon, and a smart home icon.
In some possible embodiments, the first display region 401 may be referred to as a first region, or the third display region 403 may be referred to as a first region; the second display region 403 may be referred to as a second region. In the embodiment of the present application, names of the first display area 401, the second display area 402, and the third display area 403 are not limited.
APPs that are applicable to the same red spot elimination strategy may be considered as APPs belonging to the same application type. For example, the APP with the red point elimination policy of checking the message and eliminating the red point all belong to the first application type, or the APP with the red point elimination policy of not checking the message and eliminating the red point all belong to the first application type; the APPs whose red dot elimination policy is not to view messages and not to eliminate red dots all belong to the second application type.
The all cancel control 404 may be used to cancel the red dots on the APP icons included in the first display area 401 and the third display area 403. Electronic device 100 may detect a user operation (e.g., a click operation on all dismiss control 404) acting on all dismiss control 404, and in response to the user operation, electronic device 100 may dismiss the red dots on the APP icons in first display area 401 and third display area 403 in user interface 40.
In another possible embodiment, the electronic device 100 detects a user operation (e.g., a click operation on the all-cancel control 404) acting on the all-cancel control 404, and in response to the user operation, the electronic device 100 may display the user interface 3a, and no red dots are displayed on APP icons (e.g., the WeChat icon 4011, the microblog icon, and the mailbox icon) belonging to the first display area 401 and APP icons (e.g., the setting icon) in the third display area 403 in the user interface 3 a.
In another possible embodiment, the electronic device 100 detects a user operation (e.g., a click operation on the all cancel control 404) acting on the all cancel control 404, and in response to the user operation, the electronic device 100 may display the user interface 7a shown as a in fig. 7.
In some possible embodiments, the all cancel control 404 described above may be referred to as a first control. The embodiment of the present application does not limit the naming of the all-elimination control 404.
Not limited to the icons of APPs shown in b in fig. 4, the icons of APPs contained in the above-described first display area 401, second display area 402, and third display area 403 may also be icons of other APPs. The user can set icons of APPs contained in the first display area 401, the second display area 402, and the third display area 403.
As shown in a in fig. 5, the electronic device 100 may detect a second user operation (the second user operation may be, for example, a drag operation on the short message icon) that acts in the second display area 402 of the user interface 40 to move the short message icon 4021 to the first display area 401, and in response to the second user operation, the electronic device 100 may cancel displaying the short message icon 4021 in the second display area 402 and display the short message icon in the first display area 401.
Specifically, as shown in b in fig. 5, the electronic device 100 may detect a left-sliding operation applied to the first display area 401, and in response to the left-sliding operation, the electronic device 100 may display a short message icon in the first display area 401, which may be referred to as c in fig. 5. That is, the user may input a left-slide operation in the first display area 401 to view icons of other APPs.
The display area to which the short message icon belongs is not limited to be changed, and the display area to which the icons of other APPs belong can also be changed.
The display area to which the icon of the APP belongs is not limited to be changed from the second display area 402 to the first display area 401, and the display area to which the icon of the APP belongs may also be changed arbitrarily, which is not described herein.
Not limited to the left-slide operation being input in the first display area 401, a slide operation such as right-slide, up-slide, or down-slide may also be input in the first display area 401 to view icons of other APPs in the first display area 403.
Without being limited to inputting a slide operation in the first display area 401, icons of other APPs can also be viewed by expanding the first display area 401.
As shown in fig. 6, user interface 40 may also include a control 405. The control 405 is used to expand or collapse the first display area 401. As shown in a in fig. 6, when the electronic device 100 detects a user operation (e.g., a click operation on the control 405) acting on the control 405, in response to the user operation, the electronic device 100 may expand the originally collapsed first display area 401 to display icons of all APPs. As shown in b in fig. 6, when the electronic device 100 detects a user operation (e.g., a click operation on the control 405) acting on the control 405 again, the electronic device 100 may collapse the expanded first display area 401 in response to the user operation.
Not limited to the first display area 401, the user can also view icons of other APPs in the second display area 403 by expanding the second display area 402, and the user can also view icons of other APPs in the third display area 403 by expanding the third display area 403.
The display area to which the icon of each APP shown in a of fig. 5 described above belongs may coincide with the result set by the user last time. The initial default display area of each APP icon may be the first display area 401, that is, each red APP icon may be displayed in the first display area 401 in the initial state, and the user may manually adjust each red APP icon to other display areas.
The user interface 7a is described next. The electronic apparatus 100 may detect a click operation on the wechat icon 4011, and in response to the click operation, the electronic apparatus 100 may display a user interface 7a shown by a in fig. 7.
As shown in fig. 7a, the user interface 7a may include an APP icon display area 701, a message display area 702, an APP indicator 703, and a red dot removal control 704. Wherein:
the APP icon display area 701 may be used to display icons of APPs contained in the first display area 401. The electronic device 100 may detect a user operation (e.g., a click operation on a microblog icon) acting on an icon of an APP in the APP icon display area 701, and in response to the user operation, display a list of unviewed messages of the microblog APP in the message display area 702.
Electronic device 100 may also detect user operations acting on APP icon display 701 (e.g., in APP icon display area)
701), in response to the user operation, icons of other APPs are displayed in the APP icon display area 701. In some possible embodiments, the APP icon display area 701 may be referred to as a fourth display area. The embodiment of the present application does not limit the name of the APP icon display area 701.
The message display area 702 may be used to display a list of unviewed messages of the WeChat APP. One or more messages that are not being viewed may be included in the list, and electronic device 100 may detect a swipe operation applied to message display area 702, in response to which electronic device 100 may display other messages in message display area 702. One or more messages in the list may be arranged in sequence, one by one, in a chronological order, and particularly, the latest message may be displayed at the top of the list. In some possible embodiments, the message display area 702 may be referred to as a third display area. The naming of the message display area 702 is not limited in the embodiments of the present application.
Specifically, the content of the message displayed in the list may include a content part in the data of the message reported by the APP to the message manager, or a title part.
The electronic apparatus 100 may detect a user operation acting on a certain message, and in response to the user operation, the electronic apparatus 100 may display the message in the user interface of the APP to which the message belongs.
Illustratively, the electronic device 100 may detect a user operation (e.g., a click operation on the message 7021) on the message 7021 belonging to the WeChat, in response to which the electronic device 100 displays a user interface 7b for the WeChat, as shown in b in FIG. 7, the user interface 7b being used to display the specific content of the message 7021. It can be seen that since the message 7021 and the message 7022 originate from the same source, i.e. both originate from a message sent by mom, the content of the message 7021 and the message 7022 can be displayed simultaneously in the user interface 7b, meaning that the message 7021 and the message 7022 have been viewed. At this time, the number of messages in the WeChat which are not viewed is reduced by 2, and the number is changed from 88 to 86.
If the electronic apparatus 100 detects a user operation for returning to the user interface 7a (e.g., a click operation on the return key 3051), the electronic apparatus 100 displays the user interface 7a in response to the user operation, which may be referred to as c in fig. 7. At this time, the message content in the message display area 702 in the user interface 7a changes, the message 7022 of the viewed message 702 is no longer displayed, and other messages (such as a voice message sent by autumn cloud in 11:57, a WeChat payment voucher sent by WeChat payment in 11:50, and the like) can be displayed in the message display area 702. At the same time, the number in the red dot on the WeChat icon has also changed from 88 to 86, indicating that 86 messages in the WeChat have not been viewed at this time. In one possible embodiment, electronic device 100 may detect a left-swipe operation on message display area 702, in response to which electronic device 100 may display a list of unviewed messages for the next reddotted APP in message display area 702. The next APP with the red dots of the current APP is the APP represented by the icon on the right side of the icon of the current APP in the APP icon display area. As shown in a in fig. 7, the next APP with red dots in the WeChat APPs is the microblog APP.
The APP indicator 703 is used to indicate to which APP the message displayed in the message display area 702 belongs. The APP indicator 703 in a in fig. 7 indicates that the message displayed in the message display area 702 belongs to WeChat.
The eliminate red dot control 704 may be used to eliminate red dots on the WeChat icon. Electronic device 100 may detect a user operation (e.g., a click operation on remove red control 704) acting on remove red control 704, and in response to the user operation, remove the red dot on the WeChat icon. The eliminate red dots control 704 may be used to eliminate red dots on other APP icons when a list of other APPs' unviewed messages is displayed in the message display area 702.
In some possible embodiments, the above-described red dot elimination control 704 may be referred to as a second control. The embodiment of the present application does not limit the naming of the red dot removal control 704.
In one possible embodiment, the APP icon display area 701 and the message display area 702 may be displayed in a split screen manner, the message display area 702 may be a WeChat interface, and the APP icon display area 701 may be the first display area 401 in the user interface 40.
In another possible embodiment, electronic device 100 may invoke one or more unviewed messages displayed by the notification manager in message display area 702.
As mentioned in the foregoing embodiment of fig. 4, the electronic apparatus 100 may detect a user operation acting on the all-cancel control 404, and in response to the user operation, the electronic apparatus 100 may cancel the red dots on the APP icons included in the first display area 401 and the third display area 403.
Further, in one possible embodiment, in response to the user operation on the all cancel control 404 described above, the electronic apparatus 100 may also display the user interface 3a shown as a in fig. 8. As shown in fig. 8, in the user interface 3a, the red dots on the corner marks of the respective application icons have been eliminated, and in the user interface 3b, the red dots on the corner marks of the other application icons have been eliminated in addition to the pay treasure icon and the map icon.
As can be seen from fig. 8 compared with fig. 3, through the processing process of the messages provided by the embodiments of fig. 4 to fig. 7, the icons belonging to the first display area 401, such as the wechat icon, the microblog icon, the mailbox icon, the QQ icon, and the short message icon, which originally have red dots, do not have red dots, and in the processing process of the messages, the user can collectively view the messages of the APPs that are not viewed in the message display area 702. The icons that originally have red dots, such as the setting icon, the panning icon, the hua-is AR icon, and the smart home icon, belonging to the third display area 403 do not have red dots.
In another possible embodiment, in response to the user operation on the all cancel control 404 described above, the electronic apparatus 100 may not display icons of APPs in the first display area 401 and the third display area 403.
Then, the electronic apparatus 100 may detect a click operation on the main interface key 3052, and in response to the click operation, the electronic apparatus 100 may display the user interface 3 a. Alternatively, the electronic apparatus 100 may detect a right-slide operation acting on the user interface 40, and in response to the right-slide operation, the electronic apparatus 100 may display the user interface 3 a. The user interface 3a may refer to the user interface 3a shown in a in fig. 8.
With the embodiments shown in fig. 3-8, the user can selectively view partial messages of APPs, and can selectively view partial messages of an APP, without removing red dots on APP icons by viewing all messages of all APPs. Compared with the prior art, the method and the device for processing the messages can rapidly process the messages under the condition that the user cannot miss important information, eliminate red points on the APP, reduce user operation and improve the operation efficiency of the electronic equipment.
In some possible embodiments, the first user operation may be clicking a third control.
In a specific embodiment, the third control element may be a control element 309 in the user interface 3c shown in b in fig. 9.
Specifically, as shown in fig. 9, the electronic apparatus 100 may detect a slide-down operation acting on the user interface 3a, and in response to the operation, the electronic apparatus 100 may display the user interface 3 c. The user interface 3c may be referred to as a switch page. The user interface 3c may contain various controls and a notification message list 310. The various controls may include, but are not limited to, controls for turning WLAN on or off, controls for turning bluetooth on or off, controls for turning flashlight on or off, controls for turning ring mode off, controls for turning screen auto-rotation on or off, controls for entering message center 309, controls for turning flight mode on or off, controls for turning movement data on or off, controls for turning position information on or off, controls for turning screen shot on, controls for adjusting screen display brightness, and the like, among others. The notification message list 310 may include the latest and unread notification messages sent by the respective APPs.
Since the electronic device 100 generally displays the notification message list in the switch page, the user can associate the need to read the message with the notification message list, and therefore, the use of the control 309 in the switch page as an entry for viewing the message is in accordance with the user's usage habit.
In another specific embodiment, the third control element may be the control element 311 in the user interface 3a shown in fig. 10. By adding the control for viewing the message in the user interface 3a, namely the desktop homepage, the message processing process is more convenient and faster, and the operation efficiency of the electronic equipment is improved.
In another specific embodiment, the third control element may be a floating button 312 shown in fig. 11. In particular, the hover button may be displayed in either user interface. By adding the floating button in any user interface, a user can click the floating button on any user interface to process a message. According to the embodiment of the application, user operation can be further reduced, and the operation efficiency of the electronic equipment is improved.
In some possible embodiments, the first user operation may be a preset gesture.
In a specific implementation, the preset gesture may refer to the joint drawing a preset shape, and the preset shape may be a Z-shape, for example. The preset gesture can be input in any user interface. The example of inputting the preset gesture in the user interface 3a is described.
The finger joint operation can be distinguished from the user operation input by the user in the user interface 3a through the finger pad by inputting the user operation in the user interface 3a through the finger joint, so that the electronic device 100 is prevented from mistaking the user operation entering the message center as other operations input by the user aiming at the interface elements in the user interface 3a, and the accuracy and the efficiency of entering the message center by the electronic device 100 are improved. The preset shape may be a shape set by a user.
In a specific implementation, when the touch panel is in contact with the display screen 194 of the electronic device 100, different finger parts, such as a pad, a fingertip, and a knuckle, may generate different vibration and/or acoustic effects, and generate corresponding signals (including a pressure value, a capacitance value, an acceleration value, and the like generated by the touch operation on the display screen 194). Signals generated by the different finger portions may be captured by sensors of electronic device 100 (e.g., capacitive touch screen, pressure touch screen, acceleration sensor, shock sensor, vibration sensor, acoustic sensor, displacement sensor, velocity sensor, etc.), and thus, electronic device 100 may distinguish which finger portion the user has used to touch display screen 194 by the captured signals. Further, the electronic device 100 may detect the coordinates of the touch point where the finger joint of the user touches the display screen 194 in real time during the sliding process, and determine whether the finger joint of the user slides and draws the preset shape according to the change of the position of the touch point during the sliding process, so as to recognize the input touch operation.
In another specific implementation, the preset gesture may be a hover gesture.
As shown in fig. 12 b, the user may also trigger electronic device 100 to display user interface 40 by a hover gesture of a finger over display screen 194 of electronic device 100. Here, the floating posture may mean that the finger floating above the display screen 194 is in a straightened state, a bent state, or the like. In a specific implementation, the electronic device 100 may sense an object in a three-dimensional space above the display screen and a motion of the object through the hover detector, so as to detect a hover gesture of a finger of the user.
Further, the first user operation described above may also be a voice instruction, eyeball rotation, shaking of the electronic apparatus 100, pressing of the key 190 shown in a in fig. 12, or the like. The keys 190 may include any one or more keys of a volume up key, a volume down key, and a power key.
The first user operation listed above is only an exemplary operation, and other modes may be available in specific implementations, which are not limited in the embodiments of the present application.
In some possible embodiments, the second user action described above may be clicking on the red dot elimination policy option in c in fig. 13 ("view message and eliminate red dot", or "do not view message and do not eliminate red dot", or "do not view message and eliminate red dot").
Specifically, the electronic apparatus 100 may detect a click operation acting on a setting icon in the user interface 3a, and in response to the click operation, the electronic apparatus may display the user interface 13a shown by a in fig. 13. The user interface 13a may include a plurality of APP setting entries, such as a microblog setting entry 1301, a WeChat setting entry, a music setting entry, a mailbox setting entry, and the like. The electronic apparatus 100 may detect a sliding operation (e.g., a slide-up operation) acting on the user interface 13a, and in response to the sliding operation, the electronic apparatus 100 may display a setting entry of the other APP in the user interface 13 a.
Next, a red dot elimination policy for microblogs is set as an example.
As shown in b in fig. 13, the electronic device 100 may detect a user operation (e.g., a click operation on the setting entry 1301 of the microblog) acting on the setting entry 1301 of the microblog, and in response to the user operation, the electronic device 100 may display the user interface 13 b. The user interface 13b may include an entry 1302 for setting a red dot elimination policy. Electronic device 100 may detect a user operation (e.g., a click operation on portal 1302) acting on portal 1302, in response to which electronic device 100 may display user interface 13 c. Options for the red dot elimination policy may be included in the user interface 13c, such as view message and eliminate red dots, not view message and not eliminate red dots, and not view message and eliminate red dots. The electronic device 100 may detect a second user operation (e.g., a click operation on the option "view message and eliminate red dots") applied to an option, and in response to the second user operation, the electronic device 100 may set the red dot elimination policy represented by the option as the red dot elimination policy of the microblog.
Not limited to the entry 1302 for setting the red dot removal policy, the user interface 13b may further include other entries, such as an entry for checking the APP data storage condition, an entry for setting a notification mode, and the like. The content included in the user interface 13b is not limited in the embodiment of the present application.
In other possible embodiments, the second user action may be clicking on the red dot elimination policy option in fig. 14 ("view message and eliminate red dot", or "do not view message and do not eliminate red dot", or "do not view message and eliminate red dot").
Next, a red dot erasure strategy for setting WeChat will be described as an example.
As shown in fig. 14, the electronic apparatus 100 may detect a long press operation acting on the icon of the wechat in the user interface 3a, and in response to the long press operation, the electronic apparatus 100 may display a red dot elimination policy setting panel 312 of the wechat in the user interface 3 a. A plurality of selectable red dot elimination policy options may be included in red dot elimination policy settings panel 312, such as view message and eliminate red dots, not view message and not remove red dots and not view message and eliminate red dots. The electronic apparatus 100 may detect a second user operation (e.g., a click operation on the option "view message and eliminate red dot") applied to an option, and in response to the second user operation, the electronic apparatus 100 may set the red dot elimination policy represented by the option to the WeChat red dot elimination policy.
In other possible embodiments, the electronic device 100 may automatically set the red dot elimination policy of the APP according to the usage habit of the user.
In a specific implementation manner, the electronic device 100 may record the frequency of opening the APP by the user in a week, and determine the red dot elimination policy according to the frequency of opening the APP by the user.
For example, if a user turns on an APP more than 50 times in a week, it is determined that the red point elimination policy of the APP is to view a message and eliminate red points. And if the frequency of opening certain APP by the user in one week is more than 20 times and less than or equal to 50 times, determining that the red point elimination strategy of the APP is not to check the message and not to eliminate the red point. And if the frequency of opening certain APP by the user in one week is less than or equal to 20 times, determining that the red point elimination strategy of the APP is not to check the message and eliminate the red point.
Not limited to one week, the electronic device 100 may also record the frequency of opening an APP by the user within 1 day, 3 days, 10 days, or 15 days, and the number of recorded days is not limited in the present application.
The threshold for determining the red dot elimination policy of APP is not limited to 50 times or 20 times, and may be other values, which is not limited in the embodiment of the present application.
According to the embodiment of the application, the red point elimination strategy of the APP is judged by counting the interaction frequency of the user and the APP within a period of time. Electronic equipment can accord with user psychological expectation according to user and APP's interactive frequency automatic setting APP's red point elimination strategy, reduces the loaded down with trivial details operation that user's manual setting red point eliminated the strategy, promotes electronic equipment's operating efficiency, promotes user experience.
In another particular implementation, electronic device 100 may record a time difference between when an APP generates a new message and when the message is viewed. And determining the red point elimination strategy of the APP according to the time difference of the messages.
For example, if 80% of messages generated by an APP are viewed within 1 hour, the red dot elimination policy for that APP may be determined to be viewing messages and eliminating red dots. If 80% of the messages are viewed after 1 hour and within 12 hours, the red dot elimination policy for the APP may be determined to be not viewing messages and not eliminating red dots. If 80% of the messages have not been viewed after more than 12 hours, it may be determined that the red dot elimination policy for the APP is to not view the messages and eliminate the red dots.
The percentage of the number and the total number of the viewed messages may also be other values, which is not limited to 80%, and the embodiment of the present application does not limit this.
The time difference corresponding to the different classifications may be other values, such as within 2 hours, within 2 hours and within 10 hours and after 10 hours, and the like, without being limited to within 1 hour, after 1 hour and within 12 hours and after 12 hours, and the examples of the present application are not limited thereto.
In the embodiment of the application, the time difference between the time when the message is generated and the time when the message is viewed can indicate the timeliness of the user viewing the message. A shorter time difference indicates a more timely user viewing of the message. If most messages of an APP are viewed in time, the red point elimination strategy of the APP can be determined to be viewing and red point elimination. If most messages of an APP are not viewed timely enough, the red point elimination strategy of the APP can be determined to be not viewed and not eliminate the red point. If most messages of an APP are not viewed within a period of time, it may be determined that the red point elimination policy for the APP is not viewed and the red points are eliminated. Through the red point elimination strategy of the APP automatically set by the timeliness of checking the APP message, the psychological expectation of a user can be accurately met, the complex operation of manual setting of a user can be reduced, the operation efficiency of the electronic equipment is improved, and the user experience is improved.
The above-listed manner of setting the red dot elimination policy of the APP is merely an exemplary illustration, and other manners may be used in a specific implementation, which is not limited in this embodiment of the application.
In another specific implementation, the electronic device 100 may periodically count the usage habits of the user. The usage habits of the user may include the frequency of user interaction with the APP over a period of time, or the timeliness with which the user views messages. The above statistical period may be, for example, 6:01-10:00, 10:01-14:00, 14:01-18:00, 18:01-22:00, 22:01-6:00 (the next day).
And judging the corresponding red dot elimination strategy of the APP in different time periods according to the statistical use habits of the user in the 5 time periods. For the correspondence between the usage habits of the user and the red dot removal policy of the APP, reference may be made to the related descriptions in the foregoing two embodiments, which are not described herein again.
For example, if it is determined that the red point elimination policy corresponding to the map at 6:01-10:00 is to view a message and eliminate the red point, the icon of the map may be displayed in the first display area 401 after the user opens the user interface 40 within a period of 6:01-10: 00. If the red point elimination policy corresponding to the mailbox at 10:01-14:00 is judged to be viewing messages and eliminating red points, the icon of the mail can be displayed in the first display area 401 after the user opens the user interface 40 within the time period of 10:01-14: 00. If the payment instrument is judged to be not viewing the message and not eliminating the red dots in the red dot elimination policy corresponding to 14:01-18:00, the icon of the payment instrument can be displayed in the second display area 402 after the user opens the user interface 40 within the time period of 14:01-18: 00. If the red point elimination policy corresponding to the time slot 18:01-22:00 of the treasure to be removed is that the message is not viewed and the red point is eliminated, the icon of the treasure to be removed can be displayed in the third display area 403 after the user opens the user interface 40 within the time slot 18:01-22: 00. If the red point elimination policy corresponding to the pay treasure at 22:01-6:00 (the next day) is judged to be the viewing message and the red point is eliminated, the icon of the pay treasure can be displayed in the first display area 401 after the user opens the user interface 40 within the time period of 22:01-6:00 (the next day).
The statistical time interval is not limited to the above-mentioned list, and other statistical time intervals may be available in a specific implementation, which is not limited in the embodiment of the present application.
When the red point elimination strategy of the APP is set according to the use habits of the user, different time periods in one day are taken as consideration factors, the accuracy of setting the red point elimination strategy of the APP can be further improved, and the setting result is more in line with the psychological expectation of the user.
The above listed manners for setting the APP red dot elimination policy are only exemplary, and other manners may be used in specific implementations, which are not limited in this application embodiment.
The method is not limited to setting corresponding red dot elimination strategies for each APP, and in specific implementation, all APPs with messages which are not checked can be processed in a unified mode.
As shown in fig. 15, electronic device 100 may display all red-dotted APP icons in user interface 40. The electronic apparatus 100 may detect a click operation acting on any one of the icons, such as the wechat icon, and in response to the click operation, the electronic apparatus 100 may display the user interface 7a, the user interface 7a being shown as a in fig. 7. At this time, all APP icons with red dots may be displayed in the APP icon display area 701. The user may enter a slide operation in the APP icon display area 701 to view more of the reddish APP icons.
Not limited to all red-dotted APP icons, electronic device 100 may display all APP icons in user interface 40. Specifically, the electronic device 100 may preferentially display APP icons with red dots and then display APP icons without red dots. Further, the electronic device 100 may display an APP icon without a red dot in grayscale, prompting the user that the APP does not currently have an unviewed message. The method is not limited to grayscale display, and in a specific implementation, the APP icons without red dots and the APP icons with red dots can be displayed in different manners.
Not limited to all red-dotted APP icons, electronic device 100 may display a portion of red-dotted APP icons in user interface 40. The APP icon displayed in the user interface 40 with red dots may be set by the user, or may be set by the electronic device 100 according to the user's usage habit.
In one possible embodiment, after electronic device 100 detects a click operation on all dismiss control 404, in response to the click operation, electronic device 100 may dismiss the red dot on the displayed portion of the APP icon. The portion of the APP may be an unimportant APP determined by the electronic device 100 based on the user's usage habits.
The user habit may refer to the frequency of opening the APP by the user within a period of time, and may also refer to the timeliness of the unread messages viewed by the user. The lower the frequency, the less important the APP may be determined by electronic device 100; the less timely the user views the unread message, electronic device 100 may determine that the APP is the less important.
As can be seen from the user interface 7a shown in a of fig. 7, the messages in the message display area 702 are arranged in order one by one in chronological order. Without limitation, the messages in the message display area 702 may be displayed collectively according to the source, and then arranged in sequence according to the generation time of the latest message from different sources.
Illustratively, as shown in FIG. 16, there are 88 messages in the WeChat, 2 from mom, 39 from the corporate group, 10 from WeChat payments, 1 from the autumn cloud, and 8 from the Tencent Enterprise mailbox, and the user can view more messages by sliding the message display area 702. The electronic device 100 may collectively display messages from the same source, such as two messages from mom, display the latest message, and increase the number of messages from the source in front of the content of the latest message to prompt the user. For example, two messages 7021 and 7022 originated from mom in a in fig. 7 may be gathered together, the latest message 7021 is displayed, and the number of messages [2 ] is added in front of the content of 7021, which may be referred to as 1601 in fig. 16. When the electronic apparatus 100 detects the click operation on 1601, in response to the click operation, the electronic apparatus 100 may display a user interface 7b, as shown in b in fig. 7.
As can be seen from the embodiment shown in FIG. 7, the user may click on a desired message in the message display area 702, and enter the APP to view the message. Similarly, when the messages are collectively presented by source, the user can also enter the APP to view the messages under a certain source (e.g., 1601 in fig. 16) that the user wants to view by clicking on the messages under the source (e.g., b in fig. 7).
Further, the user may eliminate red dots for messages that the user does not want to view or messages under a certain source. Next, taking the message shown in the source set as an example, how the user eliminates the red dots of the message under a certain source that the user does not want to view is specifically described.
In one possible embodiment, as shown in a of fig. 17, a control 1701 may be further included on each message from the source to eliminate red dots for all messages from the source. Electronic device 100 may detect a user operation on a control for a message from a source (e.g., a click operation on control 1701 for a message from a WeChat payment), and in response to the user operation, electronic device 100 may remove red dots from all messages from the source corresponding to the control and may remove from display messages from the source in message display area 702.
As shown in b of fig. 17, when the user clicks on the control 1701 of the message originated from the WeChat Payment, the number within the red dot of the WeChat icon in the APP icon display area 701 is changed from 88 to 78, the number within the red dot decreased is the red dot of 10 messages originated from the WeChat Payment, and the message originated from the WeChat Payment is not displayed in the message display area 702. By the method provided by the embodiment of the application, the user can eliminate the red dots of the messages which the user does not want to view, the messages which the user does not want to view are not displayed in the message display area, the messages which the user wants to view are reserved, the user can be prevented from missing important messages, meanwhile, the number of times of sliding operation input by the user in the message display area can be reduced, and the operation messages of the electronic equipment are improved. Further, when the messages are displayed according to the source set, the user only needs to click once, so that a plurality of messages which the user does not want to view are not displayed in the message display area, and the messages which the user wants to view are reserved. When the number of the APP messages is large, the number of times of inputting sliding operation in the message display area by a user can be greatly reduced, and the operation efficiency of the electronic equipment is further improved.
In another possible embodiment, as shown in a in FIG. 18, electronic device 100 may detect a left-slide operation on a message originating from a WeChat payment, in response to which electronic device may display a control 1801 for eliminating a red dot of the message originating from the WeChat payment, control 1801 may be as shown with reference to b in FIG. 18.
Electronic device 100 may detect a click operation on control 1801, and in response to the click operation, electronic device 100 may dismiss the red dots of 10 messages originating from the WeChat payment and dismiss the messages from that source in message display area 702.
In one possible embodiment, electronic device 100 may detect a left-slide operation on a message for a WeChat payment under a certain source, and in response to this operation, electronic device 100 may eliminate the red dots of 10 messages originating from the WeChat payment and cancel display of the message under the source in message display area 702.
Instead of eliminating the red dot of a message so that the message is not displayed in the message display area, the user may also delete messages that the user does not want to view or messages from a source by clicking on control 1701 or control 1801.
Further, the electronic device 100 may record the number of times that the user eliminates the red dot of the message from a source and the number of times that the message from the source is displayed, and if the ratio of the number of times that the red dot is eliminated to the number of times that the message is displayed exceeds a threshold, such as 90%, it is determined that the user does not want the message from the source to be displayed in the message display area 702.
In one possible embodiment, when the user opens user interface 7a again to view a message, the message under that source is not displayed in message display area 702.
In another possible embodiment, when the user opens the user interface 7a again to view the message, the message under that source is displayed at the very end of the list of messages.
The threshold value is not limited to 90%, and may be other values in specific implementations, and the ratio is not limited in the embodiment of the present application.
As can be seen from the embodiment shown in a in fig. 7 and the embodiment shown in fig. 18, the messages in the message display area 702 may be arranged in chronological order.
The display is not limited to the chronological order, and the display can also be performed according to the interaction degree of the user and the messages from different sources. And displaying the messages of the sources with high interaction degree in front of the message list, and displaying the messages of the sources with low interaction degree in the back of the message list.
Specifically, electronic device 100 may record the number of times a user views a message from a source and the number of times the message from the source is displayed. A higher ratio of the number of views to the number of displays may indicate a higher degree of user interaction with the message from the source. The higher the degree of interaction, the more heavily the user looks at the message from that source.
Not limited to the ratio of the number of viewing times to the number of displaying times, in a specific implementation, whether the user pays attention to the message under the source may be determined according to a time difference between the time when the message under the source is viewed and the time when the message under the source is generated, and the smaller the time difference, the more the user pays attention to the message under the source.
According to the method and the device, the more important message of the user can be preferentially displayed according to the use habit of the user, the message displayed by the electronic equipment can better accord with the psychological expectation of the user, the user can quickly and accurately check the message which needs to be checked, the user requirement is met, and the user experience is good.
The mark related in the embodiment of the application is not limited to a red dot on an APP icon of a desktop, but also can be a mark in an APP, such as a blue mark in front of an unread short message in a short message list in a short message APP.
Next, with reference to the foregoing embodiment, taking a scene in which the short message APP receives a new short message as an example, a workflow of software and hardware when the electronic device displays a red dot on an icon of the short message APP is exemplarily described.
In a possible embodiment, as shown in fig. 19, the process of displaying the red dot on the icon of the short message APP by the electronic device may include the following steps:
s11: and the short message APP of the application program layer receives the new short message.
Specifically, the new short message may be sent to the short message APP by the short message server.
S12: and sending a red dot display instruction to a view system of the application program framework layer by the short message APP of the application program layer.
The instruction can be used for displaying a red dot on an icon of the short message APP. S13: the view system of the application framework layer calls the display driver of the kernel layer, and the red dots are displayed on the icons of the short message APP through the display screen 194.
S14: and the short message APP of the application program layer sends the data of the new short message to a message manager of the application program framework layer.
Specifically, the content of the data of the new short message may refer to the data of the message reported by the smart home APP to the message manager in the embodiment of fig. 2, which is not described herein again.
S15: and the message manager of the application program framework layer processes and stores the data of the new short message.
Specifically, the processed data of the message may refer to the processed data of the message reported by the smart home APP by the message manager mentioned in the foregoing embodiment of fig. 2, which is not described herein again.
S16: and the message manager of the application program framework layer returns the message ID of the new short message to the short message APP of the application program layer.
Specifically, the sequence of the occurrences of S12-S13 and S14-S15 is not limited.
Possibly, the above process may further include:
s17: and sending a command for displaying the notification message to a notification manager by a short message APP of the application program layer.
S18: the notification manager of the application framework layer calls the display driver of the kernel layer, and the content of the new short message is displayed in the notification bar through the display screen 194.
In another possible embodiment, as shown in fig. 20, the process of displaying the red dot on the icon of the short message APP by the electronic device may include the following steps:
s21: and the short message APP of the application program layer receives the new short message.
Specifically, S21 is identical to S11 and will not be described in detail.
S22: and the short message APP of the application program layer sends the data of the new short message to a message manager of the application program framework layer.
Specifically, S22 is identical to S14 and will not be described in detail.
S23: and the message manager of the application program framework layer processes and stores the data of the new short message.
Specifically, S23 is identical to S15 and will not be described in detail.
S24: and the message manager of the application program framework layer returns the message ID of the new short message to the short message APP of the application program layer.
Specifically, S24 is identical to S16 and will not be described in detail.
S25: and sending a red dot display instruction to a view system of the application program framework layer by the short message APP of the application program layer.
Specifically, S25 is identical to S12 and will not be described in detail.
S26: the view system of the application framework layer calls the display driver of the kernel layer, and the red dots are displayed on the icons of the short message APP through the display screen 194.
Specifically, S26 is identical to S13 and will not be described in detail.
Specifically, the sequence of the occurrences of S23-S24 and S25-S26 is not limited.
Possibly, the above process may further include:
s27: and sending a command for displaying the notification message to a notification manager by a short message APP of the application program layer.
S28: the notification manager of the application framework layer calls the display driver of the kernel layer, and the content of the new short message is displayed in the notification bar through the display screen 194.
Not limited to the above-mentioned sending of the instruction for displaying the red dots to the view system of the application framework layer by the short message APP of the application layer as in fig. 19 or fig. 20, in another possible embodiment, the instruction for displaying the red dots to the view system of the application framework layer may be a window manager (not shown in fig. 19 and fig. 20). That is, after the short message APP receives a new short message, whether to display the red dots on the short message icon may be determined by the short message APP or by the window manager, which is not limited in the embodiment of the present application.
The workflow of the software and hardware when the electronic device 100 displays the user interface 40 will be described next. The left-slide operation input by the user on the user interface 3a will be described as an example.
As shown in fig. 21, the process of the electronic device 100 displaying the user interface 40 may include the following steps:
s31: the touch sensor 180K detects a touch operation of left swipe.
S32: the touch sensor 180K sends a hardware interrupt to the kernel layer.
S33: the kernel layer processes the touch operation into an original input event.
Specifically, the original input event includes information such as touch coordinates, a time stamp of the touch operation, and the like. The raw input event may be stored at the kernel layer.
S34: the application framework layer obtains the original input event from the kernel layer, and identifies that the interface element corresponding to the input event belongs to the user interface 3 a.
S35: and a message manager in the application program framework layer determines all APPs with messages which are not viewed and determines a red dot elimination strategy corresponding to each APP.
Specifically, the message manager may be configured to store data of a message reported by each APP and a red dot elimination policy corresponding to each APP. The APPs with messages and the red dot elimination policy corresponding to each APP can be referred to b in fig. 4.
S36: the message manager invokes the kernel layer launch display driver to display the user interface 40 in the display screen 194.
Next, the workflow of software and hardware when the electronic device 100 collectively displays messages will be described. The user clicks the WeChat icon 4011 on the user interface 40 as an example.
As shown in fig. 22, the process of collectively displaying the messages by the electronic device 100 may include the following steps:
s41: the touch sensor 180K detects a single-click operation.
S42: the touch sensor 180K sends a hardware interrupt to the kernel layer.
S43: the kernel layer processes the touch operation into an original input event.
Specifically, the original input event includes information such as touch coordinates, a time stamp of the touch operation, and the like. The raw input event may be stored at the kernel layer.
S44: the application framework layer obtains an original input event from the kernel layer, and identifies an icon corresponding to the input event as a WeChat icon 4011.
S45: the message manager determines message data of the WeChat APP.
S46: the message manager invokes the kernel layer launch display driver to display the user interface 7a in the display screen 194.
Specifically, the content of the message displayed in the user interface 7a may be a content in the message data, or may be a title in the message data.
Specifically, it is not limited to displaying the message of the WeChat APP in the message display area of the user interface 7 a.
The following describes the workflow of software and hardware when the electronic device 100 eliminates the red dot on the APP icon. The explanation will be given taking an example in which the user clicks the all cancel control 404 in the user interface 40 to cancel the red dot on the APP icons contained in the first display area 401 and the third display area 403.
As shown in FIG. 23, the process of electronic device 100 eliminating the red dot on the displayed APP icon may include the following steps:
s51: the touch sensor 180K detects a single-click operation.
S52: the touch sensor 180K sends a hardware interrupt to the kernel layer.
S53: the kernel layer processes the touch operation into an original input event.
Specifically, the original input event includes information such as touch coordinates, a time stamp of the touch operation, and the like. The raw input event may be stored at the kernel layer.
S54: the application framework layer obtains the original input event from the kernel layer, and identifies the control corresponding to the input event as the all-elimination control 404.
S55: the message manager determines that the unviewed messages of the APPs corresponding to the icons in the first display area 401 and the third display area 403 have changed the processing state of the unviewed messages from unprocessed to processed.
S56: the message manager calls the kernel layer to start the display driver, and cancels the display of the red dots on the APP icons contained in the first display area 401 and the third display area 403.
Next, a message processing method provided in an embodiment of the present application is described.
As shown in fig. 24, the message processing method may include at least the following steps:
s101: the electronic device displays a first user interface; the first user interface comprises an icon of a first application, an icon of a second application, an icon of a third application, an icon of a fourth application and an icon of a fifth application; marks of unread messages are arranged on the icon of the first application, the icon of the second application, the icon of the third application and the icon of the fourth application, and marks of the unread messages are not arranged on the icon of the fifth application; the first application and the second application belong to a first application type and the third application and the fourth application belong to a second application type.
S102: when the electronic equipment detects the operation of a first user, displaying a second user interface; the second user interface comprises an icon of the first application, an icon of the second application, an icon of the third application, an icon of the fourth application and the first control.
S103: the electronic device detects a user operation acting on the first control.
S104: the electronic device displays a first user interface; the icons of the first application and the second application are not marked with unread messages, and the icons of the third application and the fourth application are marked with unread messages.
In a possible embodiment, the first user interface may be the user interface 3a in the embodiment of fig. 3, the first user operation may be a left-swipe gesture in the user interface 3a, and the second user interface may be the user interface 4b shown in b in fig. 4.
In another possible embodiment, the first user interface may be the user interface 3a in the embodiment of fig. 3, and the first user operation may be clicking a third control.
Possibly, this third control may be control 311 in the embodiment of fig. 10.
Possibly, this third control may also be a float button 312 in the embodiment of fig. 11.
Possibly, this third control may also be control 309 in the embodiment of fig. 9. The user interface 3c may be referred to as a fourth user interface. The electronic device may display a fourth user interface upon detecting the drop-down gesture in the first user interface. The fourth user interface may also be referred to as a switch page.
In another possible embodiment, the first user operation may be a preset gesture.
Possibly, the preset gesture may be a finger joint drawing preset shape shown in a in fig. 12, and the preset shape may be a Z shape, for example. The preset gesture can be input in any user interface.
Possibly, the preset gesture may also be a hover gesture shown in b of fig. 12.
In another possible embodiment, the first user operation may be a voice command, an eye rotation, a shaking of the electronic device, a pressing of a key, or the like.
In another possible embodiment, the above-mentioned mark with unread messages may be a red dot.
Further, a number may be included in the red dot to indicate the number of unread messages for the application.
In particular, the first control may be the all cancel control 404 shown in b of fig. 4.
In particular, the user operation acting on the first control may be a click operation.
Specifically, the second user operation is an operation of displaying an icon of the third application in the first area.
Possibly, as shown in a in fig. 5, the second operation may be dragging an icon (e.g., short message icon 4021) of the third application from the second area (e.g., second display area 402) to the first area (e.g., first display area 401).
Possibly, as shown in FIG. 13, the second operation may be clicking any of the red dot elimination policy options (e.g., the option "View message and eliminate Red dot") in the user interface 13 c.
Possibly, as shown in fig. 14, the second operation may be clicking any of the red dot elimination strategy options in the red dot elimination strategy setup panel 312.
In another possible embodiment, before the electronic device detects a user operation acting on the first control, the method further includes: and when the electronic equipment detects the user operation acting on the icon of the first application in the second user interface, displaying a third user interface. The user operation may be a click operation.
Possibly, the third user interface is an interface of the first application.
Possibly, said third user interface comprises a third area, said third area comprising one or more unread messages by said first application. Wherein the third area may be the message display area 702 shown in a of fig. 7.
Further, the electronic apparatus 100 may detect a user operation applied to the third area, and display an interface of the first application. For example, as shown in fig. 7, the electronic apparatus 100 may detect a click operation acting on a message 7021 in the message display area 702 shown in a in fig. 7, and in response to the operation, the electronic apparatus 100 may display an interface 7b of the WeChat shown in b in fig. 7.
Further, the third user interface may further include a fourth area in which the icon of the first application and the icon of the second application are displayed.
The method further comprises the following steps: and when the electronic equipment detects a user operation acting on the icon of the second application in the fourth area, displaying one or more unread messages of the second application in the third area. The user operation may be a click operation. Wherein the fourth area may be an APP icon display area 701 shown in a of fig. 7.
Further, the electronic apparatus may further detect that the slide operation acting in the third area switches the content displayed in the third area.
In another possible embodiment, when the electronic device 100 detects a user operation on any one of the one or more unread messages, the display of any one of the unread messages is cancelled in the third area.
Possibly, the user operation may be clicking on the control 1701 shown as a in fig. 17.
Possibly, the user operation may also be clicking on a control 1801 shown as b in fig. 18.
Furthermore, the electronic device can record the use habit of the user within a period of time, for example, the user deletes the record of the unread message, and the electronic device selectively displays the unread message in the third area according to the use habit, so that the displayed message meets the psychological expectation of the user, the user experience is improved, the operation of the user for checking the unread message is reduced, and the operation efficiency is improved.
In another possible embodiment, the third user interface further includes a second control; after displaying the one or more unread messages of the second application in the third area, the method further includes: and when the electronic equipment detects the user operation acting on the second control, canceling to display one or more unread messages of the second application in the third area. The user operation may be a click operation. Wherein the second control may be the eliminate red dots control 704 shown in a of fig. 7.
Further, after canceling the display of the one or more unread messages of the second application in the third area, the method further includes: displaying the first user interface; wherein the icon of the second application does not have the mark with the unread message.
According to the embodiment of the application, the marked application icons can be displayed in the second user interface in a centralized manner, different elimination strategies are adopted for the marks on the application icons of different application types, so that a user can eliminate the marks on part of the application icons according to the requirement of the user, and the mark processing efficiency is improved. The user does not need to check the application marked on each icon, so that the user operation is reduced, and the operation efficiency is improved.
Embodiments of the present application also provide a computer-readable storage medium having stored therein instructions, which when executed on a computer or processor, cause the computer or processor to perform one or more steps of any one of the methods described above. The respective constituent modules of the signal processing apparatus may be stored in the computer-readable storage medium if they are implemented in the form of software functional units and sold or used as independent products.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (17)

1. A message processing method is applied to electronic equipment and is characterized by comprising the following steps:
the electronic device displays a first user interface, the first user interface including an icon of a first application, an icon of a second application, an icon of a third application, an icon of a fourth application, and an icon of a fifth application; the icons of the first application, the second application, the third application and the fourth application are respectively provided with marks of unread messages, and the icon of the fifth application is not provided with the marks of the unread messages;
the method comprises the steps that when the electronic equipment detects first user operation, a second user interface is displayed; wherein the second user interface comprises an icon of the first application, an icon of the second application, an icon of the third application, an icon of the fourth application, and a first control;
the electronic equipment detects user operation acting on the first control;
the electronic device displaying the first user interface; the icon of the first application and the icon of the second application do not have the mark with the unread message, and the icon of the third application and the icon of the fourth application respectively have the mark with the unread message.
2. The method of claim 1, wherein the second user interface comprises a first area and a second area; the icon of the first application and the icon of the second application are displayed in the first area, and the icon of the third application and the icon of the fourth application are displayed in the second area.
3. The method of claim 2, wherein the first application and the second application are of a first application type, and the third application and the fourth application are of a second application type; after the displaying the second user interface, the method further comprises: when the electronic equipment detects a second user operation acting on the icon of the third application in the second user interface, the application type of the third application is changed from the second application type to the first application type.
4. The method according to claim 3, wherein the second user operation is an operation of causing an icon of the third application to be displayed in the first area.
5. The method of any of claims 1-4, wherein after the electronic device displays the second user interface, before the electronic device detects the user operation acting on the first control, the method further comprises:
and when the electronic equipment detects the user operation acting on the icon of the first application in the second user interface, displaying a third user interface.
6. The method of claim 5, wherein the third user interface is an interface of the first application.
7. The method of claim 5, wherein the third user interface comprises a third area that includes one or more unread messages by the first application.
8. The method of claim 7, wherein the electronic device displays an interface of the first application when detecting a user operation acting on the third area.
9. The method of claim 7 or 8, wherein the third user interface further comprises a fourth area in which the icon of the first application and the icon of the second application are displayed;
the method further comprises the following steps:
when the electronic equipment detects user operation acting on the icon of the second application in the fourth area, one or more unread messages of the second application are displayed in the third area.
10. The method according to claim 7 or 9, wherein when the electronic device detects a user operation acting on any one of the one or more unread messages, the display of the any one of the unread messages is cancelled in the third area.
11. The method of claim 9 or 10, further comprising a second control in the third user interface;
after the displaying of the one or more unread messages of the second application in the third region, the method further comprises: and when the electronic equipment detects user operation acting on the second control, canceling display of one or more unread messages of the second application in the third area.
12. The method of claim 11, wherein after canceling the display of the one or more unread messages of the second application in the third region, the method further comprises: displaying the first user interface; wherein the icon of the second application is not marked with the unread message.
13. The method of any of claims 1-12, wherein the indicia with unread messages is a red dot.
14. The method of claim 13, wherein the red dots comprise numbers.
15. The method of any of claims 1-14, wherein the first user manipulation is a user's left-swipe gesture in the first user interface, or the first user manipulation is a user clicking a third control; the third control is an icon in a first user interface, or the third control is a floating button, or the third control is a control in a fourth user interface, and the fourth user interface is a user interface displayed after the electronic device detects a pull-down gesture in the first user interface.
16. An electronic device, comprising: one or more processors, memory, display screens, wireless communication modules, and mobile communication modules;
the memory, the display screen, the wireless communication module, and the mobile communication module are coupled with the one or more processors, the memory for storing computer program code, the computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the message processing method of any of claims 1-15.
17. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the message processing method of any of claims 1-15.
CN202111331750.3A 2018-12-24 2018-12-24 Message processing method and electronic equipment Pending CN114153356A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111331750.3A CN114153356A (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811584684.9A CN109766036B (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment
CN202111331750.3A CN114153356A (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201811584684.9A Division CN109766036B (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment

Publications (1)

Publication Number Publication Date
CN114153356A true CN114153356A (en) 2022-03-08

Family

ID=66450880

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202111331750.3A Pending CN114153356A (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment
CN201811584684.9A Active CN109766036B (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201811584684.9A Active CN109766036B (en) 2018-12-24 2018-12-24 Message processing method and electronic equipment

Country Status (1)

Country Link
CN (2) CN114153356A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116032876A (en) * 2022-08-10 2023-04-28 荣耀终端有限公司 Unread message notification processing method, electronic equipment and storage medium

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110231900A (en) * 2019-05-29 2019-09-13 维沃移动通信有限公司 A kind of application icon display methods and terminal
CN110266888B (en) * 2019-06-25 2021-08-03 努比亚技术有限公司 Method for acquiring number of corner marks by image recognition, mobile device and storage medium
CN110737371A (en) * 2019-09-04 2020-01-31 珠海格力电器股份有限公司 information display method, device and computer readable medium
CN110855830A (en) * 2019-10-30 2020-02-28 维沃移动通信有限公司 Information processing method and electronic equipment
CN110989877B (en) * 2019-10-30 2021-09-17 重庆小雨点小额贷款有限公司 Message management method, related equipment and computer readable storage medium
CN110912807A (en) * 2019-11-22 2020-03-24 北京奇艺世纪科技有限公司 Information prompting method and device, electronic equipment and computer readable storage medium
CN111158539A (en) * 2019-11-27 2020-05-15 华为技术有限公司 Method for processing unread message and terminal equipment
CN113207111B (en) * 2020-01-16 2022-09-16 华为技术有限公司 Data sending method and mobile equipment
CN113220112B (en) * 2020-01-21 2023-07-18 华为技术有限公司 Gesture recognition method, gesture recognition device, terminal equipment and computer storage medium
CN113849090B (en) * 2020-02-11 2022-10-25 荣耀终端有限公司 Card display method, electronic device and computer readable storage medium
CN111475223B (en) * 2020-03-31 2023-02-14 深圳市富途网络科技有限公司 Management method and device for information reminding
CN111506248A (en) * 2020-04-21 2020-08-07 深圳市Tcl云创科技有限公司 Notification information processing method and device, storage medium and mobile terminal
CN111596819A (en) * 2020-04-27 2020-08-28 维沃移动通信有限公司 Unread message processing method and electronic equipment
CN114500728B (en) * 2020-11-13 2023-07-18 华为终端有限公司 Incoming call bell setting method, incoming call prompting method and electronic equipment
CN114640741A (en) * 2020-12-15 2022-06-17 华为技术有限公司 Unread message management method and unread message management equipment
CN113821132B (en) * 2021-07-27 2023-08-15 荣耀终端有限公司 Message processing method and device
CN113727287A (en) * 2021-08-27 2021-11-30 展讯通信(天津)有限公司 Short message notification method and electronic terminal equipment
CN113485857B (en) * 2021-09-07 2021-12-21 天津中新智冠信息技术有限公司 Message processing method and device, electronic equipment and storage medium
CN114020197B (en) * 2021-09-30 2022-11-22 荣耀终端有限公司 Cross-application message processing method, electronic device and readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3792974B2 (en) * 2000-01-20 2006-07-05 富士通株式会社 Communication support method and conversation apparatus
CN104965639B (en) * 2015-06-30 2019-06-28 努比亚技术有限公司 The footmark management method and device of application icon
CN106533928B (en) * 2016-12-29 2021-01-05 努比亚技术有限公司 Method and device for updating unread message reminding identification
CN107623787B (en) * 2017-09-19 2019-03-01 维沃移动通信有限公司 A kind of footmark information processing method and terminal
CN108846135A (en) * 2018-07-06 2018-11-20 佛山市灏金赢科技有限公司 The method and system of prompting message are eliminated in a kind of social application client

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116032876A (en) * 2022-08-10 2023-04-28 荣耀终端有限公司 Unread message notification processing method, electronic equipment and storage medium
WO2024032037A1 (en) * 2022-08-10 2024-02-15 荣耀终端有限公司 Method for processing unread-message notification, and electronic device and storage medium

Also Published As

Publication number Publication date
CN109766036A (en) 2019-05-17
CN109766036B (en) 2021-11-19

Similar Documents

Publication Publication Date Title
CN109766036B (en) Message processing method and electronic equipment
CN112130742B (en) Full screen display method and device of mobile terminal
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN112217923B (en) Display method of flexible screen and terminal
WO2020134869A1 (en) Electronic device operating method and electronic device
CN111669459B (en) Keyboard display method, electronic device and computer readable storage medium
CN111078091A (en) Split screen display processing method and device and electronic equipment
CN111258700B (en) Icon management method and intelligent terminal
CN111913750B (en) Application program management method, device and equipment
CN114556294A (en) Theme switching method and theme switching device
CN114363462B (en) Interface display method, electronic equipment and computer readable medium
CN114461111A (en) Function starting method and electronic equipment
CN110633043A (en) Split screen processing method and terminal equipment
CN114077365A (en) Split screen display method and electronic equipment
CN113805797A (en) Network resource processing method, electronic device and computer readable storage medium
CN113641271A (en) Application window management method, terminal device and computer readable storage medium
CN113986070A (en) Quick viewing method for application card and electronic equipment
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
CN115129410A (en) Desktop wallpaper configuration method and device, electronic equipment and readable storage medium
CN112449101A (en) Shooting method and electronic equipment
WO2020192716A1 (en) System language switching method and related apparatus
CN114064160A (en) Application icon layout method and related device
CN113542574A (en) Shooting preview method under zooming, terminal, storage medium and electronic equipment
CN113438366A (en) Information notification interaction method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination