CN113645349B - Message processing method of terminal, medium and electronic device - Google Patents

Message processing method of terminal, medium and electronic device Download PDF

Info

Publication number
CN113645349B
CN113645349B CN202010343422.4A CN202010343422A CN113645349B CN 113645349 B CN113645349 B CN 113645349B CN 202010343422 A CN202010343422 A CN 202010343422A CN 113645349 B CN113645349 B CN 113645349B
Authority
CN
China
Prior art keywords
message
user
terminal
display mode
progress
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010343422.4A
Other languages
Chinese (zh)
Other versions
CN113645349A (en
Inventor
熊刘冬
李春东
白锦华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010343422.4A priority Critical patent/CN113645349B/en
Priority to PCT/CN2021/088922 priority patent/WO2021218764A1/en
Publication of CN113645349A publication Critical patent/CN113645349A/en
Application granted granted Critical
Publication of CN113645349B publication Critical patent/CN113645349B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a message processing method of a terminal, the terminal, a medium and an electronic device. The method comprises the following steps: the terminal determines the position of a user of the terminal watching a terminal screen, wherein at least one message is displayed on the terminal screen; the terminal determines the progress of a user for browsing the message under the condition that the position of the user watching the terminal screen is determined to fall into the message range of one message displayed on the screen; and changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from a first display mode to a second display mode according to the progress of the user in browsing the message, wherein the display mode of the message which is not browsed by the terminal user is the first display mode. According to the method and the device, the checking of the user on the messages appearing on the terminal screen is tracked through an eyeball tracking technology, the checked messages and the messages which are not checked are distinguished and displayed on the screen according to the progress of the user in checking the messages, the user can conveniently distinguish the checked messages from the messages which are not checked, and important messages are prevented from being missed.

Description

Message processing method of terminal, medium and electronic device
Technical Field
The present application relates to the field of information processing, and in particular, to a method, a terminal, a medium, and an electronic device for processing a message of a terminal.
Background
With the development of communication technology and intelligent terminal technology, intelligent terminals (such as smart phones, tablet computers, and the like) play more and more important roles in daily work and life of people, and more applications of terminal devices are provided, and notification messages (such as missed calls, unread messages, memos, and other item reminders) of the applications are continuously pushed out to remind users of checking.
When a user cannot check the notification messages one by one, usually a stack of messages are not processed, new messages are popped up continuously, important messages in the unread messages are easily submerged by the continuously-appearing messages, so that the user easily ignores the important notification messages or backlogs, and troubles are brought to the work and life of the user.
Disclosure of Invention
The embodiment of the application provides a message processing method of a terminal, the terminal, a medium and electronic equipment. According to the technical scheme, the checking of the user on the messages appearing on the screen of the electronic equipment is tracked through an eyeball tracking technology, the checked messages and the messages which are not checked are displayed on the screen in a distinguishing mode according to the progress of the user checking the notification messages, so that the user can conveniently and quickly distinguish which messages are watched and which messages are not watched, and further needs to pay attention to the messages in a focused mode, and the problem that the user misses important messages due to too many notification messages is solved.
In a first aspect, an embodiment of the present application provides a method for processing a message of a terminal, including:
the terminal determines the position of a user of the terminal watching a terminal screen, wherein a message displayed on the terminal screen is displayed on the terminal screen; the method comprises the steps that the terminal determines the progress of a user in message browsing under the condition that the position of the user watching a terminal screen is determined to fall into the message range of one message in messages displayed on the terminal screen; and changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from a first display mode to a second display mode according to the progress of the user in browsing the message, wherein the display mode of the message which is not browsed by the terminal user is the first display mode. Therefore, the user can quickly distinguish which messages are already seen and which messages are not seen, and further needs to pay attention to the messages, so that the problem that the user misses important messages due to too many notification messages, such as important notifications, calls, backlogs or reminders, and the like, is solved.
In a possible implementation of the first aspect, the method further includes: the message range of the message is the area occupied by the message frame of the message on the screen of the terminal.
In a possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps: and in the case that the user finishes browsing the messages, changing the background color of the messages in the message range from the first color to the second color, wherein the color of the messages which are not browsed by the terminal user is the first color. Here the display is different, i.e. the background color of the message is different. So that the user can easily distinguish which messages have been seen and which have not according to the background color.
In a possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps: according to the progress of the user in browsing the message, the background color of the area corresponding to the part of the message which is browsed in the message range is changed from the first color to the second color (for example, if the message is browsed by half, the color of the background area of the message which is browsed by half is changed), wherein the background color of the part of the message which is not browsed by the end user is the first color. Alternatively, in some embodiments, the background color of the corresponding region within the message range of the message may be changed according to the percentage of progress of the user in browsing the message.
In one possible implementation of the first aspect, the method further includes: under the condition that the terminal determines that the position of the user watching the terminal screen falls into the message range of one message in the messages displayed on the terminal screen, determining the progress of the user browsing the messages comprises the following steps:
under the condition that the terminal determines that the position of a user watching the terminal screen falls into a message frame of one message in messages displayed on the terminal screen, calculating the time that the position of the user watching the terminal screen falls into the message frame; determining the progress of the user for browsing the message according to the calculated ratio of the time to the set time threshold; or comparing the calculated time with a set time threshold, and determining the progress of the user in browsing the message according to the comparison result.
In one possible implementation of the first aspect, the method further includes: the value of the set time threshold is related to at least one of a number of words, a number of rows of the message, and a cumulative display length of the message. It can be understood that the number of lines of the message is the number of lines corresponding to the content displayed on the terminal screen by the message. For example, if all the content of a message occupies 5 lines on the screen of the terminal, but the terminal is in the screen-locked state, the message only displays 2 lines of content on the terminal screen, and the number of lines of the message can be considered as 2 lines. The cumulative display length of a message may be the corresponding cumulative number of lines in the case that a portion of the message is displayed. For example, if the message display has 5 lines in total, and only half of the contents of the first and second lines are completely displayed, the cumulative display length of the message is 1.5 lines.
In a possible implementation of the first aspect, the method further includes: under the condition that the terminal determines that the position of a user watching a terminal screen falls in the message range of one message in the messages displayed on the terminal screen, determining the progress of the user in browsing the messages comprises the following steps:
under the condition that the terminal determines that the position of a user watching the terminal screen falls into a message frame of one message in messages displayed on the terminal screen, calculating the accumulated distance of the user browsing the message frame based on the position of the user watching the terminal screen; determining the progress of the user for browsing the message according to the calculated ratio of the accumulated distance to the set distance threshold; or comparing the calculated accumulated distance with a set distance threshold, and determining the progress of the user in browsing the message according to the comparison result.
In one possible implementation of the first aspect, the method further includes: the value of the set distance threshold is related to at least one of the number of words, the number of lines of the message, and the cumulative display length of the message.
In a possible implementation of the first aspect, the method further includes: the message range includes the area occupied by the message itself on the screen of the terminal.
In one possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps: and changing the color of the part of the message which is browsed on the terminal screen from the first color to the second color according to the progress of the user in browsing the message (for example, the background color of the part of the message which is browsed by the terminal user is changed when the message is browsed by the half), wherein the color of the message which is not browsed by the terminal user in the message displayed on the terminal screen is the first color. In some embodiments, the background color of the corresponding region within the message range of the message may be changed according to the percentage of progress of the user in browsing the message.
In one possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps: and in the case that the message is judged to be viewed, folding and displaying the message on the terminal screen. In this manner, by presenting the viewed messages in a collapsed manner (e.g., only showing the number of messages that have been recently viewed by the user, and not showing the specific content of the viewed messages), the user may be made more aware of the messages that have not been viewed.
In one possible implementation of the first aspect, the method further includes: the classification of at least a part of display elements in the message range of the message on the terminal screen is changed from a first classification to a second classification according to the progress of the user in browsing the message, for example, the browsed message is classified as a read message and displayed on the upper side of the terminal screen, and the unviewed message is classified as an unread message and displayed in the middle of the terminal screen one by one.
In a possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps:
and changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from a first display mode to a second display mode through an intermediate transition state according to the progress of the user in browsing the message, wherein the intermediate transition state at least comprises a display mode different from the first display mode and the second display mode. For example, the message is viewed by the user, transitioning the background color of the message from red through orange to gray.
In a possible implementation of the first aspect, the method further includes: changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises the following steps: and in the case that the message is judged to be viewed, displaying an icon representing that the message is viewed in the message range of the message on the terminal screen, so that the user judges whether the message is viewed according to the icon.
In a second aspect, an embodiment of the present application provides a terminal, including:
the eyeball tracking module is used for determining the position of a terminal screen watched by a user of the terminal, wherein at least one message is displayed on the terminal screen;
the progress calculation module is used for determining the progress of the user for browsing the messages under the condition that the terminal determines that the position of the user watching the terminal screen falls into the message range of one message in the messages displayed on the terminal screen;
and the message management module is used for changing the display mode of at least one part of display elements in the message range of the message on the terminal screen from a first display mode to a second display mode according to the progress of the user in browsing the message, wherein the display mode of the message which is not browsed by the terminal user is the first display mode.
In a third aspect, an embodiment of the present application provides a computer-readable medium, where instructions are stored on the computer-readable medium, and when the instructions are executed on a computer, the instructions cause the computer to perform the message processing method of the terminal according to the first aspect described above and any one of various possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including:
a memory for storing instructions for execution by one or more processors of the system, an
The processor is one of the processors of the system, and is configured to execute the message processing method of the terminal according to the first aspect and any one of various possible implementations of the first aspect.
Drawings
Fig. 1 (a) is a diagram illustrating a display interface for a handset to receive multiple notification messages in a locked screen state, according to some embodiments of the present application;
fig. 1 (b) illustrates a display interface diagram showing different background colors for a read message and an unread message notification box displayed on a cell phone screen, according to some embodiments of the present application;
FIG. 2 illustrates a block diagram of a hardware configuration of a handset, according to some embodiments of the present application;
FIG. 3 illustrates a scene graph for eye tracking using a cell phone, according to some embodiments of the present application;
FIG. 4 illustrates a block diagram of hardware and software associated with the message processing techniques of the present application in a handset, according to some embodiments of the present application;
FIG. 5 illustrates a message processing flow diagram for the software system shown in FIG. 4, according to some embodiments of the present application;
FIG. 6 (a) illustrates a display interface with a cell phone in a locked state with messages on a home screen not viewed, according to some embodiments of the present application;
fig. 6 (b) shows a display interface where part of the message on the main screen has been viewed with the handset in the locked state, according to some embodiments of the present application;
FIG. 7 illustrates a flow diagram of a message processing method for a handset, according to some embodiments of the present application;
fig. 8 (a) illustrates a display interface with different background colors for viewed and unviewed messages on a main screen of a cell phone in a locked state, according to some embodiments of the present application;
FIG. 8 (b) is a display interface showing categorized display of viewed and unviewed messages on a home screen of a cell phone in a locked screen state, according to some embodiments of the present application;
fig. 8 (c) illustrates a display interface where messages that have been viewed on a home screen of a cell phone are displayed with icons in a locked screen state, according to some embodiments of the present application;
FIG. 9 illustrates a block diagram of a message processing apparatus, according to some embodiments of the present application;
FIG. 10 illustrates a block diagram of a system, according to some embodiments of the present application;
fig. 11 illustrates a block diagram of a system on a chip (SoC), according to some embodiments of the disclosure.
Detailed Description
The illustrative embodiments of the present application include, but are not limited to, a terminal message processing method, apparatus, medium, and electronic device.
Embodiments of the present application are described in further detail below with reference to the accompanying drawings.
The embodiment of the application discloses a message processing method of electronic equipment, which can track the checking of a user on messages appearing on a screen of the electronic equipment through an eyeball tracking technology, and can display the checked messages and the messages which are not checked on the screen in a distinguishing manner according to the progress of the user in checking the notification messages, so that the user can distinguish which messages are already checked and which messages are not checked and further need to pay attention to the important messages quickly, and the problem that the user misses important messages due to too many notification messages, such as important notifications, telephone calls, backlog or reminding matters, is avoided. It is to be appreciated that the electronic devices provided herein can be a variety of electronic devices whose display screens are capable of displaying messaging notifications and capable of human eye movement detection, including but not limited to tablet computers, smart phones, laptop computers, desktop computers, wearable electronic devices, head-mounted displays, mobile email devices, portable game consoles, portable music players, reader devices, and the like, wherein the wearable electronic devices include but are not limited to smart watches, smart bracelets or smart glasses, smart helmets, smart headbands, and the like. For convenience of explanation, the following describes the technical solution of the present application by taking the mobile phone 10 as an example.
Specifically, fig. 1 (a) shows a display interface diagram in which the mobile phone 10 receives a plurality of notification messages in the screen lock state. As shown in fig. 1 (a), the main screen of the mobile phone 10 is in the screen-locked state, and the display interface of the main screen displays four notification messages, which are two short messages (a bill message from 10086 and a short message from a client), a missed call from lie and a WeChat message. As shown in fig. 1 (b), when the user views the main screen in the screen-locked state, the mobile phone 10 determines whether the eyes of the user watch the notification message by tracking the position where the eyes of the user (hereinafter, referred to as the eyes of the human) watch the main screen, and distinguishes the background colors of the notification boxes of the read and unread messages, for example, in fig. 1 (b), it is detected that the user views the short message from the customer and the missed call message from the lie, so that the background colors of the short message notification box from the customer and the missed call message notification box from the lie are changed and are different from the background color of the notification box of the message that is not viewed. The solution is also applicable to other scenarios in which the display interface of the mobile phone 10 displays notification messages, for example, a list of messages appearing after the user pulls down the screen. In the prior art, when a user does not manually unlock the home screen and views each notification message of the notification center one by one, or views only a portion of the notification messages, it is often difficult to distinguish which messages have been viewed, which messages have not been viewed, important notifications, phone calls, to-do or reminder items, and the like, as new notification messages continue to appear. If the important messages are missed to be checked, the work and life of the user may be greatly influenced. As described above, according to the technical scheme of the application, the problem is solved, a user can know the condition of the checked message through the background change of the message without special operation, the problem can be solved, and the operation process is simple and rapid.
It will be appreciated that the notification message displayed on the screen of the handset 10 may include call logs, notes, third party application notifications, and the like. In addition, the message displayed on the screen of the mobile phone 10 is not necessarily the most recently pushed notification message, and a plurality of notification messages may be accumulated for the history. When the read and unread notification messages are distinguished by the background color, the background color of the notification frame of the message may be distinguished, or the background color of the message itself may be distinguished. Further, in distinguishing the read and unread notification messages, in addition to the manner of changing the background color of the message box, other manners may be used, such as bolding the font of the read message, displaying the read message in a different font from the unread message, adding an icon representing that the message is seen to the read message, and sorting, grouping, etc. the read message and the unread message.
Fig. 2 shows a schematic structural diagram of a mobile phone 10 according to an embodiment of the present application. The mobile phone 10 can execute the message processing method disclosed in the embodiment of the present application. In fig. 2, like parts have the same reference numerals. As shown in fig. 2, the mobile phone 10 may include a processor 110, a power supply module 140, a memory 180, a mobile communication module 130, a wireless communication module 120, an infrared light emitting module 101, a sensor module 190, an audio module 150, an infrared camera 170, an interface module 160, a display screen 102, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 10. In other embodiments of the present application, the handset 10 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more Processing units, for example, a Processing module or a Processing circuit that may include a Central Processing Unit (CPU), a Graphic Processing Unit (GPU), a Digital Signal Processor (DSP), a Micro-programmed Control Unit (MCU), an Artificial Intelligence (AI) processor, or a Programmable logic device (FPGA), etc. The different processing units may be separate devices or may be integrated into one or more processors. The processor 110 may be configured to process the video or image collected by the infrared camera 170, which includes the human eye, to obtain a position where the human eye gazes at the screen of the mobile phone 10. A memory unit may be provided in the processor 110 for storing instructions and data. In some embodiments, the storage unit in processor 110 is cache 180. The memory 180 may store an operating system and an application program (e.g., an application program for capturing a video) required by at least one function, and the memory 180 may further store a video or an image including a human eye captured by the infrared camera 170, and position information of the screen of the mobile phone 10 on which the human eye is focused, which is obtained by the processor 110 processing the video or the image including the human eye captured by the infrared camera 170.
The power module 140 may include a power supply, power management components, and the like. The power source may be a battery. The power management component is used for managing the charging of the power supply and the power supply of the power supply to other modules. The charging management module is used for receiving charging input from the charger; the power management module is used to connect a power source, the charging management module and the processor 110.
The mobile communication module 130 may include, but is not limited to, an antenna, a power amplifier, a filter, a Low Noise Amplifier (LNA), and the like. The mobile communication module 130 may provide a solution including wireless communication of 2G/3G/4G/5G and the like applied to the handset 10. The mobile communication module 130 can receive electromagnetic waves from the antenna, filter, amplify and so on the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 130 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave to radiate the electromagnetic wave through the antenna. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the same device as at least some of the modules of the processor 110. The wireless communication technology may include global system for mobile communications (GSM), general Packet Radio Service (GPRS), code Division Multiple Access (CDMA), wideband Code Division Multiple Access (WCDMA), time division code division multiple access (time-division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), wireless local area network (wireless local area networks, WLAN), near Field Communication (NFC), frequency modulation (frequency modulation, modulation and/or field), infrared (IR), and the like. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a bei dou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The wireless communication module 120 may include an antenna, and implement transceiving of electromagnetic waves via the antenna. The wireless communication module 120 may provide a solution for wireless communication applied to the mobile phone 10, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The handset 10 may communicate with a network and other devices via wireless communication techniques.
In some embodiments, the mobile communication module 130 and the wireless communication module 120 of the handset 10 may also be located in the same module.
The display screen 102 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-OLED, a quantum dot light-emitting diode (QLED), or the like. For example, the display screen 102 is used to display notification messages (e.g., missed calls, unread messages, memos, and other event reminders) for various applications of the cell phone 10.
The infrared light emitting module 101 includes an infrared light emitter, which can emit infrared light to the human eye to determine the position of the human eye gazing at the display screen 102 according to the position of the infrared light on the cornea of the human eye. Infrared light emitters include, but are not limited to, infrared light emitting diodes, and the like.
The sensor module 190 may include a proximity light sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The audio module 150 may convert digital audio information into an analog audio signal output or convert an analog audio input into a digital audio signal. The audio module 150 may also be used to encode and decode audio signals. In some embodiments, the audio module 150 may be disposed in the processor 110, or some functional modules of the audio module 150 may be disposed in the processor 110. In some embodiments, audio module 150 may include speakers, earphones, a microphone, and a headphone interface.
In some embodiments, the infrared camera 170 may capture a video or an image including a human eye, so as to obtain a position of the infrared light emitted by the infrared light emitting module 101 in a cornea of the user's eye by processing the video or the image including the human eye, thereby determining a position where the user's eye gazes at the display screen 102 of the mobile phone 10. It will be appreciated that the handset 10 may also include a conventional camera (not shown) for capturing color images or video.
The interface module 160 includes an external memory interface, a Universal Serial Bus (USB) interface, a Subscriber Identity Module (SIM) card interface, and the like. The external memory interface may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 10. The external memory card communicates with the processor 110 through an external memory interface to implement a data storage function. The usb interface is used for the handset 10 to communicate with other handsets. The SIM card interface is used to communicate with a SIM card attached to the handset 10, for example to read a telephone number stored in the SIM card or to write a telephone number into the SIM card.
In some embodiments, the handset 10 also includes keys, motors, indicators, and the like. The keys may include a volume key, an on/off key, and the like. The motor is used to cause a vibration effect to the handset 10, for example when the user's handset 10 is being called, to prompt the user to answer an incoming call to the handset 10. The indicators may include laser indicators, radio frequency indicators, LED indicators, and the like.
Fig. 3 illustrates a scene diagram for eye tracking using the handset 10, according to some embodiments of the present application. In the embodiment shown in fig. 3, it is illustrated that the infrared light emitting module 101 has three infrared light emitters, wherein the infrared light emitters can be used for emitting infrared light to human eyes, and accordingly, after receiving the infrared light emitted by the three infrared light emitters, the human eyes form three infrared light reflection points (such as the reflection point a, the reflection point B and the reflection point C shown in fig. 3) in the cornea of the human eyes. The infrared camera 170 is configured to capture a video or an image of a human eye, and obtain a position of a reflection point formed in a cornea of the human eye and a pupil center position of the human eye from the captured video or image, so as to determine a position where the human eye gazes at the screen of the mobile phone 10 (i.e., a line of sight where the human eye falls on the screen of the mobile phone 10). Specifically, in some embodiments, the location at which the human eye is looking at the handset 10 may be determined by:
processing the image of the human eye to obtain a gradient value of gray scale in a set direction, determining the position of the maximum gradient value of the gray scale as the pupil edge feature of the image of the human eye, fitting the pupil edge feature to determine the center position of the fitted image, determining the center position as the center position of the pupil in the image of the human eye, and obtaining an area with higher gray scale value as a purkinje spot based on a spot recognition algorithm from the gray scale image obtained after the image of the human eye is processed, wherein the position of the purkinje spot is a light spot (or referred to as a 'reflection point') position. And determining the position of the human eye watching the screen of the mobile phone 10 (namely the sight line of the human eye falling on the screen of the mobile phone 10) according to the position of the pupil center, the corneal curvature center and the preset included angle between the optical axis of the eye and the visual axis.
It will be appreciated that in other embodiments, the location of the eye looking at the screen of the handset 10 may be confirmed in other ways, and is not limited to the above method.
Furthermore, it is understood that in other embodiments, infrared light emitting module 101 may further include more than three infrared light emitters arranged in a predetermined manner, such as a delta shape, a straight shape, and the like. In other embodiments, the number of the infrared cameras 170 may also be two or more, and is not limited herein.
Fig. 4 illustrates hardware and software associated with the message processing techniques of the present application in the handset 10 of fig. 3, in accordance with some embodiments of the present application. As shown in fig. 4, the handset 10 includes a hardware system 103 and a software system 104. The hardware system 103 comprises an infrared camera 170, an infrared light emitting module 101 and a processor 110; software system 104 includes an operating system 107, and operating system 107 includes a message management module 105 and an eye tracking module 106. The infrared light emitting module 101 includes three infrared light emitters for emitting infrared light as an infrared light source. The infrared camera 170 is configured to capture a video or an image of a human eye, and send the video or the image to the processor 110, so that the processor 110 obtains a position of a reflection point formed in a cornea of the human eye and a center position of a pupil of the human eye from the captured video or the image of the human eye, and further determines a position where the human eye gazes at the screen of the mobile phone 10 according to the position of the reflection point in the cornea and the center position of the pupil of the human eye, which may be described with reference to fig. 3. The eyeball tracking module 106 continuously obtains the motion of the human eyes to feed back the position of the human eyes gazing at the screen of the mobile phone 10 in real time. The message management module 105 is a module in the cellular phone 10 that sets and dynamically changes the status display characteristics of the notification message of the cellular phone 10, and for example, the color of the message notification bar, the font size of the text, and the like can be changed by the message management module 105. In the embodiment provided by the present application, the message management module 105 is further configured to determine whether the eyes watch the notification message according to the position of the eyes gazing at the screen of the mobile phone 10, which is determined by the eyeball tracking module 106, and distinguish the background color of the read message from the background color of the unread message, so that a user can quickly distinguish which messages are watched, which messages are not watched and further need to pay attention, the user only needs to move the eyeballs, the operation process is simple and fast, and the user experience is improved.
It is understood that the message management module 105 and the eye tracking module 106 provided in the embodiment of the present application are modules divided according to functions, and in other embodiments, the message management module 105 and the eye tracking module 106 may be combined into one module or divided into more modules. The hardware and software system architecture of the mobile phone 10 illustrated in the embodiment of the present application does not specifically limit the mobile phone 10. In other embodiments of the present application, the handset 10 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used.
Fig. 5 illustrates a message processing flow of the handset 10 shown in fig. 4, according to some embodiments of the present application. Specifically, as shown in fig. 5, the method includes:
1) Message management module 105 and eye tracking module 106 establish a connection (500). In some embodiments, the message management module 105 triggers the eye tracking module 106 to turn on eye tracking functionality when a new notification message is received (e.g., when the main screen of the cell phone 10 is in a locked state). In some embodiments, when the message management module 105 receives an operation of pulling down the notification bar by the user (for example, when the mobile phone screen is unlocked), the eyeball tracking module 106 is triggered to start the eyeball tracking function. In some embodiments, the eye tracking module 106 is triggered to start the eye tracking function when the message management module 105 receives a notification alert from an application or the like of the handset 10 (e.g., a user pops up a Tips from the chat software on the display of the handset 10 while browsing a web page using a search engine).
When the eye tracking module 106 receives the registration of the message management module 105, the message management module 105 establishes a connection with the eye tracking module 106.
2) The eyeball tracking module 106 sends the position coordinates of the human eye gazing at the screen of the mobile phone 10 to the message management module 105 at preset time intervals (502). For example, in some embodiments, the eye tracking module 106 sends the location coordinates of the human eye's gaze at the screen of the handset 10 to the message management module 105 every 10 ms.
In some embodiments, the location at which the human eye is gazing at the screen of the handset 10 may be determined by optical recording, i.e.: the method comprises the steps of recording the motion condition of human eyes by a camera or a video camera, namely acquiring eye images reflecting the motion of the human eyes, and extracting eye features from the acquired eye images for establishing a model of sight line/fixation point estimation. Wherein the eye features may include: pupil center location, pupil shape, iris location, iris shape, eyelid location, canthus location, spot (also known as purkinje spot) location, and the like.
Specifically, the principle of the eyeball tracking module 106 in the embodiment of the present application for determining the position where the human eye gazes at the screen of the mobile phone 10 may be: the infrared light emitting module 101 emits infrared light to the human eye, the infrared camera 170 shoots a motion video of the human eye, and meanwhile, a reflection point (also called as a spot) of the infrared light on a cornea can be shot, so that an eye image with the spot can be obtained. The eyeball tracking module 106 analyzes the eye image with the light spot to obtain the center position of the pupil of the human eye and the position of the reflection point on the cornea, and further determines the position where the human eye gazes at the screen of the mobile phone 10. It can be understood that as the human eye rotates, the relative position relationship between the pupil center and the corneal reflection point changes, the eye images with light spots collected by the infrared camera 170 reflect the position change relationship, and the eyeball tracking module 106 determines the position where the human eye gazes at the screen of the mobile phone 10 as the human eye rotates according to the position change relationship.
It will be appreciated that in other embodiments, the location of the eye looking at the handset 10 may be ascertained in other ways, and is not limited to the above method.
3) The message management module 105 determines whether the position where the human eye gazes at the screen of the mobile phone 10 falls into the message notification box according to the received position coordinates where the human eye gazes at the screen of the mobile phone 10 and the layout information of the message notification box of the mobile phone 10 (504). In some embodiments, when the mobile phone 10 receives a new message, the message management module 105 may obtain layout information of a corresponding message box of the message on the screen of the mobile phone 10 (i.e. a coordinate range of the message box on the screen of the mobile phone 10), and determine whether the position where the human eye gazes at the screen of the mobile phone 10 falls within the message notification box by determining whether the position coordinate of the human eye gazing at the screen of the mobile phone 10 falls within the coordinate range of the message box on the screen of the mobile phone 10. It will be appreciated that when the human eye is viewing a notification message, the location at which the human eye is looking at the screen of the handset 10 must be within the message box corresponding to the message.
4) If the message notification frame is received, the message management module 105 calculates the progress of the human eye in browsing the message according to the duration of the human eye watching the message or the continuously received position information of the human eye watching the screen of the mobile phone 10 (506). In some embodiments, the progress of the human eye in browsing the message may be calculated by calculating the length of time that the location of the human eye's gaze on the screen of the handset 10 falls within the corresponding message notification box. Specifically, the progress of browsing the message by the human eyes may be a ratio of a calculated time length that the position where the human eyes watch the screen of the mobile phone 10 falls into the corresponding message notification box to a set time threshold. It will be appreciated that the timing should begin with a determination that the location of the eye looking at the screen of the handset 10 falls within the message notification box. For example, the eyeball tracking module 106 sends the position of the eye-gazed at the screen of the mobile phone 10 to the message management module 105 every 10ms from the time when the position of the eye-gazed at the screen of the mobile phone 10 is determined to fall into the message notification box, and the position of the eye-gazed at the screen of the mobile phone 10 in the continuous 100ms period is always in the message notification box, so it can be understood that the time for the eye to browse the message in the message notification box is at least 100ms. If the human eye needs 200ms to completely browse the message (namely the set time threshold of the message is 200 ms), the browsing progress is 50% when the human eye browses the message for 100 ms; if it takes 120ms (i.e. the set time threshold of the message is 120 ms) for the human eye to completely browse the message, the browsing progress when the human eye browses the message for 100ms is 83.33%. It will be appreciated that the set time threshold for a message may be related to at least one of the number of words, the number of lines of the message, and the cumulative display length of the message. That is, when the font size of the message is the same, the set time threshold corresponding to the message is larger as the number of words is larger, the number of lines is larger, or the length of the cumulative display is longer.
It can be understood that, in some embodiments, the progress of browsing the message by the human eyes may also be determined according to a size relationship between the time of browsing the message by the human eyes and a set time threshold, for example, for one or more messages displayed on the screen of the mobile phone 10, the set time threshold is 80ms, and if the time of browsing the message by the human eyes obtained through calculation is greater than or equal to 80ms, the message may be considered to be viewed; if the time for the human eye to view the message is less than 80ms, the message is considered to be not viewed.
In some embodiments, the progress of the human eye in browsing the message may be calculated by the position information of the human eye looking at the screen of the handset 10. In one embodiment, progress may be calculated by calculating a cumulative distance that a human eye browses messages within a message notification box. Specifically, the progress of message browsing by human eyes may be a ratio of a calculated cumulative distance of messages in the message notification box browsed by human eyes to a set distance threshold. For example, the total length of the messages in the message notification box is two lines, the length of the messages in the first line is 5cm, the length of the messages in the second line is 3cm, and if the total length (i.e., the cumulative distance) of the messages browsed by the human eye is determined to be 6.4cm, the progress of the human eye in browsing the messages is 80%. In one embodiment, the progress may be calculated by calculating the height of the message in the message notification box browsed by the human eye, for example, the total five lines of the message in the message notification box are provided, the height of each line is 0.5cm on average, if it is determined that the human eye browses four lines of the message, the progress of the human eye browsing the message is 80%, and if it is determined that the human eye browses three lines of the message, the progress of the human eye browsing the message is 60%. It will be appreciated that the set distance threshold for a message may be related to at least one of a number of words, a number of rows of the message, and a cumulative display length of the message. That is, when the font size of the message is the same, the set distance threshold corresponding to the message is larger as the number of words is larger, the number of lines is larger, or the length of the cumulative display is longer.
It can be understood that, in some embodiments, the progress of browsing the message by the human eye may also be determined according to the size relationship between the distance of browsing the message by the human eye and the set distance threshold, for example, for one or more messages displayed on the screen of the mobile phone 10, the distance threshold is set to 3 lines, and if the distance of browsing the message by the human eye obtained through calculation is greater than or equal to 3 lines, the message may be considered to be viewed; if the distance of the message viewed by the human eyes is less than 3 lines, the message can be considered to be not viewed.
It will be appreciated that in some embodiments, the size of each message notification box (e.g., the length and height of the message notification box) may be the same. In other embodiments, the size of each message notification box may also be different, for example, there are two rows of text in the message notification box 1 and ten rows of text in the message notification box 2, and if the text in each row is equal in height, it will be understood that the height of the message notification box is greater as the number of rows occupied by the message content is greater.
It can be understood that, in some embodiments, in the process of calculating the progress of the human eye viewing the message, if it is determined that the position where the human eye views the screen of the mobile phone 10 is not in the message notification box, the process of calculating the progress of the human eye viewing the message is interrupted, and the current progress is saved, so that when it is determined that the position where the human eye views the screen of the mobile phone 10 is in the message notification box next time, the process of calculating the progress is continued.
In some embodiments, in addition to calculating the progress of the human eye in browsing the message from the time the human eye falls in the message notification box, in the case where the message displayed on the screen of the mobile phone 10 has no message notification box (i.e., only the message content is displayed), the progress of the human eye in browsing the message may also be started from the time the human eye is determined to fall within the range of the area occupied by the message itself on the screen of the mobile phone 10.
5) The message management module 105 determines whether the human eye viewed the message based on the progress of the human eye in browsing the message (508). In some embodiments, it may be determined whether the message was viewed by the human eye based on a preset progress threshold. For example, in one embodiment, the preset progress threshold is 70%, if the human eye needs 200ms to completely browse the message, the browsing progress of the human eye when the human eye browses the message for 100ms is 50%, the human eye is considered not to have viewed the message, and if the browsing progress of the human eye when the human eye browses the message for 160ms is 80%, the human eye is considered to have viewed the message. For another example, in one embodiment, the messages in the message notification box have two total lines, the length of the message in the first line is 5cm, the length of the message in the second line is 3cm, and if it is determined that the total length of the message viewed by the human eye is 6.4cm and the progress of the message viewed by the human eye is 80%, the human eye is considered to have viewed the message. It is to be understood that the foregoing values of the progress threshold are merely exemplary and not limiting, and in a specific application, the values of the threshold may be set as needed.
It will be appreciated that in some embodiments, for a message that is partially viewed but not viewed (i.e., the message is determined not to be viewed by the human eye based on the progress of viewing the message), the progress of the current viewing may be recorded, so that the progress of attention to the message may continue to be accumulated based thereon, or the progress may be recalculated the next time.
6) If the human eye sees the message, the message management module 105 changes the color of the message box of the message (510). In some embodiments, if it is determined that the human eye has viewed the message in the message notification box, the message management module 105 distinguishes the background color of the message box (e.g., the background color of the message box is white when the message box is not viewed, and the message management module 105 changes the background color of the message box to rose after determining that the message box has been viewed) to represent that the message box has been viewed. In some embodiments, the message management module 105 may also change the color of the message box being viewed by the human eye as the human eye progresses through the message. For example, in the embodiment shown in fig. 6 (a), the main screen of the mobile phone 10 is in the lock screen state, and four messages, i.e., two pieces of information from "Guo", one total missed call message and one WeChat message, are displayed on the main screen of the mobile phone 10, and none of the four messages is viewed by the user. In the embodiment shown in fig. 6 (b), the missed call message from the caller is viewed, and the background color corresponding to the message box is completely changed (i.e. different from the background color of the message not viewed); a message from "Guo" regarding the issue discussed in the meeting is viewed 70%, and the background color of the portion corresponding to 70% of the viewed message is changed.
It will be appreciated that in some embodiments, the messages that are viewed by the human eye may be distinguished from the messages that are not viewed in other ways, such as by bolding the font of the read message, displaying the read message in a different font than the unread message, and so forth.
FIG. 7 illustrates a message processing method of an electronic device, according to some embodiments of the present application. The following takes the mobile phone 10 shown in fig. 3 as an example to describe the message processing method provided in the embodiment of the present application in detail. Specifically, as shown in fig. 7, the method includes:
1) The position of the human eye looking at the cell phone screen is determined (702). For example, the cell phone 10 determines the location where the eye of the person is looking at the cell phone 10 screen via the eye tracking module 106. Specifically, infrared light is emitted to the human eye through the infrared light emitting module 101, the human eye motion video is shot by the infrared camera 170, and meanwhile, the reflection point of the infrared light on the cornea can be shot, so that the eye image with the reflection point is obtained. The eyeball tracking module 106 obtains the center position of the pupil of the human eye and the position of the reflection point on the cornea by analyzing the eye image with the reflection point, and determines the position where the human eye gazes at the screen of the mobile phone 10 according to the center position of the pupil of the human eye and the position of the reflection point on the cornea. Moreover, as the human eye rotates, the relative position relationship between the pupil center and the corneal reflection point changes, a plurality of eye images with reflection points collected by the infrared camera 170 reflect the position change relationship, and the eyeball tracking module 106 determines the position where the human eye gazes at the screen of the mobile phone 10 as the human eye rotates according to the position change relationship.
2) And judging whether the position where the human eye gazes on the mobile phone screen falls into the message frame or not according to the position where the human eye gazes on the mobile phone screen and the layout information of the message frame displayed on the mobile phone screen (704). The message management module 105 may obtain layout information of a message frame corresponding to the message on the screen of the mobile phone 10 (i.e., a coordinate range of the message frame on the screen of the mobile phone 10), and determine whether the position where the human eye gazes at the screen of the mobile phone 10 falls within the coordinate range of the message frame on the screen of the mobile phone 10 by determining whether the position where the human eye gazes at the screen of the mobile phone 10 falls within the message notification frame. Namely: if the position coordinate of the screen of the mobile phone 10 watched by the human eyes falls into the coordinate range of the message frame on the screen of the mobile phone 10, judging that the position of the screen of the mobile phone 10 watched by the human eyes falls into the message notification frame; if the position coordinate of the mobile phone 10 screen watched by the human eye exceeds the coordinate range of the message frame on the mobile phone 10 screen, it is determined that the position of the mobile phone 10 screen watched by the human eye does not fall into the message notification frame.
3) In the case that the position where the human eye gazes at the mobile phone screen falls within the message frame, the progress of the human eye in browsing the message within the message frame is calculated (706). In some embodiments, the progress of the human eye in browsing the message may be calculated by calculating the length of time that the location of the human eye's gaze on the screen of the handset 10 falls within the corresponding message notification box. In some embodiments, the progress of the human eye in browsing the message may be calculated by the position information of the human eye looking at the screen of the handset 10. The specific calculation method is similar to the calculation method shown in fig. 5, and for detailed description, please refer to the above, which is not repeated herein.
4) It is determined whether the human eye viewed the message within the message box based on the progress of the human eye viewing the message within the message box (708).
In some embodiments, whether the human eye has viewed the message may be determined based on a preset progress threshold. For example, the preset progress threshold is 80%, if the progress of the human eye for browsing the message in the message frame is greater than or equal to 80%, it is determined that the human eye has watched the message in the message frame, otherwise, it is determined that the human eye has not watched the message in the message frame.
5) In the event that the human eye sees a message within the message box, the color of the message box is changed from a first color to a second color (710). For example, in the embodiment shown in fig. 8 (a), four messages are displayed on the main screen of the mobile phone 10, wherein there are two short messages, one from a total missed call and one from a WeChat. When a mobile phone user sees a missed call notification message from a main call, the message management module 105 distinguishes the background color of the message frame containing the call notification message by the background color different from that of the unread message frame, so that the user can quickly distinguish which messages are seen and which messages are not seen, the user only needs eye movement, the operation process is simple and quick, and the user experience is improved.
In some embodiments, the viewed part and the non-viewed part of the message may be further color-distinguished as the human eye continues to browse the message according to the progress of the human eye browsing the message, for example, the content of the non-viewed part of the message is displayed as white and the content of the viewed part of the message is displayed as green.
In some embodiments, viewed messages and unseen messages may also be categorized, with unseen messages being presented above the screen of the handset 10 and viewed messages being presented below the screen of the handset 10. It is also possible to display the messages that have not been viewed on the screen of the mobile phone 10 one by one, to display the viewed messages in a collapsed manner, to display only the number of the messages that have been read, or to display only the messages that have not been viewed on the main screen of the mobile phone 10. For example, in the embodiment shown in fig. 8 (b), there are 1 total missed call and one WeChat notification that are not viewed in the main screen of the cellular phone 10, and there are 20 notifications that are read that are displayed superimposed. This may make it easier for the user to focus on notification messages that are not being viewed, reducing the likelihood of overlooking important notification messages.
In other embodiments, for a viewed message, an icon may be displayed on the screen indicating that the message has been viewed. For example, in the embodiment shown in fig. 8 (c), the main screen of the mobile phone 10 displays four messages, two of which are short messages, one from a total missed call and one from a WeChat. After the user is viewing the sms message from "Guo" and the missed call notification message from zhao, the message management module 105 displays an eye icon on the upper right of the message boxes of the two messages, which indicates that the two messages have been viewed. The user can distinguish which messages are seen and which are not seen by looking at the message frame to display the eye icon, so that the user can conveniently and quickly distinguish the seen messages from the unseen messages.
It is to be understood that the eye icon representing the message that has been viewed in fig. 8 (c) can be any icon, and the present scheme is not limited to the shape, size, color, etc. of the icon representing the message that has been viewed. In addition, the position of the icon which represents that a certain message is viewed in the message frame is not limited by the scheme.
It will be appreciated that in other embodiments, for messages that are not viewed, icons may also be displayed on the screen indicating that the message is not viewed.
Fig. 9 provides a message processing apparatus 900 according to some embodiments of the present application, specifically, including:
an eyeball tracking module 902, configured to determine a position where a user of the terminal gazes at a terminal screen, where at least one message is displayed on the terminal screen;
the progress calculation module 904 is used for determining the progress of the user for browsing the messages under the condition that the terminal determines that the position of the user watching the terminal screen falls into the message range of one message in the messages displayed on the terminal screen;
the message management module 906 is configured to change, according to the progress of the user browsing the message, a display manner of at least a part of display elements in a message range of the message on the terminal screen from a first display manner to a second display manner, where the display manner of the message that is not browsed by the terminal user is the first display manner.
It can be understood that the message processing apparatus 900 shown in fig. 9 corresponds to the message processing method provided in the present application, and the technical details in the above specific description about the message processing method provided in the present application are still applicable to the message processing apparatus 900 shown in fig. 9, and for the specific description, refer to the above, and are not described again here.
Referring now to FIG. 10, shown is a block diagram of a system 100 in accordance with one embodiment of the present application. FIG. 10 schematically illustrates an example system 100 in accordance with various embodiments. In one embodiment, the system 100 may include one or more processors 1004, system control logic 1008 coupled to at least one of the processors 1004, system memory 1012 coupled to the system control logic 1008, non-volatile memory (NVM) 1016 coupled to the system control logic 1008, and a network interface 1020 coupled to the system control logic 1008. In some embodiments, system 100 also includes an infrared light emitting module (not shown), an infrared camera (not shown), and so forth. The infrared light emitting module comprises an infrared light emitter and can emit infrared light to human eyes. Infrared light emitters include, but are not limited to, infrared light emitting diodes, infrared lasers, and the like. The infrared camera may capture video or images containing the human eye.
In some embodiments, processor 1004 may include one or more single-core or multi-core processors. In some embodiments, the processor 1004 may include any combination of general-purpose processors and special-purpose processors (e.g., graphics processors, application processors, baseband processors, etc.). In an embodiment where the system 100 employs an eNB (enhanced Node B) 101 or a RAN (Radio Access Network) controller, the processor 1004 may be configured to perform various embodiments, for example, as shown in fig. 5 and fig. 7, the processor 1004 processes a video or an image collected by an infrared camera and including a human eye, obtains a position of infrared light emitted by an infrared light emitting module in a cornea of the user's eye, and further determines a position where the user's eye gazes at a display screen of the mobile phone.
In some embodiments, system control logic 1008 may include any suitable interface controllers to provide any suitable interface to at least one of processors 1004 and/or any suitable device or component in communication with system control logic 1008.
In some embodiments, system control logic 1008 may include one or more memory controllers to provide an interface to system memory 1012. System memory 1012 may be used to load and store data and/or instructions. For example, an application program corresponding to the photographing function of the infrared camera is loaded, a video or an image including the human eye acquired by the infrared camera is stored, and the processor processes the video or the image including the human eye acquired by the infrared camera to obtain the position information of the screen of the mobile phone watched by the human eye. Memory 1012 of system 100 may include any suitable volatile memory, such as suitable Dynamic Random Access Memory (DRAM), in some embodiments.
The NVM/memory 1016 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions. In some embodiments, the NVM/memory 1016 may include any suitable non-volatile memory such as flash memory and/or any suitable non-volatile storage device, such as at least one of a HDD (Hard Disk Drive), CD (Compact Disc) Drive, DVD (Digital Versatile Disc) Drive.
The NVM/memory 1016 may comprise a portion of a storage resource on the device on which the system 100 is installed, or it may be accessible by, but not necessarily a part of, a device. For example, the NVM/storage 1016 may be accessed over a network via the network interface 1020.
In particular, system memory 1012 and NVM/storage 1016 may include: a temporary copy and a permanent copy of instructions 1024. The instructions 1024 may include: instructions that when executed by at least one of the processors 1004 cause the system 100 to perform the method illustrated in fig. 5 and 7. In some embodiments, the instructions 1024, hardware, firmware, and/or software components thereof may additionally/alternatively be located in the system control logic 1008, the network interface 1020, and/or the processor 1004.
Network interface 1020 may include a transceiver to provide a radio interface for system 100 to communicate with any other suitable devices (e.g., front end modules, antennas, etc.) over one or more networks. In some embodiments, the network interface 1020 may be integrated with other components of the system 100. For example, the network interface 1020 may be integrated with at least one of the processor 1004, the system memory 1012, the nvm/storage 1016, and a firmware device (not shown) having instructions that, when executed by at least one of the processors 1004, the system 100 implements the methods shown in fig. 5 and 7.
The network interface 1020 may further include any suitable hardware and/or firmware to provide a multiple-input multiple-output radio interface. For example, network interface 1020 may be a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
In one embodiment, at least one of the processors 1004 may be packaged together with logic for one or more controllers of system control logic 1008 to form a System In Package (SiP). In one embodiment, at least one of the processors 1004 may be integrated on the same die with logic for one or more controllers of System control logic 1008 to form a System on Chip (SoC).
The system 100 may further include: input/output (I/O) devices 1032.I/O devices 1032 may include a user interface to enable a user to interact with system 100; the design of the peripheral component interface enables peripheral components to also interact with the system 100. In some embodiments, the system 100 further comprises a sensor for determining at least one of environmental conditions and location information associated with the system 100.
In some embodiments, the user interface may include, but is not limited to, a display (e.g., a liquid crystal display, a touch screen display, etc.), a speaker, a microphone, one or more cameras (e.g., still image cameras and/or video cameras), a flashlight (e.g., a light emitting diode flash), and a keyboard.
In some embodiments, the peripheral component interfaces may include, but are not limited to, a non-volatile memory port, an audio jack, and a power interface.
In some embodiments, the sensors may include, but are not limited to, a gyroscope sensor, an accelerometer, a proximity sensor, an ambient light sensor, and a positioning unit. The positioning unit may also be part of the network interface 1020 or interact with the network interface 1020 to communicate with components of a positioning network, such as Global Positioning System (GPS) satellites.
Fig. 11 shows a block diagram of a System on Chip (SoC) 110, according to an embodiment of the present application. In fig. 11, like parts have the same reference numerals. In addition, the dashed box is an optional feature of more advanced socs. In fig. 11, soC110 includes: an interconnection unit 1150; a system agent unit 1180; a bus controller unit 1190; an integrated memory controller unit 1140; a set or one or more coprocessors 1120 which may include integrated graphics logic, an image processor, an audio processor, and a video processor; an Static Random Access Memory (SRAM) unit 1130; a Direct Memory Access (DMA) unit 1160. In one embodiment, the coprocessor 1120 includes a special-Purpose processor, such as, for example, a network or communication processor, compression engine, graphics processor General Purpose Computing (GPGPU), high-throughput MIC processor, or embedded processor, among others.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this Application, a Processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-Only memories (CD-ROMs), magneto-optical disks, read-Only memories (ROMs), random Access Memories (RAMs), erasable Programmable Read-Only memories (EPROMs), electrically Erasable Programmable Read-Only memories (EEPROMs), magnetic or optical cards, flash Memory, or tangible machine-readable memories for transmitting information (e.g., carrier waves, infrared signals, digital signals, etc.) using the Internet to transmit information in an electrical, optical, acoustical or other form of propagated signals. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and description of the present patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (17)

1. A message processing method of a terminal is characterized by comprising the following steps:
the terminal determines the position of a terminal screen watched by a user of the terminal, wherein at least one message is displayed on the terminal screen;
the terminal determines the progress of the user in browsing the messages under the condition that the position of the user watching the terminal screen is determined to fall into the message range of one of the messages displayed on the terminal screen;
and changing the display mode of at least one part of display elements in the message range of the message on a terminal screen from a first display mode to a second display mode according to the progress of the user in browsing the message, wherein in the message range of the same message, the display mode corresponding to the part which is not browsed by the terminal user is the first display mode, and the display mode corresponding to the part which is browsed by the terminal user is the second display mode.
2. The method according to claim 1, wherein the message range of the message is an area occupied by a message box of the message on a screen of the terminal.
3. The method according to claim 2, wherein the changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and changing the background color of the message in the message range from a first color to a second color when the user finishes browsing the message, wherein the color of the message which is not browsed by the terminal user is the first color.
4. The method of claim 2, wherein changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and changing the background color of the area corresponding to the browsed part of the message in the message range of the message from a first color to a second color according to the progress of the user in browsing the message, wherein the background color of the part of the message which is not browsed by the terminal user is the first color.
5. The method according to claim 2, wherein the terminal determining the progress of the user in browsing the messages in the case that the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen comprises:
under the condition that the terminal determines that the position of the user gazing at the terminal screen falls into a message frame of one message in the messages displayed on the terminal screen, calculating the time of the position of the user gazing at the terminal screen falling into the message frame;
determining the progress of the user in browsing the message according to the calculated ratio of the time to a set time threshold; alternatively, the first and second liquid crystal display panels may be,
and comparing the calculated time with a set time threshold, and determining the progress of the user in browsing the message according to the comparison result.
6. The method of claim 5, wherein the value of the set time threshold is associated with at least one of a number of message words, a number of rows, and a cumulative display length of the message.
7. The method according to claim 2, wherein the terminal determining the progress of the user in browsing the messages in the case that the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen comprises:
under the condition that the terminal determines that the position where the user gazes at the terminal screen falls into the message frame of one message in the messages displayed on the terminal screen, calculating the accumulated distance of the user browsing the message frame based on the position where the user gazes at the terminal screen;
determining the progress of the user in browsing the message according to the calculated ratio of the accumulated distance to a set distance threshold; alternatively, the first and second electrodes may be,
and comparing the calculated accumulated distance with a set distance threshold, and determining the progress of the user in browsing the message according to the comparison result.
8. The method of claim 7, wherein the value of the set distance threshold is associated with at least one of a number of message words, a number of rows, and a cumulative display length of the message.
9. The method according to claim 1, wherein the message range includes an area occupied by the message itself on a screen of the terminal.
10. The method of claim 9, wherein changing the display mode of at least a portion of display elements within the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and changing the color of the part of the message which is browsed on the terminal screen from a first color to a second color according to the progress of the user in browsing the message, wherein the color of the message which is not browsed by the terminal user in the messages displayed on the terminal screen is the first color.
11. The method according to claim 1, wherein the changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and under the condition that the message is judged to be seen, folding and displaying the message on the terminal screen.
12. The method according to claim 1, wherein the changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and under the condition that the message is judged to be viewed, displaying an icon representing that the message is viewed in the message range of the message on the terminal screen.
13. The method of claim 1, further comprising: and changing the classification of at least one part of display elements in the message range of the message on a terminal screen from a first classification to a second classification according to the progress of the user in browsing the message.
14. The method according to claim 1, wherein the changing the display mode of at least a part of display elements in the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user in browsing the message comprises:
and changing the display mode of at least a part of display elements in the message range of the message on a terminal screen from a first display mode to a second display mode through an intermediate transition state according to the progress of the user in browsing the message, wherein the intermediate transition state at least comprises a display mode different from the first display mode and the second display mode.
15. A terminal, comprising:
the eyeball tracking module is used for determining the position of a user of the terminal for watching a terminal screen, and at least one message is displayed on the terminal screen;
the progress calculation module is used for determining the progress of the user in browsing the messages under the condition that the terminal determines that the position where the user gazes at the terminal screen falls into the message range of one message in the messages displayed on the terminal screen;
and the message management module is used for changing the display mode of at least one part of display elements in the message range of the message on a terminal screen from a first display mode to a second display mode according to the progress of the user in browsing the message, wherein the display mode corresponding to the part which is not browsed by the terminal user in the message range of the same message is the first display mode, and the display mode corresponding to the part which is browsed by the terminal user is the second display mode.
16. A computer-readable medium having instructions stored thereon, which when executed on a computer, cause the computer to perform the method of any one of claims 1-14.
17. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of the system, an
A processor, being one of the processors of a system, for performing the method of any one of claims 1-14.
CN202010343422.4A 2020-04-27 2020-04-27 Message processing method of terminal, medium and electronic device Active CN113645349B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010343422.4A CN113645349B (en) 2020-04-27 2020-04-27 Message processing method of terminal, medium and electronic device
PCT/CN2021/088922 WO2021218764A1 (en) 2020-04-27 2021-04-22 Message processing method for terminal, terminal, medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010343422.4A CN113645349B (en) 2020-04-27 2020-04-27 Message processing method of terminal, medium and electronic device

Publications (2)

Publication Number Publication Date
CN113645349A CN113645349A (en) 2021-11-12
CN113645349B true CN113645349B (en) 2023-01-13

Family

ID=78332130

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010343422.4A Active CN113645349B (en) 2020-04-27 2020-04-27 Message processing method of terminal, medium and electronic device

Country Status (2)

Country Link
CN (1) CN113645349B (en)
WO (1) WO2021218764A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114154958A (en) * 2021-12-03 2022-03-08 北京字跳网络技术有限公司 Information processing method, device, electronic equipment and storage medium
CN114553805A (en) * 2022-02-18 2022-05-27 维沃移动通信有限公司 Message display method and device
CN116027887B (en) * 2022-05-20 2024-03-29 荣耀终端有限公司 Display method and electronic equipment
CN114900803A (en) * 2022-05-31 2022-08-12 深圳市智信科技有限公司 Distributed short message verification message sending method based on computer cloud platform

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850221A (en) * 2014-02-14 2015-08-19 欧姆龙株式会社 Gesture recognition device and method of controlling gesture recognition device
CN105824403A (en) * 2015-09-21 2016-08-03 维沃移动通信有限公司 Method and terminal for operating terminal
CN106354380A (en) * 2015-07-17 2017-01-25 阿里巴巴集团控股有限公司 Reading prompting method and device
CN110825226A (en) * 2019-10-30 2020-02-21 维沃移动通信有限公司 Message viewing method and terminal

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5785015B2 (en) * 2011-07-25 2015-09-24 京セラ株式会社 Electronic device, electronic document control program, and electronic document control method
KR20140091322A (en) * 2013-01-11 2014-07-21 삼성전자주식회사 Method and apparatus thereof for displaying screen with eye tracking in portable terminal
US20160094705A1 (en) * 2014-09-30 2016-03-31 Ringcentral, Inc. Message Read Confirmation Using Eye Tracking
CN106791034A (en) * 2016-11-30 2017-05-31 宇龙计算机通信科技(深圳)有限公司 A kind of message display method and device
CN110554768A (en) * 2018-05-31 2019-12-10 努比亚技术有限公司 intelligent wearable device control method and device and computer readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104850221A (en) * 2014-02-14 2015-08-19 欧姆龙株式会社 Gesture recognition device and method of controlling gesture recognition device
CN106354380A (en) * 2015-07-17 2017-01-25 阿里巴巴集团控股有限公司 Reading prompting method and device
CN105824403A (en) * 2015-09-21 2016-08-03 维沃移动通信有限公司 Method and terminal for operating terminal
CN110825226A (en) * 2019-10-30 2020-02-21 维沃移动通信有限公司 Message viewing method and terminal

Also Published As

Publication number Publication date
CN113645349A (en) 2021-11-12
WO2021218764A1 (en) 2021-11-04

Similar Documents

Publication Publication Date Title
CN113645349B (en) Message processing method of terminal, medium and electronic device
CN113542485B (en) Notification processing method, electronic equipment and computer readable storage medium
CN114467297B (en) Video call display method and related device applied to electronic equipment
WO2020134872A1 (en) Message processing method, related device, and system
CN111543042B (en) Notification message processing method and electronic equipment
EP4024182A1 (en) Foldable screen display method and electronic device
CN103414814A (en) Picture processing method and device and terminal device
CN109151180A (en) A kind of object identifying method and mobile terminal
CN113170019B (en) Incoming call display method and terminal equipment based on application
CN108471632B (en) Information processing method and device, mobile terminal and computer readable storage medium
US20170147904A1 (en) Picture processing method and apparatus
EP4024910A1 (en) Method for recovering sim card of electronic device from card drop, and electronic device
CN106203254A (en) A kind of adjustment is taken pictures the method and device in direction
CN106664336B (en) Method and terminal for processing communication event
CN112615947B (en) Method for rapidly entering application and folding screen electronic equipment
CN114510174A (en) Interface display method and electronic equipment
CN109151174A (en) Message display method, wearable device and computer readable storage medium
CN110647731A (en) Display method and electronic equipment
CN113709304A (en) Intelligent reminding method and equipment
CN111049968B (en) Control method and electronic equipment
EP4075771A1 (en) Method for generating contact business card and method for displaying contact information
CN113970965A (en) Message display method and electronic equipment
EP3855358A1 (en) Object recognition method and terminal device
CN114205318B (en) Head portrait display method and electronic equipment
CN109598141A (en) A kind of browsing method, device and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant