WO2021218764A1 - Procédé de traitement de message pour terminal, terminal, support et dispositif électronique - Google Patents

Procédé de traitement de message pour terminal, terminal, support et dispositif électronique Download PDF

Info

Publication number
WO2021218764A1
WO2021218764A1 PCT/CN2021/088922 CN2021088922W WO2021218764A1 WO 2021218764 A1 WO2021218764 A1 WO 2021218764A1 CN 2021088922 W CN2021088922 W CN 2021088922W WO 2021218764 A1 WO2021218764 A1 WO 2021218764A1
Authority
WO
WIPO (PCT)
Prior art keywords
message
terminal
user
display mode
progress
Prior art date
Application number
PCT/CN2021/088922
Other languages
English (en)
Chinese (zh)
Inventor
熊刘冬
李春东
白锦华
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021218764A1 publication Critical patent/WO2021218764A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/7243User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
    • H04M1/72439User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for image or video messaging
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • This application relates to the field of information processing, and in particular to a terminal message processing method, terminal, medium, and electronic equipment.
  • smart terminals such as smart phones, tablet computers, etc.
  • Notification messages of various applications are continuously pushed out to remind users to check.
  • the embodiments of the present application provide a message processing method, terminal, medium, and electronic equipment of a terminal.
  • the technical solution of this application uses eye tracking technology to track the user's viewing of the messages that appear on the screen of the electronic device, and displays the viewed messages and unviewed messages on the screen according to the user's progress in viewing these notification messages, thereby facilitating the user Quickly distinguish which messages have been read and which messages have not been read and need to pay attention to avoid the problem of users missing important messages due to too many notification messages.
  • an embodiment of the present application provides a message processing method for a terminal, including:
  • the terminal determines the position where the user of the terminal is gazing at the terminal screen, where the message displayed on the terminal screen is displayed on the terminal screen; the terminal determines that the position of the user gazing at the terminal screen falls within the message range of one of the messages displayed on the terminal screen In this case, the progress of the user browsing the message is determined; according to the progress of the user browsing the message, the display mode of at least a part of the display elements within the message range of the message on the terminal screen is changed from the first display mode to the second display mode.
  • the display mode of the message browsed by the terminal user is the first display mode.
  • the foregoing method further includes: the message range of the message is the area occupied by the message box of the message on the screen of the terminal.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to
  • the second display mode includes: changing the background color of the message within the message range from the first color to the second color when the user has finished browsing the message, wherein the color of the message that has not been browsed by the terminal user is the first color.
  • the different display mode here means that the background color of the message is different. This allows the user to easily distinguish which messages have been viewed and which have not been viewed based on the background color.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to
  • the second display mode includes: according to the progress of the user browsing the message, changing the background color of the area corresponding to the part of the message within the message range of the message from the first color to the second color (for example, the message is browsed) When half of the time, the color of the background area of the half-viewed message is changed), wherein the background color of the part of the message that is not viewed by the terminal user is the first color.
  • the background color of the corresponding area within the message range of the message can be changed according to the percentage of the user's progress in browsing the message.
  • the above-mentioned method further includes: in the case that the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen, determining that the user browses
  • the progress of the message includes:
  • the terminal calculates the time the user gazes at the terminal screen within the message box; according to the calculated time and settings
  • the ratio of the time threshold is determined to determine the progress of the user browsing the message; or, the calculated time is compared with the set time threshold, and the progress of the user browsing the message is determined according to the comparison result.
  • the foregoing method further includes: setting the value of the time threshold to be related to at least one of the number of words and the number of lines of the message, and the cumulative display length of the message.
  • the number of lines of a message is the number of lines corresponding to the content displayed on the terminal screen of the message. For example, if all the content of the message occupies 5 lines on the screen of the terminal, but the terminal is locked, and the message only displays 2 lines of content on the terminal screen, it can be considered that the number of lines of the message is 2 lines .
  • the cumulative display length of the message can be the cumulative number of lines corresponding to the case where a part of the message is displayed. For example, the message display has a total of 5 lines, and only displays the complete first line and half of the second line, then the cumulative display length of the message is 1.5 lines.
  • the above-mentioned method further includes: in the case that the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen, determining that the user browses
  • the progress of the message includes:
  • the terminal calculates the cumulative distance of the user browsing the message box based on the position of the user gazing at the terminal screen; according to the calculated The ratio of the accumulated distance to the set distance threshold determines the user's progress in browsing the message; or, the calculated accumulated distance is compared with the set distance threshold, and the user's progress in browsing the message is determined according to the comparison result.
  • the foregoing method further includes: setting the value of the distance threshold to be related to at least one of the number of words and lines of the message, and the cumulative display length of the message.
  • the foregoing method further includes: the message range includes the area occupied by the message itself on the screen of the terminal.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to
  • the second display mode includes: according to the progress of the user browsing the message, the color of the part of the message that has been viewed on the terminal screen is changed from the first color to the second color (for example, half of the message is viewed, the message itself is The background color of the half of the browsed part is changed), wherein the color of the message that is not browsed by the terminal user among the messages displayed on the terminal screen is the first color.
  • the background color of the corresponding area within the message range of the message can be changed according to the percentage of the user's progress in browsing the message.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to
  • the second display mode includes: in the case of judging that the message has been viewed, folding and displaying the message on the terminal screen. In this way, by folding and displaying the messages that have been viewed (for example, only the number of messages recently viewed by the user is displayed, and the specific content of the messages that have been viewed is not displayed), the user can more easily notice the unviewed messages information.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the classification of at least a part of the display elements within the message range of the message on the terminal screen from the first classification to The second classification, for example, classify browsed messages as read messages and display them on the top of the terminal screen, and classify messages that have not been browsed as unread messages and display them one by one in the middle of the terminal screen.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to The second display mode includes:
  • the display mode of at least a part of the display elements within the message range of the message on the terminal screen is changed from the first display mode to the second display mode via an intermediate transition state, wherein the intermediate transition state includes at least one A display method different from the first display method and the second display method. For example, if the message is browsed by the user, the background color of the message is changed from red to orange to gray.
  • the foregoing method further includes: according to the progress of the user browsing the message, changing the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to
  • the second display mode includes: in the case of judging that the message has been viewed, an icon representing that the message has been viewed is displayed within the message range of the message on the terminal screen, so that the user can determine whether the message has been viewed according to the icon.
  • an embodiment of the present application provides a terminal, including:
  • the eye tracking module is used to determine the position where the user of the terminal gazes at the terminal screen, where at least one message is displayed on the terminal screen;
  • the progress calculation module is used to determine the progress of the user browsing messages when the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen;
  • the message management module is used to change the display mode of at least part of the display elements within the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user browsing the message, wherein the display mode is not browsed by the terminal user
  • the display mode of the message is the first display mode.
  • the embodiments of the present application provide a computer-readable medium with instructions stored on the computer-readable medium.
  • the instructions When executed on a computer, the computer executes the first aspect and various possible implementations of the first aspect. Any kind of terminal message processing method.
  • an electronic device including:
  • Memory used to store instructions executed by one or more processors of the system
  • the processor is one of the processors of the system, and is used to execute the first aspect and any one of the terminal message processing methods in the various possible implementations of the first aspect.
  • Figure 1(a) shows a display interface diagram of a mobile phone receiving multiple notification messages in a locked screen state according to some embodiments of the present application
  • Figure 1(b) shows a display interface diagram with different background colors of the notification boxes of read messages and unread messages displayed on the screen of the mobile phone according to some embodiments of the present application;
  • Figure 2 shows a block diagram of the hardware structure of a mobile phone according to some embodiments of the present application
  • Fig. 3 shows a scene diagram of eye tracking using a mobile phone according to some embodiments of the present application
  • Fig. 4 shows a hardware and software structure diagram related to the message processing technology of the present application in a mobile phone according to some embodiments of the present application;
  • Fig. 5 shows a message processing flowchart of the software system shown in Fig. 4 according to some embodiments of the present application;
  • Fig. 6(a) shows a display interface in which messages on the main screen of the mobile phone are not viewed when the mobile phone is in a locked state according to some embodiments of the present application;
  • Fig. 6(b) shows a display interface in which some messages on the main screen of the mobile phone have been viewed when the mobile phone is in a locked state according to some embodiments of the present application;
  • Fig. 7 shows a flow chart of a method for processing a message in a mobile phone according to some embodiments of the present application
  • FIG. 8(a) shows a display interface with different background colors for the messages that have been viewed and that have not been viewed on the main screen of the mobile phone in a locked state according to some embodiments of the present application;
  • FIG. 8(b) shows a display interface in which messages that have been viewed and that have not been viewed on the main screen of the mobile phone are categorized and displayed according to some embodiments of the present application;
  • FIG. 8(c) shows a display interface with icons displayed on messages that have been viewed on the main screen of the mobile phone in a locked state according to some embodiments of the present application
  • Fig. 9 shows a structural block diagram of a message processing apparatus according to some embodiments of the present application.
  • Fig. 10 shows a block diagram of a system according to some embodiments of the present application.
  • FIG. 11 shows a block diagram of a system on chip (SoC) according to some embodiments of the present application.
  • SoC system on chip
  • the illustrative embodiments of the present application include, but are not limited to, a terminal message processing method, device, medium, and electronic equipment.
  • the embodiment of the present application discloses a message processing method for an electronic device, which can track a user's viewing of messages that appear on the screen of the electronic device through eye tracking technology, and compare the viewed messages on the screen according to the progress of the user viewing these notification messages. Unviewed messages are displayed separately, so that users can quickly distinguish which messages have been read, which messages have not been read and need to be focused on, so as to avoid the problem of users missing important messages due to too many notification messages, such as important notifications, phone calls , To-do or reminders, etc.
  • the electronic devices provided in this application can be various electronic devices whose display screens can display message notifications and can detect eye movement, including but not limited to tablet computers, smart phones, laptop computers, desktop computers, Wearable electronic devices, head-mounted displays, mobile email devices, portable game consoles, portable music players, reader devices, etc., where wearable electronic devices include but are not limited to smart watches, smart bracelets or smart glasses, smart helmets , Smart headband, etc.
  • wearable electronic devices include but are not limited to smart watches, smart bracelets or smart glasses, smart helmets , Smart headband, etc.
  • the following takes the mobile phone 10 as an example to illustrate the technical solution of the present application.
  • FIG. 1(a) shows a display interface diagram of the mobile phone 10 receiving multiple notification messages when the screen is locked.
  • the main screen of the mobile phone 10 is in a locked state, and the display interface of the main screen displays four notification messages, which are two SMS messages (one billing information from 10086 and one from customer Zhang). SMS), a missed call from Li Lei and a WeChat message.
  • the mobile phone 10 tracks the position of the user's eyes (hereinafter referred to as human eyes) gazing at the home screen to determine whether the human eyes have seen the notification message. , And distinguish the background color of the notification box of the read and unread messages.
  • the technical solution of the present application solves the above-mentioned problems.
  • the user can learn the status of the message being viewed through the background change of the message without special operation, and the above-mentioned problem can be solved, and the operation process is simple and fast.
  • the notification messages displayed on the screen of the mobile phone 10 may include call records, short messages, third-party application notifications, and so on.
  • the message displayed on the screen of the mobile phone 10 is not necessarily the latest push notification message, and multiple notification messages may be accumulated in history.
  • the background color is used to distinguish between read and unread notification messages
  • the background color of the notification frame of the message can be distinguished, or the background color of the message itself can be distinguished.
  • other methods can also be used, such as bolding the font of the read message, and dividing the read message with the unread message.
  • Read messages in different fonts add an icon that characterizes the message that has been read, and categorize and display the read messages and unread messages in groups (for example, fold the read messages and display them in groups). Unread messages are expanded and displayed one by one) and so on.
  • FIG. 2 shows a schematic structural diagram of a mobile phone 10.
  • the mobile phone 10 can execute the message processing method disclosed in the embodiment of the present application.
  • similar parts have the same reference numerals.
  • the mobile phone 10 may include a processor 110, a power module 140, a memory 180, a mobile communication module 130, a wireless communication module 120, an infrared light emitting module 101, a sensor module 190, an audio module 150, an infrared camera 170, and an interface Module 160 and display screen 102 and so on.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 10.
  • the mobile phone 10 may include more or fewer components than shown, or combine certain components, or disassemble certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, may include a central processing unit (CPU), an image processor GPU (Graphics Processing Unit), a digital signal processor DSP, and a microprocessor MCU (Micro-programmed Control Unit), AI (Artificial Intelligence) processor, or programmable logic device FPGA (Field Programmable Gate Array) and other processing modules or processing circuits. Among them, the different processing units may be independent devices or integrated in one or more processors.
  • the processor 110 may be used to process the video or image containing human eyes collected by the infrared camera 170 to obtain the position where the human eye is looking at the screen of the mobile phone 10.
  • a storage unit may be provided in the processor 110 for storing instructions and data.
  • the storage unit in the processor 110 is a cache memory 180.
  • the memory 180 may store an operating system and at least one application program (such as an application program for shooting video) required by the function.
  • the memory 180 may also store a video or image containing human eyes collected by the infrared camera 170, and the processor 110 can monitor the infrared
  • the video or image collected by the camera 170 includes human eyes position information, etc., obtained by processing the human eyes looking at the screen of the mobile phone 10.
  • the power supply module 140 may include a power supply, a power management component, and the like.
  • the power source can be a battery.
  • the power management component is used to manage the charging of the power supply and the power supply to other modules.
  • the charging management module is used to receive charging input from the charger; the power management module is used to connect to a power source, and the charging management module is connected to the processor 110.
  • the mobile communication module 130 may include, but is not limited to, an antenna, a power amplifier, a filter, a low noise amplifier (LNA), and the like.
  • the mobile communication module 130 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the mobile phone 10.
  • the mobile communication module 130 may receive electromagnetic waves by an antenna, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 130 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic waves for radiation by the antenna.
  • at least part of the functional modules of the mobile communication module 130 may be provided in the processor 110.
  • Wireless communication technologies can include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), and broadband code division. Multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), Bluetooth (blue tooth, BT), global Navigation satellite system (global navigation satellite system, GNSS), wireless local area network (wireless local area networks, WLAN), near field communication (NFC), frequency modulation (FM) and/or field communication, NFC ), infrared technology (infrared, IR) technology, etc.
  • GSM global system for mobile communications
  • GPRS general packet radio service
  • CDMA code division multiple access
  • CDMA broadband code division.
  • Multiple access wideband code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • Bluetooth blue tooth, BT
  • global Navigation satellite system global navigation satellite system
  • GNSS global navigation satellite system
  • wireless local area network wireless local area
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system ( Quasi-zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the wireless communication module 120 may include an antenna, and transmit and receive electromagnetic waves via the antenna.
  • the wireless communication module 120 can provide applications on the mobile phone 10, including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), and global navigation satellites.
  • WLAN wireless local area networks
  • BT wireless fidelity
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • infrared technology infrared, IR
  • the mobile phone 10 can communicate with the network and other devices through wireless communication technology.
  • the mobile communication module 130 and the wireless communication module 120 of the mobile phone 10 may also be located in the same module.
  • the display screen 102 includes a display panel.
  • the display panel can use liquid crystal display (LCD), organic light-emitting diode (OLED), active matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oled, quantum dot light-emitting diode (QLED), etc.
  • the display screen 102 is used to display notification messages of various applications of the mobile phone 10 (such as missed calls, unread messages, memos and reminders of other matters, etc.).
  • the infrared light emitting module 101 includes an infrared light emitter, which can emit infrared light to the human eye, so as to determine the position of the human eye gazing at the display screen 102 by the position of the infrared light on the human eye's cornea.
  • Infrared light emitters include but are not limited to infrared light emitting diodes and so on.
  • the sensor module 190 may include a proximity light sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc.
  • the audio module 150 may convert digital audio information into an analog audio signal for output, or convert an analog audio input into a digital audio signal.
  • the audio module 150 can also be used to encode and decode audio signals.
  • the audio module 150 may be disposed in the processor 110, or part of the functional modules of the audio module 150 may be disposed in the processor 110.
  • the audio module 150 may include a speaker, a receiver, a microphone, and a headphone interface.
  • the infrared camera 170 may collect a video or image containing human eyes to obtain the position of the infrared light emitted by the infrared light emitting module 101 in the cornea of the user's eye by processing the video or image containing the human eye. , And then determine the position where the user's eyes are gazing at the display screen 102 of the mobile phone 10. It can be understood that the mobile phone 10 may also include a common camera (not shown) for shooting color images or videos.
  • the interface module 160 includes an external memory interface, a universal serial bus (USB) interface, a subscriber identification module (SIM) card interface, and the like.
  • the external memory interface can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 10.
  • the external memory card communicates with the processor 110 through an external memory interface to realize the data storage function.
  • the universal serial bus interface is used for communication between the mobile phone 10 and other mobile phones.
  • the user identification module card interface is used to communicate with the SIM card installed in the mobile phone 10, for example, to read the phone number stored in the SIM card, or to write the phone number into the SIM card.
  • the mobile phone 10 further includes buttons, motors, indicators, and the like.
  • the keys may include a volume key, an on/off key, and so on.
  • the motor is used to cause the mobile phone 10 to generate a vibration effect, for example, when the user's mobile phone 10 is called, it generates vibration to prompt the user to answer the incoming call of the mobile phone 10.
  • the indicator can include a laser indicator, a radio frequency indicator, an LED indicator, and so on.
  • FIG. 3 shows a scene diagram of eye tracking using the mobile phone 10 according to some embodiments of the present application.
  • the infrared light emitting module 101 has three infrared light emitters for illustration.
  • the infrared emitters can be used to emit infrared light to the human eye.
  • the human eye is receiving After the infrared light emitted by the three infrared light emitters, three infrared light reflection points (reflection point A, reflection point B, and reflection point C as shown in FIG. 3) are formed in the cornea of the human eye.
  • the infrared camera 170 is used to take a video or image of the human eye, and obtain the position of the reflection point formed in the cornea of the human eye and the center position of the pupil of the human eye from the captured video or image to determine the position where the human eye looks at the screen of the mobile phone 10 (That is, the line of sight of the human eye on the screen of the mobile phone 10).
  • the position where the human eye is gazing at the mobile phone 10 can be determined in the following manner:
  • the position of the human eye looking at the screen of the mobile phone 10 (that is, the human Eyes on the screen of the mobile phone 10).
  • the infrared light emitting module 101 may also include more than three infrared light emitters arranged in a predetermined manner, such as a square shape, a line shape, and the like.
  • the number of infrared cameras 170 may also be two or more, which is not limited here.
  • Fig. 4 shows hardware and software related to the message processing technology of the present application in the mobile phone 10 shown in Fig. 3 according to some embodiments of the present application.
  • the mobile phone 10 includes a hardware system 103 and a software system 104.
  • the hardware system 103 includes an infrared camera 170, an infrared light emitting module 101, and a processor 110;
  • the software system 104 includes an operating system 107, and the operating system 107 includes a message management module 105 and an eye tracking module 106.
  • the infrared light emitting module 101 includes three infrared light emitters, which are used as infrared light sources to emit infrared light.
  • the infrared camera 170 is used to capture a video or image of the human eye and send it to the processor 110 so that the processor 110 obtains the position of the reflection point formed in the cornea of the human eye and the center position of the pupil of the human eye from the captured video or image of the human eye Then, according to the position of the reflection point in the cornea and the center position of the pupil of the human eye, the position where the human eye looks at the screen of the mobile phone 10 is determined. For the specific process, refer to the description of FIG. 3.
  • the eye tracking module 106 continuously obtains the movement of the human eye to feed back the position of the human eye gazing at the screen of the mobile phone 10 in real time.
  • the message management module 105 is a module that is set in the mobile phone 10 and dynamically changes the status display characteristics of the notification message of the mobile phone 10. For example, the message management module 105 can change the color of the message notification bar and the font size of the text. In the embodiment provided in this application, the message management module 105 is also used to determine whether the human eye has seen the notification message according to the position where the human eye is gazing at the screen of the mobile phone 10 determined by the eye tracking module 106, and determine whether the human eye has seen the notification message, and compare the read and unread
  • the background color of the messages is distinguished, which is convenient for users to quickly distinguish which messages have been read and which messages have not been read and need to be focused. The user only needs eye movements. The operation process is simple and fast, and the user experience is improved.
  • the message management module 105 and the eye tracking module 106 provided in the embodiments of the present application are modules divided according to functions. In other embodiments, the message management module 105 and the eye tracking module 106 can also be combined into one module. , Or be divided into more modules.
  • the hardware and software system architecture of the mobile phone 10 illustrated in the embodiment of the present application does not constitute a specific limitation on the mobile phone 10. In other embodiments of the present application, the mobile phone 10 may include more or fewer components than shown, or combine certain components, or disassemble certain components, or arrange different components.
  • Fig. 5 shows a message processing flow of the mobile phone 10 shown in Fig. 4 according to some embodiments of the present application. Specifically, as shown in Figure 5, it includes:
  • the message management module 105 and the eye tracking module 106 establish a connection (500).
  • the message management module 105 triggers the eye tracking module 106 to activate the eye tracking function when a new notification message is received (for example, when the home screen of the mobile phone 10 is in a locked state at this time).
  • the eye tracking module 106 is triggered to enable the eye tracking function.
  • the message management module 105 receives a notification from an application of the mobile phone 10, etc. (for example, when the user browses a web page using a search engine, a tip from the chat software pops up on the display interface of the mobile phone 10) , Trigger the eye tracking module 106 to start the eye tracking function.
  • the eye tracking module 106 After the eye tracking module 106 receives the registration of the message management module 105, the message management module 105 and the eye tracking module 106 establish a connection.
  • the eye tracking module 106 sends to the message management module 105 the coordinates of the position where the human eye is looking at the screen of the mobile phone 10 every preset time (502). For example, in some embodiments, the eye tracking module 106 sends to the message management module 105 the coordinates of the position where the human eye is gazing at the screen of the mobile phone 10 every 10 ms.
  • the position of the human eye looking at the screen of the mobile phone 10 can be determined by an optical recording method, where the optical recording method is to record the movement of the human eye with a camera or video camera, that is, to obtain an eye image reflecting the movement of the human eye. And extract eye features from the acquired eye images to build a model of gaze/gaze point estimation.
  • the eye features may include: the position of the center of the pupil, the shape of the pupil, the position of the iris, the shape of the iris, the position of the eyelid, the position of the corner of the eye, the position of the light spot (also referred to as the Purkin spot), and the like.
  • the principle of the eye tracking module 106 in the embodiment of the present application for determining the position where the human eye is gazing at the screen of the mobile phone 10 may be: the infrared light emitting module 101 emits infrared light to the human eye, and the infrared camera 170 captures a video of human eye movement, and at the same time can shoot
  • the reflection point of the infrared light on the cornea is the light spot (also called the Purkin spot), through which the eye image with the light spot is obtained.
  • the eye tracking module 106 analyzes the eye image with light spots to obtain the center position of the pupil of the human eye and the position of the reflection point on the cornea, and then determine the position where the human eye looks at the screen of the mobile phone 10.
  • the relative positional relationship between the center of the pupil and the corneal reflection point changes accordingly.
  • Several eye images with light spots collected by the infrared camera 170 reflect this positional relationship, and the eye tracking module 106 According to the position change relationship, the position where the human eye looks at the screen of the mobile phone 10 as the human eye rotates is determined.
  • the message management module 105 determines whether the position of the human eye gazing at the screen of the mobile phone 10 falls into the message notification frame according to the received position coordinates of the human eye gazing at the screen of the mobile phone 10 and the layout information of the message notification frame of the mobile phone 10 (504). In some embodiments, when the mobile phone 10 receives a new message, the message management module 105 can obtain the layout information of the message box corresponding to the message on the screen of the mobile phone 10 (that is, the coordinate range of the message box on the screen of the mobile phone 10).
  • the message management module 105 calculates the progress of the human eye viewing the message according to the length of the human eye gazing at the message or the continuously received position information of the human eye gazing at the screen of the mobile phone 10 (506).
  • the progress of the human eye browsing the messages can be calculated by calculating the length of time that the position of the human eye gazing at the screen of the mobile phone 10 falls into the corresponding message notification frame.
  • the progress of the human eye browsing the message may be the calculated ratio of the time that the position of the human eye gazing at the screen of the mobile phone 10 falls into the corresponding message notification frame and the set time threshold.
  • the timing should start from determining that the position where the human eye is gazing at the screen of the mobile phone 10 falls within the message notification frame. For example, starting from determining that the position where the human eye is looking at the screen of the mobile phone 10 falls within the message notification frame, the eye tracking module 106 sends the position of the eye looking at the screen of the mobile phone 10 to the message management module 105 every 10 ms, and the human eye is looking at the screen for a continuous period of 100 ms. The position of the screen of the mobile phone 10 is always in the message notification frame. It is not difficult to understand that it can be considered that the duration of the human eye browsing the message in the message notification frame is at least 100 ms.
  • the browsing progress when the human eye browses the message reaches 100ms is 50%; if the human eye completely browses the message, it takes 120ms (That is, the set time threshold of the message is 120ms), then when the human eye browses the message for 100ms, the browsing progress is 83.33%.
  • the set time threshold of the message may be related to at least one of the number of words and lines of the message, and the cumulative display length of the message. That is, when the font size of the message is the same, the greater the number of words, the greater the number of lines, or the longer the cumulative display length, the greater the set time threshold corresponding to the message.
  • the progress of the human eye browsing message can also be determined according to the relationship between the time when the human eye browses the message and the set time threshold. For example, for one or more messages displayed on the screen of the mobile phone 10, set The set time threshold is 80ms. If the time for human eyes to browse the message is greater than or equal to 80ms by calculation, then the message can be considered to have been viewed; if the time for human eyes to browse the message is less than 80ms, then the message can be considered Haven't seen it.
  • the progress of the human eye browsing the messages can be calculated by the human eye gazing at the position information of the screen of the mobile phone 10. In one embodiment, the progress can be calculated by calculating the cumulative distance of the human eye browsing the messages in the message notification box. Specifically, the progress of the human eye browsing the message may be a calculated ratio of the cumulative distance of the human eye browsing the message in the message notification frame to the set distance threshold. For example, the message in the message notification box has a total of two lines. The message length of the first line is 5cm, and the message length of the second line is 3cm. The progress of viewing the message is 80%. In one embodiment, the progress can be calculated by calculating the height of the message in the message notification box that has been viewed by the human eye.
  • the message in the message notification box has a total of five lines, and the average height of each line is 0.5 cm. If the human eye has browsed through the four lines, the progress of the human eye is 80%. If it is determined that the human eye has browsed the three lines, the progress of the human eye of the message is 60%.
  • the set distance threshold of the message may be related to at least one of the number of words and lines of the message, and the cumulative display length of the message. That is, when the font size of the message is the same, the greater the number of words, the greater the number of lines, or the longer the cumulative display length, the greater the set distance threshold corresponding to the message.
  • the progress of the human eye to browse the message can also be determined according to the relationship between the distance of the human eye to browse the message and the set distance threshold. For example, for one or more messages displayed on the screen of the mobile phone 10, set The predetermined distance threshold is 3 lines. If the distance of the human eye browsing the message is greater than or equal to 3 lines by calculation, then the message can be considered to have been seen; if the human eye browsing the message is less than 3 lines, it can be considered The message has not been read.
  • each message notification frame may be the same.
  • the size of each message notification box can also be different. For example, there are two lines of text in the message notification box 1 and ten lines of text in the message notification box 2. If each line of text is the same height, it is not It is difficult to understand that the more lines the message content occupies, the greater the height of the message notification box.
  • the calculation of the human eye browsing message progress is interrupted, and the current progress is performed. Save, so that the next time it is judged that the position where the human eye is looking at the screen of the mobile phone 10 falls within the message notification frame, the progress will continue to be calculated.
  • the message displayed on the screen of the mobile phone 10 has no message notification box (that is, only the message content is displayed).
  • the progress of the human eye browsing the message can be started by judging that the human eye falls within the area occupied by the message itself on the screen of the mobile phone 10.
  • the message management module 105 judges whether the human eye has read the message according to the progress of the human eye browsing the message (508). In some embodiments, it can be judged whether human eyes have seen the message according to a preset progress threshold. For example, in one embodiment, the preset progress threshold is 70%. If it takes 200ms for the human eye to completely browse the message, and the browsing progress when the human eye browses the message for 100ms reaches 50%, it is considered that the human eye has not seen it. After passing this message, and if the browsing progress of the human eye reaches 80% when it reaches 160ms, it is considered that the human eye has read the message. For another example, in one embodiment, the message in the message notification box has a total of two lines.
  • the message length of the first line is 5cm, and the message length of the second line is 3cm. If it is determined that the total length of the message viewed by the human eye is 6.4cm, If the progress of the human eye viewing the message is 80%, it is considered that the human eye has seen the message. It can be understood that the value of the aforementioned progress threshold is only exemplary and not restrictive. In a specific application, the value of the threshold may be set as required.
  • the current browsing progress can be recorded, so as to build on this basis. Continue to accumulate the progress of attention to the message, or recalculate the progress next time.
  • the message management module 105 changes the color of the message box of the message (510). In some embodiments, if it is determined that the human eye has seen the message in the message notification box, the message management module 105 distinguishes the background color of the message box (for example, the background color of the message box is white when the message box has not been seen. After judging that the message has been viewed, the message management module 105 changes the background color of the message box to rose red to indicate that the message has been viewed. In some embodiments, the message management module 105 can also change the color of the message box being viewed by the human eye according to the progress of the human eye browsing the message.
  • the home screen of the mobile phone 10 is in a locked state, and four messages are displayed on the home screen of the mobile phone 10, two messages from "Guo” and two messages from Mr. Zhang.
  • the missed call message and one WeChat message were not viewed by the user.
  • the missed call message from Mr. Zhang has been checked, and the background color of the corresponding message box is completely changed (that is, the background color of the message that has not been checked is different );
  • a message from "Guo" about a problem discussed in a meeting was viewed 70%, and the background color of the 70% part of the message was changed.
  • Fig. 7 shows a message processing method of an electronic device according to some embodiments of the present application.
  • the following takes the mobile phone 10 shown in FIG. 3 as an example to introduce the message processing method provided in the embodiment of the present application in detail. Specifically, as shown in Figure 7, it includes:
  • the mobile phone 10 uses the eye tracking module 106 to determine where the human eye is looking at the screen of the mobile phone 10.
  • the infrared light emitting module 101 emits infrared light to the human eye
  • the infrared camera 170 captures a video of human eye movement, and at the same time, the reflection point of the infrared light on the cornea can be captured, thereby obtaining an eye image with the reflection point.
  • the eye tracking module 106 analyzes the eye image with reflection points to obtain the center position of the pupil of the human eye and the position of the reflection point on the cornea, and determines that the human eye is looking at the mobile phone 10 according to the center position of the pupil of the human eye and the position of the reflection point on the cornea. The position of the screen. Moreover, as the human eye rotates, the relative positional relationship between the pupil center and the corneal reflection point changes accordingly. Several eye images with reflection points collected by the infrared camera 170 reflect this positional change relationship. The eye tracking module 106 According to the position change relationship, the position where the human eye looks at the screen of the mobile phone 10 as the human eye rotates is determined.
  • the message management module 105 can obtain the layout information of the message box corresponding to the message on the screen of the mobile phone 10 (that is, the coordinate range of the message box on the screen of the mobile phone 10), and determine whether the position coordinates of the human eye watching the screen of the mobile phone 10 fall into the message.
  • the coordinate range of the frame on the screen of the mobile phone 10 is used to determine whether the position where the human eye is looking at the screen of the mobile phone 10 falls within the message notification frame.
  • the progress of the human eye browsing the messages can be calculated by calculating the length of time that the position of the human eye gazing at the screen of the mobile phone 10 falls into the corresponding message notification frame. In some embodiments, the progress of the human eye browsing the messages can be calculated by the human eye gazing at the position information of the screen of the mobile phone 10.
  • the specific calculation method is similar to the calculation method shown in FIG. 5, please refer to the above for detailed description, and will not be repeated here.
  • the preset progress threshold is 80%. If the progress of the human eye browsing the message in the message box is greater than or equal to 80%, it is judged that the human eye has seen the message in the message box, otherwise, it is judged that the human eye has not seen the message in the message box. News.
  • the color of the part that has been seen and the part that has not been seen in the message can be distinguished, for example, the unseen part.
  • Part of the content of the message is displayed in white, and part of the content of the message that has been read is displayed in green.
  • the unseen messages are displayed on the top of the screen of the mobile phone 10, and the messages that have been seen are displayed on the screen of the mobile phone 10. Show below. You can also display the unseen messages on the screen of the mobile phone 10 one by one, fold the messages that have been read, and only display the number of read messages, or only display the unseen messages on the main screen of the mobile phone 10. To display. For example, in the embodiment shown in FIG. 8(b), the main screen of the mobile phone 10 has 1 total missed call and 1 WeChat notification that has not been viewed, and there are 20 read notifications that are superimposed and displayed. This makes it easier for users to pay attention to notification messages that have not been viewed, so as to reduce the possibility of missing important notification messages.
  • an icon indicating that the message has been checked may be displayed on the screen.
  • the main screen of the mobile phone 10 displays four messages, including two short messages from the mobile phone, one from Mr. Zhang's missed call and one message from WeChat.
  • the message management module 105 displays an icon of an eye on the upper right of the message box of the two messages, which symbolizes this Both messages have been viewed. The user can distinguish which messages have been watched and which have not been watched by seeing whether the eye icon is displayed on the message box, which is convenient for the user to quickly distinguish the messages that have been watched from those that have not been watched.
  • the eye icon in FIG. 8(c) that indicates that the message has been viewed can be any icon, and this solution does not limit the shape, size, color, etc. of the icon that indicates that the message has been viewed. In addition, this solution does not limit the position of the icon that indicates that a certain message has been viewed in the message box.
  • an icon indicating that the message has not been seen may also be displayed on the screen.
  • Fig. 9 provides a message processing apparatus 900 according to some embodiments of the present application, specifically, including:
  • the eye tracking module 902 is used to determine where the user of the terminal is gazing at the terminal screen, and at least one message is displayed on the terminal screen;
  • the progress calculation module 904 is configured to determine the progress of the user browsing messages when the terminal determines that the position where the user gazes at the terminal screen falls within the message range of one of the messages displayed on the terminal screen;
  • the message management module 906 is configured to change the display mode of at least a part of the display elements within the message range of the message on the terminal screen from the first display mode to the second display mode according to the progress of the user browsing the message, wherein The display mode of the browsed message is the first display mode.
  • the message processing apparatus 900 shown in FIG. 9 corresponds to the message processing method provided in this application, and the technical details in the specific description of the message processing method provided in this application still apply to the message processing shown in FIG. 9
  • the device 900 please refer to the above for a specific description, and will not be repeated here.
  • the system 100 may include one or more processors 1004, a system control logic 1008 connected to at least one of the processors 1004, a system memory 1012 connected to the system control logic 1008, and a system control logic 1008 connected to the system control logic 1008.
  • NVM Non-volatile memory
  • the system 100 further includes an infrared light emitting module (not shown), an infrared camera (not shown), and so on.
  • the infrared light emitting module includes an infrared light emitter, which can emit infrared light to human eyes.
  • Infrared light emitters include, but are not limited to, infrared light-emitting diodes, infrared lasers, and so on.
  • Infrared cameras can capture video or images containing human eyes.
  • the processor 1004 may include one or more single-core or multi-core processors. In some embodiments, the processor 1004 may include any combination of a general-purpose processor and a special-purpose processor (for example, a graphics processor, an application processor, a baseband processor, etc.). In an embodiment in which the system 100 adopts an eNB (Evolved Node B, enhanced base station) 101 or a RAN (Radio Access Network, radio access network) controller, the processor 1004 may be configured to execute various conforming embodiments, for example In the embodiment shown in FIGS.
  • eNB evolved Node B, enhanced base station
  • RAN Radio Access Network, radio access network
  • the processor 1004 processes the video or image containing the human eye collected by the infrared camera to obtain the position of the infrared light emitted by the infrared light emitting module in the cornea of the user’s eye, Then determine the position of the user's eyes gazing at the display screen of the mobile phone.
  • system control logic 1008 may include any suitable interface controller to provide any suitable interface to at least one of the processors 1004 and/or any suitable device or component in communication with the system control logic 1008.
  • the system control logic 1008 may include one or more memory controllers to provide an interface to the system memory 1012.
  • the system memory 1012 can be used to load and store data and/or instructions. For example, loading the application program corresponding to the camera function of the infrared camera, storing the video or image containing the human eye collected by the infrared camera, and the processor processing the video or image containing the human eye collected by the infrared camera. Screen location information, etc.
  • the memory 1012 of the system 100 may include any suitable volatile memory, such as a suitable dynamic random access memory (DRAM).
  • DRAM dynamic random access memory
  • the NVM/memory 1016 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions.
  • the NVM/memory 1016 may include any suitable non-volatile memory such as flash memory and/or any suitable non-volatile storage device, such as HDD (Hard Disk Drive, hard disk drive), CD (Compact Disc) , At least one of an optical disc drive and a DVD (Digital Versatile Disc, Digital Versatile Disc) drive.
  • the NVM/memory 1016 may include a part of the storage resources on the device where the system 100 is installed, or it may be accessed by the device, but not necessarily a part of the device.
  • the NVM/storage 1016 can be accessed through the network via the network interface 1020.
  • system memory 1012 and the NVM/memory 1016 may respectively include: a temporary copy and a permanent copy of the instruction 1024.
  • the instructions 1024 may include instructions that, when executed by at least one of the processors 1004, cause the system 100 to implement the methods shown in FIGS. 5 and 7.
  • the instructions 1024, hardware, firmware, and/or software components thereof may additionally/alternatively be placed in the system control logic 1008, the network interface 1020, and/or the processor 1004.
  • the network interface 1020 may include a transceiver to provide a radio interface for the system 100, and then communicate with any other suitable devices (such as a front-end module, an antenna, etc.) through one or more networks.
  • the network interface 1020 may be integrated with other components of the system 100.
  • the network interface 1020 may be integrated with at least one of the processor 1004, the system memory 1012, the NVM/storage 1016, and the firmware device with instructions (not shown), when at least one of the processor 1004 executes the When instructed, the system 100 implements the methods shown in FIGS. 5 and 7.
  • the network interface 1020 may further include any suitable hardware and/or firmware to provide a multiple input multiple output radio interface.
  • the network interface 1020 may be a network adapter, a wireless network adapter, a telephone modem and/or a wireless modem.
  • At least one of the processors 1004 may be packaged with the logic of one or more controllers for the system control logic 1008 to form a system in package (SiP). In one embodiment, at least one of the processors 1004 may be integrated with the logic of one or more controllers used for the system control logic 1008 on the same die to form a system on chip (System on Chip, SoC).
  • SiP system in package
  • SoC system on chip
  • the system 100 may further include: an input/output (I/O) device 1032.
  • the I/O device 1032 may include a user interface to enable a user to interact with the system 100; the design of the peripheral component interface enables the peripheral component to also interact with the system 100.
  • the system 100 further includes a sensor for determining at least one of environmental conditions and location information related to the system 100.
  • the user interface may include, but is not limited to, a display (e.g., liquid crystal display, touch screen display, etc.), speakers, microphones, one or more cameras (e.g., still image cameras and/or video cameras), flashlights (e.g., LED flash) and keyboard.
  • a display e.g., liquid crystal display, touch screen display, etc.
  • speakers e.g., speakers, microphones, one or more cameras (e.g., still image cameras and/or video cameras), flashlights (e.g., LED flash) and keyboard.
  • the peripheral component interface may include, but is not limited to, a non-volatile memory port, an audio jack, and a power interface.
  • the senor may include, but is not limited to, a gyroscope sensor, an accelerometer, a proximity sensor, an ambient light sensor, and a positioning unit.
  • the positioning unit may also be part of or interact with the network interface 1020 to communicate with components of the positioning network (eg, global positioning system (GPS) satellites).
  • GPS global positioning system
  • FIG. 11 shows a block diagram of a system on chip (System on Chip, SoC) 110.
  • SoC System on Chip
  • the SoC110 includes: an interconnection unit 1150; a system agent unit 1180; a bus controller unit 1190; an integrated memory controller unit 1140; a group or one or more coprocessors 1120, which may include integrated graphics logic, Image processor, audio processor and video processor; static random access memory (SRAM) unit 1130; direct memory access (DMA) unit 1160.
  • an interconnection unit 1150 a system agent unit 1180; a bus controller unit 1190; an integrated memory controller unit 1140; a group or one or more coprocessors 1120, which may include integrated graphics logic, Image processor, audio processor and video processor; static random access memory (SRAM) unit 1130; direct memory access (DMA) unit 1160.
  • SRAM static random access memory
  • DMA direct memory access
  • the coprocessor 1120 includes a dedicated processor, such as, for example, a network or communication processor, a compression engine, a graphics processor General Purpose Computing (General Purpose Computing on GPU, GPGPU), a high-throughput MIC processor, or embedded Type processor and so on.
  • a dedicated processor such as, for example, a network or communication processor, a compression engine, a graphics processor General Purpose Computing (General Purpose Computing on GPU, GPGPU), a high-throughput MIC processor, or embedded Type processor and so on.
  • the various embodiments of the mechanism disclosed in this application may be implemented in hardware, software, firmware, or a combination of these implementation methods.
  • the embodiments of the present application can be implemented as a computer program or program code executed on a programmable system.
  • the programmable system includes at least one processor and a storage system (including volatile and non-volatile memory and/or storage elements) , At least one input device and at least one output device.
  • Program codes can be applied to input instructions to perform the functions described in this application and generate output information.
  • the output information can be applied to one or more output devices in a known manner.
  • the processing system includes any processor having a processor such as, for example, a digital signal processor (Digital Signal Processing, DSP), a microcontroller, an application specific integrated circuit (ASIC), or a microprocessor. system.
  • DSP Digital Signal Processing
  • ASIC application specific integrated circuit
  • the program code can be implemented in a high-level programming language or an object-oriented programming language to communicate with the processing system.
  • assembly language or machine language can also be used to implement the program code.
  • the mechanism described in this application is not limited to the scope of any particular programming language. In either case, the language can be a compiled language or an interpreted language.
  • the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof.
  • the disclosed embodiments can also be implemented as instructions carried by or stored on one or more transient or non-transitory machine-readable (eg, computer-readable) storage media, which can be executed by one or more processors Read and execute.
  • the instructions can be distributed through a network or through other computer-readable media.
  • a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (for example, a computer), including, but not limited to, floppy disks, optical disks, optical disks, read-only memories (CD-ROMs), magnetic CD, Read Only Memory (ROM), Random Access Memory (RAM), Erasable Programmable Read Only Memory (EPROM), Electrically Erasable and Programmable Electronically Erasable Programmable Read-Only Memory (EEPROM), magnetic or optical card, flash memory, or used to transmit information by using the Internet to transmit information in electrical, optical, acoustic or other forms (for example, carrier waves, infrared signals, digital signals) Etc.) tangible machine-readable memory. Therefore, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (for example, a computer).
  • each unit/module mentioned in each device embodiment of this application is a logical unit/module.
  • a logical unit/module can be a physical unit/module or a physical unit/ A part of the module can also be realized by a combination of multiple physical units/modules.
  • the physical realization of these logical units/modules is not the most important.
  • the combination of the functions implemented by these logical units/modules is the solution to this application.
  • the above-mentioned device embodiments of this application do not introduce units/modules that are not closely related to solving the technical problems proposed by this application. This does not mean that the above-mentioned device embodiments do not exist. Other units/modules.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente demande concerne un procédé de traitement de message pour un terminal, un terminal, un support et un dispositif électronique. Le procédé consiste : à déterminer par un terminal, une position, sur un écran de terminal, au niveau de laquelle le regard d'un utilisateur se pose, au moins un message étant affiché sur l'écran de terminal ; lorsqu'il est déterminé que la position au niveau de laquelle le regard de l'utilisateur se pose sur l'écran de terminal se trouve dans une plage de message d'un message affiché sur l'écran, à déterminer, par le terminal, la progression de l'utilisateur parcourant le message ; et à changer un mode d'affichage, sur l'écran de terminal, d'au moins certains des éléments d'affichage à l'intérieur de la plage de message du message d'un premier mode d'affichage à un second mode d'affichage en fonction de la progression de l'utilisateur parcourant le message, un mode d'affichage d'un message qui n'a pas été parcouru par l'utilisateur du terminal étant le premier mode d'affichage. Dans la présente demande, la visualisation d'un message apparaissant sur un écran de terminal par un utilisateur est suivie au moyen d'une technologie de suivi de globe oculaire, et un message visualisé et un message non visualisé sont distingués et affichés sur l'écran en fonction de la progression des messages de visualisation de l'utilisateur, de telle sorte que l'utilisateur peut distinguer rapidement le message visualisé et le message non visualisé, et ne peut manquer un message important.
PCT/CN2021/088922 2020-04-27 2021-04-22 Procédé de traitement de message pour terminal, terminal, support et dispositif électronique WO2021218764A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010343422.4A CN113645349B (zh) 2020-04-27 2020-04-27 终端的消息处理方法、终端、介质和电子设备
CN202010343422.4 2020-04-27

Publications (1)

Publication Number Publication Date
WO2021218764A1 true WO2021218764A1 (fr) 2021-11-04

Family

ID=78332130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/088922 WO2021218764A1 (fr) 2020-04-27 2021-04-22 Procédé de traitement de message pour terminal, terminal, support et dispositif électronique

Country Status (2)

Country Link
CN (1) CN113645349B (fr)
WO (1) WO2021218764A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114154958A (zh) * 2021-12-03 2022-03-08 北京字跳网络技术有限公司 信息处理方法、装置、电子设备和存储介质

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114553805A (zh) * 2022-02-18 2022-05-27 维沃移动通信有限公司 消息显示方法及其装置
CN116027887B (zh) * 2022-05-20 2024-03-29 荣耀终端有限公司 一种显示方法和电子设备
CN114900803A (zh) * 2022-05-31 2022-08-12 深圳市智信科技有限公司 一种基于计算机云平台分布式短信验证消息发送方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198032A1 (en) * 2013-01-11 2014-07-17 Samsung Electronics Co., Ltd. Method and apparatus for displaying screen with eye tracking in portable terminal
CN105824403A (zh) * 2015-09-21 2016-08-03 维沃移动通信有限公司 一种对终端进行操作的方法及终端
CN106791034A (zh) * 2016-11-30 2017-05-31 宇龙计算机通信科技(深圳)有限公司 一种消息显示方法及装置
CN110554768A (zh) * 2018-05-31 2019-12-10 努比亚技术有限公司 一种智能穿戴设备控制方法、设备和计算机可读存储介质
CN110825226A (zh) * 2019-10-30 2020-02-21 维沃移动通信有限公司 消息查看方法及终端

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5785015B2 (ja) * 2011-07-25 2015-09-24 京セラ株式会社 電子機器、電子文書制御プログラムおよび電子文書制御方法
JP2015153195A (ja) * 2014-02-14 2015-08-24 オムロン株式会社 ジェスチャ認識装置およびジェスチャ認識装置の制御方法
US20160094705A1 (en) * 2014-09-30 2016-03-31 Ringcentral, Inc. Message Read Confirmation Using Eye Tracking
CN106354380A (zh) * 2015-07-17 2017-01-25 阿里巴巴集团控股有限公司 阅读提示方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140198032A1 (en) * 2013-01-11 2014-07-17 Samsung Electronics Co., Ltd. Method and apparatus for displaying screen with eye tracking in portable terminal
CN105824403A (zh) * 2015-09-21 2016-08-03 维沃移动通信有限公司 一种对终端进行操作的方法及终端
CN106791034A (zh) * 2016-11-30 2017-05-31 宇龙计算机通信科技(深圳)有限公司 一种消息显示方法及装置
CN110554768A (zh) * 2018-05-31 2019-12-10 努比亚技术有限公司 一种智能穿戴设备控制方法、设备和计算机可读存储介质
CN110825226A (zh) * 2019-10-30 2020-02-21 维沃移动通信有限公司 消息查看方法及终端

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114154958A (zh) * 2021-12-03 2022-03-08 北京字跳网络技术有限公司 信息处理方法、装置、电子设备和存储介质

Also Published As

Publication number Publication date
CN113645349B (zh) 2023-01-13
CN113645349A (zh) 2021-11-12

Similar Documents

Publication Publication Date Title
WO2021218764A1 (fr) Procédé de traitement de message pour terminal, terminal, support et dispositif électronique
CN113542485B (zh) 一种通知处理方法、电子设备及计算机可读存储介质
US11722449B2 (en) Notification message preview method and electronic device
CN114467297B (zh) 一种应用于电子设备的视频通话的显示方法及相关装置
WO2021013134A1 (fr) Procédé de commande de levage de caméra et dispositif électronique
CN111543042B (zh) 通知消息的处理方法及电子设备
CN106534562B (zh) 文件删除方法及装置
CN103414814A (zh) 一种图像的处理方法、装置和终端设备
US11272116B2 (en) Photographing method and electronic device
KR20160035859A (ko) 사용자 인증 방법 및 그 전자 장치
CN112615947B (zh) 快速进入应用的方法与折叠屏电子设备
WO2022042766A1 (fr) Procédé d'affichage d'informations, dispositif terminal et support de stockage lisible par ordinateur
EP4325338A1 (fr) Procédé de commande d'affichage, dispositif électronique, et support de stockage informatique
US20230041690A1 (en) Intelligent reminding method and device
WO2023222130A1 (fr) Procédé d'affichage et dispositif électronique
EP4180918A1 (fr) Procédé pour déplacer une commande et dispositif électronique
CN111049968B (zh) 一种控制方法和电子设备
CN113970965A (zh) 消息显示方法和电子设备
EP3855358A1 (fr) Procédé de reconnaissance d'objet et dispositif terminal
CN114205318B (zh) 头像显示方法及电子设备
CN116700556B (zh) 卡片生成方法及相关装置
CN116048831B (zh) 一种目标信号处理方法和电子设备
CN116048236B (zh) 通信方法及相关装置
WO2022217969A1 (fr) Procédé et appareil pour activer une fonction dans une application
EP4310725A1 (fr) Procédé de commande de dispositif et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21797308

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21797308

Country of ref document: EP

Kind code of ref document: A1