CN113970965A - Message display method and electronic equipment - Google Patents

Message display method and electronic equipment Download PDF

Info

Publication number
CN113970965A
CN113970965A CN202010705738.3A CN202010705738A CN113970965A CN 113970965 A CN113970965 A CN 113970965A CN 202010705738 A CN202010705738 A CN 202010705738A CN 113970965 A CN113970965 A CN 113970965A
Authority
CN
China
Prior art keywords
preset
message window
electronic device
user
eyeball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010705738.3A
Other languages
Chinese (zh)
Inventor
熊刘冬
李春东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010705738.3A priority Critical patent/CN113970965A/en
Publication of CN113970965A publication Critical patent/CN113970965A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Message display method and electronic equipment. In the method, when there is a message window to be displayed, the message window is directly displayed around the eyeball-fixation position of the user in the display area. By implementing the technical scheme provided by the application, the message window can be displayed according to the eyeball watching position of the user, and the message notification efficiency is improved.

Description

Message display method and electronic equipment
Technical Field
The present application relates to the field of terminal and communication technologies, and in particular, to a message display method and an electronic device.
Background
Message display is a very important function in electronic devices. The method can provide required information for the user in time, prompt the user how to operate, or prompt the user about the current progress and the like.
Current message windows are typically displayed in a fixed position. For example, the information notification window is generally displayed in a top fixed area of the display screen of the electronic device, the feedback prompt window is generally displayed in a bottom fixed area of the display screen of the electronic device, and the like.
However, the message window is displayed in a fixed area or even a fixed location, which results in much of the user being unnoticed. Or the message window has disappeared when the user changes the angle of view to see the message. Therefore, the efficiency of message notification is reduced, and the user may call the message list to check the message, so that the steps of checking the message by the user are increased.
Disclosure of Invention
The application provides a message display method and electronic equipment, which are used for displaying a message window according to an eyeball gaze position of a user and improving the efficiency of message notification.
In a first aspect, the present application provides a message display method, including: the electronic equipment determines that a message window needing to be displayed exists; the electronic equipment determines the current eyeball watching position of a user in a display area of the electronic equipment in real time; the electronic equipment determines a first preset visual area according to the current eyeball fixation position, and the first preset visual area is arranged around the current eyeball fixation position on the display area; the electronic equipment displays the message window in the first preset visual area.
In the embodiment, when the electronic device determines that the message window needing to be displayed exists, the first preset visual area around the eyeball fixation position is determined in the display area according to the current eyeball fixation position of the user, and the message window is displayed in the first preset visual area, so that the message window is directly displayed around the current eyeball fixation position of the user, the user can see the message window at the first time, and the message notification efficiency is improved.
With reference to some embodiments of the first aspect, in some embodiments, the first predetermined visual area is on the display area and a center or a starting position is within a first predetermined distance from the current eye gaze position, and a size of the first predetermined visual area is not smaller than a size of the message window.
In the above embodiment, the first predetermined visual area is on the display area and the center or the starting position is within a first predetermined distance from the current eye gaze position, so as to ensure that the determined first predetermined visual area is around the current eye gaze position of the user. The size of the first predetermined visual area is not smaller than the size of the message window, so that the first predetermined visual area can be used to display the message window.
With reference to some embodiments of the first aspect, in some embodiments, the determining, by the electronic device, a first preset visual area according to the current eye gaze location specifically includes: the electronic equipment determines a preset display direction and a preset display distance matched with the relative position according to the relative position of the current eyeball fixation position on the display area; the electronic device determines that the center or the starting position of the first preset visual area is on a preset display distance of the preset display orientation of the current eyeball fixation position.
In the above embodiment, the preset display orientation and the preset display distance matched with the relative position may be determined according to the relative position of the current eyeball gaze position on the display area, and then the position of the first preset visual area is determined, so that the determination of the first preset visual area better meets the requirements of the current scene of the client.
With reference to some embodiments of the first aspect, in some embodiments, the determining, by the electronic device, a first preset visual area according to the current eye gaze location specifically includes: the electronic equipment determines whether an area which has no display content and is not smaller than the size of the message window exists in a range of a first preset distance from the current eyeball watching position on the display area; when it is determined that there is an area that does not display content and is not smaller than the size of the message window, the electronic device determines the area as the first preset visual area.
In the above embodiment, the electronic device preferentially determines the area in the display area where no content is displayed as the first preset visual area, so that the occlusion of the content displayed in the display area can be avoided.
In combination with some embodiments of the first aspect, in some embodiments, the electronic device determines a speed of movement of the user at a current eye gaze location of a display area of the electronic device.
In the above embodiment, the electronic device may further determine the moving speed of the current eyeball gaze position, so that the display of the message window is determined according to the current eyeball gaze position and the speed thereof, thereby determining the current state of the user, and making the display of the message window more humanized.
In combination with some embodiments of the first aspect, in some embodiments, the method may further comprise: the electronic equipment determines whether the moving speed of the current eyeball fixation position is not greater than a preset first speed value; the electronic device displays the message window in the first preset visual area, and specifically includes: when the electronic equipment determines that the moving speed of the current eyeball fixation position is not larger than a preset first speed value, the electronic equipment displays the message window in the first preset visual area.
In the above embodiment, when the moving speed of the current eyeball fixation position is not greater than the preset first speed value, the electronic device directly displays the message window in the first preset visual area around the current eyeball fixation position. At this time, the moving speed of the current eyeball fixation position of the user is not high, and the displayed message window can be easily seen by the user.
In combination with some embodiments of the first aspect, in some embodiments, the method may further comprise: when the electronic equipment determines that the moving speed of the current eyeball watching position is greater than a preset first speed value, the electronic equipment determines whether the average moving speed of the eyeball watching position of the user in a preset first time length is not greater than a preset second speed value; when the electronic equipment determines that the average moving speed of the eyeball watching position of the user is not more than the preset second speed value within the preset first time period, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position; when the electronic equipment determines that the average moving speed of the eyeball watching position of the user is larger than the preset second speed value within the preset first time period, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position and moves the message window along with the movement of the current eyeball watching position of the user.
In the above embodiment, when the moving speed of the current eyeball fixation position is greater than the preset first speed value, it indicates that the moving speed of the eyeball fixation position at this time is high. The electronic device may determine whether an average moving speed of the eye gaze position of the user within a subsequent preset first time period is not greater than a preset second speed value. When the average speed is not greater than the preset second speed value, the message window is displayed, and then the user can see the message window, and the electronic device can determine the first preset visual area according to the current eyeball fixation position and display the message window. When the average speed is greater than the preset second speed value, the moving speed of the eyeball watching position of the user is high, and the user can not see the message window only by displaying the message window. Therefore, after the electronic device determines the first preset visual area according to the current eyeball fixation position and displays the message window, the message window can be moved according to the movement of the current eyeball fixation position of the user, and therefore the message window can be seen by the user with higher movement speed of the eyeball fixation position more easily.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: when the electronic equipment determines that the moving speed of the current eyeball watching position is greater than a preset first speed value, the electronic equipment determines whether the accumulated distance of the movement of the eyeball watching position of the user exceeds a preset accumulated distance within a preset fourth time period; when the electronic equipment determines that the cumulative distance of the movement of the eyeball watching position of the user does not exceed the preset cumulative distance within the preset fourth time period, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position; when the electronic equipment determines that the cumulative distance of the movement of the eyeball watching position of the user within the preset fourth time exceeds the preset cumulative distance, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position and moves the message window along with the movement of the current eyeball watching position of the user.
In the above-described embodiment, since the moving speed of the eyeball focus position of the user has instability, it may be determined whether the average moving speed of the eyeball focus position of the user exceeds the preset value by determining whether the cumulative distance that the eyeball focus position of the user moves within the preset fourth period of time exceeds the preset cumulative distance. Compared with the direct determination of the average speed of the time period, the determination of the accumulated distance of the eyeball fixation position in the time period can reflect the average moving speed of the eyeball fixation position of the user more accurately, so that whether the message window needs to move along with the movement of the eyeball fixation position of the user can be determined more accurately.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: the electronic equipment determines whether the current eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; and if the current eyeball watching position of the user moves to the region of the message window at one moment in the preset second time length, the electronic equipment stops moving the message window.
In the above embodiment, if the current eyeball gaze position of the user moves into the region of the message window at a certain time within the preset second time period after the message window is displayed, it indicates that the user has seen the message window. The electronic equipment can stop moving the message window, so that the influence on the subsequent operation of the user is avoided.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: if the eyeball watching position of the user does not move into the region of the message window within the preset second time length, the electronic equipment determines whether the moving distance of the eyeball watching position of the user exceeds the preset second distance within the preset third time length after the message window is displayed; and if the moving distance of the eyeball watching position of the user does not exceed the preset second distance within the preset third time length, the electronic equipment stops moving the message window.
In the above embodiment, if the user does not see the message window within the preset second time period, and the moving distance of the eyeball gaze position of the user does not exceed the preset second distance within the preset third time period, it indicates that the user does not want to see the message window. The electronic device may stop moving the message window to avoid the message window from affecting subsequent operations of the user.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: the electronic equipment determines whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; when the electronic equipment determines that the eyeball watching position of the user does not move into the region of the message window within the preset second time length and the moving distance of the eyeball watching position of the user exceeds the preset second distance within the preset third time length after the message window is displayed, the electronic equipment displays the message window in the first preset visual region determined according to the current eyeball watching position, or the electronic equipment executes the step that the electronic equipment determines whether the average moving speed of the eyeball watching position of the user within the preset first time length is not greater than the preset second speed value.
In the above embodiment, if the user does not see the message window within the preset second time period, and the moving distance of the eyeball gaze position of the user within the preset third time period exceeds the preset second distance. It indicates that the eyeball fixation position of the user has a large jump within the preset third duration, and the user may not see the display of the message window, and the electronic device may jump the message window directly to the first preset visual area around the current eyeball fixation position of the user. Or the electronic device may execute the average speed determination process again, and display the message window accordingly according to the determination result.
In combination with some embodiments of the first aspect, in some embodiments, the method further comprises: the electronic equipment determines whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; when the electronic device determines that the eyeball watching positions of the user do not move into the region of the message window within the preset second time length and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time length after the message window is displayed, the electronic device displays the message window in the first preset visual region determined according to the current eyeball watching positions, or the electronic device executes the step that the electronic device determines whether the accumulated distance of the eyeball watching positions of the user moves exceeds the preset accumulated distance within the preset fourth time length.
In the above embodiment, if the user does not see the message window within the preset second time period, and the moving distance of the eyeball gaze position of the user within the preset third time period exceeds the preset second distance. It indicates that the eyeball fixation position of the user has a large jump within the preset third duration, and the user may not see the display of the message window, and the electronic device may jump the message window directly to the first preset visual area around the current eyeball fixation position of the user. Or the electronic device may execute the foregoing accumulated distance determining process again, and display the message window accordingly according to the determination result.
In a second aspect, an embodiment of the present application provides an electronic device, including: one or more processors and memory; the memory coupled with the one or more processors, the memory to store computer program code, the computer program code including computer instructions, the one or more processors to invoke the computer instructions to cause the electronic device to perform: determining that a message window needing to be displayed exists; determining the current eyeball watching position of a user in a display area of the electronic equipment in real time; determining a first preset visual area according to the current eyeball fixation position, wherein the first preset visual area is arranged around the current eyeball fixation position on the display area; displaying the message window in the first preset visual area.
In the embodiment, when the electronic device determines that the message window needing to be displayed exists, the first preset visual area around the eyeball fixation position is determined in the display area according to the current eyeball fixation position of the user, and the message window is displayed in the first preset visual area, so that the message window is directly displayed around the current eyeball fixation position of the user, the user can see the message window at the first time, and the message notification efficiency is improved.
With reference to some embodiments of the second aspect, in some embodiments, the first predetermined visual area is on the display area and a center or a starting position is within a first predetermined distance from the current eye gaze position, and a size of the first predetermined visual area is not smaller than a size of the message window.
With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to perform: determining a preset display direction and a preset display distance matched with the relative position according to the relative position of the current eyeball fixation position on the display area; and determining the center or the starting position of the first preset visual area to be on the preset display distance of the preset display direction of the current eyeball fixation position.
With reference to some embodiments of the second aspect, in some embodiments, the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to perform: determining whether an area which has no display content and is not smaller than the size of the message window exists in the range of a first preset distance from the current eyeball watching position on the display area; when it is determined that there is an area that does not display content and is not smaller than the size of the message window, determining the area as the first preset visual area.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: the moving speed of the current eyeball fixation position of the user in the display area of the electronic equipment is determined.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: determining whether the moving speed of the current eyeball watching position is not greater than a preset first speed value; the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to perform: and when the moving speed of the current eyeball watching position is determined not to be larger than a preset first speed value, displaying the message window in the first preset visual area.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: when the moving speed of the current eyeball watching position is determined to be larger than a preset first speed value, determining whether the average moving speed of the eyeball watching position of the user in a preset first time length is not larger than a preset second speed value or not; when the average moving speed of the eyeball watching position of the user is determined to be not more than the preset second speed value within the preset first time period, displaying the message window in a first preset visual area determined according to the current eyeball watching position; and when the average moving speed of the eyeball watching position of the user is determined to be larger than the preset second speed value within the preset first time period, displaying the message window in a first preset visual area determined according to the current eyeball watching position, and moving the message window along with the movement of the current eyeball watching position of the user.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: when the moving speed of the current eyeball watching position is determined to be larger than a preset first speed value, determining whether the accumulated distance of the movement of the eyeball watching position of the user exceeds a preset accumulated distance within a preset fourth time period; when the accumulated distance of the movement of the eyeball watching position of the user does not exceed the preset accumulated distance within the preset fourth time, displaying the message window in a first preset visual area determined according to the current eyeball watching position; and when the accumulated distance of the movement of the eyeball watching position of the user exceeds the preset accumulated distance within the preset fourth time, displaying the message window in a first preset visual area determined according to the current eyeball watching position, and moving the message window along with the movement of the current eyeball watching position of the user.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: determining whether the current eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; and if the current eyeball watching position of the user moves to the region of the message window at one moment in the preset second time length, the electronic equipment stops moving the message window.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: if the eyeball watching position of the user does not move to the region of the message window within the preset second time length, determining whether the moving distance of the eyeball watching position of the user exceeds the preset second distance within the preset third time length after the message window is displayed; and if the moving distance of the eyeball watching position of the user does not exceed the preset second distance within the preset third time length, stopping moving the message window.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: determining whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; and when it is determined that the eyeball watching positions of the user do not move into the region of the message window within the preset second time period and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time period after the message window is displayed, displaying the message window in a first preset visual region determined according to the current eyeball watching positions, or executing the step of determining whether the average moving speed of the eyeball watching positions of the user within the preset first time period is not greater than the preset second speed value.
In some embodiments combined with some embodiments of the second aspect, in some embodiments, the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform: determining whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed; and when it is determined that the eyeball watching positions of the user do not move into the region of the message window within the preset second time period and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time period after the message window is displayed, displaying the message window in a first preset visual region determined according to the current eyeball watching positions, or executing the step of determining whether the accumulated distance of the movement of the eyeball watching positions of the user exceeds the preset accumulated distance within the preset fourth time period.
In a third aspect, an embodiment of the present application provides a chip system, where the chip system is applied to an electronic device, and the chip system includes one or more processors, and the processor is configured to invoke a computer instruction to cause the electronic device to perform a method as described in the first aspect and any possible implementation manner of the first aspect.
It is understood that the system-on-chip may include one processor 110 in the electronic device 100 shown in fig. 17, may also include a plurality of processors 110 in the electronic device 100 shown in fig. 17, and may also include one or more other chips, which is not limited herein.
In a fourth aspect, embodiments of the present application provide a computer program product including instructions, which, when run on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer-readable storage medium, which includes instructions that, when executed on an electronic device, cause the electronic device to perform the method described in the first aspect and any possible implementation manner of the first aspect.
It is understood that the electronic device provided by the second aspect, the chip system provided by the third aspect, the computer program product provided by the fourth aspect, and the computer storage medium provided by the fifth aspect are all used to execute the method provided by the embodiments of the present application. Therefore, the beneficial effects achieved by the method can refer to the beneficial effects in the corresponding method, and are not described herein again.
Drawings
FIG. 1 is a schematic diagram of a scenario of an eye tracking technique according to an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an effect of infrared light reflected by an eyeball in the embodiment of the present application;
FIG. 3 is a schematic diagram illustrating another effect of infrared light reflected by an eyeball in the embodiment of the present application;
FIG. 4 is a flowchart illustrating an eye tracking technique according to an embodiment of the present application;
FIG. 5 is an exemplary diagram of a message window in an embodiment of the present application;
FIG. 6 is another exemplary diagram of a message window in an embodiment of the present application;
FIG. 7 is another exemplary diagram of a message window in an embodiment of the present application;
FIG. 8 is another exemplary diagram of a message window in an embodiment of the present application;
FIG. 9 is another exemplary diagram of a message window in an embodiment of the present application;
FIG. 10 is an exemplary diagram of a display area of an electronic device in an embodiment of the present application;
FIG. 11 is another exemplary diagram of a display area of an electronic device in an embodiment of the present application;
FIG. 12 is another exemplary diagram of a display area of an electronic device in an embodiment of the present application;
FIG. 13 is another exemplary diagram of a display area of an electronic device in an embodiment of the present application;
FIG. 14 is another exemplary diagram of a display area of an electronic device in an embodiment of the present application;
FIG. 15 is an exemplary illustration of a first predetermined area in an embodiment of the present application;
FIG. 16 is a diagram illustrating an exemplary scenario of a message display method according to an embodiment of the present application;
fig. 17 is a schematic structural diagram of an exemplary electronic device 100 provided in an embodiment of the present application;
fig. 18 is a block diagram of a software structure of an exemplary electronic device 100 provided in an embodiment of the present application;
FIG. 19 is a flowchart illustrating a message display method according to an embodiment of the present application;
FIG. 20 is an exemplary diagram of a user interface in an embodiment of the present application;
FIG. 21 is another exemplary diagram of a user interface in an embodiment of the present application;
FIG. 22a is another exemplary schematic diagram of a user interface in an embodiment of the present application;
FIG. 22b is another exemplary diagram of a user interface in an embodiment of the present application;
FIG. 23 is an exemplary set of diagrams of a user interface in an embodiment of the present application;
fig. 24 is a schematic structural diagram of an exemplary electronic device 200 provided in an embodiment of the present application.
Detailed Description
The terminology used in the following embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the present application. As used in the specification of the present application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the listed items.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature, and in the description of embodiments of the application, unless stated otherwise, "plurality" means two or more.
For ease of understanding, the related terms and concepts related to the embodiments of the present application will be described below.
(1) Eyeball tracking technology:
there are many researches on eyeball tracking technology, and the following principles are common in the current realization of eyeball tracking technology:
tracking according to the eyeball and the characteristic change around the eyeball according to the principle 1;
tracking according to the change of the iris angle according to the principle 2;
principle 3, actively projecting beams such as infrared rays to the iris to extract features.
These implementations are based on the fact that the eye changes slightly when the human eye looks in different directions. These changes can produce extractable features that the electronic device can extract by image capture or scanning to track changes in the eye in real time to determine the eye gaze location.
The main equipment of the eyeball tracking technology can include infrared equipment, can also be image acquisition equipment, and even a camera on common electronic equipment can realize eyeball tracking under the support of software.
Generally, the eye tracking according to principle 1 or principle 2 is inferior to the eye tracking according to principle 3 in the tracking accuracy. The following is a brief description of an eyeball tracking technique performed according to principle 3:
exemplarily, as shown in fig. 1, a schematic view of a scene of an eye tracking technology in an embodiment of the present application is shown. The eye tracking technique according to principle 3 requires an infrared light emitter and an infrared camera to be included in hardware. As shown in fig. 1, the electronic device is equipped with 3 infrared light emitters and 1 infrared camera. As shown by the dashed arrows in fig. 1, when infrared light is emitted from the infrared light emitter and reflected by the human eye, the video of the emitted infrared light is captured by the infrared camera.
Fig. 2 is a schematic diagram illustrating an effect of infrared light reflected by an eyeball in the embodiment of the present application. Each time the user watches a position on the display area, the positions of the reflecting points of the eyeballs reflecting the infrared light emitted by the infrared light emitters at different positions on the electronic equipment are different, so that the electronic equipment can obtain the positions of the reflecting points of the infrared light reflected by the pupils and the cornea of the user for each eye.
Fig. 3 is a schematic view illustrating another effect of the infrared light reflected by the eyeball in the embodiment of the present application. When the user's eyeball is gazing at another location on the display area, the electronic device may obtain, for each eye, the location of another set of reflection points at which the user's pupil and cornea reflect infrared light. Since the user is looking at another location at this time, the angle of the eyeball may change slightly from before. The reflection point of the user's pupil reflecting infrared light from the cornea is shifted from before.
By the infrared light reflected after entering the human eye, the electronic device can perform the following information processing, thereby obtaining the eyeball fixation position on the display area of the electronic device. Fig. 4 is a schematic flow chart of an eye tracking technique according to an embodiment of the present application:
s401, an infrared light emitter in the electronic equipment emits infrared light;
s402, capturing human eye videos by an infrared camera in the electronic equipment;
s403, the electronic equipment processes the human eye video to obtain the positions of the pupils and the corneal reflection points of the user;
s404, the electronic equipment determines the sight line direction according to the deviation of the pupil and the corneal reflection point, and finally determines the eyeball sight-watching position on the display area.
Since how to implement the eyeball tracking technology is not the focus in the embodiment of the present application, for a specific algorithm, details are not described in the embodiment of the present application.
In the embodiment of the present application, it is not limited to what principle is adopted for the eyeball tracking technology, and the adopted eyeball tracking technology is only required to finally determine the eyeball fixation position of the user on the display area, and is not limited herein.
Generally, after determining the eyeball-fixation position through an eyeball-tracking technology, the electronic device may output coordinates of the eyeball-fixation position on the display area. It is understood that the reference coordinate system of the coordinates may be a default coordinate system of the electronic device, or may be another coordinate system. If the reference coordinate system of the output coordinates is other coordinate systems, the electronic device may also convert the output coordinates into a default coordinate system of the electronic device, which is not limited herein.
(2) Message window:
the message window in the embodiment of the present application may include various windows in the electronic device for providing messages to the user, such as a message window for information notification, a message window for feedback prompt, and the like, which are not limited herein.
For example, for a message window for information notification, notification messages for various applications or systems may be included. The message window for information notification may appear when the display area of the electronic device displays a window of an application program, or may appear when the display area of the electronic device displays a main interface window of a system, which is not limited herein.
Fig. 5 is an exemplary diagram of a message window in the embodiment of the present application. The display area of the electronic device displays a window 501 for an application. At this time, a message window 502 may be displayed at the top of the display area.
The message window 502 may include an application identification area 502A and a content area 502B.
An indicator of from which application and/or from whom the message window originated can be displayed in the application identification area 502A. For example, the indicator may include "mom", "short message". The indicator is not limited to text information, but may be an icon, and is not limited herein.
The content area 502B may display the content of the information notification. For example, "fast home to eat may be displayed. ".
Fig. 6 is another exemplary diagram of a message window in the embodiment of the present application. The display area of the electronic device displays a main interface window 601 of the system. At this time, a message window 602 may also be displayed at the top of the display area. An application identification area 602A and a content area 602B may be included in the message window 602. Specifically, reference may be made to the application identification area 502A and the content area 502B, which are not described herein again.
For example, the message window for the feedback prompt may include at least the following message windows in various forms:
1、Toast:
toast is a lightweight feedback prompt, often in the form of a caption alert, that typically occurs for 1 to 2 seconds with an automatic message.
Fig. 7 is another exemplary diagram of a message window in the embodiment of the present application. The network signal that the electronic equipment connects to after the start is relatively poor, and do not connect to the WIFI network. At this time, a main interface window 701 of the system is displayed in the display area of the electronic device. A message window 702 may pop up at the bottom of the display area. The message window 702 alerts a line of captioning "network connection is unavailable, please retry later". Therefore, the user can acquire the network connection condition in the electronic equipment in time so as to process in time.
2、Snackbar:
Snapbar inherits all the characteristics of toast, namely: in the form of a small pop-up window, will automatically disappear.
But there are also several differences:
firstly, 0 to 1 operation can appear in a message window, and a cancel button is not included;
secondly, clicking the area outside the message window, and enabling the Snackbar to disappear;
and thirdly, the common default is that the common default appears at the bottom of the display area of the electronic equipment.
Fig. 8 is another exemplary diagram of a message window in the embodiment of the present application. The display area of the electronic device displays a management window 701 for a photo application. When the user deletes a photo, the photo application may pop up a Snackbar's message window 802 in the display area of the electronic device. In the message window 802, the content "photo was successfully deleted" may be displayed for prompting the user that the photo has been successfully deleted. A clickable "cancel" button may also be displayed to prompt the user to click the cancel button to cancel the just-deleted operation and restore the deleted photos.
3、Dialog:
Dialog is a feedback prompt in the form of a Dialog box. Dialogs generally require a user action (click-to-confirm or close) to disappear.
Fig. 9 is another exemplary diagram of a message window in the embodiment of the present application. The display area of the electronic device displays a main interface window 901 of the system. When a user logs in the system account of the electronic device on the PC-side web page, a Dialog message window 902 may pop up in the display area of the electronic device. The message window 902 may display a content "login system account on the web page" for prompting the user to login the system account of the electronic device on the web page side. A clickable cancel button can be displayed for prompting the user to click the cancel button so that the login of the webpage end fails; and a 'confirmation' button is displayed for prompting the user to click the confirmation button so that the login of the webpage end is successful.
It is to be understood that the message window in the embodiment of the present application may include various other types of message windows for sending a message or a prompt to a user, besides the above-described exemplary various types of message windows, which are not limited herein.
(3) Display area of electronic device:
the electronic device in the embodiment of the present application may be any electronic device supporting an eye tracking technology and having a display area, for example, may be a mobile electronic device, a personal computer electronic device, a vehicle-mounted electronic device, a Virtual Reality (VR) electronic device, an Augmented Reality (AR) electronic device, and the like, and is not limited herein.
For example, fig. 5 may be a display area of a mobile electronic device.
For example, fig. 10 is another exemplary schematic diagram of a display area of an electronic device in an embodiment of the present application. Which may be the display area of a mobile electronic device of a folding screen. As shown in fig. 10, the display area of the electronic device may be composed of two foldable display areas, an a-screen and a B-screen. When the A screen and the B screen are folded, the A screen and the B screen can be respectively used as an independent display area, and when the A screen and the B screen are unfolded, the A screen and the B screen can jointly form a larger display area.
For example, fig. 11 is another exemplary schematic diagram of a display area of an electronic device in an embodiment of the present application. Such as display area 1101 in fig. 11, which may be a display area of a personal computer electronic device.
For example, fig. 12 is another exemplary schematic diagram of a display area of an electronic device in an embodiment of the present application. As a display area 1201 in fig. 12, it may be a display area of an in-vehicle electronic apparatus.
For example, fig. 13 is another exemplary schematic diagram of a display area of an electronic device in an embodiment of the present application. As in display area 1301 of fig. 13, it may be a display area of a VR electronic device. The display area of the VR electronic device only shows the picture displayed by the electronic device, but not the picture in reality. As shown in fig. 13, although the VR electronic device worn by the user 1303 faces a scene 1304 in which two persons are moving, the scene 1304 in which two persons are moving in reality does not appear in the display region 1301. It is understood that while the screen of the VR electronic device is displayed on the two lenses 1302A and 1302B of the VR electronic device, the screen that the user 1303 ultimately sees is the display area 1301. Therefore, in this example, the display area of the electronic device in the embodiment of the present application is the display area 1301.
For example, fig. 14 is another exemplary schematic diagram of a display area of an electronic device in an embodiment of the present application. As a display region 1401 in fig. 14, it may be a display region of an AR electronic device. The display area of the AR electronic device may not only appear on the screen displayed by the electronic device, but also appear on the screen in reality. As shown in fig. 14, the AR electronic device worn by the user 1403 is oriented with two people in a moving scene 1404, and thus the scene 1404 in which two people are moving in reality appears in the display area 1401. It is understood that while the picture of the AR electronic device is displayed on the two lenses 1402A and 1402B of the AR electronic device, the picture that the user 1403 finally sees is the display area 1401. Therefore, in this example, the display region of the electronic device in the embodiment of the present application is this display region 1401.
It will be appreciated that there may be many different display areas of the electronic device and are not limited thereto. For convenience of description and understanding, the display area of the electronic device shown in fig. 5 is used as an example in the embodiment of the present application to describe the message display method in the embodiment of the present application, and a process and a type of the message display method are executed on an electronic device having other types of display areas, which is not described in detail in the embodiment of the present application.
(4) First preset visual area:
in the embodiment of the application, the message window to be displayed is preferentially displayed in the first preset visual area. The first preset visual area is around a current eyeball fixation position on a display area of the electronic equipment.
Specifically, the first preset visual area is on a display area of the electronic device, and a center or a starting position of the first preset visual area is within a first preset distance from a current eyeball fixation position. The size of the first preset visual area is not smaller than the size of a message window needing to be displayed.
Preferably, the first preset distance is less than half of the width of the display area. Optionally, the first predetermined distance is less than 2/3 of the display area width. The first preset distance may also be other values, may be preset in a factory, or may be set by a user in a user-defined manner, and is not limited herein.
The first preset visual area may be varied according to the size of the message window to be displayed. In some embodiments, the size and shape of the first preset visual area may be the same as the size and shape of the message window that needs to be displayed. In some embodiments, the size of the first preset visual area may be larger than the size of the message window to be displayed. And is not limited herein.
The position of the first preset visual area can be changed according to the change of the current eyeball fixation position.
In some embodiments, the center of the first predetermined visual area and the current eye gaze location may coincide.
In some embodiments, the center of the first predetermined visual area is not coincident with the current eye gaze location. According to the position of the current eyeball fixation position on the display area of the electronic equipment, the first preset visual area can be in different directions relative to the eyeball fixation position.
Specifically, the electronic device may determine a preset display orientation and a preset display distance that are matched with the current eyeball gaze position according to the relative position of the current eyeball gaze position on the display area; the electronic device may determine that a center or starting position of the first predetermined visual area is at a predetermined display distance of a predetermined display orientation of the eye gaze position.
For example, the relative position may include a distance from a left side of a display area of the electronic device to a preset first left side, the matched preset display orientation may be a right side of the current eye gaze position, and the matched preset display distance may be a preset second left side distance. The preset second left side distance may be greater than the preset first left side distance and less than the first preset distance.
For example, the relative position may be within a preset first upper side distance from the upper side of the display area of the electronic device, the matched preset display orientation may be the lower side of the current eye gaze position, and the matched preset display distance may be a preset second upper side distance. The preset second upper side distance may be greater than the preset first upper side distance and less than the first preset distance.
For example, the relative position may include being within a preset first center distance from the center of the display area of the electronic device, and the matched preset display orientation may be a direction opposite to the moving direction of the current eye gaze position, and the matched preset display distance may be a preset second center distance. The preset second center distance may be greater than the preset first center distance and less than the first preset distance.
It is understood that there may be a plurality of relative positions of the current eye gaze position on the display area, and the preset display orientation and the preset display distance of the first preset vision area matching with the relative positions, for example, the eye gaze position is closer to the right side of the display area, the eye gaze position is closer to the lower side of the display area, the eye gaze position is closer to the upper left corner of the display area, and the like, which may be set according to practical situations and requirements, and is not limited herein.
In some embodiments, the first predetermined area may preferentially determine an area in which no content is displayed in the display area of the electronic device. If there is no area in the display area where no content is displayed, the first preset area may be determined in other manners, which is not limited herein.
Exemplarily, as shown in fig. 15, an exemplary schematic diagram of a first preset area in the embodiment of the present application is shown. The first predetermined area 1502 is on the display area 1505 around the eye gaze location 1501, and the distance 1503 between the center of the first predetermined area and the eye gaze location 1501 is smaller than the first predetermined distance 1504.
It is understood that there are many other predetermined ways to determine the orientation of the first predetermined visual area relative to the current eye gaze location, as long as the center or starting position of the first predetermined visual area is within the first predetermined distance from the current eye gaze location, and is not limited herein.
In the prior art, when an electronic device has a message window to be displayed, the message window is generally displayed at a fixed position. As shown in fig. 5, the message window 502 is fixedly displayed at the topmost portion of the display area 501. As shown in fig. 8, a message window 802 is fixedly displayed at the lowermost portion of the display area 801.
This way the message window is displayed in a fixed position, it is likely that the user is currently looking at other locations in the display area and is not aware of the message window, so that the user does not get the message in time. Or the user needs to turn around to shift the gaze position to see the message window, which interrupts the item currently being processed by the user. If the message window is a message, the user needs to call the message list again to see the message, and the step of viewing the message by the user is added.
In the embodiment of the application, when the electronic equipment has a message window needing to be displayed, the message window is directly displayed around the eyeball fixation position of the user in the display area. Fig. 16 is a schematic view of an exemplary scenario of a message display method in the embodiment of the present application. As shown in fig. 16 (a), the current eyeball focus position of the user is the first eyeball focus position 1601. If the electronic device needs to display a message window at this time, the electronic device directly displays the message window 1602 around the first eyeball fixation position 1601.
Since the message window is directly displayed around the current eyeball fixation position of the user, the user can directly see the message window without transferring the view angle. The method and the device enable the user to quickly and conveniently acquire the notification message, do not affect the current items processed by the user, and improve the efficiency of message notification.
An exemplary electronic device 100 provided by embodiments of the present application is first described below.
Fig. 17 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application.
The following describes an embodiment specifically by taking the electronic device 100 as an example. It should be understood that electronic device 100 may have more or fewer components than shown, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
The electronic device 100 may include: the mobile phone includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, an infrared light emitter 196, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The SIM interface may be used to communicate with the SIM card interface 195, implementing functions to transfer data to or read data from the SIM card.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
It is appreciated that the display screen 194 of the exemplary electronic device 100 provides a display area in embodiments of the present application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
In some embodiments, electronic device 100 may include 1 or N infrared cameras that may capture video containing infrared light.
The infrared light emitter 196 is for emitting infrared light. In some embodiments, infrared emitter 196 emits infrared light that is reflected off of the user's eye. Electronic device 100 may include 1 or N infrared light emitters 196, N being a positive integer greater than 1.
In the embodiment of the present application, the infrared camera and the infrared light emitter 196 in the electronic device 100 may form a hardware basis for implementing the eye tracking technology by using the above-mentioned terms (1) principle 3 in the eye tracking technology.
In practical applications, if the above terms are used to describe (1) principle 1 or principle 2 of the eye tracking technology to implement the eye tracking technology, the infrared camera and the infrared emitter 196 may or may not be present in the electronic device 100, and are not limited herein.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application (such as a face recognition function, a fingerprint recognition function, a mobile payment function, and the like) required by at least one function, and the like. The storage data area may store data (such as face information template data, fingerprint information template, etc.) created during the use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication.
Fig. 18 is a block diagram of a software configuration of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the system is divided into four layers, an application layer, an application framework layer, a Runtime (Runtime) and system library, and a kernel layer, from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 18, the application package may include applications (also referred to as applications) such as a message notification module, a camera, a gallery, a calendar, a call, a map, a navigation, a WLAN, bluetooth, music, a video, a short message, etc.
The message notification module is used for determining the eyeball fixation position of the user from the eyeball fixation position calculation module, determining a message window needing to be displayed from the notification manager, and executing the message display method in the embodiment of the application according to the eyeball fixation position of the user.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 18, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, an eye gaze location calculation module, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog interface. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
In some embodiments, when the message notification module is running, the notification manager no longer notifies the message notification module of messages that display the message window.
In some embodiments, the message notification module may be located not at the application layer, but at the application framework layer, and is not limited herein.
In some embodiments, the message notification module is a sub-module in the notification manager, and is not limited herein.
The eyeball watching position calculation module is used for acquiring a real-time video of the eyeball movement of the user from the camera drive of the inner nuclear layer, and calculating the eyeball watching position of the user on the display screen in real time according to an eyeball tracking technology.
The eyeball fixation position calculation module can provide an eyeball fixation position interface, and the control movement module of the application program layer can acquire the eyeball fixation position of the user on the display screen in real time by calling the eyeball fixation position interface.
The Runtime (Runtime) includes a core library and a virtual machine. Runtime is responsible for scheduling and management of the system.
The core library comprises two parts: one part is the function that the programming language (e.g. java language) needs to call, and the other part is the core library of the system.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes programming files (e.g., java files) of the application layer and the application framework layer as binary files. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), two-dimensional graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provides a fusion of two-Dimensional (2-Dimensional, 2D) and three-Dimensional (3-Dimensional, 3D) layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing 3D graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver and a virtual card driver. In some embodiments, the inner core layer may also contain an infrared light emitter drive.
The following describes exemplary workflow of the software and hardware of the electronic device 100 in connection with capturing a photo scene.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
In some embodiments, the eye gaze location calculation module may also be at other layers of the software architecture. For example, it may belong to a kernel layer, or may belong to a system library, or may belong to an application layer, which is not limited herein. When the eyeball fixation position calculation module and the message notification module are located in the same layer, they may be a module independent from the message notification module, or may be a sub-module in the message notification module, and this is not limited here.
The method in the embodiments of the present application will be described in detail below with reference to the accompanying drawings and the software and hardware structure of the exemplary electronic device 100 described above:
fig. 19 is a flowchart illustrating a message display method according to an embodiment of the present application:
s1901, the electronic equipment determines that a message window needing to be displayed exists;
in the embodiment of the present application, the electronic device may determine that there is a message window to be displayed in multiple ways:
1. when the electronic equipment determines that the message to be displayed in the notification manager is available, determining that a message window needing to be displayed exists;
2. when the electronic equipment determines that the system or the application program has the prompt message which needs to be displayed, the electronic equipment can determine that a message window which needs to be displayed exists;
3. when the electronic device determines entities such as controls, windows, and the like to be displayed by a system or an application, it may be determined that there is a message window that needs to be displayed.
It is to be understood that the electronic device may also determine that there is a message window to be displayed in other ways, which is not limited herein.
S1902, the electronic device determines whether the current eyeball fixation position of the user is in a display area;
it can be understood that the electronic device may determine the eyeball-fixation position of the user in real time through an eyeball tracking technology, which may specifically refer to (1) the eyeball tracking technology in the above description, which is not described herein again.
When the electronic device determines that the current eyeball fixation position of the user is in the display area of the electronic device, the electronic device may perform step S1904;
when the electronic device determines that the current eyeball fixation position of the user is not in the display area of the electronic device, the electronic device may perform step S1903.
It can be understood that the electronic device may determine that the current eye gaze position of the user is not in the display area by determining that the eye gaze position of the user is outside the display area through the eye tracking technology; it may also be determined that the current eye gaze position of the user is not in the display area by not detecting the eye gaze position of the user in the display area of the electronic device, which is not limited herein.
S1903, the electronic equipment displays the message window at a default position;
when the electronic device determines that the current eyeball gaze position of the user is not in the display area, the user is not looking at the electronic device currently. At this time, the electronic device may display the message window at a default position of the message window.
Optionally, after the electronic device displays the message window at the default position, an additional reminding mode may be adopted to remind the user that there is a new message. For example, the electronic device may add a message alert sound; for another example, the electronic device may enhance message alert vibration, etc., and is not limited herein.
In some embodiments, step S1903 may not be performed, and is not limited herein.
S1904, the electronic equipment determines the current eyeball fixation position of the user in a display area of the electronic equipment in real time;
when the electronic equipment determines that the current eyeball gaze position of the user is in the display area of the electronic equipment, the electronic equipment determines the current eyeball gaze position of the user in the display area of the electronic equipment in real time.
Specifically, the electronic device may determine the current eyeball fixation position of the user in the display area of the electronic device in real time as a continuous process. The electronic device may determine the current eye gaze location of the user in a display area of the electronic device in real-time or periodic continuous tracking.
It is understood that, if the current eyeball gaze position of the electronic device in the display area is already obtained in step S1902, step S1904 may not be executed again. If it is determined in step S1902 that the eye gaze position is in the display area, but no specific eye gaze position is determined, the electronic device may determine the specific current eye gaze position of the user in the display area of the electronic device through step S1904, which is not limited herein.
S1905, the electronic equipment determines the moving speed of the current eyeball fixation position of the user in the display area of the electronic equipment;
when the electronic device determines that the current eye gaze position of the user is in the display area of the electronic device, the electronic device may determine a movement speed of the user in the current eye gaze position of the display area of the electronic device.
It is to be understood that the execution order of steps S1904 and S1905 is not limited.
In some embodiments, the step S1905 may not be executed, and the display position of the message window may be determined according to the current eyeball fixation position, which is not limited herein.
Fig. 20 is an exemplary schematic diagram of a user interface in an embodiment of the present application. On the display area 2003, the electronic device determines that the user is reading in the direction of the arrow 2001, the electronic device may determine that the user's current eye gaze location is in the display area and that the current eye gaze location is the second eye gaze location 2002. And through the eyeball tracking technology, the electronic device may determine that the velocity value of the velocity of the movement of the current eyeball focus position of the user on the display area 2003 is V2002.
S1906, the electronic device determines whether the moving speed of the current eyeball gaze position is not greater than a preset first speed value;
after the electronic equipment determines the moving speed of the user at the current eyeball fixation position in the display area of the electronic equipment, whether the moving speed is not greater than a preset first speed value or not can be determined;
if the moving speed is greater than the preset first speed value, it indicates that the current eyeball gaze position of the user moves faster, and the electronic device may execute step S1907;
if the moving speed is not greater than the preset first speed value, it indicates that the moving speed of the current eyeball gaze position of the user is slow, and the electronic device may execute step S1908.
S1907, the electronic device determines whether an average moving speed of an eyeball gaze position of the user is not greater than a preset second speed value within a preset first duration;
when the electronic equipment determines that the moving speed of the current eyeball watching position is greater than a preset first speed value, the electronic equipment can determine the average moving speed of the eyeball watching position of the user in the next preset first time period, and then determine whether the average moving speed is not greater than a preset second speed value;
it is understood that the preset second speed value may be greater than, less than, or equal to the preset first speed value, and is not limited thereto.
The moving speed of the current eyeball watching position of the user in the display area of the electronic equipment is high, the electronic equipment continuously monitors the moving speed of the eyeball watching position within the preset first time length, and whether the average moving speed of the eyeball watching position meets the display requirement or not is determined.
When the electronic device determines that the average moving speed of the eyeball gaze position of the user is not greater than the preset second speed value within the preset first time period, the average moving speed of the eyeball gaze position of the user is shown to meet the display requirement, and the electronic device may perform step S1908;
when the electronic device determines that the average moving speed of the eyeball focus position of the user is greater than the preset second speed value within the preset first time period, indicating that the average moving speed of the eyeball focus position of the user is high, the electronic device may perform step S1909.
It can be understood that the average moving speed of the eyeball fixation position of the user within the preset first time period is an average value of the moving speeds of the current eyeball fixation position of the electronic device at each time monitored in real time within the first time period.
Alternatively, in practical applications, the moving speed of the eyeball moving position of the user in a period of time is determined, and may be determined by the accumulated moving distance of the eyeball moving position of the user in a period of time, in addition to the average speed.
Specifically, when the electronic device determines that the moving speed of the current eyeball fixation position is greater than a preset first speed value, the electronic device may determine an accumulated distance that the eyeball fixation position of the user moves within a next preset fourth time period, and then determine whether the accumulated distance exceeds the preset accumulated distance;
when the electronic device determines that the cumulative distance that the eyeball gaze position of the user moves within the preset first time period is not greater than the preset cumulative distance, the electronic device may perform step S1908;
when the electronic device determines that the cumulative distance that the eyeball gaze position of the user moves within the preset first time period is greater than the preset cumulative distance, the electronic device may perform step S1909.
S1908, the electronic equipment displays a message window in a first preset visual area;
the first predetermined visual area is around a current eye gaze location of the user. For the description of the first preset visual region, reference may be made to (4) the first preset visual region in the above description, and details are not repeated here.
When the electronic equipment determines that the moving speed of the user at the current eyeball watching position in the display area of the electronic equipment is not larger than a preset first speed value, the electronic equipment displays a message window in a first preset visual area; or after the electronic equipment determines that the moving speed of the user at the current eyeball gaze position of the display area of the electronic equipment is greater than a preset first speed value, when the electronic equipment determines that the average moving speed of the eyeball gaze position of the user within the preset first time period is not greater than a preset second speed value, the electronic equipment displays a message window in the first preset visual area. The first preset visual area is determined according to the current eyeball fixation position at the moment.
In some embodiments, after the electronic device determines that there is a message window to be displayed in step S1901, the electronic device may directly perform step S1908.
In some embodiments, before the electronic device displays the message window in the first preset visual area, the electronic device may determine the first preset visual area according to a current eyeball gaze position of the user in the display area. And displaying a message window in the first preset visual area.
It is understood that the electronic device determining the first predetermined visual area according to the current eye gaze position of the user may be a real-time determination, continuously updating process. That is, as the current eyeball gaze position of the user moves, the electronic device may update and determine the first preset visual area corresponding to the new current eyeball gaze position each time the electronic device determines a new current eyeball gaze position.
Fig. 21 is an exemplary schematic diagram of a user interface in an embodiment of the present application. Assume that the preset first speed value is YV1 and the preset second speed value is YV2 in the electronic device. When the electronic apparatus determines that the velocity value V2002 of the moving velocity of the current eyeball gaze position in fig. 20 is not greater than the preset first velocity value YV1, a first preset visual region 2101 may be determined around the second eyeball gaze position 2002, and then a message window 2102 that needs to be displayed is displayed in the first preset visual region 2101. In this example, the first preset visual field 2101 is larger in size than the message window 2102 that needs to be displayed.
Fig. 22a is an exemplary schematic diagram of a user interface in an embodiment of the present application. Assume that the preset first speed value is YV1 and the preset second speed value is YV2 in the electronic device. The electronic apparatus determines that the velocity value V2002 of the moving velocity of the eyeball focus position in fig. 20 is greater than the preset first velocity value YV1, and determines whether the average moving velocity of the eyeball focus position of the user within the preset first period of time is not greater than the preset second velocity value YV 2. If the user's eye gaze position moves to the third eye gaze position 2201 after the preset first time period, the electronic device determines that the speed value of the average moving speed of the user's eye gaze position within the preset first time period is V2201 and is less than the preset second speed YV2, a first preset visual area 2202 can be determined around the third eye gaze position 2201, and then a message window 2203 to be displayed is displayed in the first visual area 2202. In this example, the first predetermined visual area 2202 is the same size and shape as the message window 2203 that needs to be displayed.
Fig. 22b is another exemplary diagram of a user interface in the embodiment of the present application. Assume that the preset first speed value in the electronic device is YV1 and the preset cumulative distance is SL. The electronic apparatus determines that the velocity value V2002 of the moving velocity of the eyeball focus position in fig. 20 is greater than the preset first velocity value YV1, and determines whether the cumulative distance that the eyeball focus position of the user moves within the preset fourth period of time is greater than the preset cumulative distance. If the eye gaze position of the user moves by the distance S1 in the arrow direction 2201 and then moves by the distance S2 in the arrow direction 2202 within the preset fourth time period, the user reaches the current eye gaze position 2203. The cumulative distance the user' S eyeball gaze position moves within the preset fourth time period is S1+ S2. When the electronic device determines that the cumulative distance S1+ S2 is less than the preset cumulative distance SL, the electronic device determines a first preset visual region 2204 around the current eye gaze location 2203 and then displays a message window 2205 to be displayed in the first visual region 2204.
S1909, the electronic device displays a message window in a first preset visual area and moves the message window along with the movement of the current eyeball watching position;
when the electronic equipment determines that the moving speed of the current eyeball watching position of the user in the display area of the electronic equipment is greater than a preset first speed value, and determines that the average moving speed of the eyeball watching position of the user in a later preset first time length is greater than a preset second speed value, or determines that the cumulative distance of the movement of the eyeball watching position of the user in a preset fourth time length at the moment is greater than a preset cumulative distance, the electronic equipment displays a message window in the first preset visual area and moves the message window along with the movement of the eyeball watching position.
The electronic device may move the message window following the movement of the eye gaze location in a number of different ways:
optionally, the electronic device may move the message window synchronously according to the moving direction and distance of the eyeball fixation position.
Illustratively, fig. 23 is a schematic view of a set of interfaces in an embodiment of the present application. Assume that the preset first speed value is YV1 and the preset second speed value is YV2 in the electronic device. The electronic apparatus determines that the velocity value V2002 of the moving velocity of the eyeball-gazing position in fig. 20 is greater than the preset first velocity value YV1, and determines that the average moving velocity of the eyeball-gazing position of the user in the preset first period thereafter is greater than the preset second velocity value YV 2. After the preset first time period has elapsed, the electronic device determines that the user has moved the eyeball gaze position to a fourth eyeball gaze position 2302 in accordance with arrow direction 2301, as shown in fig. 23 (a). The electronic device determines a first preset visual region 2303 around the fourth eyeball gaze location 2302 and displays a message window 2304 in the first preset visual region 2303. As shown in fig. 23 (b), the user's eye gaze position continues to move, and the message window 2303 also moves to the message window 2308 according to the direction vector 2307 which is the same as the direction vector 2305 while the user's eye gaze position moves from the fourth eye gaze position 2302 to the fifth eye gaze position 2306 according to the direction vector 2305.
Optionally, in the process of moving the eyeball gaze position, the electronic device may update the first preset visual area in real time according to the current eyeball gaze position of the user, and move the message window to the first preset visual area updated in real time.
For example, as shown in (c) of fig. 23, the eye gaze position of the user continues to move, and the electronic device may update the first preset visual region in real time according to the current eye gaze position of the user in the process that the eye gaze position of the user moves from the fifth eye gaze position 2306 to the sixth eye gaze position 2310 according to the direction vector 2309. When the eyeball fixation position moves to the sixth eyeball fixation position 2310, the electronic device may update the first preset visual area to be the first preset visual area 2311, and move the message window 2312 to the first preset visual area.
It is to be understood that the electronic device may move the message window in other various ways to follow the movement of the eye gaze position, and is not limited herein.
Preferably, in some embodiments, during the process that the electronic device moves the message window along with the movement of the eye gaze position, the electronic device may determine whether the moving speed of the current eye gaze position is not greater than a preset third speed value;
when it is determined that the moving speed of the current eyeball fixation position is not greater than the preset third speed value, the electronic device may stop moving the message window.
In some embodiments, this step S1909 may not be performed, and is not limited herein.
S1910, the electronic equipment determines whether the current eyeball watching position of the user moves to the area of the message window within a preset second time length after the message window is displayed;
after the electronic device displays the message window in the first preset visual area in step S1908 or step S1909, the electronic device may determine whether the eyeball gaze position of the user moves into the area of the message window within a preset second duration after the message window is displayed;
the preset second time length may be factory preset, may be different according to different message windows, and may also be set by a user in a self-defined manner, which is not limited herein.
When the electronic device determines that the eyeball gaze position of the user does not move into the region of the message window within the preset second duration after the message window is displayed, it indicates that the user has not seen the message window yet, and the electronic device may execute step S1911;
when the electronic device determines that the current eye gaze position of the user moves into the region of the message window at a time within the preset second time period after the message window is displayed, indicating that the user is looking at the message window, the electronic device may perform step S1912.
In some embodiments, step S1910 may not be performed, and is not limited herein.
S1911, the electronic equipment determines whether the moving distance of the eyeball fixation position exceeds a preset second distance within a preset third time length after the message window is displayed;
when the electronic equipment determines that the current eyeball watching position of the user does not move into the region of the message window within the preset second time length after the message window is displayed, the electronic equipment determines whether the moving distance of the eyeball watching position exceeds the preset second distance within the preset third time length after the message window is displayed, namely whether the eyeball watching position of the user has larger displacement;
when the electronic device determines that the moving distance of the eyeball fixation position exceeds the preset second distance within the preset third duration after the message window is displayed, that is, when the eyeball fixation position of the user has a large displacement, the electronic device may execute step S1907;
when the electronic device determines that the moving distance of the eyeball fixation position does not exceed the preset second distance within the preset third time period after the message window is displayed, that is, the eyeball fixation position of the user does not have a large displacement, the electronic device may execute step S1912.
In some embodiments, this step S1911 may not be performed, and is not limited herein.
S1912, the electronic device stops moving the message window.
When the electronic equipment determines that the current eyeball watching position of the user moves to the region of the message window at a moment within a preset second time length after the message window is displayed, namely the user watches the message window, the electronic equipment can stop moving the message window;
or, when the electronic device determines that the eyeball gaze position of the user does not move into the region of the message window within the preset second duration after the message window is displayed, and the electronic device determines that the movement distance of the eyeball gaze position does not exceed the preset second distance within the preset third duration after the message window is displayed, that is, when the user does not see the message window, but the eyeball gaze position of the user does not have a large displacement, the electronic device may stop moving the message window.
It will be appreciated that the electronic device ceasing to move the message window is not a processing operation that the electronic device has to do and may be a state. That is, the electronic device may stop the message window from moving without performing any operation when performing step S1912, which is not limited herein.
In the embodiment of the application, when a message window needing to be displayed exists in the electronic equipment, the message window is directly displayed around the current eyeball fixation position of the user in the display area. Since the message window is directly displayed around the current eyeball fixation position of the user, the user can directly see the message window without transferring the view angle. The user can quickly and conveniently acquire the notification message, and the message notification efficiency is improved. Furthermore, the electronic equipment can intelligently move the message window or stop moving the message window along with the movement of the current eyeball watching position according to the movement speed of the current eyeball watching position of the user, and the convenience degree of the user for checking the message window is greatly improved.
It is understood that the message display method in the embodiment of the present application may be applied not only to the electronic device 100 described above, but also to other types of electronic devices. The electronic device in the embodiment of the present application may also adopt other hardware structures, which are not limited herein.
For example, fig. 24 is a schematic structural diagram of an exemplary electronic device 200 provided in an embodiment of the present application.
The electronic device 200 may include: an input device 2401, an output device 2402, a processor 2403 and a memory 2404 (wherein the number of the processors 2403 in the electronic device 200 may be one or more, and one processor 2403 is taken as an example in fig. 24). In some embodiments of the present application, the input device 2401, the output device 2402, the processor 2403 and the memory 2404 may be connected by a bus or other means, wherein the bus connection is exemplified in fig. 24.
The input device 2401 may include hardware required for implementing an eye tracking technology, such as a high definition camera, an infrared camera, or the like. The output device 2402 may also include hardware necessary to implement eye tracking techniques, such as a display area, an infrared light emitter, etc. The specific implementation may be determined according to the implementation manner of the eyeball tracking technology, and is not limited herein.
The processor 2403 may cause the electronic device 200 to execute the message display method in the embodiment of the present application by calling the operating instructions stored in the memory 2404. The specific process is similar to the above-mentioned electronic device 100 executing the message display method in the embodiment of the present application, and is not described here again.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
As used in the above embodiments, the term "when …" may be interpreted to mean "if …" or "after …" or "in response to a determination of …" or "in response to a detection of …", depending on the context. Similarly, depending on the context, the phrase "at the time of determination …" or "if (a stated condition or event) is detected" may be interpreted to mean "if the determination …" or "in response to the determination …" or "upon detection (a stated condition or event)" or "in response to detection (a stated condition or event)".
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wire (e.g., coaxial cable, fiber optic, digital subscriber line) or wirelessly (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), among others.
One of ordinary skill in the art will appreciate that all or part of the processes in the methods of the above embodiments may be implemented by hardware related to instructions of a computer program, which may be stored in a computer-readable storage medium, and when executed, may include the processes of the above method embodiments. And the aforementioned storage medium includes: various media capable of storing program codes, such as ROM or RAM, magnetic or optical disks, etc.

Claims (27)

1. A message display method, comprising:
the electronic equipment determines that a message window needing to be displayed exists;
the electronic equipment determines the current eyeball watching position of a user in a display area of the electronic equipment in real time;
the electronic equipment determines a first preset visual area according to the current eyeball fixation position, and the first preset visual area is arranged around the current eyeball fixation position on the display area;
the electronic equipment displays the message window in the first preset visual area.
2. The method of claim 1, wherein the first predetermined visual area is on the display area and a center or starting position is within a first predetermined distance from the current eye gaze position, the size of the first predetermined visual area being not smaller than the size of the message window.
3. The method according to claim 1 or 2, wherein the determining, by the electronic device, the first preset visual area according to the current eyeball gaze position specifically includes:
the electronic equipment determines a preset display direction and a preset display distance matched with the relative position according to the relative position of the current eyeball fixation position on the display area;
the electronic equipment determines that the center or the starting position of the first preset visual area is located at a preset display distance of the preset display direction of the current eyeball fixation position.
4. The method according to claim 1 or 2, wherein the determining, by the electronic device, the first preset visual area according to the current eyeball gaze position specifically includes:
the electronic equipment determines whether an area which does not have display content and is not smaller than the size of the message window exists in a range of a first preset distance from the current eyeball watching position on the display area;
when it is determined that there is an area that does not display content and is not smaller than the size of the message window, the electronic device determines that the area is the first preset visual area.
5. The method according to any one of claims 1 to 4, further comprising:
the electronic device determines a movement speed of a user at a current eye gaze location of a display area of the electronic device.
6. The method of claim 5, further comprising:
the electronic equipment determines whether the moving speed of the current eyeball fixation position is not greater than a preset first speed value;
the electronic device displays the message window in the first preset visual area, and specifically includes:
and when the electronic equipment determines that the moving speed of the current eyeball fixation position is not greater than a preset first speed value, the electronic equipment displays the message window in the first preset visual area.
7. The method of claim 6, further comprising:
when the electronic equipment determines that the moving speed of the current eyeball watching position is greater than a preset first speed value, the electronic equipment determines whether the average moving speed of the eyeball watching position of the user in a preset first time length is not greater than a preset second speed value;
when the electronic equipment determines that the average moving speed of the eyeball watching position of the user is not larger than the preset second speed value within the preset first time length, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position;
when the electronic equipment determines that the average moving speed of the eyeball watching position of the user is greater than the preset second speed value within the preset first time length, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position and moves the message window along with the movement of the current eyeball watching position of the user.
8. The method of claim 6, further comprising:
when the electronic equipment determines that the moving speed of the current eyeball watching position is greater than a preset first speed value, the electronic equipment determines whether the accumulated distance of the movement of the eyeball watching position of the user exceeds a preset accumulated distance within a preset fourth time period;
when the electronic equipment determines that the cumulative distance of movement of the eyeball watching position of the user does not exceed the preset cumulative distance within the preset fourth time length, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position;
when the electronic equipment determines that the cumulative distance of the movement of the eyeball watching position of the user exceeds the preset cumulative distance within the preset fourth time length, the electronic equipment displays the message window in a first preset visual area determined according to the current eyeball watching position and moves the message window along with the movement of the current eyeball watching position of the user.
9. The method according to any one of claims 1 to 8, further comprising:
the electronic equipment determines whether the current eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
and if the current eyeball watching position of the user moves to the region of the message window at one moment in the preset second time length, the electronic equipment stops moving the message window.
10. The method of claim 9, further comprising:
if the eyeball watching position of the user does not move into the region of the message window within the preset second time length, the electronic equipment determines whether the moving distance of the eyeball watching position of the user exceeds a preset second distance within a preset third time length after the message window is displayed;
and if the moving distance of the eyeball watching position of the user does not exceed the preset second distance within the preset third time length, the electronic equipment stops moving the message window.
11. The method of claim 7, further comprising:
the electronic equipment determines whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
when the electronic equipment determines that the eyeball watching position of the user does not move into the region of the message window within the preset second time length and the moving distance of the eyeball watching position of the user exceeds the preset second distance within the preset third time length after the message window is displayed, the electronic equipment displays the message window in a first preset visual region determined according to the current eyeball watching position, or the electronic equipment executes the step that the electronic equipment determines whether the average moving speed of the eyeball watching position of the user within the preset first time length is not greater than the preset second speed value.
12. The method of claim 8, further comprising:
the electronic equipment determines whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
when the electronic equipment determines that the eyeball watching positions of the user do not move to the region of the message window within the preset second time length and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time length after the message window is displayed, the electronic equipment displays the message window in the first preset visual region determined according to the current eyeball watching positions, or the electronic equipment executes the step that the electronic equipment determines whether the accumulated distance of the eyeball watching positions of the user moves exceeds the preset accumulated distance within the preset fourth time length.
13. An electronic device, characterized in that the electronic device comprises: one or more processors and memory;
the memory coupled with the one or more processors, the memory to store computer program code, the computer program code including computer instructions, the one or more processors to invoke the computer instructions to cause the electronic device to perform:
determining that a message window needing to be displayed exists;
determining the current eyeball watching position of a user in a display area of the electronic equipment in real time;
determining a first preset visual area according to the current eyeball fixation position, wherein the first preset visual area is arranged around the current eyeball fixation position on the display area;
displaying the message window in the first preset visual area.
14. The electronic device of claim 13, wherein the first predetermined visual area is on the display area and a center or starting position is within a first predetermined distance from the current eye gaze position, and wherein a size of the first predetermined visual area is not smaller than a size of the message window.
15. The electronic device of claim 13 or 14, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining a preset display direction and a preset display distance matched with the relative position according to the relative position of the current eyeball fixation position on the display area;
and determining that the center or the starting position of the first preset visual area is on a preset display distance of the preset display direction of the current eyeball watching position.
16. The electronic device of claim 13 or 14, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining whether an area which has no display content and is not smaller than the size of the message window exists in the range of a first preset distance from the current eyeball watching position on the display area;
when it is determined that there is an area that does not display content and is not smaller than the size of the message window, determining the area as the first preset visual area.
17. The method of any of claims 13-16, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
the moving speed of the current eyeball fixation position of the user in the display area of the electronic equipment is determined.
18. The method of claim 17, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining whether the moving speed of the current eyeball fixation position is not greater than a preset first speed value;
the one or more processors are specifically configured to invoke the computer instructions to cause the electronic device to perform:
and when the moving speed of the current eyeball watching position is determined to be not more than a preset first speed value, displaying the message window in the first preset visual area.
19. The electronic device of claim 18, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
when the moving speed of the current eyeball watching position is determined to be larger than a preset first speed value, determining whether the average moving speed of the eyeball watching position of the user in a preset first time length is not larger than a preset second speed value or not;
when the average moving speed of the eyeball watching position of the user in the preset first time length is determined to be not more than the preset second speed value, displaying the message window in a first preset visual area determined according to the current eyeball watching position;
and when the average moving speed of the eyeball watching position of the user is determined to be greater than the preset second speed value within the preset first time length, displaying the message window in a first preset visual area determined according to the current eyeball watching position, and moving the message window along with the movement of the current eyeball watching position of the user.
20. The electronic device of claim 18, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
when the moving speed of the current eyeball watching position is determined to be larger than a preset first speed value, determining whether the accumulated distance of the movement of the eyeball watching position of the user exceeds a preset accumulated distance within a preset fourth time period;
when the fact that the accumulated distance of movement of the eyeball watching position of the user does not exceed the preset accumulated distance within the preset fourth time length is determined, displaying the message window in a first preset visual area determined according to the current eyeball watching position;
and when the fact that the accumulated distance of the movement of the eyeball watching position of the user exceeds the preset accumulated distance in the preset fourth time length is determined, displaying the message window in a first preset visual area determined according to the current eyeball watching position, and moving the message window along with the movement of the current eyeball watching position of the user.
21. The electronic device of any of claims 13-20, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining whether the current eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
and if the current eyeball watching position of the user moves to the region of the message window at one moment in the preset second time length, the electronic equipment stops moving the message window.
22. The electronic device of claim 19, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
if the eyeball watching positions of the user do not move to the region of the message window within the preset second time length, determining whether the moving distance of the eyeball watching positions of the user exceeds a preset second distance within a preset third time length after the message window is displayed;
and if the moving distance of the eyeball watching position of the user does not exceed the preset second distance within the preset third time length, stopping moving the message window.
23. The electronic device of claim 19, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
and when it is determined that the eyeball watching positions of the user do not move into the region of the message window within the preset second time period and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time period after the message window is displayed, displaying the message window in a first preset visual region determined according to the current eyeball watching positions, or executing the step of determining whether the average moving speed of the eyeball watching positions of the user within the preset first time period is not greater than the preset second speed value.
24. The electronic device of claim 20, wherein the one or more processors are further configured to invoke the computer instructions to cause the electronic device to perform:
determining whether the eyeball watching position of the user moves into the region of the message window within a preset second time length after the message window is displayed;
and when it is determined that the eyeball watching positions of the user do not move to the region of the message window within the preset second time period and the moving distance of the eyeball watching positions of the user exceeds the preset second distance within the preset third time period after the message window is displayed, displaying the message window in a first preset visual region determined according to the current eyeball watching positions, or executing the step of determining whether the moving accumulated distance of the eyeball watching positions of the user exceeds the preset accumulated distance within the preset fourth time period.
25. A chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform the method of any of claims 1-12.
26. A computer program product comprising instructions for causing an electronic device to perform the method according to any of claims 1-12 when the computer program product is run on the electronic device.
27. A computer-readable storage medium comprising instructions that, when executed on an electronic device, cause the electronic device to perform the method of any of claims 1-12.
CN202010705738.3A 2020-07-21 2020-07-21 Message display method and electronic equipment Pending CN113970965A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010705738.3A CN113970965A (en) 2020-07-21 2020-07-21 Message display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010705738.3A CN113970965A (en) 2020-07-21 2020-07-21 Message display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN113970965A true CN113970965A (en) 2022-01-25

Family

ID=79584661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010705738.3A Pending CN113970965A (en) 2020-07-21 2020-07-21 Message display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113970965A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116027887A (en) * 2022-05-20 2023-04-28 荣耀终端有限公司 Display method and electronic equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116027887A (en) * 2022-05-20 2023-04-28 荣耀终端有限公司 Display method and electronic equipment
CN116027887B (en) * 2022-05-20 2024-03-29 荣耀终端有限公司 Display method and electronic equipment

Similar Documents

Publication Publication Date Title
CN114467297B (en) Video call display method and related device applied to electronic equipment
WO2020259452A1 (en) Full-screen display method for mobile terminal, and apparatus
CN110114747B (en) Notification processing method and electronic equipment
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
WO2021082815A1 (en) Display element display method and electronic device
CN114125130A (en) Method for controlling communication service state, terminal device and readable storage medium
CN115967851A (en) Quick photographing method, electronic device and computer readable storage medium
WO2022143180A1 (en) Collaborative display method, terminal device, and computer readable storage medium
CN113746961A (en) Display control method, electronic device, and computer-readable storage medium
CN115914461B (en) Position relation identification method and electronic equipment
CN112532508B (en) Video communication method and video communication device
CN113970965A (en) Message display method and electronic equipment
WO2022022443A1 (en) Method for moving control and electronic device
CN111982037B (en) Height measuring method and electronic equipment
CN115268737A (en) Information processing method and device
CN116048236B (en) Communication method and related device
WO2024114212A1 (en) Cross-device focus switching method, electronic device and system
CN111475363B (en) Card death recognition method and electronic equipment
WO2022222702A1 (en) Screen unlocking method and electronic device
CN114077323B (en) Touch screen false touch prevention method of electronic equipment, electronic equipment and chip system
WO2024114493A1 (en) Human-machine interaction method and apparatus
CN116152814A (en) Image recognition method and related equipment
CN116301483A (en) Application card management method, electronic device and storage medium
CN115291780A (en) Auxiliary input method, electronic equipment and system
CN116700556A (en) Card generation method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination