CN110825220B - Eyeball tracking control method, device, intelligent projector and storage medium - Google Patents

Eyeball tracking control method, device, intelligent projector and storage medium Download PDF

Info

Publication number
CN110825220B
CN110825220B CN201910939363.4A CN201910939363A CN110825220B CN 110825220 B CN110825220 B CN 110825220B CN 201910939363 A CN201910939363 A CN 201910939363A CN 110825220 B CN110825220 B CN 110825220B
Authority
CN
China
Prior art keywords
eyeball
human eye
target
control signal
intelligent projector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910939363.4A
Other languages
Chinese (zh)
Other versions
CN110825220A (en
Inventor
庾波
胡震宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huole Science and Technology Development Co Ltd
Original Assignee
Shenzhen Huole Science and Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huole Science and Technology Development Co Ltd filed Critical Shenzhen Huole Science and Technology Development Co Ltd
Priority to CN201910939363.4A priority Critical patent/CN110825220B/en
Publication of CN110825220A publication Critical patent/CN110825220A/en
Application granted granted Critical
Publication of CN110825220B publication Critical patent/CN110825220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an eyeball tracking control method, an eyeball tracking control device, an intelligent projector and a storage medium. The method comprises the following steps: applied to an intelligent projector comprising an eye tracking module, the method comprising: capturing human eye changes through an eyeball tracking module, and extracting characteristics of the human eye changes as target eyeball characteristics; determining a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal; and controlling a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal, and controlling a projection function through human eye actions, thereby being convenient and quick.

Description

Eyeball tracking control method, device, intelligent projector and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an eyeball tracking control method, an eyeball tracking control device, an intelligent projector, and a storage medium.
Background
Projection technology is commonly used in people's daily lives. The use of projection systems is visible everywhere, whether in meetings, instruction or entertainment venues.
Generally, control methods of projectors include remote controllers, voices, gestures, and the like.
Disclosure of Invention
The embodiment of the application provides an eyeball tracking control method, an eyeball tracking control device, an intelligent projector and a storage medium, wherein the projection function can be controlled through human eye actions, and the method and the device are convenient and quick.
In a first aspect, an embodiment of the present application provides an eye tracking control method, which is applied to an intelligent projector including an eye tracking module, and the method includes:
capturing human eye changes through an eyeball tracking module, and extracting characteristics of the human eye changes as target eyeball characteristics;
determining a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal;
and controlling a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal.
In an alternative embodiment, the capturing, by the eye tracking module, a change in a human eye, and extracting a feature of the change in the human eye as the target eye feature includes:
collecting human eye action pictures through the camera;
transmitting the human eye action picture to a processor for feature extraction, and obtaining the feature of human eye change in the human eye action picture as the target eyeball feature, wherein the human eye change comprises at least one of the following: eyeball movement direction and movement duration, blink times and eye closing duration;
The control signal comprises any one of the following: short press signal, long press signal, acknowledgement signal, return signal.
In an alternative embodiment, the target eyeball characteristics include the eyeball moving direction and a moving duration, wherein the moving duration is a duration from when the eyeball moves towards the first direction to when the eyeball is no longer in the first direction; the determining the target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal comprises the following steps:
judging whether the movement time of the eyeball is within a first time threshold range or not;
if the short press signal is in the eyeball moving direction, determining a target control signal corresponding to the target eyeball characteristic as the short press signal of the eyeball moving direction;
the controlling, according to the target control signal, a cursor in a projection screen of the intelligent projector to execute a corresponding operation includes:
and responding to the short pressing signal of the eyeball moving direction, and controlling a cursor in a projection picture of the intelligent projector to move an option towards the eyeball moving direction.
In an alternative embodiment, the method further comprises:
if the moving time length of the eyeball is greater than a second time length threshold, determining that a target control signal corresponding to the target eyeball characteristics is the long pressing signal of the eyeball moving direction;
The controlling, according to the target control signal, a cursor in a projection screen of the intelligent projector to execute a corresponding operation includes:
and responding to the long-press signal of the eyeball moving direction, controlling a cursor in a projection picture of the intelligent projector to move at least two options towards the eyeball moving direction, wherein the number of the options for moving the cursor is in direct proportion to the eyeball moving time.
In an optional implementation manner, before capturing the human eye change by the camera and extracting the characteristic of the human eye change as the target eyeball characteristic, the method further includes:
when an eyeball control start instruction is detected, an eyeball tracking control mode is entered.
In an alternative embodiment, the method further comprises:
acquiring a face image of a user through a camera, and extracting face features of the face image;
determining the user concentration degree according to the face characteristics;
and outputting prompt information or entering a standby mode under the condition that the concentration degree of the user is lower than the attention threshold value.
In an alternative embodiment, in the case of entering the standby mode, the method further includes:
determining a label corresponding to the face feature of the face image, and randomly acquiring standby audio and/or standby images corresponding to the label;
And playing the standby audio and/or standby screen.
In a second aspect, the present application provides another eyeball tracking control device including:
the eyeball tracking module is used for capturing human eye changes and extracting the characteristics of the human eye changes as target eyeball characteristics;
the processing module is used for determining a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal;
and the control module is used for controlling a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal.
In a third aspect, an embodiment of the present application provides an intelligent projector, including a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, and the memory is configured to store a computer program, where the computer program includes program instructions, and where the processor is configured to invoke the program instructions to perform a method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program comprising program instructions for execution by a processor to perform a method as in the first aspect described above.
The eyeball tracking module of the embodiment of the application captures the change of human eyes, extracts the characteristic of the change of human eyes as a target eyeball characteristic, determines a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal, and controls a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal, so that the projection function can be controlled through the action of human eyes, and the control operation of the intelligent projector can be realized under the condition of no remote controller.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1A is a schematic structural diagram of an intelligent projector according to an embodiment of the present application;
fig. 1B is a schematic flow chart of an eyeball tracking control method according to an embodiment of the present application;
fig. 2 is a flowchart of another eyeball tracking control method according to an embodiment of the present application;
FIG. 3 is a schematic block diagram of an eye tracking control device according to an embodiment of the present application;
fig. 4 is a block diagram of another intelligent projector according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. It is also to be understood that the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As used in this specification and the appended claims, the term "if" may be interpreted as "when..once" or "in response to a determination" or "in response to detection" depending on the context. Similarly, the phrase "if a determination" or "if a [ described condition or event ] is detected" may be interpreted in the context of meaning "upon determination" or "in response to determination" or "upon detection of a [ described condition or event ]" or "in response to detection of a [ described condition or event ]".
The eyeball tracking control device in the embodiment of the application can be an intelligent projector. As shown in fig. 1A, fig. 1A is a schematic structural diagram of an intelligent projector according to an embodiment of the present application. The smart projector may include a processor, memory, signal processor, transceiver, speaker, microphone, random access memory (Random Access Memory, RAM), cameras (including first and second cameras), sensors, and network modules, among others. The system comprises a memory, a DSP, a projection device, a loudspeaker, a microphone, a RAM, a camera, a sensor and a network module, wherein the memory, the DSP, the projection device, the loudspeaker, the microphone, the RAM, the camera, the sensor and the network module are connected with a processor, and the transceiver is connected with a signal processor.
The Processor is a control center of the intelligent projector, and is connected with various parts of the whole intelligent projector by various interfaces and lines, executes various functions of the intelligent projector and processes data by running or executing software programs and/or modules stored in a memory and calling the data stored in the memory, so that the intelligent projector is monitored integrally, and the Processor can be a central processing unit (Central Processing Unit/Processor, CPU), a graphic processing unit (Graphics Processing Unit, GPU) or a network processing unit (nerve-network Processing Unit, NPU).
Further, the processor may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor.
The memory is used for storing software programs and/or modules, and the processor executes the software programs and/or modules stored in the memory so as to execute various functional applications and data processing of the intelligent projector. The memory may mainly include a memory program area and a memory data area, wherein the memory program area may store an operating system, a software program required for at least one function, and the like; the storage data area may store data created according to the use of the intelligent projector, etc. In addition, the memory may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
Wherein the sensor comprises at least one of: light-sensitive sensors, gyroscopes, infrared proximity sensors, vibration detection sensors, pressure sensors, etc. Wherein a light sensor, also called ambient light sensor, is used to detect the ambient light level. The light sensor may comprise a photosensitive element and an analog-to-digital converter. The photosensitive element is used for converting the collected optical signals into electric signals, and the analog-to-digital converter is used for converting the electric signals into digital signals. Optionally, the optical sensor may further include a signal amplifier, where the signal amplifier may amplify the electrical signal converted by the photosensitive element and output the amplified electrical signal to the analog-to-digital converter. The photosensitive element may include at least one of a photodiode, a phototransistor, a photoresistor, and a silicon photocell.
The first camera may be a visible light camera (a general view camera, a wide angle camera), an infrared camera, or a dual camera (having a ranging function), which is not limited herein. It should be noted that, in the eye tracking module for eye tracking control in the embodiment of the present application, the included camera is the second camera, which may be a different camera from the first camera, and the position of the second camera may be adjusted and may face a user using an intelligent projector, so as to collect an image of the user's eye for analysis.
The network module may be at least one of: bluetooth module, wireless Fidelity (wireless fidelity, wi-Fi), etc., without limitation, the above-described eye tracking control device can realize a projection function and an eye tracking control function.
Referring to fig. 1B, fig. 1B is a schematic flowchart of an eyeball tracking control method according to an embodiment of the present application, which can be applied to an intelligent projector, such as the intelligent projector shown in fig. 1A. The intelligent projector includes an eye tracking module. The method may include, as shown in fig. 1B:
101. and capturing human eye changes through an eyeball tracking module, and extracting the characteristics of the human eye changes as target eyeball characteristics.
The execution subject in the embodiment of the present application may be an eyeball tracking control device, which may be the above-mentioned intelligent projector. The projector is also called as projector, which is a device capable of projecting images or videos onto a curtain, and playing corresponding video signals through different interfaces and connection with computers, VCD, DVD, BD, game machines, DV and the like. Projectors are currently widely used in homes, offices, schools, and entertainment venues, and there are various types of CRT, LCD, DLP, etc. according to different operation modes.
In specific implementation, the intelligent projector can be a laser projector, also called a laser light source projector, has higher brightness and longer service life, and especially the laser light source of the laser light source projector is more power-saving and energy-saving, the later maintenance workload can be greatly reduced, and more reliable use effects can be brought to users.
The eyeball tracking module in the embodiment of the application comprises a camera, wherein the camera can be a fixed camera or a rotatable camera, in addition, the camera can be a single camera or a plurality of cameras, the single camera can be a visible light camera or an infrared camera, the visible light camera can be a common visual angle camera, or the wide-angle camera and the plurality of cameras can be a double camera, or a three camera, or a four camera, and the method is not limited herein.
The eye tracking module can be connected with the processor, can capture human eye changes of a user through the camera, and can specifically be human eye action pictures (various images or video forms) and send the human eye action pictures to the processor. The processor performs analysis processing according to the data collected by the eye tracking module, extracts the characteristic of human eye change as the target eye characteristic, and then can execute step 102.
In an alternative embodiment, the step 101 includes:
collecting human eye action pictures through a camera;
transmitting the human eye action picture to a processor for feature extraction, and obtaining the feature of human eye change in the human eye action picture as the target eyeball feature, wherein the human eye change comprises at least one of the following: eyeball movement direction and movement duration, blink times and eye closing duration;
the control signal may include any one of the following: short press signal, long press signal, acknowledgement signal, return signal.
Specifically, the camera can collect a plurality of human eye action pictures which are continuously shot, so that continuous human eye actions can be conveniently determined. The processor of the intelligent projector can process the image of the human eye action picture, before the processor performs the feature extraction, the image quality of the human eye action picture can be judged firstly, wherein the image quality comprises the integrity and the definition of human eyes, and if the image quality of the human eye action picture is lower than a quality threshold value, retry prompt information can be output for reminding a user to perform the human eye action again so as to perform corresponding control; if the image quality of the human eye motion picture is not lower than the quality threshold, the following steps can be performed.
The processor may identify a human eye feature region of the human eye action picture. The human eye characteristic region is extracted by carrying out image segmentation processing on the human eye action picture, and then the human eye characteristic region is subjected to characteristic extraction to obtain the human eye characteristic of the human eye action picture, wherein the human eye characteristic mainly comprises the human eye shape, the eyeball position, the open state or the closed state of the human eye and the like. And carrying out integration analysis according to the acquisition time sequence of the human eye action picture so as to determine the characteristic of the human eye change. It is understood that the above-described human eye change such as the eye movement direction (leftward, rightward, upward or downward, etc.) and the movement duration (the duration of movement to a certain direction to detect the homing action), the number of blinks, the eye closing duration, etc. may be determined for the human eye action picture of the continuous frames (video).
Optionally, before step 101, the method further includes: when an eyeball control start instruction is detected, an eyeball tracking control mode is entered.
The eye control start instruction may be triggered by a voice signal, a specific gesture, or a key operation, for example, when it is detected that the user says: the intelligent projector enters the eyeball tracking control mode when the eye mode is adopted or a specific gesture is made and a specific key is pressed, and the eyeball tracking control method in the embodiment of the application can be executed under the requirement of a user, so that misoperation caused by the eyeball tracking control method when control operation is not needed is avoided, and the eyeball tracking module is not required to be started so as to save expenditure.
After determining the target eye characteristics of the user's human eye change, step 102 may be performed.
102. And determining a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal.
The corresponding relation between the eyeball characteristics and the control signals can be preset to carry out operation mapping,
the control signal may include any one of the following: short press signal, long press signal, acknowledgement signal, return signal. Optionally, the short pressing signal may control to implement an operation of moving a cursor left or right by one cell in the projection screen, the long pressing signal may control to implement an operation of moving a cursor left or right by multiple cells in the projection screen, the confirmation signal may control to implement an operation of selecting an option where the current cursor is located, the return signal may control to implement an operation of returning to a display screen on the current screen, and in the embodiment of the present application, a correspondence between other eyeball characteristics and a control signal may be set, which is not limited.
According to the corresponding relation between the preset eyeball characteristics and the control signals, the intelligent projector can determine the target control signals corresponding to the target eyeball characteristics, so that step 103 is executed, and corresponding control operation is realized.
103. And controlling a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal.
The intelligent projector may perform a corresponding operation by a command of the control signal. Optionally, the intelligent projector in the embodiment of the application can respond to the control signal triggered by the remote controller, the gesture or the voice command to execute the corresponding operation, and the eyeball tracking control method can be used for realizing the corresponding control operation.
In one embodiment, the target eyeball characteristics include the eyeball moving direction and a moving duration, wherein the moving duration is a duration from when the eyeball moves to the first direction to when the eyeball is no longer in the first direction; the step 102 may specifically include:
judging whether the moving time of the eyeballs is within a first time threshold range or not;
if the control signal is in the state, determining a target control signal corresponding to the target eyeball characteristics as the short pressing signal of the eyeball moving direction;
the step 103 may specifically include:
and responding to the short pressing signal of the eyeball moving direction, and controlling a cursor in a projection picture of the intelligent projector to move an option in the eyeball moving direction.
Further, in an alternative embodiment, the method further comprises:
If the moving time length of the eyeball is greater than a second time length threshold, determining that a target control signal corresponding to the target eyeball characteristics is the long-press signal of the eyeball moving direction;
the step 103 may specifically include:
and responding to the long-press signal of the eyeball moving direction, controlling a cursor in a projection picture of the intelligent projector to move at least two options towards the eyeball moving direction, wherein the number of the options for moving the cursor is in direct proportion to the eyeball moving time.
Specifically, the first direction may be any direction, for example, left or right, and the direction range threshold may be preset for the judgment of the first direction, for example, a range of offset distances relative to the center position of the orbit and/or a range of offset angles relative to the horizontal and vertical lines, so as to determine different eyeball movement directions. For the characteristic of eyeball movement, only one eye can be identified and judged, and both eyes can be simultaneously judged, so that the accuracy is improved.
Similarly, the first time period threshold range and the second time period threshold may be set to distinguish the movement time period of the eyeball. Wherein, the value in the first time length threshold range is smaller than the second time length threshold, that is, when the moving time length of the eyeball is in the first time length threshold range, the corresponding target control signal can be determined to belong to the short press signal, and when the value is larger than the second time length threshold, the corresponding target control signal can be determined to belong to the long press signal. The first time duration threshold range and the second time duration threshold may be set and adjusted as needed.
It can be understood that in the embodiment of the application, the moving direction of the cursor of the intelligent projector and the moving direction of the eyeball have a corresponding relationship, and the number of moving options of the cursor and the moving duration of the eyeball have a positive correlation, so that the movement of the eyeball can be converted into a control signal for the cursor, thereby controlling the cursor to execute a corresponding operation.
For example, the correspondence between the preset eyeball characteristics and the control signals, and the functions of the corresponding control signals may include:
1) The eyeballs of both eyes move left briefly, and the cursor in the projection picture can be controlled to move left by one frame corresponding to a control signal of 'short left press';
2) The eyeballs of the two eyes move left for a long time, and the cursor in the projection picture can be controlled to move left for multiple lattices corresponding to a control signal of 'left long press';
3) Blinking the left eye, corresponding to a 'confirm' control signal, and selecting the option of the current cursor;
4) The right eye blinks down, and the display screen on the current screen is returned corresponding to the return control signal.
In the embodiment of the application, the corresponding relation between the eyeball characteristics and the control signals is preset, and the functions of the corresponding control signals can be set in other ways, and can be added, deleted, edited and modified, without limitation.
In one implementation manner, the embodiment of the application can extract the characteristic of human eye change through a preset neural network model, and then convert the characteristic into the characteristic label which can be identified by the processor to map the control signal.
According to the embodiment of the application, the eye tracking module captures human eye changes, the characteristics of the human eye changes are extracted to serve as target eye characteristics, then the target control signal corresponding to the target eye characteristics is determined according to the corresponding relation between the preset eye characteristics and the control signal, then the cursor in the projection picture of the intelligent projector is controlled to execute corresponding operation according to the target control signal, the projection function can be controlled through human eye actions, and the control operation of the intelligent projector can be realized under the condition of no remote controller.
Referring to fig. 2, fig. 2 is a schematic flowchart of another eye tracking control method according to an embodiment of the present application, where the embodiment shown in fig. 2 may be performed based on the embodiment shown in fig. 1A and 1B, and the method shown in fig. 2 may include:
201. and acquiring face images of the user through a camera, and extracting face features of the face images.
Specifically, the intelligent projector may collect face images through a camera, which may include one or more faces. The processor of the intelligent projector can process the face image, before the processor performs feature extraction, the image quality of the face image can be judged, wherein the image quality comprises the integrity and the definition of the face, and if the image quality of the face image is lower than a quality threshold value, the face image can be shot again; if the image quality of the face image is not lower than the quality threshold, the following steps may be performed.
The processor may identify a face feature region of the face image. The human face image is segmented, the human face feature area is extracted, and then the human face feature area is extracted to obtain the human face features, wherein the human face features can comprise human eye features mentioned in the previous embodiment, and the human eye change features can be determined through the human face images continuously collected through multiple frames. The face features may also include mouth status features, or face orientation features, etc., which may be used to determine the concentration level of the user.
202. And determining the concentration degree of the user according to the face characteristics.
Specifically, the face image can be identified through the neural network, the face characteristics are extracted, and the user concentration degree is determined. The different face features obtained by the analysis may calculate corresponding user concentration evaluation values to reflect whether the user is concentrating on the projection screen.
For example, if the offset between the face orientation and the projection screen can be calculated, the weight value in the calculation is lower as the offset is larger, which has a negative effect on the calculation of the user concentration degree, and if the offset between the face orientation and the projection screen can be determined by the eye position, the weight value in the calculation is lower as the offset is larger, which has a negative effect on the calculation of the user concentration degree; through the mouth state characteristics, whether the user is in a long-time speaking state or a fatigue state of yawning can be judged, and the user eye closing time exceeds a specific time threshold. Other evaluation rules for the concentration degree of the user can be provided in the embodiment of the application, and can be set and modified according to the needs, and are not repeated here.
The embodiment of the application can preset the attention threshold value, compare the calculated evaluation value of the concentration degree of the user with the attention threshold value, judge whether the attention of the user is too low, and further execute the step 203 or the step 204.
203. And outputting prompt information or entering a standby mode under the condition that the concentration degree of the user is lower than the attention threshold value.
Under the condition that the concentration degree of the user is lower than the attention threshold value, the fact that the current user is less in attention to the projection content can be determined, and prompt information can be output, wherein the prompt information is used for prompting whether the user chooses to pause playing or to replace new projection content. Such as outputting voice information or popup window display: "does new content need to be played? "the user can selectively perform the corresponding operation".
Or, the intelligent projector can also enter a standby mode, the maintaining time of the standby mode can be preset, for example, one minute or 100 seconds, and the projection function can be turned off or switched to standby audio and/or standby pictures for broadcasting in the standby mode, so that a user can rest, and the overall viewing effect is improved. The intelligent projector in the standby mode can be actively awakened by a user at any time, and is switched to a normal working mode to continuously play the projection content.
Optionally, if the intelligent projector detects that the film watching duration of the user reaches the film watching duration threshold, step 203 may also be executed to remind the user to rest and ensure healthy film watching of the user.
In one embodiment, after the step 202, the method further includes:
determining a label corresponding to the face feature of the face image, and randomly acquiring standby audio and/or standby pictures corresponding to the label;
and playing the standby audio and/or standby picture.
Personalized standby audio and/or standby pictures can be provided for the user through the face characteristics, so that the use feeling of the user is improved. In the foregoing feature extraction and analysis, corresponding tags may be determined at the same time, and the mapping relationship between the tags with face features and standby audio and/or standby frames may be preset in the intelligent projector, so as to determine the tags corresponding to the face features of the collected face image, and randomly select the standby audio and/or standby frames under the tags from the standby data set to output.
204. And when the user concentration is not lower than the attention threshold, entering an eyeball tracking control mode when an eyeball control starting instruction is detected.
The above step 204 may refer to the description in the embodiment shown in fig. 1B, and the eye tracking control method in the embodiment shown in fig. 1B may be performed after the step 204, which is not described herein.
Alternatively, the above steps 201 and 202 may be periodically performed to detect the concentration of the user through the face features, and guide the user to take a rest if the user is not concentrated enough.
By implementing the embodiment of the application, the face image of the user can be acquired through the camera, the face characteristics of the face image can be extracted, the user concentration degree can be determined according to the face characteristics, the prompt message is output or the standby mode is entered under the condition that the user concentration degree is lower than the attention threshold, the eyeball tracking control mode is entered under the condition that the eyeball control starting instruction is detected under the condition that the user concentration degree is not lower than the attention threshold, the user concentration degree can be acquired through face recognition in the use process of the intelligent projector, the user is reminded of rest, the projection function can be controlled through the action of human eyes, the operation of controlling the intelligent projector can be realized conveniently and rapidly, and the user experience is improved.
Correspondingly, the embodiment of the application also provides an eyeball tracking control device.
Specifically, referring to fig. 3, a schematic block diagram of an eye tracking control device 300 according to an embodiment of the present application is shown. The eyeball tracking control device 300 of the present embodiment includes:
the eye tracking module 310 is configured to capture a change of a human eye, and extract a feature of the change of the human eye as a target eye feature;
the processing module 320 is configured to determine a target control signal corresponding to the target eyeball characteristic according to a corresponding relationship between a preset eyeball characteristic and the control signal;
and the control module 330 is configured to control a cursor in a projection screen of the intelligent projector to perform a corresponding operation according to the target control signal.
In one embodiment, eye tracking module 310 is specifically configured to:
collecting human eye action pictures through the camera;
transmitting the human eye action picture to a processor for feature extraction to obtain the feature of human eye change in the human eye action picture as the target eyeball feature, wherein the human eye change comprises at least one of the following: eyeball movement direction and movement duration, blink times and eye closing duration;
the control signal includes any one of the following: short press signal, long press signal, acknowledgement signal, return signal.
Optionally, the processing module 320 is specifically configured to:
judging whether the moving time of the eyeballs is within a first time threshold range or not;
if the control signal is in the state, determining a target control signal corresponding to the target eyeball characteristics as the short pressing signal of the eyeball moving direction;
the control module 330 is specifically configured to control a cursor in a projection screen of the intelligent projector to move an option in the eyeball moving direction in response to the short-press signal of the eyeball moving direction.
Optionally, the processing module 320 is further configured to determine that the target control signal corresponding to the target eyeball characteristic is the long press signal in the eyeball moving direction if the moving time length of the eyeball is greater than a second time length threshold;
the control module 330 is specifically configured to control, in response to the long-press signal of the eyeball moving direction, a cursor in a projection screen of the intelligent projector to move at least two options in the eyeball moving direction, where the number of options for moving the cursor is proportional to the eyeball moving duration.
Optionally, the processing module 320 is further configured to: when the eye tracking module 310 detects an eye control start command, it enters an eye tracking control mode.
Optionally, the eye tracking module 310 is further configured to collect a face image of the user through a camera, and the eye tracking control device 300 further includes the output module 340; wherein, the processing module 320 is further configured to:
extracting face features of the face image;
determining the user concentration degree according to the face characteristics;
the output module 340 is further configured to output a prompt message or enter a standby mode if the user concentration is lower than a threshold of attention.
Optionally, the processing module 320 is further configured to determine a label corresponding to a face feature of the face image in the standby mode, and randomly acquire standby audio and/or standby screen corresponding to the label;
the output module 340 is further configured to play the standby audio and/or standby screen.
According to the eyeball tracking control device 300 in the embodiment of the application, the eyeball tracking module can capture the change of human eyes, the characteristic of the change of human eyes is extracted to serve as a target eyeball characteristic, then a target control signal corresponding to the target eyeball characteristic is determined according to the corresponding relation between the preset eyeball characteristic and the control signal, then a cursor in a projection picture of the intelligent projector is controlled to execute corresponding operation according to the target control signal, the projection function can be controlled through the action of human eyes, and the control operation of the intelligent projector can be realized under the condition of no remote controller.
Referring to fig. 4, a schematic block diagram of an intelligent projector according to another embodiment of the present application is provided. The intelligent projector in this embodiment may include: one or more processors 410; one or more input devices 420, one or more output devices 430, and a memory 440. The processor 410, input device 420, output device 430, and memory 440 are connected by bus 450. Memory 440 is used to store a computer program comprising program instructions, and processor 410 is used to execute the program instructions stored by memory 440. Wherein the processor 410 is configured to execute the above-described program instructions.
It should be appreciated that in embodiments of the present application, the processor 410 may be a central processing unit (Central Processing Unit, CPU), the processor 410 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSPs), application specific integrated circuits (Application Specific Integrated Circuit, ASICs), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGAs) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 420 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of a fingerprint), a microphone, etc., and the output device 430 may include a display (LCD, etc.), a speaker, etc. At least one USB interface may also be included to enable data transfer.
The memory 440 may include read only memory and random access memory and provide instructions and data to the processor 410. A portion of memory 440 may also include non-volatile random access memory. For example, the memory 440 may also store information of a device type, or various types of voice data, image data, video data, and the like of a user.
In an embodiment of the present application, one or more instructions stored in a computer storage medium are loaded and executed by the processor 410 to implement the corresponding steps of the method flow shown in fig. 1B or fig. 2 described above.
According to the embodiment of the application, the intelligent projector 400 can capture human eye changes through the eye tracking module, extract the characteristics of the human eye changes as target eye characteristics, determine target control signals corresponding to the target eye characteristics according to the corresponding relation between the preset eye characteristics and the control signals, and control a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signals, so that the projection function can be controlled through human eye actions, and the control operation of the intelligent projector can be realized under the condition of no remote controller.
The embodiment of the application also provides a computer readable storage medium, which is characterized in that the computer storage medium stores a computer program, the computer program comprises program instructions, and the program instructions are executed by a processor to execute a part or all of the eyeball tracking control method according to the embodiment of the application.
The computer readable storage medium may be an internal storage unit of the eye tracking control device of any of the foregoing embodiments, such as a hard disk or a memory of the intelligent projector. The computer readable storage medium may also be an external storage device of the eye tracking control apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the eye tracking control apparatus. Further, the computer-readable storage medium may also include both an internal storage unit and an external storage device of the eye tracking control apparatus. The computer-readable storage medium is used to store a computer program and other programs and data required for the eye tracking control device. The computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working procedures of the server and the unit described above may refer to the corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in the present application, it should be understood that the disclosed server and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. In addition, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or elements, or may be an electrical, mechanical, or other form of connection.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the embodiment of the present application.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.

Claims (8)

1. An eye tracking control method, applied to an intelligent projector including an eye tracking module, comprising:
capturing human eye changes through an eyeball tracking module, extracting characteristics of the human eye changes as target eyeball characteristics, and comprising the following steps: collecting human eye action pictures through a camera; judging the image quality of the human eye action picture, and if the image quality of the human eye action picture is lower than a quality threshold, outputting retry prompt information, wherein the retry prompt information is used for reminding a user to perform human eye action again so as to perform corresponding control; if the image quality of the human eye action picture is not lower than the quality threshold, transmitting the human eye action picture to a processor for feature extraction, and obtaining the feature of human eye change in the human eye action picture as the target eyeball feature, wherein the human eye change comprises at least one of the following steps: eyeball movement direction and movement duration, blink times and eye closing duration; the target eyeball characteristics comprise the eyeball moving direction and the moving time length, wherein the moving time length is the time length from the moment when the eyeball moves to the first direction to the moment when the eyeball is not in the first direction any more;
according to the corresponding relation between the preset eyeball characteristics and the control signals, determining a target control signal corresponding to the target eyeball characteristics comprises the following steps: judging whether the movement time of the eyeball is within a first time threshold range or not; if the control signal is in the state, determining that a target control signal corresponding to the target eyeball characteristics is a short pressing signal in the eyeball moving direction; if the moving time length of the eyeball is greater than a second time length threshold, determining that a target control signal corresponding to the target eyeball characteristics is a long-press signal in the eyeball moving direction;
And controlling a cursor in a projection picture of the intelligent projector to execute corresponding operation according to the target control signal, wherein the method comprises the following steps: responding to the short pressing signal of the eyeball moving direction, and controlling a cursor in a projection picture of the intelligent projector to move an option towards the eyeball moving direction; and responding to the long-press signal of the eyeball moving direction, controlling a cursor in a projection picture of the intelligent projector to move at least two options towards the eyeball moving direction, wherein the number of the options for moving the cursor is in direct proportion to the eyeball moving time.
2. The method of claim 1, wherein the control signal comprises any one of: short press signal, long press signal, acknowledgement signal, return signal.
3. The method of claim 2, wherein the capturing of the human eye change by the camera further comprises, prior to extracting the feature of the human eye change as the target eye feature:
when an eyeball control start instruction is detected, an eyeball tracking control mode is entered.
4. A method according to claim 3, characterized in that the method further comprises:
acquiring a face image of a user through a camera, and extracting face features of the face image;
Determining the user concentration degree according to the face characteristics;
and outputting prompt information or entering a standby mode under the condition that the concentration degree of the user is lower than the attention threshold value.
5. The method of claim 4, wherein in the event of the entering a standby mode, the method further comprises:
determining a label corresponding to the face feature of the face image, and randomly acquiring standby audio and/or standby images corresponding to the label;
and playing the standby audio and/or standby screen.
6. An eyeball tracking control device, comprising:
the eyeball tracking module is used for capturing human eye changes and extracting the characteristics of the human eye changes as target eyeball characteristics, and comprises the following steps: collecting human eye action pictures through a camera; judging the image quality of the human eye action picture, and if the image quality of the human eye action picture is lower than a quality threshold, outputting retry prompt information, wherein the retry prompt information is used for reminding a user to perform human eye action again so as to perform corresponding control; if the image quality of the human eye action picture is not lower than the quality threshold, transmitting the human eye action picture to a processor for feature extraction, and obtaining the feature of human eye change in the human eye action picture as the target eyeball feature, wherein the human eye change comprises at least one of the following steps: eyeball movement direction and movement duration, blink times and eye closing duration; the target eyeball characteristics comprise the eyeball moving direction and the moving time length, wherein the moving time length is the time length from the moment when the eyeball moves to the first direction to the moment when the eyeball is not in the first direction any more;
The processing module is used for determining a target control signal corresponding to the target eyeball characteristic according to the corresponding relation between the preset eyeball characteristic and the control signal, and comprises the following steps: judging whether the movement time of the eyeball is within a first time threshold range or not; if the control signal is in the state, determining that a target control signal corresponding to the target eyeball characteristics is a short pressing signal in the eyeball moving direction; if the moving time length of the eyeball is greater than a second time length threshold, determining that a target control signal corresponding to the target eyeball characteristics is a long-press signal in the eyeball moving direction;
the control module is used for controlling a cursor in a projection picture of the intelligent projector to execute corresponding operations according to the target control signal, and comprises the following steps: responding to the short pressing signal of the eyeball moving direction, and controlling a cursor in a projection picture of the intelligent projector to move an option towards the eyeball moving direction; and responding to the long-press signal of the eyeball moving direction, controlling a cursor in a projection picture of the intelligent projector to move at least two options towards the eyeball moving direction, wherein the number of the options for moving the cursor is in direct proportion to the eyeball moving time.
7. An intelligent projector comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-5.
8. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program comprising program instructions for execution by a processor for performing the method of any of claims 1-5.
CN201910939363.4A 2019-09-29 2019-09-29 Eyeball tracking control method, device, intelligent projector and storage medium Active CN110825220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910939363.4A CN110825220B (en) 2019-09-29 2019-09-29 Eyeball tracking control method, device, intelligent projector and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910939363.4A CN110825220B (en) 2019-09-29 2019-09-29 Eyeball tracking control method, device, intelligent projector and storage medium

Publications (2)

Publication Number Publication Date
CN110825220A CN110825220A (en) 2020-02-21
CN110825220B true CN110825220B (en) 2023-12-08

Family

ID=69548542

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910939363.4A Active CN110825220B (en) 2019-09-29 2019-09-29 Eyeball tracking control method, device, intelligent projector and storage medium

Country Status (1)

Country Link
CN (1) CN110825220B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111596760A (en) * 2020-04-30 2020-08-28 维沃移动通信有限公司 Operation control method and device, electronic equipment and readable storage medium
CN111796752B (en) * 2020-05-15 2022-11-15 四川科华天府科技有限公司 Interactive teaching system based on PC
CN114449319A (en) * 2020-11-04 2022-05-06 深圳Tcl新技术有限公司 Video picture dynamic adjustment method and device, intelligent terminal and storage medium
CN113903078A (en) * 2021-10-29 2022-01-07 Oppo广东移动通信有限公司 Human eye gaze detection method, control method and related equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373767A (en) * 2015-07-23 2016-03-02 中山大学深圳研究院 Eye fatigue detection method for smart phones
CN107390874A (en) * 2017-07-27 2017-11-24 深圳市泰衡诺科技有限公司 A kind of intelligent terminal control method and control device based on human eye
CN108399085A (en) * 2018-02-12 2018-08-14 广东欧珀移动通信有限公司 electronic device, application management method and related product
CN109542217A (en) * 2018-10-12 2019-03-29 深圳市元征科技股份有限公司 A kind of eyeball tracking display methods and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105373767A (en) * 2015-07-23 2016-03-02 中山大学深圳研究院 Eye fatigue detection method for smart phones
CN107390874A (en) * 2017-07-27 2017-11-24 深圳市泰衡诺科技有限公司 A kind of intelligent terminal control method and control device based on human eye
CN108399085A (en) * 2018-02-12 2018-08-14 广东欧珀移动通信有限公司 electronic device, application management method and related product
CN109542217A (en) * 2018-10-12 2019-03-29 深圳市元征科技股份有限公司 A kind of eyeball tracking display methods and device

Also Published As

Publication number Publication date
CN110825220A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110825220B (en) Eyeball tracking control method, device, intelligent projector and storage medium
US11561621B2 (en) Multi media computing or entertainment system for responding to user presence and activity
WO2020192400A1 (en) Playback terminal playback control method, apparatus, and device, and computer readable storage medium
JP6143975B1 (en) System and method for providing haptic feedback to assist in image capture
JP4384240B2 (en) Image processing apparatus, image processing method, and image processing program
US9106821B1 (en) Cues for capturing images
CN108241434B (en) Man-machine interaction method, device and medium based on depth of field information and mobile terminal
KR101444103B1 (en) Media signal generating method and apparatus using state information
CN108712603B (en) Image processing method and mobile terminal
US20140047464A1 (en) Method and apparatus for measuring tv or other media delivery device viewer's attention
CN109032345B (en) Equipment control method, device, equipment, server and storage medium
CN102467668A (en) Emotion detecting and soothing system and method
US20200413138A1 (en) Adaptive Media Playback Based on User Behavior
US9591210B2 (en) Image processing face detection apparatus, method for controlling the same, and program
WO2023065849A1 (en) Screen brightness adjustment method and apparatus for electronic device, and electronic device
US20110228155A1 (en) Cosmetic mirror and adjusting method for the same
EP3328062A1 (en) Photo synthesizing method and device
WO2015078240A1 (en) Video control method and user terminal
WO2016177200A1 (en) Method and terminal for implementing screen control
WO2020108024A1 (en) Information interaction method and apparatus, electronic device, and storage medium
CN110971948A (en) Control method and device of smart television, smart television and medium
CN111447497A (en) Intelligent playing device and energy-saving control method thereof
CN111402096A (en) Online teaching quality management method, system, equipment and medium
KR20190101825A (en) Electronic device and method for recording thereof
CN111182280A (en) Projection method, projection device, sound box equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant