WO2016188258A1 - 眼控装置、眼控方法和眼控系统 - Google Patents

眼控装置、眼控方法和眼控系统 Download PDF

Info

Publication number
WO2016188258A1
WO2016188258A1 PCT/CN2016/079259 CN2016079259W WO2016188258A1 WO 2016188258 A1 WO2016188258 A1 WO 2016188258A1 CN 2016079259 W CN2016079259 W CN 2016079259W WO 2016188258 A1 WO2016188258 A1 WO 2016188258A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
operated
human eye
control
control signal
Prior art date
Application number
PCT/CN2016/079259
Other languages
English (en)
French (fr)
Inventor
李文波
杨添
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US15/511,861 priority Critical patent/US10372206B2/en
Publication of WO2016188258A1 publication Critical patent/WO2016188258A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2354/00Aspects of interface with display user

Definitions

  • the present invention relates to the field of display technologies, and in particular, to an eye control device, an eye control method, and an eye control system.
  • the invention provides an eye control device, an eye control method and an eye control system, which can control the operating device based on the action of the human eye.
  • an eye control device including:
  • a gaze point obtaining unit configured to acquire position information of a gaze point of the human eye on the device to be operated
  • a human eye motion detecting unit configured to detect whether the human eye makes a preset motion, and when detecting that the human eye makes a preset motion, controlling the gaze point acquiring unit to position the gaze point of the human eye on the current position of the device to be operated The information is sent to the control signal generating unit;
  • control signal generating unit configured to generate, according to a pre-stored position control correspondence table corresponding to the to-be-operated device, the current bit on the device to be operated with a gaze point of the human eye Setting a control signal corresponding to the information, and transmitting the control signal to the device to be operated to control the device to be operated to perform a corresponding operation
  • the position control correspondence table stores each position information on the device to be operated and a control signal corresponding to each position information.
  • the gaze point obtaining unit includes:
  • An infrared emitting module disposed on the device to be operated for emitting infrared light to a human eye and forming a light reflecting point in two pupils of the human eye;
  • a first human eye image acquisition module which is disposed on the device to be operated for acquiring a human eye image
  • An image processing module configured to establish an image coordinate system based on the human eye image acquired by the first human eye image acquisition module, and process the human eye image to obtain a center of two pupils and the light reflection point in the Position coordinates in the image coordinate system;
  • a calculation module configured to obtain, according to the position coordinates of the two pupils obtained by the image processing module and the position coordinates of the light reflection point in the image coordinate system, the gaze point of the human eye is to be operated Location information on the device.
  • the infrared emitting module includes four infrared emitting sources, and the four infrared emitting sources are respectively disposed at four corners of the device to be operated.
  • the gaze point obtaining unit further includes a correction module, configured to correct position information of the gaze point of the human eye on the device to be operated according to the linear scaling algorithm.
  • a correction module configured to correct position information of the gaze point of the human eye on the device to be operated according to the linear scaling algorithm.
  • the image processing module and the calculation module are integrated in the device to be operated.
  • the gaze point obtaining unit includes:
  • a scene acquisition module configured to acquire a scene image that is seen by a human eye through the glasses, where the scene image includes an image of the device to be operated;
  • a second human eye image acquisition module which is disposed on the glasses for acquiring a human eye image
  • a gaze direction determining module configured to determine a gaze direction of the human eye according to the human eye image acquired by the second human eye image acquiring module
  • a gaze point determining module configured to determine, according to the scene image acquired by the scene acquisition module and the gaze direction of the human eye determined by the gaze direction determining module, that the gaze point of the human eye is waiting Operate the position information on the device.
  • the gaze direction determining module is configured to determine a gaze direction of the human eye according to the pupil position in the human eye image acquired by the second human eye image acquiring module.
  • the preset action includes maintaining a position of the gaze point of the human eye on the device to be operated for 2 to 3 seconds, or maintaining a gaze point of the human eye in the preset time to be operated. Quickly blink 3 to 5 when the position on the device is unchanged.
  • the human eye motion detecting unit is a device having an imaging function.
  • the human eye motion detecting unit acquires a real-time image of the human eye through the first human eye image acquiring module or the second human eye image acquiring module, and obtains an action of the human eye based on the real-time image of the human eye.
  • the present disclosure also provides an eye control system comprising: a plurality of devices to be operated and an eye control device as described above.
  • the gaze point acquisition unit in the eye control device includes the infrared emission module, the first human eye image acquisition module, the image processing module, and the calculation module as described above, the The number of gaze point acquisition units is equal to one-to-one correspondence with the number of devices to be operated;
  • Each of the fixation point acquisition units is disposed on a corresponding device to be operated.
  • the number of the human eye motion detecting units in the eye control device is equal to one-to-one correspondence with the number of the devices to be operated;
  • Each of the human eye motion detecting units is disposed on a corresponding device to be operated.
  • all the gaze point obtaining units are connected to a control signal generating unit, wherein the control signal generating unit stores a plurality of position control correspondence tables respectively corresponding to the to-be-operated devices, and is capable of waiting for each The operating device sends control information.
  • the eye control device includes the glasses, the scene acquisition module, the second human eye image acquisition module, the gaze direction determination module, and the gaze point determination module, as described above, the eye control The device includes one of the fixation point acquisition units;
  • the eye control system further includes: a plurality of activation units corresponding to the device to be operated in one-to-one correspondence, each of the activation units being disposed on a corresponding device to be operated;
  • Each of the activation units is configured to activate the eye control device when the gaze point acquisition unit faces a device to be operated corresponding to the activation unit, and control the control signal generation
  • the unit calls a position control correspondence table corresponding to the device to be operated corresponding to the activation unit.
  • the eye control device includes one of the human eye motion detecting units, and the human eye motion detecting unit is disposed on the glasses of the gaze point acquiring unit.
  • the present invention further provides an eye control method, which is implemented based on an eye control device, wherein the eye control device is the above-described eye control device, and the eye control method includes:
  • control signal generating unit Generating, by the control signal generating unit, a control signal corresponding to the current position information of the gaze point of the human eye on the device to be operated, according to a position control correspondence table corresponding to the to-be-operated device stored in advance, and The control signal is sent to the device to be operated to control the device to be operated to perform a corresponding operation,
  • the position control correspondence table stores each position information on the device to be operated and a control signal corresponding to each position information.
  • the eye control method further includes:
  • a position control correspondence table corresponding to the device to be operated is established.
  • the present disclosure provides an eye control device, an eye control method, and an eye control system
  • the eye control device includes: a fixation point acquisition unit, a human eye motion detection unit, and a control signal generation unit, and the fixation point acquisition unit
  • the human eye motion detecting unit is configured to detect whether the human eye makes a preset action, and when detecting that the human eye makes a preset action, controlling the gaze point acquiring unit
  • the current position information of the gaze point of the human eye on the device to be operated is sent to the control signal generating unit
  • the control signal generating unit is configured to generate a gaze point with the human eye according to the position control correspondence table corresponding to the device to be operated stored in advance a control signal corresponding to the current position information on the device to be operated, and the control signal is sent It is sent to the device to be operated to control the device to be operated to perform the corresponding operation.
  • the technical solution of the present invention can effectively realize the control of the
  • FIG. 1 is a schematic structural diagram of an eye control device according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a specific application of an eye control device according to an embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a gaze point acquiring unit in an eye control device according to an embodiment of the present invention.
  • FIG. 4 is still another schematic structural diagram of a gaze point acquiring unit in an eye control device according to an embodiment of the present disclosure
  • FIG. 5 is a schematic structural diagram of an eye control system according to an embodiment of the present invention.
  • FIG. 6 is still another schematic structural diagram of an eye control system according to an embodiment of the present invention.
  • FIG. 7 is a flowchart of an eye control method according to an embodiment of the present invention.
  • FIG. 1 is a schematic structural diagram of an eye control device according to an embodiment of the present invention.
  • the eye control device includes a fixation point acquisition unit 1, a human eye motion detection unit 2, and a control signal generation unit 3.
  • the fixation point acquisition unit 1 is connected to the human eye motion detection unit 2 and the control signal generation unit 3, respectively.
  • the gaze point obtaining unit 1 is configured to acquire position information of the gaze point of the human eye on the device to be operated; the human eye motion detecting unit 2 is configured to detect whether the human eye makes a preset action, and when detecting that the human eye makes a preset action
  • the control gaze point acquisition unit 1 transmits the current position information of the gaze point of the human eye on the device to be operated to the control signal generation unit 3; the control signal generation unit 3 is configured to control the position according to the pre-stored device corresponding to the device to be operated.
  • the correspondence table generates a control signal corresponding to the current position information of the gaze point of the human eye on the device to be operated, and transmits the generated control signal to the device to be operated to control the device to be operated to perform a corresponding operation. It should be noted that the position control correspondence table stores each position information on the device to be operated and a control signal corresponding to each position information.
  • the eye of the human eye is first acquired by the fixation point acquisition unit 1.
  • the position information of the viewpoint on the device to be operated is then detected by the human eye motion detecting unit 2 whether the human eye makes a preset motion, and when detecting that the human eye makes a preset motion, the gaze point acquiring unit 1 controls the gaze of the human eye.
  • the current position information on the device to be operated is sent to the control signal generating unit 3, and finally the control signal generating unit 3 generates a gaze point with the human eye on the device to be operated according to the position control correspondence table corresponding to the device to be operated.
  • the control signal corresponding to the current position information causes the device to be operated to perform a corresponding operation based on the control signal, thereby implementing eye control.
  • the device to be operated in the embodiments of the present invention may be an electronic device such as a mobile phone, a touch display screen, or a PAD (tablet computer).
  • the "connection" in the embodiments of the present invention may be either a wired connection or a wireless connection, which is not limited herein.
  • the specific process of implementing eye control by using the eye control device provided by the embodiment of the present invention will be described in detail below with reference to examples.
  • the device to be operated is assumed to be a touch display screen.
  • the display area of the touch display screen is provided with some fixed soft buttons, and some peripheral physical buttons are disposed in the peripheral area.
  • FIG. 2 is a schematic diagram of a specific application of an eye control device according to an embodiment of the present invention. It is easy to understand that a position control correspondence table corresponding to the touch display screen 4 needs to be established before the eye control operation is performed by using the eye control device provided by the embodiment of the present invention. It should be noted that, in this embodiment, each position information in the position control correspondence table is specifically described as coordinate information, but those skilled in the art should know that the position information in the embodiment is coordinate information. It is merely exemplary, and this does not limit the technical solutions of the embodiments of the present invention. In an embodiment of the invention, any information that can be used to describe the specific location of the gaze point of the human eye on the device to be operated can be used as positional information of the gaze point of the human eye on the device to be operated.
  • the lower left corner of the touch display screen 4 may be used as an origin to extend from the lower left corner of the touch display screen 4 to the lower right corner thereof.
  • the direction is the X axis
  • the direction extending from the lower left corner of the touch display screen 4 to the upper left corner thereof is the Y axis
  • the coordinate system is established, and then the entire touch display screen 4 is coordinated, thereby being on the entire touch display screen 4.
  • Set n sampling points that are evenly distributed and in different positions. Specifically, n sampling points at different positions can be set on the touch display screen 4, and the n sampling points form an array of sampling points of a row and a column b.
  • sampling points should cover the positions of all the buttons (soft keys and physical buttons) on the touch display screen 4, that is, in the area where each button (soft button and physical button) on the touch display screen 4 is located. Includes at least one sampling point.
  • control signals corresponding to the positions of the sampling points are set to generate a position control correspondence table corresponding to the touch display screen 4, and the position control correspondence table is stored in the control signal generating unit 3. in.
  • Table 1 An example of the generated position control correspondence table corresponding to the touch display screen 4 is shown in Table 1 below.
  • the touch display screen 4 when the touch display screen 4 receives the no operation control signal (ie, the current position of the human eye has no corresponding button (soft button and physical button) at the current position on the touch display screen 4), the touch display The screen 4 does not perform any operation; when the touch display screen 4 receives the volume up control signal, the touch display screen 4 will perform the volume up operation; when the touch screen display 4 receives the character A input control signal The touch display screen 4 will perform the input of the character "A" at a certain preset position of the display screen, and the like. No more examples are given here.
  • one location information corresponds to only one control signal, but one control signal can correspond to two or more different location information, such as coordinates (X 2 , Y 2 ) and coordinates (X 3 , Y 3 ). Both correspond to the volume up control signal.
  • the reason is that the size of some buttons on the touch display screen 4 is large, and when the number of sampling points is large, the button with a larger size may cover two or more sampling points. At this time, when the position of the gaze point of the human eye on the touch display screen 4 falls at the position of any one of the two or more sampling points, it can be determined that the user wishes to perform the button with the larger size. operating.
  • the working process of the eye control device provided in this embodiment will be described in detail with reference to FIG. 2, taking an example in which the eye control device controls the scene in which the touch display screen 4 is turned up according to the motion of the human eye.
  • the fixation point acquisition unit 1 acquires the fixation point of the human eye in the touch display. Location information on screen 4.
  • the gaze point acquisition unit 1 acquires the position coordinates of the gaze point of the human eye on the touch display screen 4 as (X 2 , Y 2 ) or (X 3 , X 3 ).
  • the position coordinates acquired by the fixation point acquisition unit 1 are (X 2 , Y 2 ) as an example for description.
  • FIG. 3 is a schematic structural diagram of a gaze point acquiring unit in an eye control device according to an embodiment of the present invention. As shown in FIG. 3, an optional structure of a gaze point acquiring unit in the eye control device provided in this embodiment is provided.
  • the gaze point acquisition unit 1 includes an infrared emission module 6, a first human eye image acquisition module 7, an image processing module 8, and a calculation module 9.
  • the infrared emitting module 6 is disposed on the device to be operated 5 (for example, the touch display screen 4) for emitting infrared light to the human eye, and forming a light reflection point in the two pupils of the human eye, specifically, infrared
  • the transmitting module 6 includes four infrared emitting sources, and the four infrared emitting sources are evenly distributed over the four corners of the device 5 to be operated.
  • the first human eye image acquisition module 7 is also disposed on the device to be operated 5 for acquiring an image of a human eye (ie, an image of a user's eyes).
  • the image processing module 8 is connected to the first human eye image acquisition module 7 for establishing an image coordinate system based on the human eye image acquired by the first human eye image acquisition module 7, and processing the human eye image to obtain the center of the two pupils. And the position coordinates of the light reflection point in the image coordinate system.
  • the calculation module 9 is connected to the image processing module 8 for obtaining the gaze point of the human eye by the cross-invariance algorithm according to the center of the two pupils obtained by the image processing module 8 and the position coordinates of the light reflection point in the image coordinate system.
  • the position information on the device 5 is operated.
  • fixation point acquisition unit 1 in this embodiment may further include a correction module (in the figure) Not shown) for correcting the positional information of the gaze point of the human eye on the device to be operated 5 according to the linear scaling algorithm to obtain the precise position of the gaze point on the device 5 to be operated.
  • a correction module in the figure Not shown
  • image processing module 8 and the calculation module 9 in the fixation point acquisition unit 1 in FIG. 3 can also be integrated into the corresponding device to be operated 5.
  • FIG. 4 is still another schematic structural diagram of a gaze point acquisition unit in the eye control device according to the embodiment of the present invention. As shown in FIG. 4, another optional option of the gaze point acquisition unit in the eye control device provided in this embodiment is provided.
  • the gaze point acquisition unit 1 includes: glasses 10, a scene acquisition module 11, a second human eye image acquisition module 12, a gaze direction determination module 13, and a gaze point determination module 14.
  • the scene acquisition module 11 is disposed on the glasses 10 for acquiring a scene image that is seen by the human eye through the glasses 10, and the scene image includes an image of the device to be operated 5 (eg, the touch display screen 4);
  • the eye image acquisition module 12 is disposed on the glasses 10 for acquiring a human eye image;
  • the gaze direction determining module 13 is coupled to the second human eye image acquisition module 12 for acquiring the human eye image according to the second human eye image acquisition module 12 To determine the gaze direction of the human eye;
  • the gaze point determination module 14 is connected to the scene acquisition module 11 and the gaze direction determination module 13 respectively for gazing according to the scene image acquired by the scene acquisition module 11 and the gaze direction determination module 13
  • the direction is to determine the positional information of the gaze point of the human eye on the device 5 to be operated.
  • the gaze direction determining module 13 is configured to determine a gaze direction of the human eye according to the pupil position in the human eye image acquired by the second human eye image acquiring module 12 .
  • the gaze point acquisition unit 1 in the embodiment of the present invention may also adopt other gaze capable of acquiring the human eye.
  • a device that points to position information on a device to be operated such as a naked eye type eye tracker or a wearable eye tracker.
  • the user controls the eye to make a preset action, for example, keeping the gaze point of the human eye touched.
  • the position on the control display screen 4 is unchanged for 2 to 3 seconds, or the eye-catching point of the human eye is kept at the position of the touch display screen 4 within a preset time, and the eye blinks quickly 3 to 5 times.
  • the preset action in this embodiment is not limited to the above two examples, but may be correspondingly set according to the needs of the user.
  • the human eye motion detecting unit 2 When the human eye motion detecting unit 2 detects that the human eye makes a preset motion, the human eye motion detecting unit 2 controls the gaze point acquiring unit 1 to transmit the current position information of the gaze point of the human eye on the device to be operated 5 to the control signal generation. Unit 3. Specifically, the human eye motion detecting unit 2 controls the gaze point acquiring unit 1 to transmit the current position coordinates (X 2 , Y 2 ) of the gaze point of the human eye on the device to be operated 5 to the control signal generating unit 3.
  • the human eye motion detecting unit 2 in this embodiment may be a device having an imaging function, such as a CCD (Charge-coupled Device).
  • the human eye motion detecting unit 2 in the embodiment may further acquire a real-time image of the human eye by using the first human eye image acquiring module 7 or the second human eye image acquiring module 12, and based on the real-time image of the human eye. To know the movements of the human eye.
  • the control signal generating unit 3 After receiving the current position information of the gaze point of the human eye on the device to be operated 5, the control signal generating unit 3 generates a correspondence with the current position information based on the position control correspondence table corresponding to the device to be operated 5 stored in advance. Control signal. Specifically, the control signal generation unit 3 generates and coordinates the current position based on the position control correspondence table (for example, as shown in Table 1) corresponding to the touch display screen 4 and the current position coordinates (X 2 , Y 2 ). 2 , Y 2 ) The corresponding volume up control signal is sent, and the volume up control signal is sent to the touch display 4.
  • the position control correspondence table for example, as shown in Table 1
  • the touch display screen 4 After receiving the volume up control signal, the touch display screen 4 performs the operation of turning up the volume by itself, and the eye control process is completed.
  • the eye control device provided in this embodiment can effectively establish the relationship between the human eye line of sight and the device to be operated 5, thereby realizing the control of the operating device 5 by the human eye, thereby improving the experience of human-computer interaction.
  • the embodiment of the present invention further provides an eye control system, comprising: an eye control device and a plurality of devices to be operated 5, wherein the eye control device adopts the eye control device in the above embodiment, and the eye control device
  • an eye control system comprising: an eye control device and a plurality of devices to be operated 5, wherein the eye control device adopts the eye control device in the above embodiment, and the eye control device
  • FIG. 5 is a schematic structural diagram of an eye control system according to an embodiment of the present invention.
  • an optional embodiment of an eye control system provided by an embodiment of the present invention is provided in an eye control device of the eye control system.
  • the gaze point acquisition unit 1 adopts the gaze point acquisition unit 1 shown in FIG. 3 to implement the naked eye type eye control.
  • the specific structure of the gaze point acquisition unit 1 refer to the description of the corresponding content in the foregoing embodiment, and details are not described herein again.
  • a plurality of devices to be operated for example, device A to be operated, device B to be operated, ..., device Z to be operated
  • number of devices to be operated and eye-catching in the eye control system are included.
  • each gaze point obtaining unit 1 is disposed on the corresponding device to be operated to acquire position information of the gaze point of the human eye on the corresponding device to be operated when the user performs eye control.
  • all the gaze point obtaining units 1 are connected to the same control signal generating unit 3, and the control signal generating unit 3 stores a plurality of position control correspondence tables respectively corresponding to the to-be-operated devices (the position The number of control correspondence tables is equal to one-to-one correspondence with the number of devices to be operated, and the control signal generating unit 3 can transmit control information to each device to be operated.
  • the number of human eye motion detecting units 2 in the eye control device is also equal to one-to-one correspondence with the number of devices to be operated, and each human eye motion detecting unit 2 is disposed in a corresponding one.
  • the operating device is configured to acquire a human eye image before the corresponding device to be operated when the user performs eye control, and detect whether the human eye makes a preset action according to the human eye image.
  • the fixation point acquisition unit 1 disposed on the device B to be operated can acquire the position information of the fixation point of the human eye on the device B to be operated (the fixation point acquisition unit 1 on the other device to be operated cannot detect the human eye).
  • the detecting unit 2 transmits the current position information of the gaze point acquiring unit 1 on the device B to be operated on the device B to be operated to the control signal generating unit 3.
  • the control signal generation unit 3 After receiving the current position information sent by the fixation point acquisition unit 1 on the device B to be operated, the control signal generation unit 3 calls up the position control correspondence table corresponding to the device B to be operated, and based on the received current position information and adjustment a position control correspondence table corresponding to the device B to be operated, generating a control signal for controlling the device B to be operated corresponding to the current position information, and transmitting the control signal to the device B to be operated to control Operate device B for the corresponding operation.
  • the human eye motion detecting unit 2 can acquire the real-time image of the human eye through the first human eye image acquiring module 7, and based on the human eye.
  • the time image is used to know the motion of the human eye, and then it is detected whether the human eye makes a preset action on the device to be operated corresponding to the human eye motion detecting unit 2.
  • FIG. 6 is still another schematic structural diagram of an eye control system according to an embodiment of the present invention.
  • the eye control of the eye control system uses the gaze point acquisition unit 1 shown in FIG. 4 to implement the spectacles-type eye control.
  • the specific structure of the gaze point acquisition unit 1 refer to the description of the corresponding content in the above embodiment. Let me repeat.
  • the eye control system shown in FIG. 1 uses the gaze point acquisition unit 1 shown in FIG. 4 to implement the spectacles-type eye control.
  • a plurality of devices to be operated (for example, the device to be operated A, the device to be operated B, ..., the device to be operated Z) are included, and the eye control device includes only one fixation point acquisition unit 1,
  • the eye control system further includes: a plurality of activation units 15 that are equal and one-to-one corresponding to the number of devices to be operated, each activation unit 15 is disposed on a corresponding device to be operated, and the activation unit 15 is configured to acquire the unit 1 at the fixation point
  • the eye control device is activated toward the device to be operated corresponding to the activation unit 15, and the control signal generation unit 3 calls the position control correspondence table corresponding to the device to be operated corresponding to the activation unit 15.
  • the eye control device includes a human eye motion detecting unit 2, and the human eye motion detecting unit 2 is disposed on the glasses 10 of the gaze point acquiring unit 1.
  • the user When the user desires to operate the operating device B, the user needs to wear the glasses-type fixation point acquisition unit 1 (when the entire eye control device is in a non-working state) in front of the eyes, and moves to the position opposite to the device B to be operated. At the position where the gaze point of the eye is moved on the device B to be operated, it is moved to an area of a button (soft button and physical button) to be operated on the device B to be operated, and the eye is controlled to perform a preset action. At this time, the activation unit 15 disposed on the device to be operated B detects that the fixation point acquisition unit 1 is facing the device B to be operated, and then activates the eye control device (the eye control device starts to be in an operating state), and controls the control signal generation.
  • the activation unit 15 disposed on the device to be operated B detects that the fixation point acquisition unit 1 is facing the device B to be operated, and then activates the eye control device (the eye control device starts to be in an operating state), and controls the control signal generation.
  • the unit 3 calls a position control correspondence table corresponding to the device B to be operated.
  • the fixation point acquisition unit 1 and the human eye motion detection unit 2 in the eye control device start normal operation, the fixation point acquisition unit 1 can acquire the current position information of the fixation point of the human eye on the device B to be operated.
  • the human eye motion detecting unit 2 can detect that the human eye has made a preset motion, and control the gaze point acquiring unit 1 to transmit the current position information of the gaze point of the human eye on the device B to be operated to the control signal.
  • Number generation unit 3 is a position control correspondence table corresponding to the device B to be operated.
  • the control signal generating unit 3 After receiving the current location information of the gaze point of the human eye sent by the gaze point acquisition unit 1 on the device B to be operated, the control signal generating unit 3 receives the current location information and the previously called device B to be operated according to the received current location information.
  • the position control correspondence table generates a control signal corresponding to the received current position information, and transmits the control signal to the device B to be operated to control the device B to be operated to perform a corresponding operation.
  • the human eye motion detecting unit 2 can be wirelessly connected to the second human eye image acquiring module 12, and the real image of the human eye is acquired by the second human eye image acquiring module 12. And based on the real-time image of the human eye to know the action of the human eye, and then to detect whether the human eye makes a preset action on the device to be operated.
  • the control signal generating unit 3 in this embodiment may also be wirelessly connected to the gaze point determining module 14 (see FIG. 4) in the gaze point acquiring unit 1 to receive the position information transmitted by the gaze point determining module 14.
  • the eye control system provided by the embodiment of the invention has a simple structure, and only needs to control a plurality of devices to be operated by using one control signal generating unit, so that the cost of the entire eye control system can be greatly reduced.
  • FIG. 7 is a flowchart of an eye control method according to an embodiment of the present invention. As shown in FIG. 7 , the eye control method is based on the eye control device provided in the foregoing embodiment. For the specific structure of the eye control device, refer to the foregoing embodiment. The description of the eye control method may include the following steps.
  • Step 101 Acquire location information of a gaze point of the human eye on the device to be operated by the gaze point acquisition unit.
  • Step 102 Detect whether the human eye makes a preset action by the human eye motion detecting unit, and control the current position information of the gaze point acquiring unit on the device to be operated when the human eye detects the preset action. Send to the control signal generating unit.
  • Step 103 The control signal generating unit generates a control signal corresponding to the current position information of the gaze point of the human eye on the device to be operated according to the position control correspondence table corresponding to the device to be operated stored in advance, and generates the generated control. A signal is sent to the device to be operated to control the device to be operated to perform a corresponding operation.
  • each position information on the device to be operated and a control signal corresponding to each position information are stored in the position control correspondence table.
  • the eye control method further includes the step of establishing a position control correspondence table corresponding to the device to be operated.
  • the gaze point acquiring unit acquires position information of the gaze point of the human eye on the device to be operated, and then detects whether the human eye makes a preset action by the human eye motion detecting unit, and When detecting that the human eye makes a preset action, the control gaze point acquisition unit sends the current position information of the gaze point of the human eye on the device to be operated to the control signal generating unit, and finally the control signal generating unit corresponds to the device to be operated.
  • the position control correspondence table generates a control signal corresponding to the current position information of the gaze point of the human eye on the device to be operated, and transmits the generated control signal to the device to be operated, so that the device to be operated performs the corresponding based on the control signal Operation, thereby realizing the use of the human eye to control the operating device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

一种眼控装置、眼控方法和眼控系统,该眼控装置包括:注视点获取单元(1)、人眼动作检测单元(2)和控制信号生成单元(3),注视点获取单元(1)用于获取人眼的注视点在待操作器件(5)上的位置信息;人眼动作检测单元(2)用于检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元(1)将人眼的注视点在待操作器件(5)上的当前位置信息发送给控制信号生成单元(3);控制信号生成单元(3)用于根据预先存储的与待操作器件(5)相对应的位置控制对应表生成与所述当前位置信息相对应的控制信号,并将控制信号发送至待操作器件(5)以控制待操作器件(5)执行相应的操作。上述眼控装置、眼控方法和眼控系统可有效的实现利用人眼来对待操作器件(5)进行控制。

Description

眼控装置、眼控方法和眼控系统 技术领域
本发明涉及显示技术领域,特别涉及一种眼控装置、一种眼控方法和一种眼控系统。
背景技术
探索自然、和谐的人机关系已成为一个重要的计算机研究领域,自然、高效、智能化的人机交互界面是当今计算机发展的重要趋势。但是,对于残障人士或者双手暂时没有空闲(例如,双手正在进行洗漱、做饭、吃饭等活动)进行操作的用户而言,利用诸如鼠标、键盘、操作器之类的输入设备实现人机交互就非常困难。
在人机交互领域,由于眼睛是一种重要的信息交互通道,而且视线能够反应人的注意方向,因此将视线应用于人机交互领域具有自然性、直接性和交互性等特点,从而备受人们的关注。如何基于人眼的动作来实现人机交互是本领域重要的研究方向。
发明内容
本发明提供一种眼控装置、一种眼控方法和一种眼控系统,可基于人眼的动作来对待操作器件进行控制。
为实现上述目的,本公开提供了一种眼控装置,其包括:
注视点获取单元,用于获取人眼的注视点在待操作器件上的位置信息;
人眼动作检测单元,用于检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制所述注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元;
控制信号生成单元,用于根据预先存储的与所述待操作器件相对应的位置控制对应表,生成与人眼的注视点在待操作器件上的所述当前位 置信息相对应的控制信号,并将所述控制信号发送至待操作器件以控制所述待操作器件执行相应的操作,
其中,所述位置控制对应表中存储有所述待操作器件上的各位置信息和与所述各位置信息相对应的控制信号。
可选地,所述注视点获取单元包括:
红外发射模块,其设置于所述待操作器件上,用于向人眼发射红外光,并在人眼的两个瞳孔中形成光反射点;
第一人眼图像获取模块,其设置于所述待操作器件上,用于获取人眼图像;
图像处理模块,用于基于所述第一人眼图像获取模块获取的人眼图像建立图像坐标系,并对所述人眼图像进行处理以获得两个瞳孔的中心和所述光反射点在所述图像坐标系中的位置坐标;以及
计算模块,用于根据所述图像处理模块获得的两个瞳孔的中心和所述光反射点在所述图像坐标系中的位置坐标,通过交比不变算法得到人眼的注视点在待操作器件上的位置信息。
可选地,所述红外发射模块包括四个红外线发射源,所述四个红外线发射源分别设置在所述待操作器件的四个角上。
可选地,所述注视点获取单元还包括校正模块,用于根据线性定标算法对人眼的注视点在待操作器件上的位置信息进行校正。
可选地,所述图像处理模块和所述计算模块集成于所述待操作器件中。
可选地,所述注视点获取单元包括:
眼镜;
场景获取模块,其设置于所述眼镜上,用于获取人眼通过所述眼镜所看到的场景图像,所述场景图像中包含有所述待操作器件的图像;
第二人眼图像获取模块,其设置于所述眼镜上,用于获取人眼图像;
注视方向确定模块,用于根据所述第二人眼图像获取模块获取的人眼图像来确定人眼的注视方向;以及
注视点确定模块,用于根据所述场景获取模块获取的场景图像和所述注视方向确定模块确定的人眼的注视方向来确定人眼的注视点在待 操作器件上的位置信息。
可选地,所述注视方向确定模块用于根据所述第二人眼图像获取模块获取的人眼图像中的瞳孔位置来确定人眼的注视方向。
可选地,所述预设动作包括保持人眼的注视点在所述待操作器件上的位置不变2~3秒,或者,在预设时间内保持人眼的注视点在所述待操作器件上的位置不变的情况下快速眨眼3~5下。
可选地,所述人眼动作检测单元为具有摄像功能的设备。
可选地,所述人眼动作检测单元通过所述第一人眼图像获取模块或者第二人眼图像获取模块获取人眼的实时图像,并基于人眼的实时图像来获知人眼的动作。
为实现上述目的,本公开还提供了一种眼控系统,包括:多个待操作器件和如上所述的眼控装置。
可选地,当所述眼控装置中的注视点获取单元包括如上所述的红外发射模块、第一人眼图像获取模块、图像处理模块和计算模块时,所述眼控装置中的所述注视点获取单元的数量与所述待操作器件的数量相等且一一对应;并且
每个所述注视点获取单元设置在对应的待操作器件上。
可选地,所述眼控装置中的所述人眼动作检测单元的数量与所述待操作器件的数量相等且一一对应;并且
每个所述人眼动作检测单元均设置在对应的待操作器件上。
可选地,所有所述注视点获取单元均连接至一个控制信号生成单元,所述控制信号生成单元中存储有分别与各待操作器件相对应的多个位置控制对应表,并且能够向各待操作器件发送控制信息。
可选地,当所述眼控装置中的注视点获取单元包括如上所述的眼镜、场景获取模块、第二人眼图像获取模块、注视方向确定模块和注视点确定模块时,所述眼控装置中包括一个所述注视点获取单元;
所述眼控系统还包括:与所述待操作器件一一对应的多个激活单元,每个所述激活单元均设置在对应的待操作器件上;
每个所述激活单元用于在所述注视点获取单元朝向与所述激活单元相对应的待操作器件时激活所述眼控装置,并控制所述控制信号生成 单元调用与所述激活单元相对应的待操作器件所对应的位置控制对应表。
可选地,所述眼控装置包括一个所述人眼动作检测单元,所述人眼动作检测单元设置在所述注视点获取单元的眼镜上。
为实现上述目的,本发明还提供了一种眼控方法,所述眼控方法基于眼控装置来实施,所述眼控装置为上述的眼控装置,所述眼控方法包括:
通过所述注视点获取单元获取人眼的注视点在待操作器件上的位置信息;
通过所述人眼动作检测单元检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制所述注视点获取单元将当前人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元;以及
由所述控制信号生成单元根据预先存储的与所述待操作器件相对应的位置控制对应表生成与人眼的注视点在待操作器件上的所述当前位置信息相对应的控制信号,并将所述控制信号发送至待操作器件以控制所述待操作器件执行相应的操作,
其中,所述位置控制对应表中存储有所述待操作器件上的各位置信息和与所述各位置信息对应的控制信号。
可选地,在通过所述注视点获取单元获取人眼的注视点在待操作器件上的位置信息之前,所述眼控方法还包括:
建立与待操作器件对应的位置控制对应表。
本公开的技术方案具有以下有益效果:
本公开提供了一种眼控装置、一种眼控方法和一种眼控系统,其中该眼控装置包括:注视点获取单元、人眼动作检测单元和控制信号生成单元,注视点获取单元用于获取人眼的注视点在待操作器件上的位置信息;人眼动作检测单元用于检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元;控制信号生成单元用于根据预先存储的与待操作器件相应的位置控制对应表,生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,并将控制信号发 送至待操作器件以控制待操作器件执行相应的操作。本发明的技术方案可有效的实现利用人眼来对待操作器件进行控制。
附图说明
图1为本发明实施例提供的眼控装置的结构示意图;
图2为本发明实施例提供的眼控装置的具体应用示意图;
图3为本发明实施例提供的眼控装置中的注视点获取单元的结构示意图;
图4为本发明实施例提供的眼控装置中的注视点获取单元的又一结构示意图;
图5为本发明实施例提供的眼控系统的结构示意图;
图6为本发明实施例提供的眼控系统的又一结构示意图;以及
图7为本发明实施例提供的眼控方法的流程图。
具体实施方式
为使本领域的技术人员更好地理解本发明的技术方案,下面结合附图对本发明提供的眼控装置、眼控系统和眼控方法进行详细描述。
图1为本发明实施例提供的眼控装置的结构示意图,如图1所示,该眼控装置包括:注视点获取单元1、人眼动作检测单元2和控制信号生成单元3。其中,注视点获取单元1分别与人眼动作检测单元2和控制信号生成单元3连接。注视点获取单元1用于获取人眼的注视点在待操作器件上的位置信息;人眼动作检测单元2用于检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元1将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元3;控制信号生成单元3用于根据预先存储的与待操作器件相对应的位置控制对应表,生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,并将生成的控制信号发送至待操作器件以控制待操作器件执行相应的操作。需要说明的是,该位置控制对应表中存储有待操作器件上的各位置信息和与各位置信息对应的控制信号。
本实施例的眼控装置中,首先通过注视点获取单元1获取人眼的注 视点在待操作器件上的位置信息,然后通过人眼动作检测单元2检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元1将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元3,最后由控制信号生成单元3根据与待操作器件相对应的位置控制对应表,生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,使得待操作器件基于该控制信号执行相应的操作,从而实现眼控。
需要说明的是,本发明各实施例中的待操作器件可以为手机、触控显示屏、PAD(平板电脑)等电子设备。本发明各实施例中的“连接”既可以为有线连接,也可以为无线连接,在此不做限定。
下面将结合实例来对利用本发明实施例提供的眼控装置实现眼控的具体过程进行详细的描述。其中,假定待操作器件为触控显示屏,该触控显示屏的显示区域内设置有一些位置固定的软按键,而其周边区域内设置有一些位置固定的物理按键。
图2为本发明实施例提供的眼控装置的具体应用示意图。容易理解的是,在利用本发明实施例提供的眼控装置进行眼控操作之前需要建立与该触控显示屏4相对应的位置控制对应表。需要说明的是,本实施例中以位置控制对应表中的各位置信息具体为坐标信息为例进行说明,但是本领域技术人员应该知晓的是,本实施例中的位置信息为坐标信息的情况仅起到示例性作用,这并不会对本发明各实施例的技术方案产生限制。在本发明的实施例中,能够用于描述人眼的注视点在待操作器件上的具体位置的任何信息均可作为人眼的注视点在待操作器件上的位置信息。
本实施例中,在建立与触控显示屏4相对应的位置控制对应表时,可以以触控显示屏4的左下角为原点,以从触控显示屏4的左下角向其右下角延伸的方向为X轴,从触控显示屏4的左下角向其左上角延伸的方向为Y轴,建立坐标系,然后将整个触控显示屏4坐标化,从而在整个触控显示屏4上设置n个均匀分布且处于不同位置的采样点,具体地,可在触控显示屏4上设置n个处于不同位置的采样点,这n个采样点构成一个a行、b列的采样点阵列,在这个采样点阵列中在行方向上或列 方向上相邻的任意两个采样点之间的距离相等。需要说明的是,该n个采样点应该覆盖触控显示屏4上所有按键(软按键和物理按键)的位置,即触控显示屏4上每个按键(软按键和物理按键)所在区域内包括至少一个采样点。在设置完采样点之后,设置与这些采样点的位置对应的控制信号,以生成与触控显示屏4相对应的位置控制对应表,并将所述位置控制对应表存储于控制信号生成单元3中。以下表1中示出了所生成的与触控显示屏4相对应的位置控制对应表的一个示例。
表1
位置信息 控制信号
(X1,Y1) 无操作控制信号
(X2,Y2) 音量调高控制信号
(X3,Y3) 音量调高控制信号
(X4,Y4) 音量调低控制信号
(X5,Y5) 音量调低控制信号
(X6,Y6) 开/关机控制信号
…… ……
…… ……
(Xn-2,Yn-2) 字符A输入控制信号
(Xn-1,Yn-1) 无操作控制信号
(Xn,Yn) 无操作控制信号
具体地,当触控显示屏4接收到无操作控制信号时(即,人眼的注视点在触控显示屏4上的当前位置没有对应的按键(软按键和物理按键)),触控显示屏4不进行任何的操作;当触控显示屏4接收到音量调高控制信号时,触控显示屏4将执行音量调高的操作;当触控显示屏4接收到字符A输入控制信号时,触控显示屏4将执行在显示屏的某一预设位置输入字符“A”,等等。此处不再一一举例说明。
由上述表1可见,一个位置信息只对应一个控制信号,但是一个控制信号可以对应两个或多个不同的位置信息,例如:坐标(X2,Y2)和坐标(X3,Y3)均对应音量调高控制信号。其原因在于,触控显示屏4上某些按键的尺寸较大,当采样点数量较多时,该尺寸较大的按键可能 覆盖了两个或多个采样点。此时,当人眼的注视点在触控显示屏4上的位置落在所述两个或多个采样点中的任一个的位置处时,可认定用户希望对该尺寸较大的按键进行操作。例如,当人眼的注视点在触控显示屏4上的位置落在坐标为(X2,Y2)或(X3,X3)的采样点处时,均可认定用户希望对音量调高按键进行操作。
下面将结合图2,以眼控装置根据人眼的动作控制触控显示屏4调高音量的场景为例,对本实施例提供的眼控装置的工作过程进行详细描述。
首先,用户将注视点在触控显示屏4上的位置移动到触控显示屏4上的音量调高按键的区域内,此时,注视点获取单元1获取人眼的注视点在触控显示屏4上的位置信息。作为示例,注视点获取单元1获取到人眼的注视点在触控显示屏4上的位置坐标为(X2,Y2)或(X3,X3)。本实施例中,具体以注视点获取单元1获取到的位置坐标为(X2,Y2)为例进行说明。
图3为本发明实施例提供的眼控装置中的注视点获取单元的结构示意图,如图3所示,作为本实施例提供的眼控装置中的注视点获取单元的一种可选结构(该结构针对用户未佩戴眼镜的情况,即裸眼的情况),该注视点获取单元1包括:红外发射模块6、第一人眼图像获取模块7、图像处理模块8和计算模块9。其中,红外发射模块6设置于待操作器件5(如,触控显示屏4)上,用于向人眼发射红外光,并在人眼的两个瞳孔中形成光反射点,具体地,红外发射模块6包括四个红外线发射源,四个红外线发射源可均匀地分布在待操作器件5的四个角上。第一人眼图像获取模块7也设置于待操作器件5上,用于获取人眼图像(即,用户的眼睛的图像)。图像处理模块8与第一人眼图像获取模块7连接,用于基于第一人眼图像获取模块7获取的人眼图像建立图像坐标系,并对人眼图像进行处理以获得两个瞳孔的中心和光反射点在图像坐标系中的位置坐标。计算模块9与图像处理模块8连接,用于根据图像处理模块8获得的两个瞳孔的中心和光反射点在图像坐标系中的位置坐标,通过交比不变算法得到人眼的注视点在待操作器件5上的位置信息。更进一步地,本实施例中的注视点获取单元1可以还包括校正模块(图中 未示出),用于根据线性定标算法对人眼的注视点在待操作器件5上的位置信息进行校正,以获得所述注视点在待操作器件5上的精确位置。
需要补充说明的是,图3中的注视点获取单元1中的图像处理模块8和计算模块9也可集成于对应的待操作器件5中。
图4为本发明实施例提供的眼控装置中的注视点获取单元的又一结构示意图,如图4所示,作为本实施例提供的眼控装置中的注视点获取单元的又一可选结构(该结构针对用户佩戴眼镜的情况),该注视点获取单元1包括:眼镜10、场景获取模块11、第二人眼图像获取模块12、注视方向确定模块13和注视点确定模块14。其中,场景获取模块11设置于眼镜10上,用于获取人眼通过眼镜10所看到的场景图像,场景图像中包含有待操作器件5(如,触控显示屏4)的图像;第二人眼图像获取模块12设置于眼镜10上,用于获取人眼图像;注视方向确定模块13与第二人眼图像获取模块12连接,用于根据第二人眼图像获取模块12获取的人眼图像来确定人眼的注视方向;注视点确定模块14分别与场景获取模块11和注视方向确定模块13连接,用于根据场景获取模块11获取的场景图像和注视方向确定模块13确定的人眼的注视方向来确定人眼的注视点在待操作器件5上的位置信息。
进一步可选地,注视方向确定模块13用于根据第二人眼图像获取模块12获取的人眼图像中的瞳孔位置来确定人眼的注视方向。
需要进一步说明的是,本发明实施例提供的注视点获取单元1的上述两种结构仅起到示例性作用,本发明实施例中的注视点获取单元1还可以采用其他能够获取人眼的注视点在待操作器件上的位置信息的设备,例如:裸眼式眼动仪或可穿戴式眼动仪等。
在用户将注视点在触控显示屏4上的位置移动到触控显示屏4上的音量调高按键的区域内之后,用户控制眼睛作出预设动作,例如,保持人眼的注视点在触控显示屏4上的位置不变2~3s,或者在预设时间内保持人眼的注视点在触控显示屏4上的位置不变的前提下快速眨眼3~5下。需要说明的是,本实施例中该预设动作并不限于上述两种示例,而是可以根据用户的需求进行相应的设定。
在人眼动作检测单元2检测出人眼作出预设动作时,人眼动作检测 单元2控制注视点获取单元1将人眼的注视点在待操作器件5上的当前位置信息发送给控制信号生成单元3。具体地,人眼动作检测单元2控制注视点获取单元1将人眼的注视点在待操作器件5上的当前位置坐标(X2,Y2)发送给控制信号生成单元3。
需要说明的是,本实施例中的人眼动作检测单元2可以为具有摄像功能的设备,例如CCD(Charge-coupled Device,电荷耦合元件)。可选地,本实施例中的人眼动作检测单元2还可以通过上述第一人眼图像获取模块7或者第二人眼图像获取模块12获取人眼的实时图像,并基于人眼的实时图像来获知人眼的动作。
控制信号生成单元3接收到人眼的注视点在待操作器件5上的当前位置信息之后,基于预先存储的与待操作器件5相对应的位置控制对应表,生成与所述当前位置信息相对应的控制信号。具体地,控制信号生成单3基于与触控显示屏4相对应的位置控制对应表(例如,如表1所示)和当前位置坐标(X2,Y2),生成与当前位置坐标(X2,Y2)相对应的音量调高控制信号,并将该音量调高控制信号发送给触控显示屏4。
触控显示屏4接收到该音量调高控制信号之后,自行执行音量调高的操作,眼控流程完成。
本实施例提供的眼控装置能够有效的建立起人眼视线与待操作器件5之间的联系,从而实现了利用人眼对待操作器件5进行控制,进而提升了人机交互的体验感。
本发明实施例还提供了一种眼控系统,该眼控系统包括:眼控装置和若干个待操作器件5,其中所述眼控装置采用上述实施例中的眼控装置,该眼控装置的具体描述可参见上述实施例中的描述,此处不再赘述。
图5为本发明实施例提供的眼控系统的结构示意图,如图5所示,作为本发明实施例提供的眼控系统的一种可选实施方式,该眼控系统的眼控装置中的注视点获取单元1采用图3所示的注视点获取单元1,以实现裸眼式眼控,该注视点获取单元1的具体结构可参见上述实施例中相应内容的描述,此处不再赘述。在图5所示的眼控系统中包括若干个待操作器件(如,待操作器件A、待操作器件B、……、待操作器件Z),眼控系统中的待操作器件的数量与注视点获取单元1的数量相等且一一 对应,每个注视点获取单元1均设置在对应的待操作器件上,以在用户进行眼控时获取人眼的注视点在对应的待操作器件上的位置信息。
本实施例中,所有注视点获取单元1均连接至同一个控制信号生成单元3,该控制信号生成单元3中存储有分别与各待操作器件相对应的若干个位置控制对应表(所述位置控制对应表的数量与待操作器件的数量相等且一一对应),并且该控制信号生成单元3可向各待操作器件发送控制信息。
可选地,在本实施例中,该眼控装置中的人眼动作检测单元2的数量也与待操作器件的数量相等且一一对应,每个人眼动作检测单元2均设置在对应的待操作器件上,以在用户进行眼控时获取对应的待操作器件前的人眼图像,并根据人眼图像检测人眼是否作出预设动作。
下面以用户采用裸眼式眼控的方式控制图5中的待操作器件B为例,对本发明实施例提供的眼控系统的工作过程进行详细描述。
在用户希望对待操作器件B进行操作时,用户需要移到与待操作器件B正对的位置处,并将眼睛的注视点在待操作器件B上的位置移到待操作器件B上某一待操作的按键(软按键和物理按键)的区域内,然后控制眼睛作出预设动作。此时,设置于待操作器件B上的注视点获取单元1可获取人眼的注视点在待操作器件B上的位置信息(其他待操作器件上的注视点获取单元1无法检测到人眼的注视点在其上的位置信息),与待操作器件B对应的人眼动作检测单元2可以检测到人眼对待操作器件B作出了预设动作,此时与待操作器件B对应的人眼动作检测单元2将控制待操作器件B上的注视点获取单元1将人眼的注视点在待操作器件B上的当前位置信息发送给控制信号生成单元3。控制信号生成单元3接收到待操作器件B上的注视点获取单元1发送的当前位置信息之后,调出与待操作器件B相对应的位置控制对应表,并基于接收到的当前位置信息和调出的与待操作器件B相对应的位置控制对应表,生成与所述当前位置信息相对应的用于控制待操作器件B的控制信号,并将该控制信号发送给待操作器件B以控制待操作器件B进行相应的操作。
需要说明的是,在图5所示的眼控系统中,人眼动作检测单元2可通过第一人眼图像获取模块7来获取人眼的实时图像,并基于人眼的实 时图像来获知人眼的动作,继而检测人眼是否对与该人眼动作检测单元2对应的待操作器件作出预设动作。
图6为本发明实施例提供的眼控系统的又一结构示意图,如图6所示,作为本发明实施例提供的眼控系统的又一种可选实施方式,该眼控系统的眼控装置中的注视点获取单元1采用图4所示的注视点获取单元1,以实现眼镜式眼控,该注视点获取单元1的具体结构可参见上述实施例中相应内容的描述,此处不再赘述。在图6所示的眼控系统中包括若干个待操作器件(如,待操作器件A、待操作器件B、……、待操作器件Z),眼控装置只包括一个注视点获取单元1,该眼控系统还包括:与待操作器件的数量相等且一一对应的若干个激活单元15,每个激活单元15设置在对应的待操作器件上,激活单元15用于在注视点获取单元1朝向与该激活单元15对应的待操作器件时激活眼控装置,并控制控制信号生成单元3调用与该激活单元15相对应的待操作器件所对应的位置控制对应表。
可选地,本实施例中,眼控装置包括一个人眼动作检测单元2,该人眼动作检测单元2设置在注视点获取单元1的眼镜10上。
下面以用户采用眼镜式眼控方式控制图6中的待操作器件B为例,对本发明实施例提供的眼控系统的工作过程进行详细描述。
在用户希望对待操作器件B进行操作时,用户需要将眼镜式注视点获取单元1(此时整个眼控装置处于非工作状态)佩戴在眼睛前,且移到与待操作器件B正对的位置处,将眼睛的注视点在待操作器件B上的位置移到待操作器件B上某一待操作的按键(软按键和物理按键)的区域内,并控制眼睛作出预设动作。此时,设置于待操作器件B上的激活单元15检测到注视点获取单元1正对着待操作器件B,于是将眼控装置激活(眼控装置开始处于工作状态),并控制控制信号生成单元3调用与待操作器件B相对应的位置控制对应表。与此同时,由于眼控装置中的注视点获取单元1和人眼动作检测单元2开始正常工作,则注视点获取单元1可获取到人眼的注视点在待操作器件B上的当前位置信息,人眼动作检测单元2可以检测到人眼作出了预设动作,并控制注视点获取单元1将人眼的注视点在待操作器件B上的当前位置信息发送给控制信 号生成单元3。控制信号生成单元3接收到注视点获取单元1发送的人眼的注视点在待操作器件B上的当前位置信息之后,根据接收到的当前位置信息和之前调出的与待操作器件B相对应的位置控制对应表,生成与接收到的当前位置信息相对应的控制信号,并将该控制信号发送给待操作器件B以控制待操作器件B进行相应的操作。
需要说明的是,在图6所示的眼控系统中,人眼动作检测单元2可与第二人眼图像获取模块12无线连接,通过第二人眼图像获取模块12获取人眼的实时图像,并基于人眼的实时图像来获知人眼的动作,继而检测人眼是否对正对的待操作器件作出预设动作。当然,本实施例中控制信号生成单元3也可与注视点获取单元1中的注视点确定模块14(参见图4)无线连接,以接收注视点确定模块14传递的位置信息。
本发明实施例提供的眼控系统的结构简单,且仅需利用一个控制信号生成单元即可对多个待操作器件分别进行控制,因此整个眼控系统的成本可以大大降低。
图7为本发明实施例提供的眼控方法的流程图,如图7所示,该眼控方法基于上述实施例中提供的眼控装置,该眼控装置的具体结构可参见上述实施例中的描述,此处不再赘述,该眼控方法可包括以下步骤。
步骤101:通过注视点获取单元获取人眼的注视点在待操作器件上的位置信息。
步骤102:通过人眼动作检测单元检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元。
步骤103:由控制信号生成单元根据预先存储的与待操作器件相对应的位置控制对应表生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,并将生成的控制信号发送至待操作器件以控制待操作器件执行相应的操作。
需要说明的是,在位置控制对应表中存储有待操作器件上的各位置信息和与各位置信息对应的控制信号。
可选地,在步骤101之前,所述眼控方法还包括建立与待操作器件对应的位置控制对应表的步骤。
本发明实施例提供的眼控方法中,首先通过注视点获取单元获取人眼的注视点在待操作器件上的位置信息,然后通过人眼动作检测单元检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元,最后由控制信号生成单元根据与待操作器件相对应的位置控制对应表生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,并将生成的控制信号发送至待操作器件,使得待操作器件基于该控制信号执行相应的操作,从而实现利用人眼来对待操作器件进行控制。
可以理解的是,以上实施方式仅仅是为了说明本发明的原理而采用的示例性实施方式,然而本发明并不局限于此。对于本领域内的普通技术人员而言,在不脱离本发明的精神和实质的情况下,可以做出各种变型和改进,这些变型和改进也视为本发明的保护范围。

Claims (18)

  1. 一种眼控装置,包括:
    注视点获取单元,用于获取人眼的注视点在待操作器件上的位置信息;
    人眼动作检测单元,用于检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制所述注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给控制信号生成单元;以及
    控制信号生成单元,用于根据预先存储的与所述待操作器件相对应的位置控制对应表,生成与人眼的注视点在待操作器件上的所述当前位置信息相对应的控制信号,并将所述控制信号发送至待操作器件以控制所述待操作器件执行相应的操作,
    其中,所述位置控制对应表中存储有所述待操作器件上的各位置信息和与所述各位置信息相对应的控制信号。
  2. 根据权利要求1所述的眼控装置,其中,所述注视点获取单元包括:
    红外发射模块,其设置于所述待操作器件上,用于向人眼发射红外光,并在人眼的两个瞳孔中形成光反射点;
    第一人眼图像获取模块,其设置于所述待操作器件上,用于获取人眼图像;
    图像处理模块,用于基于所述第一人眼图像获取模块获取的人眼图像建立图像坐标系,并对所述人眼图像进行处理以获得两个瞳孔的中心和所述光反射点在所述图像坐标系中的位置坐标;以及
    计算模块,用于根据所述图像处理模块获得的两个瞳孔的中心和所述光反射点在所述图像坐标系中的位置坐标,通过交比不变算法得到人眼的注视点在待操作器件上的位置信息。
  3. 根据权利要求2所述的眼控装置,其中,所述红外发射模块包括四个红外线发射源,所述四个红外线发射源分别设置在所述待操作器件 的四个角上。
  4. 根据权利要求2所述的眼控装置,其中,所述注视点获取单元还包括校正模块,用于根据线性定标算法对人眼的注视点在待操作器件上的位置信息进行校正。
  5. 根据权利要求2所述的眼控装置,其中,所述图像处理模块和所述计算模块集成于所述待操作器件中。
  6. 根据权利要求1所述的眼控装置,其中,所述注视点获取单元包括:
    眼镜;
    场景获取模块,其设置于所述眼镜上,用于获取人眼通过所述眼镜所看到的场景图像,所述场景图像中包含有所述待操作器件的图像;
    第二人眼图像获取模块,其设置于所述眼镜上,用于获取人眼图像;
    注视方向确定模块,用于根据所述第二人眼图像获取模块获取的人眼图像来确定人眼的注视方向;以及
    注视点确定模块,用于根据所述场景获取模块获取的场景图像和所述注视方向确定模块确定的人眼的注视方向来确定人眼的注视点在待操作器件上的位置信息。
  7. 根据权利要求6所述的眼控装置,其中,所述注视方向确定模块用于根据所述第二人眼图像获取模块获取的人眼图像中的瞳孔位置来确定人眼的注视方向。
  8. 根据权利要求1所述的眼控装置,其中,所述预设动作包括保持人眼的注视点在所述待操作器件上的位置不变2~3秒,或者,在预设时间内保持人眼的注视点在所述待操作器件上的位置不变的情况下快速眨眼3~5下。
  9. 根据权利要求1所述的眼控装置,其中,所述人眼动作检测单元为具有摄像功能的设备。
  10. 根据权利要求2或6所述的眼控装置,其中,所述人眼动作检测单元通过所述第一人眼图像获取模块或者第二人眼图像获取模块获取人眼的实时图像,并基于人眼的实时图像来获知人眼的动作。
  11. 一种眼控系统,包括:多个待操作器件和根据权利要求1至10中任一项所述的眼控装置。
  12. 根据权利要求11所述的眼控系统,其中,当所述眼控装置为根据权利要求2所述的眼控装置时,所述眼控装置中的所述注视点获取单元的数量与所述待操作器件的数量相等且一一对应;并且
    每个所述注视点获取单元设置在对应的待操作器件上。
  13. 根据权利要求12所述的眼控系统,其中,所述眼控装置中的所述人眼动作检测单元的数量与所述待操作器件的数量相等且一一对应;并且
    每个所述人眼动作检测单元均设置在对应的待操作器件上。
  14. 根据权利要求12所述的眼控系统,其中,所有所述注视点获取单元均连接至一个控制信号生成单元,所述控制信号生成单元中存储有分别与各待操作器件相对应的多个位置控制对应表,并且能够向各待操作器件发送控制信息。
  15. 根据权利要求11所述的眼控系统,其中,当所述眼控装置为根据权利要求6所述的眼控装置时,所述眼控装置中包括一个所述注视点获取单元;
    所述眼控系统还包括:与所述待操作器件一一对应的多个激活单元,每个所述激活单元均设置在对应的待操作器件上;
    每个所述激活单元用于在所述注视点获取单元朝向与所述激活单元相对应的待操作器件时激活所述眼控装置,并控制所述控制信号生成单元调用与所述激活单元相对应的待操作器件所对应的位置控制对应表。
  16. 根据权利要求15所述的眼控系统,其中,所述眼控装置包括一个所述人眼动作检测单元,所述人眼动作检测单元设置在所述注视点获取单元的眼镜上。
  17. 一种眼控方法,其中,所述眼控方法基于眼控装置来实施,所述眼控装置为根据权利要求1-10中任一项所述的眼控装置,所述眼控方法包括:
    通过所述注视点获取单元获取人眼的注视点在待操作器件上的位置信息;
    通过所述人眼动作检测单元检测人眼是否作出预设动作,并在检测出人眼作出预设动作时,控制所述注视点获取单元将人眼的注视点在待操作器件上的当前位置信息发送给所述控制信号生成单元;以及
    由所述控制信号生成单元根据预先存储的与所述待操作器件相对应的位置控制对应表生成与人眼的注视点在待操作器件上的当前位置信息相对应的控制信号,并将所述控制信号发送至待操作器件以控制所述待操作器件执行相应的操作,
    其中,所述位置控制对应表中存储有所述待操作器件上的各位置信息和与所述各位置信息对应的控制信号。
  18. 根据权利要求17所述的眼控方法,其中,在通过所述注视点获取单元获取人眼的注视点在待操作器件上的位置信息之前,所述眼控方法还包括:
    建立与待操作器件对应的位置控制对应表。
PCT/CN2016/079259 2015-05-27 2016-04-14 眼控装置、眼控方法和眼控系统 WO2016188258A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/511,861 US10372206B2 (en) 2015-05-27 2016-04-14 Eye-controlled apparatus, eye-controlled method and eye-controlled system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510280294.2 2015-05-27
CN201510280294.2A CN104866100B (zh) 2015-05-27 2015-05-27 眼控装置及其眼控方法和眼控系统

Publications (1)

Publication Number Publication Date
WO2016188258A1 true WO2016188258A1 (zh) 2016-12-01

Family

ID=53911981

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/079259 WO2016188258A1 (zh) 2015-05-27 2016-04-14 眼控装置、眼控方法和眼控系统

Country Status (3)

Country Link
US (1) US10372206B2 (zh)
CN (1) CN104866100B (zh)
WO (1) WO2016188258A1 (zh)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104866100B (zh) 2015-05-27 2018-11-23 京东方科技集团股份有限公司 眼控装置及其眼控方法和眼控系统
CN105046283A (zh) * 2015-08-31 2015-11-11 宇龙计算机通信科技(深圳)有限公司 终端操作方法和终端操作装置
CN105373225B (zh) * 2015-10-26 2018-11-20 深圳中发明科技开发有限公司 用眼进行控制的方法以及装置
CN105373720B (zh) * 2015-10-28 2018-04-20 广东欧珀移动通信有限公司 一种应用于移动终端的模块控制方法及装置
CN105867603A (zh) * 2015-12-08 2016-08-17 乐视致新电子科技(天津)有限公司 一种眼控方法及装置
CN105892642A (zh) * 2015-12-31 2016-08-24 乐视移动智能信息技术(北京)有限公司 一种根据眼部动作对终端进行控制的方法及装置
CN105677040A (zh) * 2016-02-17 2016-06-15 深圳创维数字技术有限公司 一种终端控制方法、装置及可穿戴设备
CN106476281B (zh) * 2016-09-14 2018-07-10 西安科技大学 基于眨眼识别与视觉诱发的3d打印机控制方法
CN107027014A (zh) * 2017-03-23 2017-08-08 广景视睿科技(深圳)有限公司 一种动向智能投影系统及其方法
CN107219921B (zh) * 2017-05-19 2019-09-27 京东方科技集团股份有限公司 一种操作动作执行方法及其系统
CN107506030B (zh) * 2017-08-16 2021-03-30 陈乾 视控仪
CN107562208A (zh) * 2017-09-27 2018-01-09 上海展扬通信技术有限公司 一种基于视觉的智能终端控制方法及智能终端控制系统
CN110069960A (zh) * 2018-01-22 2019-07-30 北京亮亮视野科技有限公司 基于视线运动轨迹的拍摄控制方法、系统及智能眼镜
CN108491072B (zh) * 2018-03-05 2020-01-21 京东方科技集团股份有限公司 一种虚拟现实交互方法及装置
US10747312B2 (en) * 2018-03-14 2020-08-18 Apple Inc. Image enhancement devices with gaze tracking
CN108904163A (zh) * 2018-06-22 2018-11-30 北京信息科技大学 轮椅控制方法及系统
CN109101110A (zh) * 2018-08-10 2018-12-28 北京七鑫易维信息技术有限公司 一种操作指令执行方法、装置、用户终端及存储介质
CN109508092A (zh) * 2018-11-08 2019-03-22 北京七鑫易维信息技术有限公司 基于眼球追踪控制终端设备的方法、装置和终端
CN111297209A (zh) * 2020-03-17 2020-06-19 南京航空航天大学 一种基于眼球驱动与控制的自动煮饭系统
CN111399648A (zh) * 2020-03-17 2020-07-10 南京航空航天大学 一种基于眼球驱动与控制的早教图画交互系统
CN111336644A (zh) * 2020-03-17 2020-06-26 南京航空航天大学 一种基于眼球驱动控制的空调调节系统
CN111459285B (zh) * 2020-04-10 2023-12-12 康佳集团股份有限公司 基于眼控技术的显示设备控制方法、显示设备及存储介质
US11995774B2 (en) * 2020-06-29 2024-05-28 Snap Inc. Augmented reality experiences using speech and text captions
CN114578966B (zh) * 2022-03-07 2024-02-06 北京百度网讯科技有限公司 交互方法、装置、头戴显示设备、电子设备及介质
CN116795212B (zh) * 2023-06-19 2024-03-01 深圳市晚成辉电子有限公司 一种等离子显示屏的控制方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102176755A (zh) * 2010-12-24 2011-09-07 青岛海信数字多媒体技术国家重点实验室有限公司 基于眼动三维显示角度的控制方法及装置
US20130176208A1 (en) * 2012-01-06 2013-07-11 Kyocera Corporation Electronic equipment
WO2013133618A1 (en) * 2012-03-06 2013-09-12 Samsung Electronics Co., Ltd. Method of controlling at least one function of device by using eye action and device for performing the method
CN103500061A (zh) * 2013-09-26 2014-01-08 三星电子(中国)研发中心 控制显示器的方法及设备
CN103703438A (zh) * 2011-04-08 2014-04-02 亚马逊技术公司 基于注视的内容显示器
CN103838372A (zh) * 2013-11-22 2014-06-04 北京智谷睿拓技术服务有限公司 智能眼镜的智能功能开启/关闭方法和开启/关闭系统
CN104866100A (zh) * 2015-05-27 2015-08-26 京东方科技集团股份有限公司 眼控装置及其眼控方法和眼控系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9053581B2 (en) * 2012-01-05 2015-06-09 Google Technology Holdings LLC Hidden line grids in a browser environment
CN102981620A (zh) * 2012-11-27 2013-03-20 中兴通讯股份有限公司 一种终端操作方法及终端
CN103324290A (zh) * 2013-07-04 2013-09-25 深圳市中兴移动通信有限公司 终端设备及其眼睛操控方法
CN104348969A (zh) * 2013-09-05 2015-02-11 陈英时 一种利用视线凝视操作手机的方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102176755A (zh) * 2010-12-24 2011-09-07 青岛海信数字多媒体技术国家重点实验室有限公司 基于眼动三维显示角度的控制方法及装置
CN103703438A (zh) * 2011-04-08 2014-04-02 亚马逊技术公司 基于注视的内容显示器
US20130176208A1 (en) * 2012-01-06 2013-07-11 Kyocera Corporation Electronic equipment
WO2013133618A1 (en) * 2012-03-06 2013-09-12 Samsung Electronics Co., Ltd. Method of controlling at least one function of device by using eye action and device for performing the method
CN103500061A (zh) * 2013-09-26 2014-01-08 三星电子(中国)研发中心 控制显示器的方法及设备
CN103838372A (zh) * 2013-11-22 2014-06-04 北京智谷睿拓技术服务有限公司 智能眼镜的智能功能开启/关闭方法和开启/关闭系统
CN104866100A (zh) * 2015-05-27 2015-08-26 京东方科技集团股份有限公司 眼控装置及其眼控方法和眼控系统

Also Published As

Publication number Publication date
US10372206B2 (en) 2019-08-06
CN104866100A (zh) 2015-08-26
CN104866100B (zh) 2018-11-23
US20170293357A1 (en) 2017-10-12

Similar Documents

Publication Publication Date Title
WO2016188258A1 (zh) 眼控装置、眼控方法和眼控系统
CN114341779B (zh) 用于基于神经肌肉控制执行输入的系统、方法和界面
JP7153096B2 (ja) アイフィードバックによるコミュニケーションを可能にするシステム及び方法
US20210103338A1 (en) User Interface Control of Responsive Devices
CN105361429B (zh) 基于多通道交互的智能学习平台及其交互方法
US10976808B2 (en) Body position sensitive virtual reality
US20120268580A1 (en) Portable computing device with intelligent robotic functions and method for operating the same
US10203760B2 (en) Display device and control method thereof, gesture recognition method, and head-mounted display device
US20180075736A1 (en) Method and Apparatus for Configuring Wireless Remote Control Terminal by Third-party Terminal
KR102297473B1 (ko) 신체를 이용하여 터치 입력을 제공하는 장치 및 방법
WO2020108101A1 (zh) 虚拟数据的处理方法、装置、存储介质及终端
CN109558061A (zh) 一种操作控制方法及终端
CN109782968B (zh) 一种界面调整方法及终端设备
US20190286255A1 (en) Electronic whiteboard, image display method, and recording medium
KR20070043469A (ko) 장애인을 위한 마우스 동작 인식 시스템
CN106020487A (zh) 一种输入设备控制方法及装置
WO2015067023A1 (zh) 视频会议体感控制方法、终端及系统
CN103974107A (zh) 电视机眼动控制方法、装置及电视机
Lee et al. Mobile gesture interaction using wearable tactile displays
US20190041997A1 (en) Pointer control in a handheld computer by way of hid commands
JP2015052895A (ja) 情報処理装置及び情報処理方法
CN209803661U (zh) 一种穿戴式计算设备
US20150205374A1 (en) Information processing method and electronic device
CN112445328A (zh) 映射控制方法及装置
WO2019167052A1 (en) A system for augmentative and alternative communication for people with severe speech and motor disabilities

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16799147

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15511861

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16799147

Country of ref document: EP

Kind code of ref document: A1