JP6308940B2 - System and method for identifying eye tracking scene reference position - Google Patents

System and method for identifying eye tracking scene reference position Download PDF

Info

Publication number
JP6308940B2
JP6308940B2 JP2014512905A JP2014512905A JP6308940B2 JP 6308940 B2 JP6308940 B2 JP 6308940B2 JP 2014512905 A JP2014512905 A JP 2014512905A JP 2014512905 A JP2014512905 A JP 2014512905A JP 6308940 B2 JP6308940 B2 JP 6308940B2
Authority
JP
Japan
Prior art keywords
object
scene
reference
wearer
known
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2014512905A
Other languages
Japanese (ja)
Other versions
JP2014520314A (en
JP2014520314A5 (en
Inventor
パブリカヴァー,ネルソン,ジー.
シー. トーチ,ウィリアム
シー. トーチ,ウィリアム
アマヤ,ゴラムレザ
ルブラン,デービット
Original Assignee
グーグル エルエルシー
グーグル エルエルシー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US13/113,003 priority Critical
Priority to US13/113,003 priority patent/US8885877B2/en
Application filed by グーグル エルエルシー, グーグル エルエルシー filed Critical グーグル エルエルシー
Priority to PCT/US2012/038743 priority patent/WO2012162204A2/en
Publication of JP2014520314A publication Critical patent/JP2014520314A/en
Publication of JP2014520314A5 publication Critical patent/JP2014520314A5/ja
Application granted granted Critical
Publication of JP6308940B2 publication Critical patent/JP6308940B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/0093Other optical systems; Other optical apparatus with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00597Acquiring or recognising eyes, e.g. iris verification
    • G06K9/00604Acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type, eyeglass details G02C
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS, OR APPARATUS
    • G02B27/00Other optical systems; Other optical apparatus
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Description

  The present invention relates to an apparatus, system, and method for unambiguously identifying a reference location within a device wearer's environment for eye tracking and other applications.

  The devices, systems, and methods herein use machine vision techniques to track the position and objects being viewed by an observer. The eye tracking algorithm provides two consecutive data streams to yield accurate tracking results: 1) pupil edge or other identifiable inside eye to calculate eye pivot angle and field of view direction It can be thought of as a target tracking method for detecting a reference point, and 2) a head tracking method required to locate the head position and posture in our 3D world.

  In general, head tracking may involve identifying the position of a rigid object attached to the head (as opposed to the head itself). In this case, the caps or glasses attached to the head have a known geometric shape and a displacement relative to a reference point on the head or head that can be calculated. More specifically, for accurate eye tracking, the head tracking device should have a known displacement from the pivot point of one or both eyeballs of the observer. Furthermore, for most applications, the eye tracking position is determined relative to a reference position or reference object in the device wearer's environment, such as a corner of a display monitor, mobile computing device, switch, light source, window, etc. Is done.

  Applications related to machine vision are becoming increasingly common. To some extent, this has occurred as a result of technological advances in the electronics and software development industry and a reduction in the cost of cameras, information processing units, and other electronic components. In particular, eye tracking is increasingly used in many diagnostic, human performance, and control applications. Only a few examples include monitoring individual fatigue, assessing driver or pilot awareness, assessing the effects of drugs or alcohol, diagnosing posttraumatic stress disorder, age Tracking human performance, determining the effectiveness of training or exercise, assessing the effectiveness of advertising and web page design by measuring eye dwell time, specific objects or images under observation (including words ) Increase or change brightness, control various aspects of the game, collect basic clinical data to assess neurological or cognitive impairment, diagnose and monitor eye degeneration And a person with limited mobility from the neck to the bottom or from the neck to the bottom using a computer with one or more eyes and eyelids. Including that to be able to communicate by controlling the cursor. Industry sectors and industries that use eye tracking include military, medical, security, human performance, sports medicine, rehabilitation engineering, police, laboratories, and toys.

  In almost all cases, increased gaze tracking accuracy leads to increased performance and convenience in most applications. For example, as accuracy increases, eye dwell times can be measured more accurately to quantify gaze times on smaller objects or object components. Gaze tracking can be more effectively employed with portable devices that use smaller screens, including mobile phones and handheld displays. When gaze tracking is used to control a cursor related to selection from multiple virtual objects or icons in the screen, smaller virtual objects or icons can be used, so more selectable objects It can be displayed at the same time. Increasing the number of objects within each level of the selection process has a dramatic effect on the efficiency with which virtual objects and associated actions are selected (ie, decreasing the number of selection levels and / or reducing time). Similarly, increasing or increasing the brightness level of objects and words under observation can significantly increase the recognition speed and reading speed of visually impaired persons.

  Many eye tracking systems use cameras and eye illuminators that are placed at a significant distance from the eye (eg, greater than 10 centimeters (10 cm)). As the distance away from the eye increases, the optotype tracking device is generally less obtrusive, but because of the higher spatial resolution required by the camera and the ability of a wide range of head movements to track the eye It is becoming increasingly difficult to accurately measure the position of the eyes because it can lead to serious loss. Many eye tracking systems also use a bright (visible or invisible) light source that is placed at a distance from the head to produce glints or bright spots on the surface of the eye. These glints can be used to generate a reference vector from the position of the glint on the surface of the eye to a known position in the environment (ie, the light source). Here again, extensive movement of the head may lead to a loss of ability to track glint and / or to associate glint with a particular light source.

  With the advent of today's microelectronics and micro-optics, line-of-sight tracking components are described in US Pat. Nos. 6,163,281, 6,542,081, 7,488,294, or 7, 515,054 can be installed without obstruction on glasses (eg, glasses frame) or hats (eg, helmet, mask, goggles, virtual reality display) including the device disclosed in US Pat. is there. By using high-precision micro-optics in glasses or hats, it is possible to more clearly resolve structures and reflections in the eyes and nearby areas, as well as scenes seen by device wearers. . The use of low power miniature cameras and electronics optionally eliminates the need to connect the head-mounted system through the use of battery power. Further, recent advances in wireless telecommunications allow gaze tracking results to be transmitted in real time to other computing devices, data storage devices, or control devices. As a result of these technological advances in many fields, eye tracking systems based on glasses or hats can be unobtrusive, lightweight, portable and convenient to use.

  Gaze tracking involves identifying the position and / or object being viewed by the observer substantially continuously. Accurate line-of-sight tracking results from a combination of eye tracking and head tracking for identified reference locations within our 3D world. The devices, systems, and methods herein are non-obtrusive scene cameras that are placed on eyewear or hats to identify naturally occurring or intentionally placed reference locations in the wearer's environment. Is used.

  More specifically, the devices, systems, and methods herein facilitate identifying unambiguous reference locations within the device wearer's environment for eye tracking and other applications. there is a possibility. In one embodiment, a system and method for determining a scene reference position is a device configured to be worn on a person's head and connected to the device and positioned to capture an image of the wearer's environment. A scene camera, a scene processor operatively connected to the scene camera to determine a scene reference position in the scene camera image, and a target tracking position of at least one of the wearer's eyes connected to the device And a processor that uses the scene reference position and the target tracking position to determine the position being viewed by the wearer.

  A reference position in the scene may be identified using one or more features of the object including the shape, size, or color of the object. Spatial relationships between various geometric shapes, such as those found on 1D and 2D barcodes, QR (ie quick response) codes, matrix (ie 2D) codes, etc. are also located. And may be used for orientation. An object that defines a reference location, such as a colored paper or plastic strip, a pigment (e.g., paint or ink) colored spot, a colored (or black and white) area in a display screen, a light source, and / or a reflective surface; It may be intentionally positioned within the wearer's environment. Alternatively, the reference location may be the corner of the display screen, the corner of a mobile phone or reader (eg, an iPad® or Kindle® device), the center location of a larger object, the color on the display monitor. It may be extracted using object recognition techniques from the wearer's invariant environment, such as icons or patches, buttons, markings on objects, edges of colored patterns, and the like. The reference position may be identified by visible light or invisible light. They may be based on the position of the whole object or a subset of the object, such as corners, voids, points, or edges. The light from the reference location may use ambient light, light projected from glasses or hats, light generated by the reference location itself, and / or light from other light sources. A combination of both general approaches (ie, recognizing both naturally occurring and intentionally placed objects) is also possible.

  In light of the above background, the devices, systems, and methods herein may provide improved eye tracking methods and systems for various applications.

  In an exemplary embodiment, the method involves the use of a “scene camera” attached to the glasses or hats, looking outward to the individual wearing the glasses or hats. The scene camera transmits the image to a processor programmed to identify a plurality of reference locations within the scene camera image. Optionally, the processor may be coupled to and communicate with a database of “templates” (ie, known object images, reference position configurations, etc.) to identify reference positions, or This may be accessed in other ways.

  In accordance with one embodiment, systems and methods are provided for identifying reference locations using image recognition techniques to identify objects or components of objects with known geometric shapes and colors. A typical configuration using this method is to identify the four corners of a computer display monitor or mobile computing / telephone device or other electronic object. This is by recognizing the edge of the device frame relative to the background scene and the edge of the display screen (ie the backlight area in the case of LCD-based devices or objects) relative to the display frame, or both. It may be done. Corners and / or edges may be identified based on color, texture, rounded geometry for sharp geometries, size for other identifiable components, markings, and the like.

  According to another embodiment, systems and methods are provided that provide a reference location where an identifiable object or surface is added to the scene at a known location. For example, the system and method may be a piece of paper or plastic that is conveniently attached to an object that may be identified based on color and / or shape (eg, using an adhesive, screw, clip, or other fastener). May be used. Similarly, ink, paint, or other pigmented material may be applied to the object to generate a reference location with an identifiable color or shape. The color and / or shape of the applied reference surface may be based on measurements of reflected light, fluorescence, phosphorescence, or luminescence, which may be either visible or invisible.

  According to yet another embodiment, using a reflective patch (eg, composed of paint, cloth, plastic, paper, etc.) that may be attached to any surface (eg, using adhesive, fasteners, etc.). Systems and methods for providing a bright reference point are provided. These reflecting surfaces may be based on prisms or planar reflecting mirror surfaces. They may be illuminated with one or more light sources located on the glasses or hats by ambient light and / or other light sources. An example of a light source is a single or multiple light emitting diodes (LEDs) that are placed adjacent to or away from a scene camera on glasses or hats. The light source may use visible or invisible electromagnetic radiation wavelengths, such as infrared or other light outside the visible spectrum, to avoid interference with the wearer's and / or others' normal activities. In this configuration, the timing of illumination may be controlled by glasses or hats, and an illumination source that provides power from outside the glasses or hats may not be required.

  According to yet another embodiment, a system and method are provided that not only provide a shining reference position illuminated by eyeglasses or hats, but also generate a reference glint from light reflected from the reference point onto the eyeball. By controlling the timing of illumination relative to the timing of video image acquisition, it is possible to collect images with or without illumination of reflective reference points and glints. Subtracting the lighted-on image from the light-off image is the location of the reference point in the image collected by the scene camera as well as the correspondence in the image collected by the optotype tracking camera (s) This may be useful for the separation ability of the position of the reflection source including the position of the glint.

  The controller uses the camera (s) to sample the luminance at each reflected reference location of the light source, and the light source (based on the sampled luminance to provide a desired luminance level in the camera image. It may be coupled to a camera (s) and / or a light source configured to modulate the singular (s).

  A processing unit operatively coupled to the scene camera may collect images of the device wearer's environment, for example, to monitor and / or further analyze scene features. The scene processing unit and the optotype tracking processing unit may be one or more separate processors, or may be a single processor, and / or the intensity of illumination of the environment to the device wearer An illumination controller may be included to adjust

  In one embodiment, the lighting controller is configured to amplitude modulate at least one of the current and / or voltage to the light source to provide a desired brightness level in each region of the scene camera image. Also good. Additionally or alternatively, the controller may be configured to pulse width modulate the current and / or voltage to the light source to provide the desired brightness level.

  In any of these examples, illumination, reference position tracking, optotype tracking, and gaze tracking may be operated substantially continuously or intermittently. For example, the scene light source may be deactivated when the scene camera is not operating. This includes the time between collecting camera images. The processor, camera, and lighting may also be deactivated when not in use, for example, to save power. The illumination source and other electronics may also be powered down or turned off to increase the safety of the device wearer.

  In an exemplary embodiment, the system includes a frame of glasses or hats, a scene camera that is directed toward viewing the environment around the device wearer, at least one camera that is directed to one eye of the wearer, One or more illumination sources directed toward at least one eye of the wearer, and one or more processors, eg, scene processing coupled to a scene camera to identify a reference position in the scene camera image A unit and a processing unit for target tracking. The system also includes one or more light sources on the frame that are directed away from the wearer, for example, to provide scene illumination when a reflective reference location is used. Machine vision technology is used in the processing unit (s) to determine the reference position. The reference position identified in the scene processing unit and the target tracking processing unit may then be used for the eye tracking calculation.

  Other aspects and features of the present invention will become more apparent from consideration of the following description taken in conjunction with the accompanying drawings.

  The drawings illustrate exemplary embodiments of the invention.

FIG. 1 is a perspective view of an example of a system installed on a frame of spectacles for reference position tracking and target tracking. 2 is a partial cutaway view of the system of FIG. 1 showing the spatial relationship between the scene camera and the optotype tracking camera and the connections between the processing unit, the scene camera, the optotype tracking camera, and other components. FIG. FIG. 3 is a diagram illustrating an exemplary method for detecting a reference location using object recognition in an immutable scene that includes a mobile computing / telephone device. FIG. 4 is a diagram illustrating another exemplary method for detecting a reference location, including providing a reference object, eg, four identifiable colored round paper pieces, on four corners of a display monitor. is there. FIG. 5 is yet another illustrative example of detecting a reference location, including providing “virtual” identifiable reference objects, eg, four colored areas that are displayed at four corners of a display monitor. FIG. FIG. 6 is a diagram illustrating an example of an illumination path indicating a reflective surface that may be detected as a reference position by the scene camera and a glint on the surface of the eye that may be detected by the target tracking camera.

  Turning to the drawings, FIG. 1 shows an exemplary embodiment of a system 10 that includes a scene camera 12, two optotype tracking cameras 13 a, 13 b, and a spectacle frame 11 with a processing unit 14. The scene camera 12 is directed toward viewing an area on the frame 11 away from the device wearer's head 15 to track one or more reference positions 16a, 16b in the device wearer's environment. The optotype tracking cameras 13a and 13b are used to track the position of the head 15 on the frame 11 in order to track the position of the wearer's pupil, glint, and / or other reference points on one or both eyes of the wearer. Directed towards.

  In this embodiment, for example, a single processing unit 14 may be supported by frame 11 to collect images from scene camera 12 as well as optotype tracking cameras 13a, 13b, but on frame 11 or frame 11 It will be appreciated that a separate processor (not shown) may be provided at a remote location (not shown) in communication with the computer. The power source (eg, battery) 17 may be supported by the frame 11 on the opposite side of the processing unit 14 from being included, for example, in a stem of the frame 11. The scene illumination light sources 18a, 18b may optionally be located near the scene camera 12 or more remote from the scene camera 12.

  In an exemplary embodiment, the scene camera 12 captures an image and generates, for example, a CCD or CMOS or other detector that includes an active area having a rectangular or other pixel array to generate a video signal representative of the image. May be included. The active area of the camera 12 may have any desired shape, such as a square or rectangular shape. In addition, the camera 12 may include one or more filters, lenses, etc. as desired, for example, to focus an image on the active area, filter out unwanted intensity and / or wavelength light, etc. (For example, the filter 67 and / or the lens 66 shown in FIG. 6) may be included.

  In the embodiment illustrated in FIG. 1, the scene camera 12 is placed out of the way on the nose bridge 25 (FIG. 2) of the frame 11, thereby minimizing interference with the wearer's normal field of view. To do. Other positions of the scene camera (s) can also include near the outer edge of the frame 11. Alternatively, in the case of hats, one or more scene cameras may be placed at the top of the head (not shown), for example. For example, reflective and / or refractive optical components may be incorporated to direct light from different areas of the environment towards the scene camera (s).

  In addition or alternatively, there are provided a plurality of scene cameras 19a, 19b spaced apart from each other and / or directed towards a plurality of reference positions 16a, 16b, for example providing separate or overlapping fields of view May be. The multiple scene cameras 16a, 16b may provide higher resolution, increased sensitivity under different lighting conditions, and / or wider field of view, for example in addition to or instead of the scene camera 12. Another potential advantage of using multiple scene cameras is that each camera has a different optical filter, for example to separate reference sources that are preferentially illuminated using electromagnetic radiation of different colors or different wavelengths. (For example, see filter 67 in FIG. 6).

  If two scene cameras are used, they are, for example, near each of the outer corners of the frame 11 (eg, near the positions shown as 19a and 19b in FIG. 1) or next to the headgear (not shown). It may be conveniently arranged on the side. The reference position and the corresponding scene camera orientation may be within the wearer's normal field of view, or may be outside this range, including beside or behind the head. The field (s) may optionally be controlled to the size and / or position by the reflective surface and the refractive lens.

  FIG. 2 shows a cutaway view and shows the back side of the system 10 illustrated in FIG. A fixed spatial displacement in the X, Y, and Z directions between the scene camera 12 and the target tracking camera 13b installed in the frame 11 of the glasses may be seen from this perspective view. FIG. 2 also shows examples of positions where a single processing unit 14 for reference position tracking and target tracking may be incorporated into the stem of the frame 11. In the exemplary embodiment, processing unit 14 is a field programmable gate array (FPGA).

  The processing unit 14 may include one or more controllers or processors, eg, one or more hardware components and / or software modules for operating the various components of the system 10. For example, the processing unit 14 may include a separate (not shown) or integrated controller, such as for controlling light sources or cameras, receiving and / or processing signals from the cameras 12, 13b, etc. Good. Optionally, one or more of the components of the processing unit 14 are identified on the ear support 24, the lens support of the frame 11, the nose bridge 25, and / or a reference identified elsewhere herein. It may be supported on other locations within glasses or hats similar to the embodiments described in the literature. In the exemplary embodiment shown in FIGS. 1 and 2, a single processing unit 14 is used for image collection and processing for both the reference position tracking function and the target tracking function.

  The cable (s) 26 are connected to the cameras 12, 13b, the battery 17 (FIG. 1), the light sources 18a, 18b (FIG. 1), and / or other components on the frame 11, and / or to the processing unit 14. It may include individual cable or wire pairs to be joined. For example, individual cable or wire pairs (not shown) can be used, for example, to reduce the overall profile of the frame 11 as desired and / or for any hinged regions or corners 27 in the glasses or hats. In order to guide the signal around, it may be incorporated into the frame 11 until it is taken into the cable 26 along the rim from, for example, the cameras 12, 13b, etc.

  The processing unit 14 also includes a memory (not shown) for storing the image signal from the camera (s) 12, 13b, a filter for editing and / or processing the image signal, for measurement calculations. Including elements (also not shown). Optionally, frame 11 and / or processing unit 14 may include one or more transmitters and / or receivers (not shown) for transmitting data, receiving instructions, and the like. Additionally or alternatively, at least some processing is performed by components and / or onboard processing units 14 remote from the frame 11 similar to the embodiments disclosed in the cited references identified elsewhere herein. It may be done. For example, the data collection system may include one or more receivers at one or more remote locations from the processing unit 14 and / or the frame 11, for example, in the same room, at a nearby monitoring station, or at a more remote location. A processor and / or display (not shown) may be included. Such a display may include a scene camera (s) 12 and / or a target tracking camera (s) 13b, and a view generated by eye tracking measurements and related calculations.

  FIG. 3 illustrates known geometric shapes in an “invariant scene” (ie, a scene that is not intentionally changed for the purpose of establishing a reference position by the wearer / observer or someone else involved in the observation) and / or FIG. 5 is an example of reference position tracking where machine vision techniques related to object identification are used to locate colored objects. In this example, the scene camera 31 may be used to track the size, posture, and / or position of a conventional mobile phone or handheld computing device 30. The image may be supported, for example, by a scene camera 31 (FIG. 1) using one or more lenses 33 that may be supported by or otherwise coupled to the scene camera (s) 31 (not shown). And may be similar to the scene camera 12 shown in FIG. 2).

  Within the image collected by the scene camera 31, a processing unit (not shown) scans the field of view 32 of the image from the scene camera 31 for objects that are similar in shape and color to the object template for the mobile computing device. May be. For example, the processing unit may include a database of known templates, eg, a table associating known objects with data identifying their shape and / or color, or otherwise accessing it. Good. The database may include vertical and horizontal reference points 36 and 37 of known objects, detailed color and / or shape information on reference objects that are mapped to specific physical objects, and so on. Provide the processing unit with enough information to identify the object. If an object with the appropriate attributes is found, a rectangle 34 may be used (in this rectangular cell phone example) to delimit the device in the image from the scene camera 31. In order to calculate the orientation of the position of the scene camera 31 relative to a reference point in the mobile computing device 30, the side dimensions of the rectangle 34 may be used. The overall size of the rectangle 34 in the image from the scene camera 31 is used to calculate the distance between the scene camera 31 (ie attached to the glasses or hats 11) and a reference point in the mobile computing device 30. May be.

  Examples of reference positions within the reference object include four corners of a square 34 corresponding to the four corners 35a, 35b, 35c, 35d of the mobile computing device 30. The real-world dimensions of the reference object in the vertical direction 36 and in the horizontal direction 37 are known in the scene camera processing unit, and together with the measurements made on the scene camera image, the measured distance in the image from the scene camera 31 May be used to convert to

  FIG. 4 is an example of reference position tracking where a reference object is intentionally placed within the wearer's environment. Machine vision techniques related to object identification are used to locate these objects with known geometric shapes and / or colors in the scene camera image. In this case, four disks 45a, 45b, 45c, 45d of known size (s) and color (s) are attached to the four corners of the display monitor 40, for example by bonding with an adhesive. It has been. Alternatively, the monitor 40 or other device may include a reference object that is permanently attached to or otherwise incorporated into the device at a desired location.

  If desired, any number of reference objects may be added to the wearer's environment, for example, 2 or 3, or more than 4 (not shown). The reference object may be any size, shape, or color. All reference objects may be substantially the same size, shape, and / or color, or one or more reference objects may be different in size, shape, and / or color. In the latter example, the size, shape, or color differences may unambiguously determine the reference position and the exact pose of the associated object, eg, to uniquely identify each corner of the mobile computing device 30. May be useful to.

  Still referring to FIG. 4, the image may be focused on the scene camera 41 (which may be similar to the scene camera 12) using, for example, a lens 43. Employing the images collected by the scene camera 41, a processing unit (not shown), for example, accesses a template database as described elsewhere in this specification, and object identification for intentionally placed reference objects. The field of view 42 of the scene camera 41 may be scanned for objects that are similar in shape and / or color to the template. When an object with appropriate attributes is found, the distance between the corners or edges of the reference objects 45a, 45b, 45c, 45d may be measured in the vertical direction 46 and the horizontal direction 47. These distances may then be used to calculate the position and orientation of the scene camera 31 relative to the reference points 45a, 45b, 45c, 45d in the scene. The overall size of the rectangle defined by the four corners of the reference objects 45a, 45b, 45c, 45d may also be used to calculate the distance between the scene camera 41 and the position in the scene. A known real world distance between the reference point in the vertical direction 46 and the reference point in the horizontal direction 47 is used to convert the measured distance in the image from the scene camera 41 to a real world dimension. Good.

  One application of head tracking and eye tracking using these techniques is to control the position of a computer cursor 44 displayed on the monitor 40. Accurate control of the cursor using eye tracking results in surfing the Internet, controlling games, generating text-to-speech, lighting or other environments in home or industrial settings It can result in a wide range of applications, including using a computer to turn control on and off, etc. The tracking of head and eye movements while the observer is instructed to closely follow an object such as cursor 44 is also a spatial aberration in the field of view 42 such as the spatial aberration caused by most lenses 43, for example. May be used during a calibration procedure that may be used to take into account.

  FIG. 5 shows another example of reference position tracking where a “virtual” reference object is intentionally displayed on a monitor or screen 50 that is within the field of view 52 of the scene camera 51. A “virtual” reference object may be, for example, a patch of color, icon, QR code, and / or other visual pattern that is distinct from the screen background. For example, the driver of the monitor 50 may be modified or replaced so that the virtual object is superimposed on any image that is otherwise displayed on the monitor 50. Thus, virtual objects may exist even when a monitor is used to display images and / or otherwise operate various programs. The virtual object may remain substantially stationary in the image presented on the monitor 50 or may be moved during, for example, as described elsewhere herein.

  Machine vision techniques may be used to locate these “virtual” objects with known geometric shapes, spatial relationships, and / or colors in the scene. In the example illustrated in FIG. 5, four “virtual” objects 55 a, 55 b, 55 c and 55 d are displayed at the four corners of the display monitor 50. Any number of “virtual” reference objects may be added to the field of view 52 of the scene camera 51. The “virtual” reference object may be any size, shape, or color. All “virtual” reference objects may have substantially the same geometric shape size, shape, spatial distribution, and / or color, or one or more “virtual” reference objects may be of size , Shapes and / or colors may be different. In the latter example, the geometric shape size, shape, spatial distribution and / or color differences are useful to unambiguously determine the rotational orientation of the reference position, as in other embodiments herein. There is.

  For example, when a virtual object having an appropriate attribute is found by a processing unit that analyzes an image from the scene camera 51, the distance between the corners of the objects 55a, 55b, 55c, and 55d is measured in the vertical direction 56 and the horizontal direction 57. May be. These distances may be used to calculate the orientation of the position of the scene camera 51 relative to the reference points 55a, 55b, 55c, 55d in the environment of the device wearer. The overall size of the rectangle defined by the reference objects 55a, 55b, 55c, 55d at the four corners of the display screen may be used to calculate the distance between the scene camera 51 and the position in the scene. A known real world distance between the reference point in the vertical direction 56 and the reference point in the horizontal direction 57 may be used to convert the measured distance in the image from the scene camera 51 into a real world dimension. Good. For example, the processing unit may include or have access to a database of templates that contain sufficient information to identify objects actually encountered, as in other embodiments herein. Head tracking and gaze tracking measurements using these techniques may, for example, control the position of the cursor 54 displayed on the computer monitor 50 and / or otherwise face and / or other nearby objects. It may be used to interact.

  An advantage of using a “virtual” reference object when depicted in FIG. 5 is that it can generate an identifiable reference object without any (hardware) modification of the real world object. For example, if a computer (not shown) is to be used by the wearer of the system 10 of FIG. 1, for example, modifying or replacing the monitor driver (s) and / or the computer during use of the system 10 Software that includes the virtual reference object in other ways in the image displayed on the monitor 50 may be loaded on the computer. Conversely, for example, the use of a physical reference object placed on the edge of the computer monitor 40 as depicted in FIG. 4 will cause any superimposed display (and associated software modifications) within the displayable area of the monitor 40. Eliminate the need for

  With further reference to FIGS. 4 and 5, any number of actual reference objects can be combined with any number of “virtual” reference objects in the scene. Machine vision techniques using images from one or more scene cameras may be used to track any number of such objects. For example, to identify a screen or device that is actuated by or in communication with a system such as system 10 of FIG. May be used. The identification of “virtual” objects may then be used when an appropriate screen viewing angle exists, for example after a physical object has been used to identify a monitor or screen. In order to provide very accurate eye tracking within a local area on the screen, for example, when the eye tracking on the monitor or screen that the wearer is looking at is determined, for example, the location of the “virtual” reference object Or it may be desirable to dynamically change other tracking features. For example, smaller “virtual” reference objects that are more closely spaced may be used to draw attention to a particular subset or region of the monitor or screen. The processing unit then discards the image data outside the field of the virtual object on the monitor or screen, for example to improve the accuracy of eye tracking and reduce the size of stored and / or processed image data. May be.

  FIG. 6 shows an example of illumination and optical paths that utilize reflective reference patches and associated locations. In this example, an illumination source (eg, one or more LEDs) 60 is included or otherwise supported within eyewear or hats (not shown, eg, frame 11 in FIGS. 1 and 2). The The electromagnetic radiation from the illumination source 60 is reflected from one or more reflective patches or reflective surfaces 61 that are added to or incorporated into one or more objects in the scene at known locations. In this exemplary embodiment, light is reflected from a disk 61 attached to a corner of a display monitor or mobile computing device 62. The position of this reflective surface and other reference surfaces in the scene may be determined from images collected using a scene camera (not shown in FIG. 6, see, for example, scene camera 12 in FIGS. 1 and 2). Good.

  With further reference to FIG. 6, light reflected from the reflective reference surface can produce glint 63 on the surface of the eye 64. The glint may be detected as a high intensity bright spot in the image collected using the target tracking camera (s) 65. In glasses or hats, a short working distance lens 66 is generally required to focus the image from the eye 64 onto the target tracking camera 65 and is reflective (fluorescent, phosphorescent, or luminescent). ) A filter 67 may optionally be included in the optical path to separate the optical wavelengths produced by the reference position surface.

  A line segment between the center of the glint 63 and the center of the corresponding reference position 61 yields a vector 68 that may be used as an input to the eye tracking calculation. This reference vector 68 along with the position of the center of the pupil 69 may be used to calculate a line-of-sight tracking vector 70 relative to the reference vector 68. Additional considerations in calculating the eye tracking vector 70 include the position of the foveal center (ie, the image sensing area of the retina) slightly offset from the center of the pupil 69 being measured, and the cornea (not shown). Including refraction in the optical path through The line-of-sight tracking vector 70 points to a position 71 viewed by an observer (ie, a person wearing glasses or a hat).

  Returning to FIGS. 1 and 6, the advantage of having glasses or hats, eg, illumination source (s) on the frame 11, is compared to the image collection by the scene camera 12 and the target tracking 13a, 13b cameras. Thus, the timing and / or intensity of illumination can be conveniently controlled. By subtracting the scene image and / or optotype tracking camera image with illumination turned on from the image with illumination turned off, reflection from the reference positions 16a and 16b is easier in the scene camera image. And the reflection from glint 63 may be more easily separated in the optotype tracking camera image. Furthermore, this scheme eliminates the need for any light source or other powered component to be located or connected to the power supply 17 or controller in the glasses or hats. Thus, if a reference object is attached or incorporated into the device monitor or screen, such reference object need not include a power source and / or controller to generate light, but from the illumination source (s) 60 The light may be simply reflected.

  Any number of reflective surfaces 61 may be used as reference positions and / or sources of glint 63. Any number of electromagnetic radiation sources can produce visible or invisible light. This scheme causes reflection at the reference location and glint over the eyes because the dispersion that occurs on the part of the device wearer is small or non-dispersive (due to the presence of potentially shining reflected light). It is particularly convenient to use invisible light. A CMOS camera is particularly capable of detecting electromagnetic radiation in the near infrared spectrum that is invisible to the human eye. CMOS cameras are also particularly well suited for applications where low power and / or miniaturization is desired.

  As further described elsewhere in this specification, with reference to FIG. 6, a glint 63 measured using a target tracking camera 65 to control the intensity of the illumination source (s) 60. The brightness level and reflection from the reference position 61 measured using the scene camera (s) 12 (not shown, see FIG. 1) may be used in feedback mode. One or more illumination sources 60 may be used to illuminate the reference location, for example, a plurality of illumination sources 60 (not shown) installed at multiple locations throughout the glasses or hats. The use of multiple illumination sources 60 that illuminate the device wearer's environment from different angles can help maintain high intensity reflections in camera images at different viewing angles.

  In one embodiment, either the voltage or current amplitude driving each illumination source 60 may be used to control the light intensity. This is commonly referred to as “amplitude modulation”. In another embodiment, the duration or “dwell time” of the controlling voltage or current may be modulated to control the light intensity. This is commonly referred to as “pulse width modulation”. Optionally, both schemes can be used simultaneously.

  In an exemplary embodiment, each illumination source 60 includes relatively narrow or wide bandwidth light, for example, near infrared light at one or more wavelengths between about 640 and about 700 nanometers, broadband visible. An LED (light emitting diode) configured to emit light, white light, or the like may be included. Optionally, one or more of the illumination sources 60 can be lenses, filters, diffusers, reflectors, or, for example, to facilitate and / or control illumination uniformity of the device wearer's environment. Other features (not shown) may be included. The illumination source (s) 60 is illuminated, for example, by the desired scene image by the source (s) 60, and then the images are processed using the systems and methods described elsewhere herein. As such, it may be operated substantially continuously, periodically, or intermittently in other ways.

  The above disclosure of exemplary embodiments has been presented for purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Many variations and modifications of the embodiments described herein will be apparent to those skilled in the art in light of the above disclosure.

  Further, in describing representative embodiments, the specification may have the presented methods and / or processes as a particular series of steps. However, a method or process should not be limited to the specific series of steps described, unless the method or process is in the specific order of the steps, methods or processes described herein. Other sequences of steps may be possible, as will be apparent to those skilled in the art. Accordingly, the specific order of the steps set forth herein should not be construed as a limitation on the claims.

  While the invention is susceptible to various modifications and alternative forms, specific examples thereof are shown in the drawings and will herein be described in detail. However, the invention is not limited to the particular forms or methods disclosed, but on the contrary, the invention encompasses all modifications, equivalents, and variations that fall within the scope of the appended claims. Should be understood.

Claims (18)

  1. A system for determining a reference position,
    A device configured to be worn on the wearer's head;
    A scene camera installed on the device facing away from the wearer to capture a scene camera image of the wearer's environment;
    An optotype tracking camera installed on the device towards the wearer's one eye to capture at least one optotype tracking position of the wearer's eye;
    One or more processors coupled to the scene camera and coupled to the optotype tracking camera to determine a scene reference position within the scene camera image;
    Wherein the one or more processors use the scene reference position and the target tracking position to determine a position viewed by the wearer,
    A database of templates mapping known objects to scene reference locations associated with each known object, the database including a table associating the known objects with data identifying the shape and color of the known objects The database includes vertical reference points and horizontal reference points of the known object, and the one or more processors identify objects associated with the scene reference position collected from the scene camera image. Coupled to the database, the one or more processors using a known real world distance between the vertical reference point and the horizontal reference point of the known object. Convert the measured distance in the image from the scene camera to real world dimensions ;
    system.
  2.   The system of claim 1, wherein an object recognition algorithm is used by the one or more processors to identify a scene reference location within the field of view of the scene camera.
  3.   The system of claim 2, wherein object recognition is based at least in part on the shape of the object, the color of the object, and at least one edge of the object.
  4.   The scene reference location includes a plurality of reference points of the object, and the one or more processors determine the direction of the object relative to the scene camera and calculate the distance to the object. 4. A system according to claim 2 or claim 3, wherein the system is configured to be used to identify a reference point of the object.
  5.   The system according to claim 1, wherein the one or more processors use object recognition to identify the position of at least one of the corners of the display device.
  6.   An object in the field of view of the scene camera and an additional reference object attached to the object, the object being the scene camera so that the additional reference object is recognized by the one or more processors 6. A system according to any one of claims 1 to 5 arranged in an image.
  7.   7. A system according to any one of the preceding claims, further comprising one or more light sources on the device that are directed away from the wearer to illuminate the wearer's environment. .
  8.   The system further comprises an additional reflective reference object in the wearer's environment, the light source being able to recognize the additional reflective reference object in the scene camera image. The system of claim 7, wherein the system is configured to illuminate a reflective reference object.
  9. A system for eye tracking,
    An electronic object comprising a display and a plurality of reflective reference objects disposed around the display;
    A device configured to be worn on the wearer's head;
    The device comprises:
    a) a scene camera installed on the device facing away from the wearer to capture a scene camera image of the wearer's environment;
    b) a target tracking camera installed on the device towards the eye of the wearer to capture at least one target tracking position of the wearer's eye;
    c) one or more processors coupled to the scene camera to identify the reference object in the scene camera image and coupled to the optotype tracking camera;
    Wherein the one or more processors use the position of the reference object and the target tracking position in the scene camera image to determine a position on the display viewed by the wearer And
    d) further comprising a database of templates that map known objects to scene reference locations associated with each said known object, said database associating said known object with data identifying the shape and color of said known object A table, wherein the database includes vertical and horizontal reference points of the known object, and the one or more processors are associated with the scene reference positions collected from the scene camera image. Coupled to the database to identify an object , wherein the one or more processors are known real world distances between the vertical reference point and the horizontal reference point of the known object; To convert the measured distance in the image from the scene camera to real world dimensions ,
    system.
  10.   The apparatus further comprises one or more light sources on the apparatus that are directed away from the wearer to illuminate a reference object to enhance identification of the reference object in the scene camera image. The system according to claim 9.
  11.   11. A system according to claim 9 or claim 10, wherein one or more of the reference objects are attached to the electronic object by an adhesive or integrated into a casing of the electronic object.
  12. A system for eye tracking,
    An electronic object comprising a display, wherein the electronic object is configured to include a plurality of virtual reference objects in an image presented on the display;
    A device configured to be worn on the wearer's head;
    The device comprises:
    a) a scene camera installed on the device facing away from the wearer to capture a scene camera image of the wearer's environment;
    b) a target tracking camera installed on the device towards the eye of the wearer to capture at least one target tracking position of the wearer's eye;
    c) one or more processors coupled to the scene camera to identify the virtual reference object in the scene camera image and coupled to a target tracking camera;
    Wherein the one or more processors determine the position on the display viewed by the wearer using the position of the virtual reference object and the target tracking position in the scene camera image And
    d) a database of templates that map known objects to scene reference objects associated with each said known object, said database associating said known object with data identifying the shape and color of said known object The database includes a vertical reference point and a horizontal reference point of the known object, and the one or more processors are based on the scene reference object identified in the scene camera image . Coupled to the database to identify the electronic object, the one or more processors are configured to detect a known real-time between the vertical reference point and the horizontal reference point of the known object. Using world distance to convert the measured distance in the image from the scene camera to real world dimensions ;
    system.
  13. A system for identifying an object,
    A device configured to be worn on the wearer's head;
    A scene camera installed on the device facing away from the wearer to capture a scene camera image of the wearer's environment;
    An optotype tracking camera installed on the device towards the wearer's one eye to capture at least one optotype tracking position of the wearer's eye;
    One or more processors coupled to the scene camera and coupled to the optotype tracking camera to determine a scene reference position within the scene camera image;
    A database of templates that map known objects to scene reference locations associated with each said known object;
    The database includes a table associating the known object with data identifying the shape and color of the known object, and the database includes a vertical reference point and a horizontal reference point of the known object. The one or more processors measure in the image from the scene camera using a known real-world distance between the vertical reference point and the horizontal reference point of the known object. Convert the measured distance to real-world dimensions ,
    system.
  14. A method for determining a reference position,
    Placing the device on the wearer's head,
    On the device positioned to capture an image of the wearer's environment is a scene camera and is coupled to a scene processor to determine a scene reference position within the scene camera image collected from the scene camera Providing a scene camera,
    Providing a target tracking camera on the device positioned to capture a target tracking position of at least one of the wearer's eyes;
    The optotype tracking camera and the scene camera are coupled to a processor to determine a position viewed by the wearer using the scene reference position and the optotype tracking position;
    A database of templates that map known objects to scene reference positions associated with each known object, coupled to the processor to identify objects associated with the scene reference positions collected from the scene camera image providing a database that is includes, the database look contains a table that associates the known object data identifying the shape and color of the known objects, reference the database in the vertical direction of the known object Including a point and a horizontal reference point,
    Convert the measured distance in the image from the scene camera to real world dimensions using a known real world distance between the vertical reference point and the horizontal reference point of the known object Further including,
    Method.
  15.   The method of claim 14, wherein the processor uses an object recognition algorithm to identify a scene reference location within the field of view of the scene camera.
  16.   The object recognition algorithm is based at least in part on the shape of the object, or the color of the object, or the object recognition algorithm is used to identify the position of at least one edge of the object, or of a display The method of claim 15, wherein the method is used to identify a location of at least one of the corners.
  17.   17. A method according to claim 15 or claim 16, further comprising positioning additional reference objects within the wearer's environment such that these additional reference objects are recognized by the scene camera.
  18. A method for determining a reference position,
    Placing on the wearer's head a device comprising a scene camera directed away from the wearer and a target tracking camera directed toward at least one of the wearer's eyes;
    Collecting scene camera images of the environment around the wearer with the scene camera;
    Identify scene reference positions in the scene camera image to identify objects associated with scene reference positions in the environment from a database of templates that map known objects to scene reference positions associated with each of the known objects To do,
    Wherein the said database is viewed contains a table that associates the known object data identifying the shape and color of the known objects, the database of reference points in the vertical direction and the horizontal direction of the reference point of the known object Including
    And analyzing the scene camera images and eye tracking image to determine the position being viewed by the wearer with respect to the object,
    Convert the measured distance in the image from the scene camera to real world dimensions using a known real world distance between the vertical reference point and the horizontal reference point of the known object To do,
    A method further comprising:
JP2014512905A 2011-05-20 2012-05-19 System and method for identifying eye tracking scene reference position Active JP6308940B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US13/113,003 2011-05-20
US13/113,003 US8885877B2 (en) 2011-05-20 2011-05-20 Systems and methods for identifying gaze tracking scene reference locations
PCT/US2012/038743 WO2012162204A2 (en) 2011-05-20 2012-05-19 Systems and methods for identifying gaze tracking scene reference locations

Publications (3)

Publication Number Publication Date
JP2014520314A JP2014520314A (en) 2014-08-21
JP2014520314A5 JP2014520314A5 (en) 2015-07-09
JP6308940B2 true JP6308940B2 (en) 2018-04-11

Family

ID=47174944

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2014512905A Active JP6308940B2 (en) 2011-05-20 2012-05-19 System and method for identifying eye tracking scene reference position

Country Status (6)

Country Link
US (2) US8885877B2 (en)
EP (1) EP2710516A4 (en)
JP (1) JP6308940B2 (en)
CN (1) CN103748598B (en)
CA (1) CA2836777A1 (en)
WO (1) WO2012162204A2 (en)

Families Citing this family (112)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
EP2389095B1 (en) * 2009-01-26 2014-10-01 Tobii Technology AB Detection of gaze point assisted by optical reference signals
US8781162B2 (en) * 2011-01-05 2014-07-15 Ailive Inc. Method and system for head tracking and pose estimation
KR101114993B1 (en) * 2011-02-28 2012-03-06 (재)예수병원유지재단 Medical head lamp of tracking position of eyes
KR101046677B1 (en) * 2011-03-15 2011-07-06 동국대학교 산학협력단 Methods for tracking position of eyes and medical head lamp using thereof
US9785835B2 (en) * 2011-03-22 2017-10-10 Rochester Institute Of Technology Methods for assisting with object recognition in image sequences and devices thereof
US8787666B2 (en) * 2011-11-21 2014-07-22 Tandent Vision Science, Inc. Color analytics for a digital image
US8864310B2 (en) 2012-05-01 2014-10-21 RightEye, LLC Systems and methods for evaluating human eye tracking
US9398229B2 (en) * 2012-06-18 2016-07-19 Microsoft Technology Licensing, Llc Selective illumination of a region within a field of view
US9674436B2 (en) 2012-06-18 2017-06-06 Microsoft Technology Licensing, Llc Selective imaging zones of an imaging sensor
US9262680B2 (en) * 2012-07-31 2016-02-16 Japan Science And Technology Agency Point-of-gaze detection device, point-of-gaze detecting method, personal parameter calculating device, personal parameter calculating method, program, and computer-readable storage medium
US9443414B2 (en) * 2012-08-07 2016-09-13 Microsoft Technology Licensing, Llc Object tracking
US10321127B2 (en) 2012-08-20 2019-06-11 Intermec Ip Corp. Volume dimensioning system calibration systems and methods
US20140152558A1 (en) * 2012-11-30 2014-06-05 Tom Salter Direct hologram manipulation using imu
EP2929413A4 (en) 2012-12-06 2016-07-13 Eyefluence Inc Eye tracking wearable devices and methods for use
JP6105953B2 (en) * 2013-01-25 2017-03-29 京セラ株式会社 Electronic device, line-of-sight input program, and line-of-sight input method
JP6199038B2 (en) * 2013-02-04 2017-09-20 学校法人東海大学 Line of sight analyzer
US20140240226A1 (en) * 2013-02-27 2014-08-28 Robert Bosch Gmbh User Interface Apparatus
JP6012846B2 (en) * 2013-02-28 2016-10-25 Hoya株式会社 Eyeglass lens design system, supply system, design method and manufacturing method
US9335547B2 (en) * 2013-03-25 2016-05-10 Seiko Epson Corporation Head-mounted display device and method of controlling head-mounted display device
WO2014178047A1 (en) 2013-04-30 2014-11-06 Inuitive Ltd. System and method for video conferencing
JP6265348B2 (en) 2013-05-22 2018-01-24 国立大学法人神戸大学 Eye-gaze measuring device, eye-gaze measuring method, and eye-gaze measuring program
US20140358009A1 (en) * 2013-05-30 2014-12-04 Michael O'Leary System and Method for Collecting Eye-Movement Data
US10073518B2 (en) * 2013-08-19 2018-09-11 Qualcomm Incorporated Automatic calibration of eye tracking for optical see-through head mounted display
AT513987B1 (en) * 2013-08-23 2014-09-15 Ernst Dipl Ing Dr Pfleger Spectacles and methods for determining pupil centers of both eyes of a human
CN103557859B (en) * 2013-10-10 2015-12-23 北京智谷睿拓技术服务有限公司 The method of image acquisition and image acquisition Location Positioning System
CN103630116B (en) * 2013-10-10 2016-03-23 北京智谷睿拓技术服务有限公司 The method of image acquisition and image acquisition positioning means positioning
WO2015060869A1 (en) * 2013-10-25 2015-04-30 Intel Corporation Dynamic optimization of light source power
CN103617432B (en) * 2013-11-12 2017-10-03 华为技术有限公司 A scenario recognition method and apparatus
DE102013224962A1 (en) * 2013-12-05 2015-06-11 Robert Bosch Gmbh Arrangement for creating an image of a scene
US9292765B2 (en) * 2014-01-07 2016-03-22 Microsoft Technology Licensing, Llc Mapping glints to light sources
US9939934B2 (en) 2014-01-17 2018-04-10 Osterhout Group, Inc. External user interface for head worn computing
US10254856B2 (en) 2014-01-17 2019-04-09 Osterhout Group, Inc. External user interface for head worn computing
US9651784B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US20150205111A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Optical configurations for head worn computing
US20150205135A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. See-through computer display systems
US20150206173A1 (en) 2014-01-21 2015-07-23 Osterhout Group, Inc. Eye imaging in head worn computing
US9529195B2 (en) 2014-01-21 2016-12-27 Osterhout Group, Inc. See-through computer display systems
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US9684172B2 (en) 2014-12-03 2017-06-20 Osterhout Group, Inc. Head worn computer display systems
US9298007B2 (en) 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9538915B2 (en) 2014-01-21 2017-01-10 Osterhout Group, Inc. Eye imaging in head worn computing
US9766463B2 (en) 2014-01-21 2017-09-19 Osterhout Group, Inc. See-through computer display systems
US9715112B2 (en) 2014-01-21 2017-07-25 Osterhout Group, Inc. Suppression of stray light in head worn computing
US9594246B2 (en) 2014-01-21 2017-03-14 Osterhout Group, Inc. See-through computer display systems
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
US9753288B2 (en) 2014-01-21 2017-09-05 Osterhout Group, Inc. See-through computer display systems
US9494800B2 (en) 2014-01-21 2016-11-15 Osterhout Group, Inc. See-through computer display systems
US9651788B2 (en) 2014-01-21 2017-05-16 Osterhout Group, Inc. See-through computer display systems
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9400390B2 (en) 2014-01-24 2016-07-26 Osterhout Group, Inc. Peripheral lighting for head worn computing
US20150229915A1 (en) * 2014-02-08 2015-08-13 Microsoft Corporation Environment-dependent active illumination for stereo matching
US9401540B2 (en) 2014-02-11 2016-07-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9852545B2 (en) 2014-02-11 2017-12-26 Osterhout Group, Inc. Spatial location presentation in head worn computing
US9229233B2 (en) 2014-02-11 2016-01-05 Osterhout Group, Inc. Micro Doppler presentations in head worn computing
US9299194B2 (en) 2014-02-14 2016-03-29 Osterhout Group, Inc. Secure sharing in head worn computing
US9442292B1 (en) * 2014-02-18 2016-09-13 Google Inc. Directional array sensing module
US20150277118A1 (en) 2014-03-28 2015-10-01 Osterhout Group, Inc. Sensor dependent content position in head worn computing
KR101557152B1 (en) 2014-04-02 2015-10-05 서울시립대학교 산학협력단 Apparatuses, methods and recording medium for measuring exposured advertisement
US20150302252A1 (en) * 2014-04-16 2015-10-22 Lucas A. Herrera Authentication method using multi-factor eye gaze
US9651787B2 (en) 2014-04-25 2017-05-16 Osterhout Group, Inc. Speaker assembly for headworn computer
US9672210B2 (en) 2014-04-25 2017-06-06 Osterhout Group, Inc. Language translation with head-worn computing
US9766715B2 (en) 2014-05-01 2017-09-19 Seiko Epson Corporation Head-mount type display device, control system, method of controlling head-mount type display device, and computer program
WO2016018488A2 (en) 2014-05-09 2016-02-04 Eyefluence, Inc. Systems and methods for discerning eye signals and continuous biometric identification
US9746686B2 (en) 2014-05-19 2017-08-29 Osterhout Group, Inc. Content position calibration in head worn computing
US9841599B2 (en) 2014-06-05 2017-12-12 Osterhout Group, Inc. Optical configurations for head-worn see-through displays
JP6311461B2 (en) * 2014-06-05 2018-04-18 大日本印刷株式会社 Gaze analysis system and gaze analysis apparatus
US9575321B2 (en) 2014-06-09 2017-02-21 Osterhout Group, Inc. Content presentation in head worn computing
KR101606726B1 (en) * 2014-06-13 2016-03-28 엘지전자 주식회사 Wearable device and lighting systerm includes the wearable device
US9810906B2 (en) 2014-06-17 2017-11-07 Osterhout Group, Inc. External user interface for head worn computing
RU2017106629A3 (en) 2014-08-03 2018-09-04
US9635222B2 (en) * 2014-08-03 2017-04-25 PogoTec, Inc. Wearable camera systems and apparatus for aligning an eyewear camera
US9823059B2 (en) 2014-08-06 2017-11-21 Hand Held Products, Inc. Dimensioning system with guided alignment
US9798143B2 (en) * 2014-08-11 2017-10-24 Seiko Epson Corporation Head mounted display, information system, control method for head mounted display, and computer program
US9829707B2 (en) 2014-08-12 2017-11-28 Osterhout Group, Inc. Measuring content brightness in head worn computing
US20160048019A1 (en) * 2014-08-12 2016-02-18 Osterhout Group, Inc. Content presentation in head worn computing
US9703119B2 (en) * 2014-08-13 2017-07-11 Google Inc. Compact folding architecture for head mounted device
US9489739B2 (en) * 2014-08-13 2016-11-08 Empire Technology Development Llc Scene analysis for improved eye tracking
US9829708B1 (en) * 2014-08-19 2017-11-28 Boston Incubator Center, LLC Method and apparatus of wearable eye pointing system
US9423842B2 (en) 2014-09-18 2016-08-23 Osterhout Group, Inc. Thermal management for head-worn computer
US9671613B2 (en) 2014-09-26 2017-06-06 Osterhout Group, Inc. See-through computer display systems
US20160101936A1 (en) 2014-10-10 2016-04-14 Hand Held Products, Inc. System and method for picking validation
US9897434B2 (en) 2014-10-21 2018-02-20 Hand Held Products, Inc. Handheld dimensioning system with measurement-conformance feedback
KR20160051411A (en) * 2014-11-03 2016-05-11 삼성전자주식회사 An electoronic device for controlling an external object and a method thereof
US9936195B2 (en) * 2014-11-06 2018-04-03 Intel Corporation Calibration for eye tracking systems
US9448409B2 (en) 2014-11-26 2016-09-20 Osterhout Group, Inc. See-through computer display systems
US9911395B1 (en) * 2014-12-23 2018-03-06 Amazon Technologies, Inc. Glare correction via pixel processing
MX2017008519A (en) 2014-12-23 2018-06-27 Pogotec Inc Wireless camera system and methods.
USD751552S1 (en) 2014-12-31 2016-03-15 Osterhout Group, Inc. Computer glasses
USD753114S1 (en) 2015-01-05 2016-04-05 Osterhout Group, Inc. Air mouse
WO2016110451A1 (en) * 2015-01-07 2016-07-14 The Eye Tribe Aps Dynamic camera or light operation
EP3051386A1 (en) * 2015-01-30 2016-08-03 4tiitoo GmbH Eye tracking system and method
DK201570064A1 (en) * 2015-02-04 2016-08-22 IT-Universitetet i København A Gaze Tracker and a Gaze Tracking Method
US20160239985A1 (en) 2015-02-17 2016-08-18 Osterhout Group, Inc. See-through computer display systems
CN106155459B (en) * 2015-04-01 2019-06-14 北京智谷睿拓技术服务有限公司 Exchange method, interactive device and user equipment
CN106155288B (en) 2015-04-10 2019-02-12 北京智谷睿拓技术服务有限公司 Information acquisition method, information acquisition device and user equipment
WO2016187064A1 (en) * 2015-05-15 2016-11-24 Vertical Optics, LLC Wearable vision redirecting devices
US9690119B2 (en) 2015-05-15 2017-06-27 Vertical Optics, LLC Wearable vision redirecting devices
GB2539009A (en) * 2015-06-03 2016-12-07 Tobii Ab Gaze detection method and apparatus
CN104905764B (en) * 2015-06-08 2017-09-12 四川大学华西医院 Based on method of gaze tracking speed fpga
EP3308216A4 (en) 2015-06-10 2019-03-06 Pogotec, Inc. Eyewear with magnetic track for electronic wearable device
US20160377414A1 (en) 2015-06-23 2016-12-29 Hand Held Products, Inc. Optical pattern projector
CN105007424A (en) * 2015-07-22 2015-10-28 深圳市万姓宗祠网络科技股份有限公司 Automatic focusing system, method and wearable device based on eye tracking
KR101734287B1 (en) * 2015-08-04 2017-05-11 엘지전자 주식회사 Head mounted display and method for controlling the same
EP3335096A4 (en) * 2015-08-15 2019-03-06 Google LLC Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
JP2017046233A (en) * 2015-08-27 2017-03-02 キヤノン株式会社 Display device, information processor, and control method of the same
WO2017075405A1 (en) 2015-10-29 2017-05-04 PogoTec, Inc. Hearing aid adapted for wireless power reception
US10249030B2 (en) 2015-10-30 2019-04-02 Hand Held Products, Inc. Image transformation for indicia reading
US10339352B2 (en) * 2016-06-03 2019-07-02 Hand Held Products, Inc. Wearable metrological apparatus
US10261328B2 (en) * 2016-09-02 2019-04-16 Microsoft Technology Licensing, Llc Enhanced illumination system
GR20170100473A (en) * 2017-10-16 2019-05-24 Στυλιανοσ Γεωργιοσ Τσαπακησ Virtual reality set

Family Cites Families (169)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4595990A (en) 1980-12-31 1986-06-17 International Business Machines Corporation Eye controlled information transfer
US4439157A (en) 1982-05-03 1984-03-27 The United States Of America As Represented By The Secretary Of The Navy Helmet mounted display projector
US4568159A (en) 1982-11-26 1986-02-04 The United States Of America As Represented By The Secretary Of The Navy CCD Head and eye position indicator
EP0125808A3 (en) 1983-04-18 1986-01-29 Lee S. Weinblatt Eye movement monitoring technique
US4852988A (en) 1988-09-12 1989-08-01 Applied Science Laboratories Visor and camera providing a parallax-free field-of-view image for a head-mounted eye movement measurement system
US4950069A (en) 1988-11-04 1990-08-21 University Of Virginia Eye movement detector with improved calibration and speed
US5231674A (en) 1989-06-09 1993-07-27 Lc Technologies, Inc. Eye tracking method and apparatus
JPH03109030A (en) 1989-09-22 1991-05-09 Canon Inc Noting point detector
US5092669A (en) * 1990-03-16 1992-03-03 Migra Limited Optical device and method for using same
US5367315A (en) 1990-11-15 1994-11-22 Eyetech Corporation Method and apparatus for controlling cursor movement
US5331149A (en) 1990-12-31 1994-07-19 Kopin Corporation Eye tracking system having an array of photodetectors aligned respectively with an array of pixels
US5739912A (en) * 1991-04-26 1998-04-14 Nippon Telegraph And Telephone Corporation Object profile measuring method and apparatus
US5270748A (en) 1992-01-30 1993-12-14 Mak Technologies, Inc. High-speed eye tracking device and method
US5517021A (en) 1993-01-19 1996-05-14 The Research Foundation State University Of New York Apparatus and method for eye tracking interface
US5583590A (en) 1992-05-04 1996-12-10 Wabash Scientific Corp. Alert monitoring system
US5345281A (en) 1992-12-17 1994-09-06 John Taboada Eye tracking system and method
US5422653A (en) * 1993-01-07 1995-06-06 Maguire, Jr.; Francis J. Passive virtual reality
US5471542A (en) 1993-09-27 1995-11-28 Ragland; Richard R. Point-of-gaze tracker
GB9323970D0 (en) 1993-11-22 1994-01-12 Toad Innovations Ltd Safety device
JPH07184089A (en) * 1993-12-21 1995-07-21 Canon Inc Video camera
US5481622A (en) 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US5583795A (en) 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5689241A (en) 1995-04-24 1997-11-18 Clarke, Sr.; James Russell Sleep detection and driver alert apparatus
US5585871A (en) 1995-05-26 1996-12-17 Linden; Harry Multi-function display apparatus
US6373961B1 (en) * 1996-03-26 2002-04-16 Eye Control Technologies, Inc. Eye controllable screen pointer
US5861936A (en) 1996-07-26 1999-01-19 Gillan Holdings Limited Regulating focus in accordance with relationship of features of a person's eyes
US6163281A (en) 1996-08-19 2000-12-19 Torch; William C. System and method for communication using eye movement
US6542081B2 (en) 1996-08-19 2003-04-01 William C. Torch System and method for monitoring eye movement
US6847336B1 (en) 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system
US6070098A (en) 1997-01-11 2000-05-30 Circadian Technologies, Inc. Method of and apparatus for evaluation and mitigation of microsleep events
WO1998049028A1 (en) 1997-04-25 1998-11-05 Applied Science Group, Inc. An alertness monitor
WO1999005988A2 (en) 1997-07-30 1999-02-11 Applied Science Laboratories An eye tracker using an off-axis, ring illumination source
WO1999023524A1 (en) 1997-10-30 1999-05-14 The Microoptical Corporation Eyeglass interface system
US6055322A (en) 1997-12-01 2000-04-25 Sensor, Inc. Method and apparatus for illuminating and imaging eyes through eyeglasses using multiple sources of illumination
US6088470A (en) 1998-01-27 2000-07-11 Sensar, Inc. Method and apparatus for removal of bright or dark spots by the fusion of multiple images
US6614408B1 (en) 1998-03-25 2003-09-02 W. Stephen G. Mann Eye-tap for electronic newsgathering, documentary video, photojournalism, and personal safety
US6152563A (en) 1998-02-20 2000-11-28 Hutchinson; Thomas E. Eye gaze direction tracker
US6204828B1 (en) 1998-03-31 2001-03-20 International Business Machines Corporation Integrated gaze/manual cursor positioning system
US6091378A (en) 1998-06-17 2000-07-18 Eye Control Technologies, Inc. Video processing methods and apparatus for gaze point tracking
US6184863B1 (en) * 1998-10-13 2001-02-06 The George Washington University Direct pointing apparatus and method therefor
US6282553B1 (en) 1998-11-04 2001-08-28 International Business Machines Corporation Gaze-based secure keypad entry system
US6433760B1 (en) 1999-01-14 2002-08-13 University Of Central Florida Head mounted display with eyetracking capability
US6577329B1 (en) 1999-02-25 2003-06-10 International Business Machines Corporation Method and system for relevance feedback through gaze tracking and ticker interfaces
US7120880B1 (en) 1999-02-25 2006-10-10 International Business Machines Corporation Method and system for real-time determination of a subject's interest level to media content
US6952198B2 (en) * 1999-07-06 2005-10-04 Hansen Karl C System and method for communication with enhanced optical pointer
US20010010514A1 (en) * 1999-09-07 2001-08-02 Yukinobu Ishino Position detector and attitude detector
US6346887B1 (en) 1999-09-14 2002-02-12 The United States Of America As Represented By The Secretary Of The Navy Eye activity monitor
US6330356B1 (en) * 1999-09-29 2001-12-11 Rockwell Science Center Llc Dynamic visual registration of a 3-D object with a graphical model
US6864912B1 (en) 1999-12-16 2005-03-08 International Business Machines Corp. Computer system providing hands free user input via optical means for navigation or zooming
JP2001183735A (en) * 1999-12-27 2001-07-06 Fuji Photo Film Co Ltd Method and device for image pickup
US6758563B2 (en) 1999-12-30 2004-07-06 Nokia Corporation Eye-gaze tracking
AUPQ896000A0 (en) 2000-07-24 2000-08-17 Seeing Machines Pty Ltd Facial image processing system
US20030043268A1 (en) 2001-06-26 2003-03-06 Mann W. Stephen G. EyeTap vehicle or vehicle controlled by headworn camera, or the like
US6873314B1 (en) 2000-08-29 2005-03-29 International Business Machines Corporation Method and system for the recognition of reading skimming and scanning from eye-gaze patterns
US6608615B1 (en) * 2000-09-19 2003-08-19 Intel Corporation Passive gaze-driven browsing
GB0023172D0 (en) 2000-09-20 2000-11-01 Minter Kemp Martin J Wakeup vizor
US7016532B2 (en) 2000-11-06 2006-03-21 Evryx Technologies Image capture and identification system and process
JP2002143094A (en) 2000-11-07 2002-05-21 Nac Image Technology Inc Visual axis detector
EP1357831A2 (en) 2001-02-09 2003-11-05 Sensomotoric Instruments GmbH Multidimensional eye tracking and position measurement system
DE10108064A1 (en) 2001-02-20 2002-09-05 Siemens Ag Linked eye tracking information within an augmented reality system
GB2372683A (en) 2001-02-23 2002-08-28 Ibm Eye tracking display apparatus
US6959102B2 (en) 2001-05-29 2005-10-25 International Business Machines Corporation Method for increasing the signal-to-noise in IR-based eye gaze trackers
US6886137B2 (en) 2001-05-29 2005-04-26 International Business Machines Corporation Eye gaze control of dynamic information presentation
GB0113533D0 (en) 2001-06-05 2001-07-25 Brigantia Software Ltd Apparatus and method for testing visual response
GB0119859D0 (en) 2001-08-15 2001-10-10 Qinetiq Ltd Eye tracking system
US6927694B1 (en) 2001-08-20 2005-08-09 Research Foundation Of The University Of Central Florida Algorithm for monitoring head/eye motion for driver alertness with one camera
US6997556B2 (en) 2001-10-01 2006-02-14 Ernst Pfleger Method for detecting, evaluating, and analyzing look sequences
AUPR872301A0 (en) 2001-11-08 2001-11-29 Sleep Diagnostics Pty Ltd Alertness monitor
DE10297574B4 (en) 2001-12-21 2009-09-10 Sensomotoric Instruments Gmbh Method and device for eye detection
US6659611B2 (en) 2001-12-28 2003-12-09 International Business Machines Corporation System and method for eye gaze tracking using corneal image mapping
US7197165B2 (en) 2002-02-04 2007-03-27 Canon Kabushiki Kaisha Eye tracking using image data
US7206435B2 (en) 2002-03-26 2007-04-17 Honda Giken Kogyo Kabushiki Kaisha Real-time eye detection and tracking under various light conditions
US6919907B2 (en) 2002-06-20 2005-07-19 International Business Machines Corporation Anticipatory image capture for stereoscopic remote viewing with foveal priority
US20150150455A9 (en) 2002-07-03 2015-06-04 Epley Research, Llc Stimulus-evoked vestibular evaluation system, method and apparatus
US20040061680A1 (en) 2002-07-10 2004-04-01 John Taboada Method and apparatus for computer control
NL1021496C2 (en) 2002-09-19 2004-03-22 Joannes Hermanus Heiligers Signaling Device and method for monitoring alertness of people.
US6943754B2 (en) 2002-09-27 2005-09-13 The Boeing Company Gaze tracking system, eye-tracking assembly and an associated method of calibration
SE524003C2 (en) 2002-11-21 2004-06-15 Tobii Technology Ab Process and plant for the detection and tracking an eye and the gaze angle
US7233684B2 (en) * 2002-11-25 2007-06-19 Eastman Kodak Company Imaging method and system using affective information
US7206022B2 (en) * 2002-11-25 2007-04-17 Eastman Kodak Company Camera system with eye monitoring
US6637883B1 (en) * 2003-01-23 2003-10-28 Vishwas V. Tengshe Gaze tracking system and method
US7306337B2 (en) 2003-03-06 2007-12-11 Rensselaer Polytechnic Institute Calibration-free gaze tracking under natural head movement
AU2003901528A0 (en) 2003-03-31 2003-05-01 Seeing Machines Pty Ltd Eye tracking system and method
JP2004310470A (en) 2003-04-07 2004-11-04 Advanced Telecommunication Research Institute International System for deciding and combining eye-gaze position and computer-executable program
US7259785B2 (en) * 2003-04-28 2007-08-21 Hewlett-Packard Development Company, L.P. Digital imaging method and apparatus using eye-tracking control
US7401920B1 (en) 2003-05-20 2008-07-22 Elbit Systems Ltd. Head mounted eye tracking and display system
US7391888B2 (en) 2003-05-30 2008-06-24 Microsoft Corporation Head pose assessment methods and systems
US7068815B2 (en) * 2003-06-13 2006-06-27 Sarnoff Corporation Method and apparatus for ground detection and removal in vision systems
US7145550B2 (en) 2003-08-08 2006-12-05 Lucent Technologies Inc. Method and apparatus for reducing repetitive motion injuries in a computer user
US20050047629A1 (en) 2003-08-25 2005-03-03 International Business Machines Corporation System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking
KR20050025927A (en) 2003-09-08 2005-03-14 유웅덕 The pupil detection method and shape descriptor extraction method for a iris recognition, iris feature extraction apparatus and method, and iris recognition system and method using its
US7963652B2 (en) 2003-11-14 2011-06-21 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
CA2545202C (en) 2003-11-14 2014-01-14 Queen's University At Kingston Method and apparatus for calibration-free eye tracking
US7365738B2 (en) 2003-12-02 2008-04-29 International Business Machines Corporation Guides and indicators for eye movement monitoring systems
US7561143B1 (en) 2004-03-19 2009-07-14 The University of the Arts Using gaze actions to interact with a display
GB2412431B (en) 2004-03-25 2007-11-07 Hewlett Packard Development Co Self-calibration for an eye tracker
US7331671B2 (en) 2004-03-29 2008-02-19 Delphi Technologies, Inc. Eye tracking method based on correlation and detected eye movement
US20050227217A1 (en) * 2004-03-31 2005-10-13 Wilson Andrew D Template matching on interactive surface
CA2967756C (en) * 2004-04-01 2018-08-28 Google Inc. Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US20110077548A1 (en) 2004-04-01 2011-03-31 Torch William C Biosensors, communicators, and controllers monitoring eye movement and methods for using them
PT1607840E (en) 2004-06-18 2015-05-20 Tobii Ab Eye control of computer apparatus
JP4560368B2 (en) 2004-10-08 2010-10-13 キヤノン株式会社 Eye detecting device and an image display device
JP4632417B2 (en) * 2004-10-26 2011-02-23 キヤノン株式会社 The imaging device, and a control method thereof
US20060209013A1 (en) * 2005-03-17 2006-09-21 Mr. Dirk Fengels Method of controlling a machine connected to a display by line of vision
US7676063B2 (en) 2005-03-22 2010-03-09 Microsoft Corp. System and method for eye-tracking and blink detection
BRPI0614807B1 (en) 2005-08-11 2018-02-14 Sleep Diagnostics Pty Ltd “glass frames for use in eye control system”
US7580545B2 (en) 2005-10-03 2009-08-25 Avago Technologies General Ip (Singapore) Pte. Ltd. Method and system for determining gaze direction in a pupil detection system
WO2007056287A2 (en) 2005-11-04 2007-05-18 Eye Tracking, Inc. Generation of test stimuli in visual media
WO2007062478A1 (en) 2005-11-30 2007-06-07 Seeing Machines Pty Ltd Visual tracking of eye glasses in visual head and eye tracking systems
US7522344B1 (en) 2005-12-14 2009-04-21 University Of Central Florida Research Foundation, Inc. Projection-based head-mounted display with eye-tracking capabilities
DE102006002001B4 (en) 2006-01-16 2009-07-23 Sensomotoric Instruments Gmbh Method for determining the spatial relationship of a person's eye with respect to a camera apparatus
US7747068B1 (en) 2006-01-20 2010-06-29 Andrew Paul Smyth Systems and methods for tracking the eye
CN101336089A (en) 2006-01-26 2008-12-31 诺基亚公司 Eye tracker equipment
US9213404B2 (en) 2006-02-01 2015-12-15 Tobii Technology Ab Generation of graphical feedback in a computer system
WO2007092512A2 (en) 2006-02-07 2007-08-16 Attention Technologies, Inc. Driver drowsiness and distraction monitor
US8064647B2 (en) 2006-03-03 2011-11-22 Honeywell International Inc. System for iris detection tracking and recognition at a distance
US7542210B2 (en) 2006-06-29 2009-06-02 Chirieleison Sr Anthony Eye tracking head mounted display
US7986816B1 (en) 2006-09-27 2011-07-26 University Of Alaska Methods and systems for multiple factor authentication using gaze tracking and iris scanning
US7646422B2 (en) 2006-10-04 2010-01-12 Branislav Kisacanin Illumination and imaging system with glare reduction and method therefor
US8225229B2 (en) 2006-11-09 2012-07-17 Sony Mobile Communications Ab Adjusting display brightness and/or refresh rates based on eye tracking
SE0602545L (en) 2006-11-29 2008-05-30 Tobii Technology Ab Eye tracking illumination
US7783077B2 (en) 2006-12-01 2010-08-24 The Boeing Company Eye gaze tracker system and method
KR100850357B1 (en) 2006-12-06 2008-08-04 한국전자통신연구원 System and method for tracking gaze
US20080166052A1 (en) 2007-01-10 2008-07-10 Toshinobu Hatano Face condition determining device and imaging device
WO2008141460A1 (en) 2007-05-23 2008-11-27 The University Of British Columbia Methods and apparatus for estimating point-of-gaze in three dimensions
US7556377B2 (en) 2007-09-28 2009-07-07 International Business Machines Corporation System and method of detecting eye fixations using adaptive thresholds
ITRM20070526A1 (en) * 2007-10-05 2009-04-06 Univ Roma Apparatus acquisition and processing of information related to human activities eyepieces
US20090196460A1 (en) 2008-01-17 2009-08-06 Thomas Jakobs Eye tracking system and method
CN101945612B (en) 2008-02-14 2013-09-25 诺基亚公司 Device and method for determining gaze direction
TWI349214B (en) 2008-05-05 2011-09-21 Utechzone Co Ltd
WO2010003410A1 (en) 2008-07-08 2010-01-14 It-University Of Copenhagen Eye gaze tracking
EP2303627A4 (en) 2008-07-18 2015-07-29 Optalert Pty Ltd Alertness sensing device
US7736000B2 (en) * 2008-08-27 2010-06-15 Locarna Systems, Inc. Method and apparatus for tracking eye movement
TWI432172B (en) 2008-10-27 2014-04-01 Utechzone Co Ltd
US8730266B2 (en) 2008-11-13 2014-05-20 Queen's University At Kingston System and method for integrating gaze tracking with virtual reality or augmented reality
US20100128118A1 (en) 2008-11-26 2010-05-27 Locarna Systems, Inc. Identification of visual fixations in a video stream
WO2010071928A1 (en) 2008-12-22 2010-07-01 Seeing Machines Limited Automatic calibration of a gaze direction algorithm from user behaviour
US8401248B1 (en) * 2008-12-30 2013-03-19 Videomining Corporation Method and system for measuring emotional and attentional response to dynamic digital media content
EP2389095B1 (en) 2009-01-26 2014-10-01 Tobii Technology AB Detection of gaze point assisted by optical reference signals
US7819525B2 (en) 2009-02-15 2010-10-26 International Business Machines Corporation Automatic direct gaze detection based on pupil symmetry
CN101807110B (en) 2009-02-17 2012-07-04 由田新技股份有限公司 Pupil positioning method and system
WO2010102037A2 (en) 2009-03-03 2010-09-10 The Ohio State University Gaze tracking measurement and training system and method
WO2010106414A1 (en) 2009-03-16 2010-09-23 Nokia Corporation A controller for a directional antenna and associated apparatus and methods
TWI398796B (en) 2009-03-27 2013-06-11 Utechzone Co Ltd
EP2237237B1 (en) 2009-03-30 2013-03-20 Tobii Technology AB Eye closure detection using structured illumination
EP2236074A1 (en) 2009-04-01 2010-10-06 Tobii Technology AB Visual display with illuminators for gaze tracking
WO2010118292A1 (en) 2009-04-09 2010-10-14 Dynavox Systems, Llc Calibration free, motion tolerant eye-gaze direction detector with contextually aware computer interaction and communication methods
US20100295774A1 (en) 2009-05-19 2010-11-25 Mirametrix Research Incorporated Method for Automatic Mapping of Eye Tracker Data to Hypermedia Content
GB0909126D0 (en) 2009-05-27 2009-07-01 Qinetiq Ltd Eye tracking apparatus
US8320623B2 (en) 2009-06-17 2012-11-27 Lc Technologies, Inc. Systems and methods for 3-D target location
EP2275020B1 (en) 2009-07-16 2018-02-21 Tobii AB Eye detection system and method using sequential data flow
TWI397865B (en) 2009-08-12 2013-06-01 Utechzone Co Ltd
EP2470061A1 (en) 2009-08-26 2012-07-04 Ecole Polytechnique Fédérale de Lausanne (EPFL) Wearable systems for audio, visual and gaze monitoring
IL200627A (en) * 2009-08-27 2014-05-28 Erez Berkovich Method for varying dynamically a visible indication on display
US8323216B2 (en) 2009-09-29 2012-12-04 William Fabian System and method for applied kinesiology feedback
EP2309307A1 (en) 2009-10-08 2011-04-13 Tobii Technology AB Eye tracking using a GPU
EP2502410B1 (en) * 2009-11-19 2019-05-01 eSight Corporation A method for augmenting sight
US20110170061A1 (en) 2010-01-08 2011-07-14 Gordon Gary B Gaze Point Tracking Using Polarized Light
US9507418B2 (en) 2010-01-21 2016-11-29 Tobii Ab Eye tracker based contextual action
US8552850B2 (en) 2010-02-17 2013-10-08 Honeywell International Inc. Near-to-eye tracking for adaptive operation
US20110262887A1 (en) 2010-04-21 2011-10-27 Lc Technologies Inc. Systems and methods for gaze based attention training
CN101901485B (en) 2010-08-11 2014-12-03 华中科技大学 3D free head moving type gaze tracking system
WO2012021967A1 (en) 2010-08-16 2012-02-23 Tandemlaunch Technologies Inc. System and method for analyzing three-dimensional (3d) media content
US8941559B2 (en) 2010-09-21 2015-01-27 Microsoft Corporation Opacity filter for display device
WO2012083415A1 (en) 2010-11-15 2012-06-28 Tandemlaunch Technologies Inc. System and method for interacting with and analyzing media on a display using eye gaze tracking
US8408706B2 (en) 2010-12-13 2013-04-02 Microsoft Corporation 3D gaze tracker
US9213405B2 (en) * 2010-12-16 2015-12-15 Microsoft Technology Licensing, Llc Comprehension and intent-based content for augmented reality displays
US9690099B2 (en) 2010-12-17 2017-06-27 Microsoft Technology Licensing, Llc Optimized focal area for augmented reality displays
US8510166B2 (en) * 2011-05-11 2013-08-13 Google Inc. Gaze tracking system

Also Published As

Publication number Publication date
EP2710516A2 (en) 2014-03-26
US20150169050A1 (en) 2015-06-18
US20120294478A1 (en) 2012-11-22
CA2836777A1 (en) 2012-11-29
JP2014520314A (en) 2014-08-21
WO2012162204A3 (en) 2013-03-14
US8885877B2 (en) 2014-11-11
EP2710516A4 (en) 2014-10-15
WO2012162204A2 (en) 2012-11-29
US9405365B2 (en) 2016-08-02
CN103748598B (en) 2017-06-09
CN103748598A (en) 2014-04-23

Similar Documents

Publication Publication Date Title
US9971156B2 (en) See-through computer display systems
EP0350957B1 (en) Image pickup apparatus
US9651784B2 (en) See-through computer display systems
ES2327633T3 (en) Installation to see and keep an eye and gaze direction thereof.
US9841599B2 (en) Optical configurations for head-worn see-through displays
US9720241B2 (en) Content presentation in head worn computing
EP2923638B1 (en) Optical measuring device and system
EP0604430B1 (en) Gaze tracking for field analyzer
US7809160B2 (en) Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
JP5887026B2 (en) Head mounted system and method for computing and rendering a stream of digital images using the head mounted system
CN103091843B (en) Through display brightness control
US8824779B1 (en) Apparatus and method for determining eye gaze from stereo-optic views
US20180267605A1 (en) Eye glint imaging in see-through computer display systems
US9594246B2 (en) See-through computer display systems
US9310610B2 (en) See-through computer display systems
US10078224B2 (en) See-through computer display systems
US20160207457A1 (en) System for assisted operator safety using an hmd
US20130176533A1 (en) Structured Light for Eye-Tracking
JP4750721B2 (en) Custom glasses manufacturing method
US9671613B2 (en) See-through computer display systems
US10036889B2 (en) Head worn computer display systems
US20160085071A1 (en) See-through computer display systems
US20160131904A1 (en) Power management for head worn computing
RU2565482C2 (en) System and method for tracing point of observer's look
US9529195B2 (en) See-through computer display systems

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20150519

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150519

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20160614

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20160831

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20161110

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20161213

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170207

A601 Written request for extension of time

Free format text: JAPANESE INTERMEDIATE CODE: A601

Effective date: 20170501

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170803

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20170912

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A711

Effective date: 20171010

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180111

A911 Transfer of reconsideration by examiner before appeal (zenchi)

Free format text: JAPANESE INTERMEDIATE CODE: A911

Effective date: 20180117

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20180213

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20180313

R150 Certificate of patent or registration of utility model

Ref document number: 6308940

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150