WO2014033722A1 - Computer vision stereoscopic tracking of a hand - Google Patents

Computer vision stereoscopic tracking of a hand Download PDF

Info

Publication number
WO2014033722A1
WO2014033722A1 PCT/IL2013/050745 IL2013050745W WO2014033722A1 WO 2014033722 A1 WO2014033722 A1 WO 2014033722A1 IL 2013050745 W IL2013050745 W IL 2013050745W WO 2014033722 A1 WO2014033722 A1 WO 2014033722A1
Authority
WO
WIPO (PCT)
Prior art keywords
hand
user
movement
display
detecting
Prior art date
Application number
PCT/IL2013/050745
Other languages
French (fr)
Inventor
Eran Eilat
Original Assignee
Pointgrab Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261696252P priority Critical
Priority to US61/696,252 priority
Application filed by Pointgrab Ltd. filed Critical Pointgrab Ltd.
Publication of WO2014033722A1 publication Critical patent/WO2014033722A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Abstract

A method for computer vision based tracking of a user's hand is provided to control movement of displayed content. The method includes tracking movement of a user's hand through a sequence of images and translating location of the hand on a virtual arc- like surface (a surface which is defined by movement of the user's hand) to a location on a 2 dimensional display and moving content on the display according to the location of the hand on the virtual surface. Content may be moved based on an angle of pitch of the hand.

Description

COMPUTER VISION STEREOSCOPIC TRACKING OF A HAND

FIELD OF THE INVENTION

[0001] The present invention relates to the field of machine-user interaction. Specifically, the invention relates to user control of electronic devices based on movement of the user's hand and to translation of the user's hand movement to movement of displayed content.

BACKGROUND [0002] The need for more convenient, intuitive and portable input devices increases, as computers and other electronic devices become more prevalent in our everyday life.

[0003] Recently, human gesturing, such as hand gesturing, has been suggested as a user interface input tool in which a hand gesture is detected by a camera and is translated into a specific command. Gesture recognition enables humans to interface with machines naturally without any mechanical appliances. Additionally, gesture recognition enables operating devices from a distance; the user need not touch a keyboard or a touchscreen in order to control the device.

[0004] Typically, when operating a device having a display, once a user's hand is identified, an icon appears on the display to symbolize the user's hand and movement of the user's hand is translated to movement of the icon on the device. The user may move his hand to bring the icon to a desired location on the display to interact with the display at that location (e.g., to emulate mouse right or left click by hand posturing or gesturing). Similarly, movement of the user's hand may be translated to move other content displayed on a screen. [0005] Usually movement of a user's hand up/down and from side to side on a plane in space (virtual control plane) is translated to up/down and side to side movements on the plane of the screen. However, holding up a hand to gesture over an extended period of time may be tiring and inconvenient for the user. SUMMARY

[0006] A method for computer vision based tracking of a user's hand, according to embodiments of the invention, provides the user with the option of resting his/her arm while gesturing. [0007] Gesturing while resting an arm or elbow on a support, such as a table top, may include horizontal movement of the hand which is similar to horizontal movement of an un-supported hand. However, vertical movement is not possible for a hand resting on a support, without leaving the support. The virtual control plane for a hand resting on a surface is typically a spherical or arc like plane rather than a two dimensional flat plane which is usually used in touchless control of devices.

[0008] Embodiments of the present invention enable to translate movement of a hand, even a hand resting on a support, to both horizontal and vertical movements on a display, namely, embodiments of the invention propose a control mode in which the virtual control surface is a sphere determined by the location of a user's hand for a fixed position of the user's elbow.

[0009] Movement of an arm which is resting on a support (or the elbow of which is resting on a support) towards or away from the camera (on the Z axis) will typically change the angle of pitch of the arm.

[0010] According to one embodiment a method for tracking a user's hand to control movement of displayed content, includes tracking movement of a user's hand through a sequence of images; detecting an angle of pitch of the hand; and moving content on a display according to the detected angle of pitch.

[0011] Thus, a hand may be moved without the arm/elbow leaving the support, and this movement of a hand on the Z axis may be translated to a vertical movement (on the Y axis) of content on the screen.

BRIEF DESCRIPTION OF THE FIGURES

[0012] The invention will now be described in relation to certain examples and embodiments with reference to the following illustrative figures so that it may be more fully understood. In the figures: [0013] Figs. 1A-C schematically illustrate a method and system for computer vision based tracking of a user's hand to control movement of displayed content, according to one embodiment of the invention;

[0014] Fig. 2 schematically illustrates a method which includes detecting a change in the pitch angle and moving the content on a vertical axis of the display according to the change in pitch angle, according to an embodiment of the invention;

[0015] Fig. 3 schematically illustrates a method including detecting a left-right movement of the hand and moving content on a horizontal axis of the display according to the detected left-right movement, according to an embodiment of the invention;

[0016] Fig. 4 schematically illustrates a method in which detecting an angle of pitch of the hand comprises detecting a change of shape of the hand, according to an embodiment of the invention;

[0017] Fig. 5 schematically illustrates a method in which location of display of content is based on the initial location of the user's hand, according to an embodiment of the invention; and

[0018] Fig. 6 schematically illustrates a system for identifying a posture of a user's hand according to an embodiment of the invention.

DETAILED DESCRIPTION OF THE INVENTION

[0019] Methods according to embodiments of the invention may be implemented in a user-device interaction system (for example as schematically illustrated in Fig. IB and described below) which includes a device to be operated and controlled by user commands, typically by touchlessly interacting with a display of the device, and a camera. According to embodiments of the invention user commands are based on identification and tracking of the user's hand. The system identifies the user's hand in the images obtained by the camera. Once a user's hand is identified it is tracked such that movement of the hand may be followed and translated into operating and control commands. For example, the device may include a display and movement of a hand may be translated into movement of an icon or symbol, such as a cursor or any other displayed object, on the display or another manipulation of content on the display. [0020] The camera may be a standard 2D camera and may be associated with a processor and a storage device for storing image data. The storage device may be integrated within the camera or may be external to the camera. According to some embodiments image data may be stored in the processor, for example in a cache memory.

[0021] In some embodiments image data of a field of view (which includes a user's hand) is sent to the processor for analysis. A user command is generated by the processor, based on the image analysis, and is sent to a device, which may be any electronic device that can accept user commands, e.g., TV, DVD player, PC, mobile phone, camera, STB (Set Top Box), streamer, etc.

[0022] According to one embodiment the device is an electronic device available with an integrated standard 2D camera. According to other embodiments a camera is an external accessory to the device. According to some embodiments more than one 2D camera is provided to enable obtaining 3D information. According to some embodiments the system includes a 3D camera.

[0023] One or more detectors may be used for correct identification of a moving object and for identification of different postures of a hand. For example, a contour detector may be used together with a feature detector.

[0024] Methods for tracking a user's hand may include using an optical flow algorithm or other known tracking methods.

[0025] In the following description, various aspects of the present invention will be described. For purposes of explanation, specific configurations and details are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details presented herein. Furthermore, well known features may be omitted or simplified in order not to obscure the present invention.

[0026] Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining," or the like, refer to the action and/or processes of a computer or computing system, or similar electronic computing device, that manipulates and/or transforms data represented as physical, such as electronic, quantities within the computing system's registers and/or memories into other data similarly represented as physical quantities within the computing system's memories, registers or other such information storage, transmission or display devices.

[0027] Some devices may require more extensive gesturing than others for their operation. For example, operating a TV or video typically requires gesturing to turn the device on/off and during operation some gesturing to change channels and/or adjust volume, brightness or other parameters. Usually, these activities are limited during operation of a TV or video. However, operating a device such as a PC, tablet or phone typically requires more activity during operation of the device, for example, on/off, bringing up a required application or software, opening the application, working within the application etc. Extensive gesturing may be tiring for the user and the user may want to have support for his gesturing hand during operation. For example, the user may want to place his elbow on a table and continue gesturing while his arm/hand is supported by the table. [0028] Horizontal movement of the hand is possible while resting an arm or elbow on a support, however, vertical movement is not possible for a hand resting on a support, without leaving the support. Embodiments of the invention provide a solution which enables accurate translation of hand movements and the possibility of freely manipulating displayed content even while the gesturing hand is resting on a support. [0029] The term "hand" may include a hand or any part of a hand such as one or more fingers.

[0030] Reference is now made to Figs. 1A -C which schematically illustrate methods and systems for computer vision based tracking of a user's hand to control movement of displayed content, according to embodiments of the invention. [0031] According to one embodiment a method includes tracking movement of a user's hand through a sequence of images; detecting movement of the hand on a virtual surface (a surface defined by movement of the user's hand, in which the radius of the surface is essentially the length of the user's arm between the elbow and the tip of the hand. Movement of the user's hand may include, for example, movement a Z axis originating from the user's elbow) and moving content on a 2 dimensional display based on movement of the hand on the virtual surface (e.g., moving content along a Y axis of the display according to the detected movement of the hand along the Z axis). [0032] Detecting movement on a spherical or arc like virtual surface or along a Z axis may be done, for example, by detecting a pitch angle (as described for example in Fig. 1C), by detecting a change of size or shape of the hand (as described for example with reference to Fig. 2), by detecting a transformation of movement of selected points/pixels from within images of a hand, determining changes of scale along X and Y axes from the transformations and determining movement along the Z axis from the scale changes or any other appropriate methods, for example, by using stereoscopy or 3D imagers.

[0033] Alternatively, detecting an angle of pitch of the hand may be done by detecting movement of the hand along a Z axis, detecting a change of size, detecting a change of aspect ratio or detecting a shape of the hand. Movement along a Z axis relative to the camera imaging the hand results in a change of the location of the hand on the Y axis (relative to the camera imaging the hand) too, although the changes of location on the Y axis are smaller than the changes of location along the Z axis. The location of the hand on the Y axis may also be used, alone or in combination with the location of the hand on the Z axis and/or in combination with other parameters, to detect the angle of pitch of the hand.

[0034] According to one embodiment the method includes tracking movement of a user's hand through a sequence of images (102); translating location of the hand on a virtual surface to a location on a 2 dimensional display (106) and moving content on the display according to the location of the hand on the virtual surface (108).

[0035] A spherical or arc like virtual surface may be calculated, for example, by detecting the user's hand or finger or other point at the end of the user's arm and estimating the location of the user's elbow (e.g., by analyzing the user's hand motions), thereby extracting the radius of the sphere or arc. A location on the virtual surface can be translated to a location on a display using, for example, epipolar geometry.

[0036] According to another embodiment the method includes tracking movement of a user's hand through a sequence of images (112); detecting a pitch angle of the hand (114); and moving content on a display according to the detected pitch angle (116). For example, a user 10 may be using hand gestures and/or postures to operate a device 11, such as a PC, having a display 17. A camera 15 which is in communication with the device 11 obtains images of the user's hand 14 and a processor 122 translates the hand 14 posture and/or movement to operating commands, such as to movement of content on the display 17 of the device according to movement of the hand 14.

[0037] Communication between the image sensor or camera 15 and processor 122 and/or between the processor 122 and the device 11 may be through a wired or wireless link, such as through infrared (IR) communication, radio transmission, Bluetooth technology and other suitable communication routes.

[0038] According to embodiments of the invention a hand or a part of a hand may be detected by applying shape recognition algorithms to identify the hand by its shape. Shape recognition algorithms may include, for example, an algorithm which calculates Haar-like features in a Viola- Jones object detection framework.

[0039] When discussed herein, a processor such as processor 122 which may carry out all or part of a method as discussed herein, may be configured to carry out the method by, for example, being associated with or connected to a memory storing code or software which, when executed by the processor, carry out the method.

[0040] Different embodiments are disclosed herein. Features of certain embodiments may be combined with features of other embodiments; thus certain embodiments may be combinations of features of multiple embodiments.

[0041] Embodiments of the invention may include an article such as a computer or processor readable non-transitory storage medium, such as for example a memory, a disk drive, or a USB flash memory encoding, including or storing instructions, e.g., computer-executable instructions, which when executed by a processor or controller, cause the processor or controller to carry out methods disclosed herein.

[0042] Processor 122 may include, for example, a central processing unit (CPU), a digital signal processor (DSP), a microprocessor, a controller, a chip, a microchip, an integrated circuit (IC), cache memory, or any other suitable multi-purpose or specific processor or controller, and may be one or more processor. The system may also include a storage device, such as, for example, a random access memory (RAM), a dynamic RAM (DRAM), a flash memory, a volatile memory, a non-volatile memory, a cache memory, a buffer, a short term memory unit, a long term memory unit (e.g. disk drive), or other suitable memory units or storage units.

[0043] The user 10 may place his elbow 144 on a support 13 (such as a desk top or an arm of an armchair or other arm rests) to rest his arm 12 so as not to tire while gesturing. Movement of the user's arm on arc-like surface 16 (which is the virtual surface defined by movement of the arm 12 and/or hand 14 or by the location of a user's hand or part of hand for a fixed position of the user's elbow 144) can then be translated to movement of content on display 17.

[0044] According to one embodiment movement of an arm 12 from side to side may be translated to horizontal movement of content on a display whereas movement of the hand 12 towards and away from the display 17 (on the Z axis) may be translated to vertical movement of content on a display, thus providing horizontal and vertical movement of displayed content without the user having to lift his arm/hand from the support 13. [0045] Movement of the arm 12 or hand 14 towards and away from the display is a movement having a pitch angle, the pitch angle a typically being the angle between the user's arm 12 and a perpendicular axis (Y) originating from the user's elbow 144. The pitch angle may be used to determine the location of the hand (e.g., the location of the hand on a spherical or arc like surface, the center of which is the user's elbow, can be detected using spherical trigonometry) and also movement of the hand along a Z axis.

[0046] According to some embodiments a three dimensional camera may be used to obtain the sequence of images. According to other embodiments a plurality of two dimensional cameras may be used to obtain the sequence of images. The cameras may be positioned relative to each other to obtain a stereoscopic view of the user's hand and movement along a Z axis may be determined from the stereoscopic views of the hand.

[0047] According to some embodiments the method includes detecting a change in the pitch angle and moving the content on a vertical axis of the display according to the change in pitch angle, for example, as schematically illustrated in Fig. 2. A hand 24 in a first position (a) creates a pitch angle a whereas the hand 24 in a second position (b) creates a pitch angle β. According to one embodiment detecting the change of the angle from a to P causes movement of an icon 23 on a display 27. According to one embodiment if the pitch angle increases (β> a) the content (e.g., icon 23) is moved towards the bottom of the display 27 (to location b') and if the angle decreases the content is moved towards the top of the display 27 (to location a'). Icon 23 may be a cursor or any other suitable symbol.

[0048] According to some embodiments movement of content on a vertical axis of the display is dependent on detecting a change in size or aspect ratio of the hand. For example, when the hand 24 is moved from its location in a to its location in b, the hand 24 comes closer to the imager 25 and thus grows larger in the images obtained by imager 25. The change of size or aspect ratio of the hand 24 may be detected (for example, by using shape recognition to detect scale changes or by detecting a change of transformation of selected points/pixels from within images of a hand between images in a sequence of images or by other appropriate methods) and may be translated to movement of content on a vertical axis of the display 27. For example, when a hand grows larger content may be moved towards the bottom of the display (e.g., to location b') and when a hand grows smaller content may be moved towards the upper part of the display (e.g., to location a').

[0049] Thus, according to one embodiment, the size of the user's hand may be an indication and may be used in calculating the angle of pitch. This embodiment may be advantageous when operating devices from a large distance (e.g., from a distance of 5 meters from the device). [0050] Other parameters, such as location of the hand on a Z axis and/or Y axis or a change of size or aspect ratio may be an indication and may be used to detect an angle of pitch.

[0051] According to some embodiments the method may include detecting a left-right movement of the hand and moving content on a horizontal axis of the display according to the detected left-right movement, as schematically illustrated in Fig. 3.

[0052] Horizontal (left-right) movement of a supported hand (e.g., supported on surface 33) may resemble the left-right movement of an un-supported hand during short movements (short movements being schematically shown by arrows S and long movements being schematically shown by arrows L). However, the left-right movement of a supported hand 34 is actually a rotation motion of the hand 34 about a point. The point may be the user's elbow 304 or the user's wrist 305. Thus, during longer movement of the hand or towards the end of the motion a supported hand's movement may be different than an un-supported hand's horizontal motion.

[0053] According to one embodiment of the invention a user's hand may be detected and tracked and rotational motion of a user's hand may be detected from the tracking of the hand and displayed content may be moved horizontally on a display according to the rotational motion of the user's hand. [0054] Fig. 4 schematically illustrates a method according to an embodiment of the invention, in which detecting an angle of pitch of the hand comprises detecting a change of shape of the hand.

[0055] When a hand 44 held in a specific posture (e.g., a hand with all finger tips brought together or a hand in a fist) is viewed by a camera 45 while the hand is at a first angle of pitch (a) the image of the hand (b) is different than the image (b') captured of that same hand 44 in the same posture but at a different angle of pitch (α').

[0056] According to one embodiment the method includes identifying a posture or gesture of the hand and activating a user command according to the identified posture or gesture.

[0057] For example, when the posture of the hand is a hand with all finger tips brought together or a hand in a fist a "select" user command may be activated. Displayed content may thus be selected and moved according to movement of the hand while in the "select" posture. Movement of the hand in the Z axis may be translated to vertical movement of an icon or other content on the display as described above.

[0058] According to one embodiment the method includes detecting a shape of the user's hand within a sequence of images; comparing the detected shape to a database of hands; and identifying a posture of the hand based on the comparison. Typically, the database of hands comprises hands positioned at different angles relative to a camera used to obtain the sequence of images.

[0059] Thus, for example, an image 44b of a hand at angle of pitch a will be stored in a database of hands and may be later used to indicate or calculate the angle of pitch a. An image 44b' of a hand may be used as an indicator or may be used to calculate angle of pitch a'.

[0060] According to some embodiments movement of a cursor (or other symbol or icon) on a display may be accelerated according to the angle of pitch. Since the angle of pitch or other parameters such as size, aspect ratio or shape of the hand may be used to translate movement of the hand on the Z axis to movement on a vertical axis of the display, these parameters may be used to correctly decipher the speed of movement of the user's hand on the Z axis and the correctly deciphered speed of movement of the hand may be used to control the acceleration of an icon, as known in the art. Thus, an icon (such as a cursor or other displayed object) movement can be normalized based on parameters such as hand size, aspect ratio, location on Z axis and shape.

[0061] According to one embodiment, which is schematically illustrated in Fig. 5, the system may take into account movement limitations which are typical of a hand or arm resting on a support. For example, if a user initially positions his hand at an angle of pitch a which is close to 0°, the user may be limited in how much further back (how much smaller angle of pitch) he may bend his arm, in which case he may be limited in moving content vertically on a display. Thus, according to one embodiment, the method includes detecting an initial location of the user's hand (502) (or alternatively, detecting the angle of pitch when the hand is at an initial location) and displaying content, such as an icon, in proximity to a top or bottom area of the display (e.g., in the top or bottom third of the screen) according to the initial location of the user's hand (504) (or according to the angle of pitch when the hand is at the initial location). So, if an initial location of a hand 54 is relatively high (e.g., angle of pitch a is close to 0°) an icon 53, such as a cursor, may be initially positioned close to the top of the display 57 so that the user may then be able to move the icon 53 to the upmost limit of the display 57 without having to decrease the angle of pitch a much more. Similarly, if the user initially positions his hand at a larger angle of pitch (e.g., a is close to 90°) he will be limited as to how much further he may lower his hand. In this case an icon may be positioned close to the bottom part of the display so that the user may be able to move the icon to the lowest limit of the display regardless of the limitation of movement.

[0062] A system for identifying a posture of a user's hand, according to an embodiment of the invention, is schematically illustrated in Fig. 6. The system typically includes a camera 65 to obtain an image of the user's hand 64, a database of hands 600, the database containing hands positioned at different angles or positions relative to the camera 65, and a processor 62 to compare an image of the user's hand to the database of hands 600 to identify the posture of the hand.

[0063] The system may operate using machine learning techniques or other suitable techniques. [0064] Typical databases used for identifying shapes of hands include hands positioned mostly parallel relative to a camera, since these databases are meant for identifying hand gestures and postures of un-supported hands, which are typically held in parallel to the camera. The database according to one embodiment of the invention, is unique in that it includes images of hands that are typically positioned on a plane which is not parallel to a plane of the camera.

[0065] The database may be a pre-formed database or may be formed on-line using machine learning techniques.

[0066] Machine learning techniques may include supervised learning techniques, in which a set of training examples is presented to the computer. Each example typically includes a pair consisting of an input object and a desired output value. A supervised learning algorithm analyzes the training data and produces an inferred function (classifier), if the output is discrete, or a regression function, if the output is continuous. The classifier is then used in the identification of future objects. Thus, a hand in a first image may be identified as a hand in a specific posture by using a pre- constructed database. In this case, a hand is identified in the first frame by using a semi automated process in which a user assists or directs machine construction of a database of hands in specific postures and which are positioned at different angles relative to a virtual line connecting the hand and the camera and in the following frames the posture of the hand is identified by using a fully automated process in which the machine construction of a database of hand objects is automatic. An identified hand in a specific posture or information of an identified hand in the specific posture may be added to the first, semi automatically constructed database or a newly identified hand (or information of the hand) may be stored or added to a new fully automatic machine-constructed database.

[0067] The system may be used to detect movement of a user's supported hand in the Z axis. A supported hand moving towards or away from a camera will appear different (e.g., may have a different shape or size) to the camera at different angles of pitch. Thus, the posture or shape of a hand may be detected and displayed and content may be moved vertically on the display based on the detected shape of hand. Additionally, content, such as a cursor, may be accelerated based on the detected shape of hand.

Claims

Claims
1. A method for computer vision based tracking of a user's hand to control movement of displayed content, the method comprising: tracking movement of hand through a sequence of images; detecting an angle of pitch of the hand; and moving content on a display according to the detected angle of pitch.
2. The method of claim 1 comprising detecting a change in the pitch angle and moving the content on a vertical axis of the display according to the change in pitch angle.
3. The method of claim 2 comprising moving content towards a top of the display when the pitch angle decreases and moving content towards a bottom of the display when the pitch angle increases.
4. The method of claim 1 wherein detecting an angle of pitch of the hand comprises detecting location of the hand along a Z axis and/or along a Y axis, detecting a size or change of size of the hand, detecting an aspect ratio or change of aspect ratio of the hand or detecting a shape of the hand.
5. The method of claim 1 wherein the content is an icon on the display.
6. The method of claim 5 wherein the icon is a cursor.
7. The method of claim 6 comprising detecting an initial location of the user's hand and displaying the icon in proximity to a top or bottom area of the display according to the initial location of the user's hand.
PCT/IL2013/050745 2012-09-03 2013-09-03 Computer vision stereoscopic tracking of a hand WO2014033722A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201261696252P true 2012-09-03 2012-09-03
US61/696,252 2012-09-03

Publications (1)

Publication Number Publication Date
WO2014033722A1 true WO2014033722A1 (en) 2014-03-06

Family

ID=50182612

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2013/050745 WO2014033722A1 (en) 2012-09-03 2013-09-03 Computer vision stereoscopic tracking of a hand

Country Status (1)

Country Link
WO (1) WO2014033722A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality
US9911232B2 (en) 2015-02-27 2018-03-06 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011137226A1 (en) * 2010-04-30 2011-11-03 Verizon Patent And Licensing Inc. Spatial-input-based cursor projection systems and methods
WO2011138775A1 (en) * 2010-05-04 2011-11-10 Timocco Ltd. System and method for tracking and mapping an object to a target
US20110289456A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Modifiers For Manipulating A User-Interface
US20120113223A1 (en) * 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011137226A1 (en) * 2010-04-30 2011-11-03 Verizon Patent And Licensing Inc. Spatial-input-based cursor projection systems and methods
WO2011138775A1 (en) * 2010-05-04 2011-11-10 Timocco Ltd. System and method for tracking and mapping an object to a target
US20110289456A1 (en) * 2010-05-18 2011-11-24 Microsoft Corporation Gestures And Gesture Modifiers For Manipulating A User-Interface
US20120113223A1 (en) * 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9911232B2 (en) 2015-02-27 2018-03-06 Microsoft Technology Licensing, Llc Molding and anchoring physically constrained virtual environments to real-world environments
US9836117B2 (en) 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
US9898864B2 (en) 2015-05-28 2018-02-20 Microsoft Technology Licensing, Llc Shared tactile interaction and user safety in shared space multi-person immersive virtual reality

Similar Documents

Publication Publication Date Title
KR101620777B1 (en) Enhanced virtual touchpad and touchscreen
CN102915112B (en) Close operation of systems and methods for tracking
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
US8031175B2 (en) Touch sensitive remote control system that detects hand size characteristics of user and adapts mapping to screen display
US20180046263A1 (en) Display control apparatus, display control method, and display control program
US8872762B2 (en) Three dimensional user interface cursor control
EP2972669B1 (en) Depth-based user interface gesture control
US20130159939A1 (en) Authenticated gesture recognition
JP5207513B2 (en) Control device operation gesture recognition apparatus, the control device operation gesture recognition system and control equipment operation gesture recognition program
US20160239080A1 (en) Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments
US20120146903A1 (en) Gesture recognition apparatus, gesture recognition method, control program, and recording medium
JP6129879B2 (en) Navigation technique for multi-dimensional input
US9274608B2 (en) Systems and methods for triggering actions based on touch-free gesture detection
US9372544B2 (en) Gesture recognition techniques
CN101930286B (en) And an operation control means controls the operation method
US9632658B2 (en) Dynamic user interactions for display control and scaling responsiveness of display objects
US9423876B2 (en) Omni-spatial gesture input
CN102822773A (en) Gesture mapping for display device
CN103097996A (en) Motion control touch screen method and apparatus
US20130063345A1 (en) Gesture input device and gesture input method
CN104115099A (en) Engagement-dependent gesture recognition
CA2811868C (en) Operation input apparatus, operation input method, and program
WO2011137226A1 (en) Spatial-input-based cursor projection systems and methods
CN105980965A (en) Systems, devices, and methods for touch-free typing
US8839137B2 (en) Information processing device, table, display control method, program, portable terminal, and information processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13833382

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13833382

Country of ref document: EP

Kind code of ref document: A1