WO2014072786A2 - 3d hand-gesture to convey cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the cartesian axes - Google Patents

3d hand-gesture to convey cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the cartesian axes Download PDF

Info

Publication number
WO2014072786A2
WO2014072786A2 PCT/IB2013/002438 IB2013002438W WO2014072786A2 WO 2014072786 A2 WO2014072786 A2 WO 2014072786A2 IB 2013002438 W IB2013002438 W IB 2013002438W WO 2014072786 A2 WO2014072786 A2 WO 2014072786A2
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
hand
axes
cartesian
freedom
Prior art date
Application number
PCT/IB2013/002438
Other languages
French (fr)
Other versions
WO2014072786A9 (en
Inventor
Camillo TREVISAN
Original Assignee
Università Iuav Di Venezia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Università Iuav Di Venezia filed Critical Università Iuav Di Venezia
Publication of WO2014072786A2 publication Critical patent/WO2014072786A2/en
Publication of WO2014072786A9 publication Critical patent/WO2014072786A9/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the invention consists of establishing a three-dimensional gesture. It uses a specific body posture which is easy to reproduce and instantly recognizable, in this case three digits of a user's hand: thumb, index and middle finger, with the other two fingers folded. Such a 3D gesture is useful for conveying the arrangement of the tri-orthogonal Cartesian system in an unambiguous manner. It is also useful for simultaneous and/or individual control with six degrees of freedom - three translations and three rotations with respect to the three Cartesian axes - of the movements of the projection center and the direction of view of a virtual camera for the representation of a 3D digital model, corresponding to the movements of an observer navigating within the model itself. Furthermore, the
  • simultaneous and/or individual control of the six degrees of freedom can find applications in other fields such as, but not exclusively, robotics and the remote control of vehicles or equipment.
  • a complete control of navigation would entail, inter alia, a free orientation of the direction of view and the ability to translate the virtual camera's perspective along the three Cartesian axes.
  • robotics In other fields - such as, but not exclusively, robotics - the control of all six degrees of freedom; yaw, pitch, roll and translations along each X/Y/Z axis is typically controlled via hardware apparatus.
  • Control may be difficult or incomplete if the user is seated.
  • a further goal is to permit the interactive navigation of 3D digital models in more traditional configurations (notebooks, ultrabooks, desktops, etc.) and also with other limited-range 3D gesture- detection devices, for example, Leapmotion Leap3D ⁇ or Creative Interactive Gesture Camera ⁇ (these are typically limited to about 1 meter).
  • the simultaneous and/or individual control of the six degrees of freedom can also find applications in other fields such as, but not exclusively, robotics and for the remote control of equipment or vehicles.
  • the 3D gesture is identified by a clear and unambiguous hand posture - being easily detected even if the fingers are not fully stretched and orthogonal to each other - simple, natural and instantly relatable to indicate movements as well as being quick and easy to setup and finish.
  • Figure 1 Configuration of the digits of the left hand; thumb upwards to identify the positive direction of the Cartesian vertical axis (here the Z-axis), the index finger identifies the positive direction of one of the two horizontal axes (here the Y-axis) and the middle finger is aimed to identify the positive direction of the second Cartesian horizontal axis (here the X-axis).
  • Figure 2 Configuration of the digits of the right hand.
  • the middle finger identifies the negative direction of the X-axis.
  • the detection of the 3D gesture - the arrangement of digits of the right or left hand in order to indicate the three Cartesian axes of the reference system - may be
  • 3D gesture detection devices such as, for example, Leapmotion Leap3D ⁇ , Creative Interactive Gesture Camera ⁇ , Microsoft Kinect ⁇ or Asus Xtion ⁇
  • software using specially designed libraries of Computer Vision, for example, OpenCV ⁇ .
  • a hand with three fingers arranged as axes reproduces the movements and rotations of the observer's view that explores the 3D digital model or the movements and rotations of controlled equipment.
  • the movements can be carried out either by moving the wrist, the forearm or whole arm, since the detection system detects the ideal origin of the reference system, located at the root of the three digits - as well as the position of the user's head, elbow and shoulder - and is therefore able to identify the overall movement of the user's arm itself and thus be able to recognize and interpret the user's possible 'intentions'.
  • the detection system detects the ideal origin of the reference system, located at the root of the three digits - as well as the position of the user's head, elbow and shoulder - and is therefore able to identify the overall movement of the user's arm itself and thus be able to recognize and interpret the user's possible 'intentions'.
  • this movement can be interpreted as a jump to be performed by the virtual observer, thus further extending the possible commands associated with the 3D gesture.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The gesture 3D is identified by the specific position of three digits of one hand (either the right or left), stretched and arranged perpendicularly to one another in order to identify the three axes of a Cartesian reference system. In particular, the thumb pointing upward indicates the positive direction of the vertical axis of the reference system; the index finger pointed forward indicates the positive:di'rection of a horizontal axis and the middle finger pointed laterally indicates the positive direction (left hand) or negative (right hand) of the other horizontal axis. This allows the gesture control over all six degrees of freedom (three translations and three rotations) either simultaneously or separately, useful for defining the movements of a virtual camera in a 3D digital model. There are further possible applications, for example in robotics and the remote control of vehicles and.other equipment.

Description

Title
3D hand-gesture to convey Cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the Cartesian axes
Purpose and technical scope
The invention consists of establishing a three-dimensional gesture. It uses a specific body posture which is easy to reproduce and instantly recognizable, in this case three digits of a user's hand: thumb, index and middle finger, with the other two fingers folded. Such a 3D gesture is useful for conveying the arrangement of the tri-orthogonal Cartesian system in an unambiguous manner. It is also useful for simultaneous and/or individual control with six degrees of freedom - three translations and three rotations with respect to the three Cartesian axes - of the movements of the projection center and the direction of view of a virtual camera for the representation of a 3D digital model, corresponding to the movements of an observer navigating within the model itself. Furthermore, the
simultaneous and/or individual control of the six degrees of freedom can find applications in other fields such as, but not exclusively, robotics and the remote control of vehicles or equipment.
Demonstration of the invention
The state of technology
Currently, some devices, like Microsoft Kinect© or Asus Xtion©, acknowledge some 3D gesturing, specific positions of the human body in three dimensional space, utilized for video games as well as other professional and amateur applications (via their SDK, Software Development Kits). The navigation in virtual 3D digital models is done in various ways, usually by identifying the position of the human body and its limbs.
A complete control of navigation would entail, inter alia, a free orientation of the direction of view and the ability to translate the virtual camera's perspective along the three Cartesian axes.
For example, when simulating the movements of both hands gripping a steering wheel, not only are both hands engaged but they also limit the degrees of freedom. Simulating one or both hand movements while gripping an airplane's joystick, but one needs to be familiar with the principles of a pilot to appreciate the lack of control for the throttle or rudder pedals.
In other fields - such as, but not exclusively, robotics - the control of all six degrees of freedom; yaw, pitch, roll and translations along each X/Y/Z axis is typically controlled via hardware apparatus.
For further references regarding 3D hand gestures, see:
http://airccse.org/journal/ijaia/papers/3412ijaia12.pdf
http://www.cs.rutgers.edu/~vladimir/pub/pavlovic97pami.pdf
http://airccj.org/CSCP/vol2/csit2320.pdf
http://www.ics.forth.gr/~zabulis/2009_06_book_hci_gestures.pdf
http://cs-people.bu.edu/athitsos/publications/athitsos_cvpr2004.pdf
http://cs-people.bu.edu/athitsos/publications/athitsos_cues2001.pdf
http://reference.kfupm.edu.sa/content/v/i/vision_based_gesture_recognition a_revi_291
732.pdf
http://www.ampublisher.com/Mar%202012/IPCV-1203-015-Hand-Gesture-Modeling- Recognition-Geometric-Features-Review.pdf
http://cs.anu.edu.au/student/projects/ 1S2/Reports/Jianming%20Guo.pdf
http://cg.cs.uni-kl.de/~denker/publications/pdf/muc12.pdf
Disadvantages
The use of the actions of the human body to orient and translate freely the direction of view and the position of the virtual camera - therefore the position and direction of view of the virtual observer or movements of a vehicle - may involve some drawbacks:
1. Control may be difficult or incomplete if the user is seated.
2. Sometimes these mechanisms requires a relatively large maneuvering space.
3. It is often almost impossible to use a mouse and/or keyboard simultaneously with such gestures.
4. Any involuntary movements of the user might easily be misinterpreted and result in unintended movements of the virtual camera/vehicle/apparatus.
Objectives
The goals are to allow the user, even if sitting, complete control, simultaneous or individual, of the six degrees of freedom - three translations and three rotations - utilizing one hand only, either the right or the left hand, without the need of a large space for maneuvering, thus allowing a free hand to operate the keyboard, mouse or to furnish further guidance to the system using other 3D gesturing. In this way a further goal is to permit the interactive navigation of 3D digital models in more traditional configurations (notebooks, ultrabooks, desktops, etc.) and also with other limited-range 3D gesture- detection devices, for example, Leapmotion Leap3D© or Creative Interactive Gesture Camera© (these are typically limited to about 1 meter).
Solution
The goal is achieved by the claims 1 , 2, 3, 4, 5, 6. Advantages
The dependent claims show some favorable developments. The invention has the following advantages:
1. Complete control, simultaneous and/or individual, of the movements of a virtual camera with six degrees of freedom, three rotations around the three axes and three translations along all three axes.
2. Using one hand, either the right or the left, thus making it a fully compatible for lefthanders.
3. Need for a reduced operating space, typically less than one meter.
4. Possibility of use even when seated in the immediate vicinity of the monitor.
5. Freedom to operate the mouse, keyboard or other 3D gesturing commands with the other hand.
6. The simultaneous and/or individual control of the six degrees of freedom can also find applications in other fields such as, but not exclusively, robotics and for the remote control of equipment or vehicles.
7. The 3D gesture is identified by a clear and unambiguous hand posture - being easily detected even if the fingers are not fully stretched and orthogonal to each other - simple, natural and instantly relatable to indicate movements as well as being quick and easy to setup and finish.
List of figures
The invention is explained in detail in the following two diagrams, concerning the use of the left and right hand respectively.
Figure 1 : Configuration of the digits of the left hand; thumb upwards to identify the positive direction of the Cartesian vertical axis (here the Z-axis), the index finger identifies the positive direction of one of the two horizontal axes (here the Y-axis) and the middle finger is aimed to identify the positive direction of the second Cartesian horizontal axis (here the X-axis).
Figure 2: Configuration of the digits of the right hand. In this case, the middle finger identifies the negative direction of the X-axis.
Realization / description of the invention
The detection of the 3D gesture - the arrangement of digits of the right or left hand in order to indicate the three Cartesian axes of the reference system - may be
accomplished either via the hardware (by means of suitable 3D gesture detection devices such as, for example, Leapmotion Leap3D©, Creative Interactive Gesture Camera©, Microsoft Kinect© or Asus Xtion©), via software (using specially designed libraries of Computer Vision, for example, OpenCV©). Once the 3D gesture and the hand (left or right) have been identified, one can simultaneously control the three axes of rotation (the coordinate axes X, Y, Z) of the direction of view of the virtual camera representation of the 3D digital model and three translations (always with respect to the X, Y and Z axes) of the point of view of the virtual camera itself.
The same can be done for the remote control of equipment or vehicles.
In other words, a hand with three fingers arranged as axes reproduces the movements and rotations of the observer's view that explores the 3D digital model or the movements and rotations of controlled equipment.
Every time you raise your hand and configure digits to indicate the 3D gesture, they are recorded and, after an initial latency (the time needed to posse one's hand in the correct initial posture), rotations and offsets are measured continuously both with respect to the initial position and with respect to the previous position. By closing the hand, or otherwise divesting the posture, the 3D gesture is interrupted. The slight delay between the action of the hand and the application of the command is necessary both to enable the recognition of the gesture, the application of the geometrical transformations, the representation of the model and, above all, to allow an interruption of the action without accidentally altering the parameters.
It should be noted that the movements (both rotations and translations) can be carried out either by moving the wrist, the forearm or whole arm, since the detection system detects the ideal origin of the reference system, located at the root of the three digits - as well as the position of the user's head, elbow and shoulder - and is therefore able to identify the overall movement of the user's arm itself and thus be able to recognize and interpret the user's possible 'intentions'. Thus, for example, if the user rotates the forearm rapidly upward and then returns it to its initial position while maintaining the posture of the 3D gesture with the three pointing digits, this movement can be interpreted as a jump to be performed by the virtual observer, thus further extending the possible commands associated with the 3D gesture.

Claims

Claims
1. 3D gesture uniquely defined by a specific posture of three digits of one hand, characterized by the fact that the thumb is pointing upwards, the index finger is pointed straight ahead and the middle finger is pointed orthogonally to the plane formed by the other two fingers, with both the ring and little fingers folded.
2. 3D gesture that, according to claim 1 , allows the control, simultaneous and/or individual, of six degrees of freedom characterized in that - if the three fingers identify the three Cartesian axes of the reference system, in particular the thumb the vertical axis, the index one of the two horizontal axes, the average the other horizontal axis - the movement of the hand is used to uniquely define the position of the Cartesian system and to dynamically control the three rotations about the X, Y and Z axes and the three translations along each of those axes.
3. 3D gesture that, according to claims 1 and 2, is characterized by the fact that it allows one to operate in three dimensional space.
4. 3D gesture that, according to claims 1, 2, 3, is characterized by the fact that it allows one to operate in a limited field of operation.
5. Gesture that, according to claims 1, 2, 3, is characterized by the fact that it allows one to operate with either the right or left hand.
6. Gesture that, according to claims 1, 2, 3, 4, 5, is characterized by the fact that it allows one to operate, while maintaining the posture of the three fingers, by moving, either in combination or alternatively, the hand, wrist, elbow, shoulder and/or the user's body.
PCT/IB2013/002438 2012-11-09 2013-11-04 3d hand-gesture to convey cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the cartesian axes WO2014072786A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CH2301/12 2012-11-09
CH23012012A CH707208A2 (en) 2012-11-09 2012-11-09 3D gesture for the definition of the plane of arrangement of a Cartesian system and the control of six degrees of freedom: three rotations and three translations along the Cartesian axes.

Publications (2)

Publication Number Publication Date
WO2014072786A2 true WO2014072786A2 (en) 2014-05-15
WO2014072786A9 WO2014072786A9 (en) 2014-07-03

Family

ID=50102131

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/002438 WO2014072786A2 (en) 2012-11-09 2013-11-04 3d hand-gesture to convey cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the cartesian axes

Country Status (2)

Country Link
CH (1) CH707208A2 (en)
WO (1) WO2014072786A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108108018A (en) * 2017-12-12 2018-06-01 歌尔科技有限公司 Commanding and training method, equipment and system based on virtual reality
US11262849B2 (en) 2016-02-17 2022-03-01 Volkswagen Aktiengesellschaft User interface, a means of transportation and a method for classifying a user gesture performed freely in space

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11262849B2 (en) 2016-02-17 2022-03-01 Volkswagen Aktiengesellschaft User interface, a means of transportation and a method for classifying a user gesture performed freely in space
CN108108018A (en) * 2017-12-12 2018-06-01 歌尔科技有限公司 Commanding and training method, equipment and system based on virtual reality

Also Published As

Publication number Publication date
WO2014072786A9 (en) 2014-07-03
CH707208A2 (en) 2014-05-15

Similar Documents

Publication Publication Date Title
US11500475B2 (en) Dynamically balanced, multi-degrees-of-freedom hand controller
US9342151B2 (en) Hand motion-capturing device with force feedback system
Jin et al. Multi-LeapMotion sensor based demonstration for robotic refine tabletop object manipulation task
US10664002B2 (en) Multi-degrees-of-freedom hand held controller
US20190041891A1 (en) Dynamically Balanced Multi-Degrees-of-Freedom Hand Controller
US20200310561A1 (en) Input device for use in 2d and 3d environments
US10191544B2 (en) Hand gesture recognition system for controlling electronically controlled devices
EP3234742A2 (en) Methods and apparatus for high intuitive human-computer interface
WO2017009707A1 (en) Apparatus and method for hybrid type of input of buttons/keys and "finger writing" and low profile/variable geometry hand-based controller
BR112012011321B1 (en) method and system for manual control of a minimally invasive teleoperated auxiliary surgical instrument
NO339941B1 (en) System and method for a gesture-based management system
Duval et al. Skewer: a 3d interaction technique for 2-user collaborative manipulation of objects in virtual environments
Vargas et al. Gesture recognition system for surgical robot's manipulation
KR102170638B1 (en) Method for controlling interaction in virtual reality by tracking fingertips and VR system using it
Tanjung et al. The use of virtual reality controllers and comparison between vive, leap motion and senso gloves applied in the anatomy learning system
Segen et al. Look ma, no mouse!
Bai et al. Asymmetric Bimanual Interaction for Mobile Virtual Reality.
Aguerreche et al. Short paper: 3-hand manipulation of virtual objects
WO2014072786A2 (en) 3d hand-gesture to convey cartesian space arrangement plus all six degrees of freedom: three rotations and three translations along the cartesian axes
Cannan et al. A Multi-sensor armband based on muscle and motion measurements
Tseng et al. FingerMapper: Enabling Arm Interaction in Confined Spaces for Virtual Reality through Finger Mappings
Schlattmann et al. Real-time bare-hands-tracking for 3D games
Tatzgern et al. Exploring input approximations for control panels in virtual reality
Worrallo A multiple optical tracking based approach for enhancing hand-based interaction in virtual reality simulations
Minh et al. Haptic Smart Glove for Augmented and Virtual Reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13829069

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13829069

Country of ref document: EP

Kind code of ref document: A2