WO2017021902A1 - System and method for gesture based measurement of virtual reality space - Google Patents

System and method for gesture based measurement of virtual reality space Download PDF

Info

Publication number
WO2017021902A1
WO2017021902A1 PCT/IB2016/054679 IB2016054679W WO2017021902A1 WO 2017021902 A1 WO2017021902 A1 WO 2017021902A1 IB 2016054679 W IB2016054679 W IB 2016054679W WO 2017021902 A1 WO2017021902 A1 WO 2017021902A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
hand
augmented reality
hands
environment
Prior art date
Application number
PCT/IB2016/054679
Other languages
French (fr)
Inventor
Gautam TEWARI
Original Assignee
Smartvizs Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smartvizs Private Limited filed Critical Smartvizs Private Limited
Publication of WO2017021902A1 publication Critical patent/WO2017021902A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the invention relates to the field of virtual reality (VR). More specifically the invention relates to measurements in virtual or augmented reality environments using hand gestures.
  • VR virtual reality
  • Virtual reality is a technology for creation of and interacting in a computer simulated environment. It also includes the field of Augmented Reality (AR) which concerns providing an environment where the perception of a real-world environment (or data representing a real-world environment) is augmented or modified with com puter-generated virtual data.
  • AR Augmented Reality
  • data representing a real-world environment may be captured in real-time using sensory input devices such as a camera or microphone and augmented with com puter-generated virtual data including virtual images and virtual sounds.
  • the virtual data may also include information related to the real-world environment such as a text description associated with a real- world object in the real-world environment.
  • a VR or AR implementation may be used to enhance numerous applications including video game, mapping, navigation, and mobile device applications.
  • HMD Head Mounted Display
  • a virtual reality environment is controlled by an operator who is outside the virtual environment.
  • a user could interact with a virtual reality environment either through the use of standard input devices, such as a keyboard and mouse, through multimodal devices, such as a wired glove or through hand gestures.
  • gestures of the user have also come to be utilized as input method in such scenarios as some computer processing systems utilize gesture recognition as user input by interpreting human gestures.
  • gestures originate from human body motion, such as hand gestures.
  • User gesture input has grown in popularity as an alternative to established touch, pointing device, and speech input techniques.
  • a virtual reality environment consists of objects with which a user interacts.
  • Systems are available in the prior art for determining the location and tracking the position of an object in a virtual reality environment. Location systems are known which allow the presence or absence of an object in a specified environment to be confirmed or denied, and relative to one or more reference points to identify where in the environment the object is located.
  • Such measuring technology being employed in real world is not only complicated and difficult to learn but is also counter intuitive and error prone. Further, it is difficult to perceive the scale of measurement when using conventional techniques for measurements in real world environment.
  • Virtual reality systems using HMDs require tracking of the orientation and position of a user's head and hands with respect to a world coordinate frame in order to control view parameters for head mounted devices and allow manual interactions with the virtual world.
  • This tracking in VR setups has been achieved with a variety of mechanical, acoustic, magnetic, and optical systems. These systems require propagation of a signal between a fixed "source” and the tracked “sensor” and therefore limit the range of operation. They also require a degree of care in setting up the source or preparing the site that reduces their utility for field use.
  • the emerging fields of wearable computing and augmented reality require tracking systems to be wearable and capable of operating essentially immediately in arbitrary environments.
  • Virtual environments need to be intuitive in order to facilitate computer interaction.
  • the available methods for measurement in the virtual environments provide for on-screen tools that require user interaction with the use of input devices such as keyboard and mouse. This method does not hold good in the virtual or augmented reality environment where the user is operating the virtual environment from the inside, such as when wearing a head mounted display.
  • United States patent number US 7610558 The invention provides for measurement of position of hand and measurement of view position and view direction of a head mounted display for generating and displaying a mixed reality image of a mixed reality space. The invention does not provide for measuring distances, area or volume using hand gestures.
  • United States patent number 9013396 The invention provides for locating and tracking an object in a virtual reality environment using a virtual control panel within the virtual reality environment. The invention does not provide for measurements of or between objects using hand gestures in a virtual reality environment.
  • United States patent number 6892162 The invention provides for a position measuring apparatus comprising of position and distance determining units from a reference position using sensors. The invention however does not provide for determining any other unit of measurements such as area or volume and does not utilize hand gestures for determining real world measurements for objects in the virtual reality environment.
  • the invention provides a method wherein the virtual world data is associated with physical object sensed by head mounted display and virtual world data is associated with movement, location or direction of the user enabling interactions with the virtual world data using head mounted display.
  • the invention does not provide for interacting with virtual world data using hand gestures and also does not provide for enabling measurements in virtual world data to correspond with real world data.
  • the present invention provides a novel system and method for measuring real world distances between two objects or points, area of a space or volume of an object in a VR or AR environment in real-time.
  • Figure 1 illustrate the hand gesture with initial position of stretched hands in the AR/VR environment in accordance with an embodiment of the invention.
  • Figure 2 illustrate the hand gesture of moving of hands away from each other in order to measure the distance in AR/VR environment in accordance with an embodiment of the invention.
  • Figure 3 illustrate the hand gesture of turning of palm into fist in order to save/ mark the points in the ARA/R environment in accordance with an embodiment of the invention.
  • Figure 4 illustrate the hand gesture of moving of hand (palm) in any direction within the ARA/R environment in accordance with an embodiment of the invention.
  • Figure 5 illustrate the hand gesture of turning hand(s) into fist to save/ mark other points in any direction within the AR/VR environment in accordance with an embodiment of the invention.
  • Figure 6 illustrate the hand gesture of resting the hands on the sides of the user indicating ending of hand gesture in AR/VR environment in accordance with an embodiment of the invention.
  • Figure 7 is a flow chart illustrating the method of gesture based measurements of virtual reality space in accordance with an embodiment of the present invention.
  • the disclosed system comprises of a Head Mounted Display (HMD) 102 for displaying the virtual or augmented reality environment to the user 108.
  • the virtual reality image can be projected in front of the eyes of a user 108 using the HMD 102, so that the only image seen by the user 108 is the virtual image.
  • Translucent or partially reflective materials and cameras at the back of HMD 102 can also be used so that a user 108 can view both the real world environment and the virtual environment at the same time.
  • the system also comprises of an Infrared (IR) radiation based depth sensor mounted on top of the HMD 102.
  • IR Infrared
  • two hand held motion controllers with inbuilt sensors such as gyroscope, accelerometer and IR emitters can be utilized.
  • the system further comprises of an Infrared camera to detect the world position of the HMD 102 and the hand held motion controllers.
  • the system components are in wired or wireless communication with a computer system 106 which may or may not be connected to a network.
  • the virtual image or the Computer Generated Image (CGI) may also be superimposed on a real image, i.e. the virtual / augmented image can be displayed on the HMD 102 by tethering the image onto the HMD 102 from a computer system 106.
  • CGI Computer Generated Image
  • the HMD 102 can comprise of on-board microprocessor and operating system which allows applications to run locally on the HMD 102 thus enabling them as standalone devices without the need of any external device for generating video.
  • the processor atleast one of the microprocessor based computing system 106 or the HMD 102 further comprises of software programme instructions enabling the HMD 102 to project VR or AR environment.
  • the virtual or augmented environment is displayed to the user 108 through the Head Mounted Display 102.
  • the user 108 then initiates a gesture such as a hand based gesture by getting the hands to a pre-defined spatial position 100.
  • Fig. 1 illustrate the hand gesture for initial position 100 of stretched hands 104 in the ARA/R environment in accordance with an embodiment of the invention.
  • the Infrared based depth sensor on the top of the head mounted display 102 recognizes the gesture and reads the position of the hands and fingers 104 After determining the position of the hands 104, the position of the head is recognized with respect to the hands 104.
  • 104 is recognized by the motion controllers in the two hands, utilizing sensors such as gyroscope, accelerometer and IR emitters.
  • sensors such as gyroscope, accelerometer and IR emitters.
  • Infrared rays are cast from the IR emitters on the top of the HMD 102 in the general direction of the hands 104 of the user 108.
  • the user 108 moves either of the hands 104 from the pre-defined initial position 100 or both his hands 104 away from each other 200 as illustrated in Fig. 2 in accordance with the present embodiment, the displaced position with respect to the head of the user 108 is recognized in real time.
  • the raw positional data of the hands 104 in reference to the head of the user 108 is continuously being fed into the computer system 106 which is in communication with the position measuring devices i.e. head mounted display 102 and the infrared depth sensor or motion sensors.
  • a collision point in the virtual space is obtained corresponding to the position of the hands 104 in the real world.
  • the direction of moving hands 200 is obtained as a vector from the start of reference point which could be the tip or palm of the hand 104 to the tip or palm of the other hand 104.
  • the colliding points of rays cast from hand positions act as starting and ending points for distance calculation.
  • the software programme instructions deployed on the microprocessor based computing device 106 calculates the distance between the two points or objects in the virtual environment based on the positional information of the hands 104 in the real world environment. The calculated distance is rendered onto the head mounted display 102 for the user 108 to visualize.
  • the hand gesture of turning hand into fist 302 is recognized by the HMD 102 as a save / mark location in the AR/VR environment.
  • the virtual world position is recognized as a gesture for intermediary position 300 and which position is to be utilized as an intermediary position for measurement of various aspects between the saved / marked points.
  • the hand 104 is moved in any direction 400 and the sensor on the HMD 102 keeps a track on the position of the hand 104 within the AR/VR environment.
  • Figure 5 illustrates the hand gesture of turning hand 104 into fist 302 to save/ mark other points in any direction 500 within the AR/VR environment in accordance with an embodiment of the invention.
  • the system thus recognizes a set of intermediary positions 300 in the virtual or augmented reality environment.
  • the present invention provides measurement information of the objects or between the objects at varying depths and distances in the ARA/R environment.
  • the user 108 rests the hand(s) 104 on his sides indicating end of the hand gesture 600.
  • the HMD 102 and / or the computer system 106 displays the indicated measurements between the initial, intermediary and end positions.
  • Figure 7 is a flow chart illustrating steps involved in a method of taking gesture based measurements of virtual reality space. The process is initiated at step 702 when hands are brought to the initial position. At step 704, orientation of the hands is detected with the help of depth sensors on HMD or on motion controllers on hands. [0056] At step 706, the computing system calculate the world positions of the hands based on the data generated by depth sensor. At step 708, computing system detects movement of hands from initial position and calculates the distance between the two hands / points.
  • step 710 measurements, such as distance, area, volume, angles etc are displayed on the display means in real time.
  • step 712 the system detects if the palms of the user are closed and then reopened which indicates gesture for intermediate position. If yes, the system returns to step 704 where the orientation and movement of user's hands are detected in real time. If no, the system waits for the end gesture by user, for example, bringing down the hands to the sides of the user in accordance with the present embodiment.
  • the head mounted display 102 has on-board microprocessor and operating system for measuring and displaying the measurements calculated based on the raw positional data of the hands 104 in reference to the head.
  • the measurements as displayed on the head mounted display 102 is updated in real time with the change in position of the hands 104. Such measurements can be displayed simultaneously on the HMD 102 as well as the connected computer system 106, if any.
  • the process of displaying and updating the measurements continue till the time the head mounted display 102 recognizes the human gesture suggesting the end of the gesture 600, such as the user 108 putting his hands down in accordance with the present embodiment.
  • the measurement of distance in the virtual or augmented environment happens in multiple planes or in multiple dimensions.
  • the user 108 can move his hands 104 in any desired space or path and the updated measurements, based on the position of the hands 104 is displayed to the user 108 in real time.
  • the user 108 is provided with visual clues, such as a line drawn between the tips of the hands 104, to determine the movement of the hand 104 in the real world environment and the corresponding location of the objects or points in the virtual environment.
  • the visual clue for the user 108 can also comprise of a virtual ruler or scale appearing along with the line drawn between the tips of the hands 104, to determine the movement of the hand 104 corresponding with the virtual reality position.
  • the virtual scale can provide the distance in any unit of distance measurement as may be desired by the user 108.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present invention provides a novel system and an easy to use and intuitive method for measurements in a virtual reality environment for a user who is controlling the environment while being in it as opposed to being an external operator, by using wearable technology by way of Head Mounted Display for interacting with the virtual reality environment. The system can be utilized for measuring area of a space, volume of an object or angle and distances between two objects or points in a virtual or augmented reality environment in real-time utilizing wearable computing technology and human gestures as input method.

Description

SYSTEM AND METHOD FOR GESTURE BASED MEASUREMENTS OF
VIRTUAL REALITY SPACE
FIELD OF INVENTION
[001 ] The invention relates to the field of virtual reality (VR). More specifically the invention relates to measurements in virtual or augmented reality environments using hand gestures.
BACKGROUND
[002] Virtual reality is a technology for creation of and interacting in a computer simulated environment. It also includes the field of Augmented Reality (AR) which concerns providing an environment where the perception of a real-world environment (or data representing a real-world environment) is augmented or modified with com puter-generated virtual data. For example, data representing a real-world environment may be captured in real-time using sensory input devices such as a camera or microphone and augmented with com puter-generated virtual data including virtual images and virtual sounds. The virtual data may also include information related to the real-world environment such as a text description associated with a real- world object in the real-world environment. A VR or AR implementation may be used to enhance numerous applications including video game, mapping, navigation, and mobile device applications.
[003] Most current virtual reality environments are primarily visual experiences, displayed either on a computer screen or through special stereoscopic displays. Very recently, Head Mounted Display (HMD) has been added to the list of devices on which virtual reality environment / content can be displayed. A HMD may be worn by a user to view the mixed imagery of virtual and real objects.
[004] Conventionally a virtual reality environment is controlled by an operator who is outside the virtual environment. A user could interact with a virtual reality environment either through the use of standard input devices, such as a keyboard and mouse, through multimodal devices, such as a wired glove or through hand gestures.
[005] With the advancement of technology, a user can now control the virtual reality environment while being in it. Gestures of the user have also come to be utilized as input method in such scenarios as some computer processing systems utilize gesture recognition as user input by interpreting human gestures. Typically, gestures originate from human body motion, such as hand gestures. User gesture input has grown in popularity as an alternative to established touch, pointing device, and speech input techniques.
[006] Whether real or imagined, a virtual reality environment consists of objects with which a user interacts. Systems are available in the prior art for determining the location and tracking the position of an object in a virtual reality environment. Location systems are known which allow the presence or absence of an object in a specified environment to be confirmed or denied, and relative to one or more reference points to identify where in the environment the object is located.
[007] In a real world environment, the measurement of distances between two reference points has been achieved by use of various methods such as use of ultrasonic or electromagnetic pulses to calculate distances by starting a high speed counter simultaneously with the emission of pulse from a movable pointer, and then stopping the counter when the pulse is received at one or more stationary positions. Such methods have been employed in a number of forms such for locating vehicles in automatic guidance transport systems.
[008] Such measuring technology being employed in real world is not only complicated and difficult to learn but is also counter intuitive and error prone. Further, it is difficult to perceive the scale of measurement when using conventional techniques for measurements in real world environment.
[009] Virtual reality systems using HMDs require tracking of the orientation and position of a user's head and hands with respect to a world coordinate frame in order to control view parameters for head mounted devices and allow manual interactions with the virtual world. This tracking in VR setups has been achieved with a variety of mechanical, acoustic, magnetic, and optical systems. These systems require propagation of a signal between a fixed "source" and the tracked "sensor" and therefore limit the range of operation. They also require a degree of care in setting up the source or preparing the site that reduces their utility for field use.
[0010] Further, the emerging fields of wearable computing and augmented reality require tracking systems to be wearable and capable of operating essentially immediately in arbitrary environments. Virtual environments need to be intuitive in order to facilitate computer interaction. [0011 ] The available methods for measurement in the virtual environments provide for on-screen tools that require user interaction with the use of input devices such as keyboard and mouse. This method does not hold good in the virtual or augmented reality environment where the user is operating the virtual environment from the inside, such as when wearing a head mounted display.
PRIOR ART
[0012] While some prior art virtual reality measurement techniques have attempted to overcome one or more of the problems identified above, no system and method exists in the prior art which provides for measuring area of a space, volume of an object or angle and distances between two objects or points in a virtual reality or augmented reality environment utilizing human gestures and wearable computing technology.
[0013] Some of the prior art patents and publications are discussed below:
[0014] United States patent number US 7610558: The invention provides for measurement of position of hand and measurement of view position and view direction of a head mounted display for generating and displaying a mixed reality image of a mixed reality space. The invention does not provide for measuring distances, area or volume using hand gestures.
[0015] United States patent number 9013396: The invention provides for locating and tracking an object in a virtual reality environment using a virtual control panel within the virtual reality environment. The invention does not provide for measurements of or between objects using hand gestures in a virtual reality environment. [0016] United States patent number 6892162: The invention provides for a position measuring apparatus comprising of position and distance determining units from a reference position using sensors. The invention however does not provide for determining any other unit of measurements such as area or volume and does not utilize hand gestures for determining real world measurements for objects in the virtual reality environment.
[0017] International Patent Application no. PCT/US2014/023739: The invention provides a method wherein the virtual world data is associated with physical object sensed by head mounted display and virtual world data is associated with movement, location or direction of the user enabling interactions with the virtual world data using head mounted display. The invention does not provide for interacting with virtual world data using hand gestures and also does not provide for enabling measurements in virtual world data to correspond with real world data.
[0018] The above systems do not measure characteristics other than position and direction. A number of other systems are also available with similar inherent disadvantages. Therefore, there is a need in the art for a system and methods for measurement of various aspects of virtual space, to be utilized by a user within virtual or augmented reality environment such as by using hand gestures as input method.
SUMMARY OF INVENTION
[0019] In view of the foregoing disadvantages inherent in the known systems of measurements in virtual or augmented reality environments, the present invention provides a novel system and method for measuring real world distances between two objects or points, area of a space or volume of an object in a VR or AR environment in real-time.
[0020] It is an object of the invention to provide a system and method for comparison of measurements among two or more objects placed in a VR or AR environment.
[0021 ] It is also an object of the invention to utilize human gestures as input method for measurement of distances, area or volume in VR or AR environments.
[0022] It is also an object of the invention to provide an easy to use and intuitive method for measurements in a virtual reality environment for a user who is controlling the environment while being in it as opposed to being an external operator.
[0023] It is also an object of this invention to use wearable technology by way of Head Mounted Display for interacting with the virtual reality environment for the purpose of measuring the distance between the objects, area of a space or volume of an object in the virtual or augmented environment.
[0024] It is also an object of this invention to significantly improve on measuring 3D spaces in real-time using intuitive gestures that don't involve the usage of active input devices and relies purely on passive devices such as motion trackers/sensors. This provides a natural way to measure distances in a Virtual Reality/Augmented Reality environment.
[0025] Additional aspects, advantages, features and objects of the present disclosure would be apparent from the description of the illustrative embodiments that follow.
BRIEF DESCRIPTION OF DRAWINGS
[0026] The following detailed description of illustrative embodiments is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the invention is not limited to specific methods and instrumentalities disclosed herein. Moreover, those in the art will understand that the drawings are not to scale. Wherever possible, like elements have been indicated by identical numbers.
[0027] Figure 1 illustrate the hand gesture with initial position of stretched hands in the AR/VR environment in accordance with an embodiment of the invention.
[0028] Figure 2 illustrate the hand gesture of moving of hands away from each other in order to measure the distance in AR/VR environment in accordance with an embodiment of the invention.
[0029] Figure 3 illustrate the hand gesture of turning of palm into fist in order to save/ mark the points in the ARA/R environment in accordance with an embodiment of the invention.
[0030] Figure 4 illustrate the hand gesture of moving of hand (palm) in any direction within the ARA/R environment in accordance with an embodiment of the invention.
[0031 ] Figure 5 illustrate the hand gesture of turning hand(s) into fist to save/ mark other points in any direction within the AR/VR environment in accordance with an embodiment of the invention.
[0032] Figure 6 illustrate the hand gesture of resting the hands on the sides of the user indicating ending of hand gesture in AR/VR environment in accordance with an embodiment of the invention.
[0033] Figure 7 is a flow chart illustrating the method of gesture based measurements of virtual reality space in accordance with an embodiment of the present invention.
DESCRIPTION
[0034] The invention is described in detail below with reference to several embodiments and examples. Such discussion is for purposes of illustration only. Modifications to examples within the spirit and scope of the present invention, will be readily apparent to one of skilled in the art. Terminology used throughout the specification herein is given its ordinary meaning as supplemented by the discussion immediately below. As used in the specification, the singular forms "a", "an" and "the" include plural references unless the context clearly dictates otherwise.
[0035] Those with ordinary skill in the art will appreciate that the elements in the Figures are illustrated for simplicity and clarity and are not necessarily drawn to scale. There may be additional components described in the foregoing application that are not depicted on one of the described drawings. In the event such a component is described, but not depicted in a drawing, the absence of such a drawing should not be considered as an omission of such design from the specification. [0036] The disclosed system comprises of a Head Mounted Display (HMD) 102 for displaying the virtual or augmented reality environment to the user 108. The virtual reality image can be projected in front of the eyes of a user 108 using the HMD 102, so that the only image seen by the user 108 is the virtual image. Translucent or partially reflective materials and cameras at the back of HMD 102 can also be used so that a user 108 can view both the real world environment and the virtual environment at the same time.
[0037] The system also comprises of an Infrared (IR) radiation based depth sensor mounted on top of the HMD 102. Alternatively, two hand held motion controllers with inbuilt sensors such as gyroscope, accelerometer and IR emitters can be utilized.
[0038] The system further comprises of an Infrared camera to detect the world position of the HMD 102 and the hand held motion controllers.
[0039] The system components are in wired or wireless communication with a computer system 106 which may or may not be connected to a network. The virtual image or the Computer Generated Image (CGI) may also be superimposed on a real image, i.e. the virtual / augmented image can be displayed on the HMD 102 by tethering the image onto the HMD 102 from a computer system 106.
[0040] In an embodiment of the present invention, the HMD 102 can comprise of on-board microprocessor and operating system which allows applications to run locally on the HMD 102 thus enabling them as standalone devices without the need of any external device for generating video. [0041 ] Depending upon the placement of the processor, atleast one of the microprocessor based computing system 106 or the HMD 102 further comprises of software programme instructions enabling the HMD 102 to project VR or AR environment.
[0042] In accordance with the present embodiment, the virtual or augmented environment is displayed to the user 108 through the Head Mounted Display 102. The user 108 then initiates a gesture such as a hand based gesture by getting the hands to a pre-defined spatial position 100.
[0043] Referring now to Figures, Fig. 1 illustrate the hand gesture for initial position 100 of stretched hands 104 in the ARA/R environment in accordance with an embodiment of the invention.
[0044] The Infrared based depth sensor on the top of the head mounted display 102 recognizes the gesture and reads the position of the hands and fingers 104 After determining the position of the hands 104, the position of the head is recognized with respect to the hands 104.
[0045] In an embodiment of the present invention, the position of the hands
104 is recognized by the motion controllers in the two hands, utilizing sensors such as gyroscope, accelerometer and IR emitters.
[0046] Infrared rays are cast from the IR emitters on the top of the HMD 102 in the general direction of the hands 104 of the user 108. When the user 108 moves either of the hands 104 from the pre-defined initial position 100 or both his hands 104 away from each other 200 as illustrated in Fig. 2 in accordance with the present embodiment, the displaced position with respect to the head of the user 108 is recognized in real time. [0047] The raw positional data of the hands 104 in reference to the head of the user 108 is continuously being fed into the computer system 106 which is in communication with the position measuring devices i.e. head mounted display 102 and the infrared depth sensor or motion sensors.
[0048] A collision point in the virtual space is obtained corresponding to the position of the hands 104 in the real world. The direction of moving hands 200 is obtained as a vector from the start of reference point which could be the tip or palm of the hand 104 to the tip or palm of the other hand 104. The colliding points of rays cast from hand positions act as starting and ending points for distance calculation.
[0049] The software programme instructions deployed on the microprocessor based computing device 106 calculates the distance between the two points or objects in the virtual environment based on the positional information of the hands 104 in the real world environment. The calculated distance is rendered onto the head mounted display 102 for the user 108 to visualize.
[0050] Referring to Fig. 3, in accordance with the present embodiment of the invention, the hand gesture of turning hand into fist 302 is recognized by the HMD 102 as a save / mark location in the AR/VR environment. When the fist 302 is reopened, the virtual world position is recognized as a gesture for intermediary position 300 and which position is to be utilized as an intermediary position for measurement of various aspects between the saved / marked points.
[0051 ] Referring to Figure 4, the hand 104 is moved in any direction 400 and the sensor on the HMD 102 keeps a track on the position of the hand 104 within the AR/VR environment.
[0052] Figure 5 illustrates the hand gesture of turning hand 104 into fist 302 to save/ mark other points in any direction 500 within the AR/VR environment in accordance with an embodiment of the invention. The system thus recognizes a set of intermediary positions 300 in the virtual or augmented reality environment.
[0053] In an embodiment of the present invention, several intermediary positions 300 are detected in same or multiple planes of the AR/VR environment thus allowing measurements of distances, area, volume, surface area, angle and other similar measurements. The system indicates such measurements in real time which are visible to the user 108 through the HMD 102. The present invention thus provides measurement information of the objects or between the objects at varying depths and distances in the ARA/R environment.
[0054] Referring now to Figure 6, the user 108 rests the hand(s) 104 on his sides indicating end of the hand gesture 600. The HMD 102 and / or the computer system 106 displays the indicated measurements between the initial, intermediary and end positions.
[0055] Figure 7 is a flow chart illustrating steps involved in a method of taking gesture based measurements of virtual reality space. The process is initiated at step 702 when hands are brought to the initial position. At step 704, orientation of the hands is detected with the help of depth sensors on HMD or on motion controllers on hands. [0056] At step 706, the computing system calculate the world positions of the hands based on the data generated by depth sensor. At step 708, computing system detects movement of hands from initial position and calculates the distance between the two hands / points.
[0057] At step 710, measurements, such as distance, area, volume, angles etc are displayed on the display means in real time. At step 712, the system detects if the palms of the user are closed and then reopened which indicates gesture for intermediate position. If yes, the system returns to step 704 where the orientation and movement of user's hands are detected in real time. If no, the system waits for the end gesture by user, for example, bringing down the hands to the sides of the user in accordance with the present embodiment.
[0058] In an exemplary embodiment of the present invention, the head mounted display 102 has on-board microprocessor and operating system for measuring and displaying the measurements calculated based on the raw positional data of the hands 104 in reference to the head.
[0059] The measurements as displayed on the head mounted display 102 is updated in real time with the change in position of the hands 104. Such measurements can be displayed simultaneously on the HMD 102 as well as the connected computer system 106, if any.
[0060] The process of displaying and updating the measurements continue till the time the head mounted display 102 recognizes the human gesture suggesting the end of the gesture 600, such as the user 108 putting his hands down in accordance with the present embodiment. [0061 ] In an embodiment of the present invention, the measurement of distance in the virtual or augmented environment happens in multiple planes or in multiple dimensions.
[0062] In another embodiment of the present invention, the user 108 can move his hands 104 in any desired space or path and the updated measurements, based on the position of the hands 104 is displayed to the user 108 in real time.
[0063] In an embodiment of the present invention, the user 108 is provided with visual clues, such as a line drawn between the tips of the hands 104, to determine the movement of the hand 104 in the real world environment and the corresponding location of the objects or points in the virtual environment.
[0064] In another embodiment of the present invention, the visual clue for the user 108 can also comprise of a virtual ruler or scale appearing along with the line drawn between the tips of the hands 104, to determine the movement of the hand 104 corresponding with the virtual reality position. The virtual scale can provide the distance in any unit of distance measurement as may be desired by the user 108.
[0065] While the invention has been described in detail, modifications within the spirit and scope of the invention will be readily apparent to those of skill in the art. Such modifications are also to be considered as part of the present invention. In view of the foregoing discussion, relevant knowledge in the art and references or information discussed above in connection with the Background of the Invention, the disclosures of which are all incorporated herein by reference, further description is deemed unnecessary. In addition, it should be understood that aspects of the invention and portions of various embodiments may be combined or interchanged either in whole or in part. Furthermore, those of ordinary skill in the art will appreciate that the foregoing description is by way of example only, and is not intended to limit the invention. 6] Thus, the present invention has been described herein with reference to a particular embodiment for a particular application. Those having ordinary skill in the art and access to the present teachings may recognize additional various substitutions and alterations are also possible without departing from the spirit and scope of the present invention, and as defined by the following claims.

Claims

Claims
1 . A virtual / augmented reality space measuring system, comprising of: a display system for displaying the virtual / augmented reality images; a detection system for detecting the user's hands and its movements; and a processing system for representing a hand as a vector, for processing the vectors to recognize hand gestures and for analysing the hand gesture to determine measurements in a virtual / augmented reality space.
2. A virtual / augmented reality space measuring system as claimed in claim 1 , wherein the detection system comprises of: at least one infra-red radiation emitter / depth sensor for emitting infra-red rays to collide with the hand and detect hand position; and at least one infrared camera for detecting the world position of the infrared radiation emitter / depth sensor.
3. A virtual / augmented reality space measuring system as claimed in claim 1 , wherein the detection system comprises of: hand held motion controllers with embedded infra-red Light Emitting Diodes (LED) and inbuilt sensors, namely, gyroscope and accelero- meter, to track hand position and motion; and at least one infrared camera for detecting the world position of hand held motion controllers.
4. A virtual / augmented reality space measuring system as claimed in claim 1 , wherein the display system is a wearable Head Mounted Display (HMD); the HMD further comprising of at least one infrared emitter for tracking head positioning.
5. A virtual / augmented reality space measuring system as claimed in claim 1 , wherein the processing system consists of: at least one processor; memory; and machine readable instructions stored in the memory that, when executed by at least one processors, cause the system to carry out function of detecting the hands and its movement for determining a hand gesture and analysing the hand gesture to determine measurements in a virtual / augmented reality space.
6. A virtual / augmented reality space measuring method comprising the steps of: determining, from the collision of the infrared rays from the infrared emitter onto the two open hands of the user, the start of the hand gesture; determining, by the infrared camera, the position of the head with respect to the hands; determining initial position of the two open hands as reference position in the virtual / augmented reality environment; detecting movement of the at least one open hand from the reference position to second position of the virtual / augmented reality space; feeding the positional data of hands in reference to the head / HMD into the processing means; determining the direction of moving hands as a vector from the start of reference point; determining the distance between the reference position and the second position in terms of virtual / augmented reality environment; and displaying the measured distance on the display means.
7. The method as claimed in claim 6 further comprising the steps of: making the determination of the hand gesture of closing of the hand into a fist at the second position and then reopening it to determine the second position as an intermediary position; and detecting movement of the hand from the one intermediary position to another intermediary position or the end position, wherein the end position is determined by putting down at least one hand.
8. The method as claimed in Claim 6 wherein the hands are moved and hand gestures are used to define several intermediary positions in multiple planes of the virtual / augmented reality environment allowing measurement of distance, area, volume, surface area, angle and other similar measurements.
9. The method as claimed in Claim 6 wherein the depth of an object is measured by moving the hand towards an object in the virtual / augmented reality environment.
10. The method as claimed in Claim 6 wherein the dimensions of an object are measured by moving the hand around the edges of the object in the virtual / augmented reality environment.
11. The method as claimed in Claim 6 wherein visual clues are shown in the display means by way of a line between the hands, optionally with a virtual ruler or scale appearing along with the line drawn between the hands, to determine the movement of the hand corresponding with the virtual / augmented reality position.
PCT/IB2016/054679 2015-08-03 2016-08-03 System and method for gesture based measurement of virtual reality space WO2017021902A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN2371DE2015 2015-08-03
IN2371/DEL/2015 2015-08-03

Publications (1)

Publication Number Publication Date
WO2017021902A1 true WO2017021902A1 (en) 2017-02-09

Family

ID=57942537

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2016/054679 WO2017021902A1 (en) 2015-08-03 2016-08-03 System and method for gesture based measurement of virtual reality space

Country Status (1)

Country Link
WO (1) WO2017021902A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170302908A1 (en) * 2016-04-19 2017-10-19 Motorola Mobility Llc Method and apparatus for user interaction for virtual measurement using a depth camera system
CN109101120A (en) * 2017-06-21 2018-12-28 腾讯科技(深圳)有限公司 The method and apparatus that image is shown
CN110892364A (en) * 2017-07-20 2020-03-17 高通股份有限公司 Augmented reality virtual assistant
CN111108462A (en) * 2017-09-27 2020-05-05 苹果公司 Ranging and accessory tracking for head-mounted display systems
CN111448542A (en) * 2017-09-29 2020-07-24 苹果公司 Displaying applications in a simulated reality environment
WO2022179279A1 (en) * 2021-02-26 2022-09-01 华为技术有限公司 Interaction method, electronic device, and interaction system
US11765318B2 (en) 2019-09-16 2023-09-19 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
US11800059B2 (en) 2019-09-27 2023-10-24 Apple Inc. Environment for remote communication
US11893964B2 (en) 2019-09-26 2024-02-06 Apple Inc. Controlling displays
US11960641B2 (en) 2018-09-28 2024-04-16 Apple Inc. Application placement based on head position

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012011A1 (en) * 1997-03-31 2001-08-09 Mark Leavy Camera-based interface to a virtual reality application

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010012011A1 (en) * 1997-03-31 2001-08-09 Mark Leavy Camera-based interface to a virtual reality application

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170302908A1 (en) * 2016-04-19 2017-10-19 Motorola Mobility Llc Method and apparatus for user interaction for virtual measurement using a depth camera system
CN109101120B (en) * 2017-06-21 2021-09-28 腾讯科技(深圳)有限公司 Method and device for displaying image
CN109101120A (en) * 2017-06-21 2018-12-28 腾讯科技(深圳)有限公司 The method and apparatus that image is shown
CN110892364A (en) * 2017-07-20 2020-03-17 高通股份有限公司 Augmented reality virtual assistant
US11727625B2 (en) 2017-07-20 2023-08-15 Qualcomm Incorporated Content positioning in extended reality systems
CN111108462A (en) * 2017-09-27 2020-05-05 苹果公司 Ranging and accessory tracking for head-mounted display systems
CN111108462B (en) * 2017-09-27 2023-07-18 苹果公司 Ranging and accessory tracking for head mounted display systems
CN111448542B (en) * 2017-09-29 2023-07-11 苹果公司 Display application
CN111448542A (en) * 2017-09-29 2020-07-24 苹果公司 Displaying applications in a simulated reality environment
US11960641B2 (en) 2018-09-28 2024-04-16 Apple Inc. Application placement based on head position
US11765318B2 (en) 2019-09-16 2023-09-19 Qualcomm Incorporated Placement of virtual content in environments with a plurality of physical participants
US11893964B2 (en) 2019-09-26 2024-02-06 Apple Inc. Controlling displays
US11800059B2 (en) 2019-09-27 2023-10-24 Apple Inc. Environment for remote communication
WO2022179279A1 (en) * 2021-02-26 2022-09-01 华为技术有限公司 Interaction method, electronic device, and interaction system

Similar Documents

Publication Publication Date Title
WO2017021902A1 (en) System and method for gesture based measurement of virtual reality space
CN110647237B (en) Gesture-based content sharing in an artificial reality environment
US11157725B2 (en) Gesture-based casting and manipulation of virtual content in artificial-reality environments
JP7283506B2 (en) Information processing device, information processing method, and information processing program
JP6979475B2 (en) Head-mounted display tracking
US10783712B2 (en) Visual flairs for emphasizing gestures in artificial-reality environments
US20190238755A1 (en) Method and apparatus for push interaction
JP2022535316A (en) Artificial reality system with sliding menu
WO2017136125A2 (en) Object motion tracking with remote device
US20180225837A1 (en) Scenario extraction method, object locating method and system thereof
TW201911133A (en) Controller tracking for multiple degrees of freedom
WO2014093608A1 (en) Direct interaction system for mixed reality environments
KR20160080109A (en) Systems and techniques for user interface control
KR20140059109A (en) System and method for human computer interaction
CN110603510A (en) Position and orientation tracking of virtual controllers in virtual reality systems
WO2018038136A1 (en) Image display device, image display method, and image display program
KR20130068191A (en) 3d interface device and method based motion tracking of user
JP6777391B2 (en) Computer equipment and methods for 3D interactions
EP3811186B1 (en) Input scaling to keep controller inside field of view
JP2009258884A (en) User interface
RU2670649C9 (en) Method of manufacturing virtual reality gloves (options)
JP4678428B2 (en) Virtual space position pointing device
RU2673406C1 (en) Method of manufacturing virtual reality glove
US20190369713A1 (en) Display control apparatus, display control method, and program
KR20110045564A (en) Apparatus for controlling 3 dimensional avatar and method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16832401

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16832401

Country of ref document: EP

Kind code of ref document: A1