WO2015123771A1 - Suivi de gestes et commande en réalité augmentée et virtuelle - Google Patents
Suivi de gestes et commande en réalité augmentée et virtuelle Download PDFInfo
- Publication number
- WO2015123771A1 WO2015123771A1 PCT/CA2015/050120 CA2015050120W WO2015123771A1 WO 2015123771 A1 WO2015123771 A1 WO 2015123771A1 CA 2015050120 W CA2015050120 W CA 2015050120W WO 2015123771 A1 WO2015123771 A1 WO 2015123771A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- gesture
- information
- orientation
- processor
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/015—Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
Definitions
- the following relates generally to biometric tracking systems for augmented reality and virtual reality applications, and more specifically to biometric tracking systems for tracking a user's position and gestures.
- AR augmented reality
- VR virtual reality
- a system for gesture tracking and control for augmented and virtual reality applications comprising: (a) a user object coupled to a user, the user object comprising: (i) a biometric module comprising a biometric sensor for determining biometric information for a user; biometric information comprising gesture information; and (ii) a location, motion and orientation module for determining location, motion and orientation information for the user; and (b) a processor in communication with the user object, the processor configured to: (i) obtain the location, motion and orientation information for the user object; (ii) obtain the biometric information for the user object; (iii) determine location and orientation of the user object from the location, motion and orientation information; and (iv) determine one or more gestures being performed by the user from the biometric information.
- a method for gesture tracking and control for augmented and virtual reality applications comprising: (a) providing a user object to be coupled to a user, the user object comprising: (i) a biometric module comprising a biometric sensor for determining biometric information for a user; biometric information comprising gesture information; and (ii) a location, motion and orientation module for determining location, motion and orientation information for the user; and (b) configuring a processor to be in communication with the user object, the processor configured to: (i) obtain the location, motion and orientation information for the user object; (ii) obtain the biometric information for the user object; (iii) determine location and orientation of the user object from the location, motion and orientation information; and (iv) determine one or more gestures being performed by the user from the biometric information.
- FIG. 1 illustrates in schematic form a system for tracking user objects in a physical environment occupied by a plurality of users equipped with head mounted displays;
- FIG. 2 illustrates an exemplary configuration of biometric tracking systems upon a user equipped with an HMD
- FIG. 3 illustrates an exemplary configuration of a head mounted display comprising components of a biometric tracking system
- FIG. 4 illustrates an exemplary configuration of a peripheral comprising a biometric tracking system
- FIG. 5 is a flowchart illustrating a method of processing biometric information relating to a gesture
- Fig. 6 is a flowchart illustrating a method of processing biometric information relating to a user's vital signs.
- Fig. 7 is a flowchart illustrating a method of processing information from location, motion and orientation systems as well as biometric information.
- any module, unit, component, server, computer, terminal or device exemplified herein that executes instructions may include or otherwise have access to computer readable media such as storage media, computer storage media, or data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape.
- Computer storage media may include volatile and non-volatile, removable and nonremovable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data.
- Examples of computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by an application, module, or both. Any such computer storage media may be part of the device or accessible or connectable thereto.
- any processor or controller set out herein may be implemented as a singular processor or as a plurality of processors. The plurality of processors may be arrayed or distributed, and any processing function referred to herein may be carried out by one or by a plurality of processors, even though a single processor may be exemplified. Any method, application or module herein described may be implemented using computer readable/executable instructions that may be stored or otherwise held by such computer readable media and executed by the one or more processors.
- AR augmented reality
- AR includes: the interaction by a user with real physical objects and structures along with virtual objects and structures overlaid thereon; and the interaction by a user with a fully virtual set of objects and structures that are generated to include renderings of physical objects and structures and that may comply with scaled versions of physical environments to which virtual objects and structures are applied, which may alternatively be referred to as an "enhanced virtual reality".
- the virtual objects and structures could be dispensed with altogether, and the AR system may display to the user a version of the physical environment which solely comprises an image stream of the physical environment.
- VR virtual reality
- the system provides user objects, which include biometric tracking systems that comprises a biometric tracker in communication with a processor.
- the biometric tracker is configured to obtain biometric information of a user and comprises one or more location, motion and orientation tracking modules.
- the processor is configured to obtain the biometric information and location, motion and/or orientation information from the biometric tracker.
- the processor processes the biometric and location, motion and/or orientation information in order to determine characteristics of the user at the user object.
- the user object is a wearable device worn by a user.
- the wearable device comprises an electromyogram, electrocardiogram and/or photoplethysmogram.
- Embodiments of biometric tracking (“BT”) systems are provided herein for use in AR applications for tracking location, motion and/or orientation (“LMO”) information and biometric information of users.
- the BT systems may be provided in head mounted displays (“HMDs”) worn by users, as well as in wearable peripherals (“peripherals”) coupled to the user.
- HMDs head mounted displays
- peripherals wearable peripherals
- User associated HMDs and peripherals are collectively referred to as "user objects”.
- Embodiments of BT systems comprising location, motion and orientation modules ("LMO modules”) for tracking location, motion and/or orientation information of a user object, and further comprising biometric modules comprising biometric sensors for tracking biometric information of the user at the user object.
- LMO information may be provided to a processor for processing in order to determine the LMO of the user object.
- the LMO of the user object may further be processed according to a predetermined kinematic relationship between the user object and an associated virtual object in an augmented reality or virtual reality application, to determine the position and orientation of the virtual object associated with the user object.
- BT systems further comprise biometric tracking modules.
- the biometric tracking modules generate biometric information corresponding to the user at the user object.
- the biometric information may include, for example, information related to the gestures, heart rate, or other biometric information.
- the biometric information may be provided to a processor for processing in order to track, for example, gestures performed by a user, or to determine a user's vital signs, such as the user's heart rate.
- biometric information may be compared to a library of gestures to determine a particular gesture performed by a user.
- a user object may be an armband worn on the user's forearm, the armband comprising an electromyogram and the armband being in data communication with the processor.
- the electromyogram is operable to obtain electrical activity of muscles in the forearm. Resulting, movement of the forearm, wrist, hand, fingers or thumb will be measureable by the user object and can be communicated to the processor.
- the processor may have access to a database of known gestures, such as wrist or finger movements, and their associated musculatory response. The processor can compare the obtained biometric information to the known gestures to determine a most likely gesture having been made by a user.
- gesture information examples include providing user input to augmented reality applications.
- the corresponding location, motion and orientation information can be used to map additional motion information relating to tracked gestures to a virtual map.
- the BT systems described herein may provide sufficient LMO information and biometric information to avoid the traditional requirement of using physical controllers to receive user input or provide user tracking.
- Fig. 1 an exemplary scenario is shown in which multiple users occupy a physical environment.
- the users are equipped with HMDs 12 and peripherals 13.
- Each HMD and peripheral may be equipped with a BT system to provide LMO information and biometric information to a processor.
- the singular "processor" is used herein, but it will be appreciated that the processor may be distributed amongst the components occupying the physical environment, within the physical environment or in a server 14 in network communication with a network 17 accessible from the physical environment.
- the processor may be distributed between the HMDs 12 and a console 11 , or over the Internet via the network 17.
- the HMD 12 may communicate with the peripherals 13, or the HMD 12 and peripherals 13 may communicate directly with console 1 1 or server 14 located over a network 17 accessible from the physical environment, as shown.
- Communication between the processor and BT systems of the user objects may be via suitable wired or wireless communication.
- Each BT system may comprise a communication module configured to transmit LMO information and biometric information for the system to the processor, or communication of information may be routed through HMDs, as shown in Fig. 1.
- the processor can access a gesture library 19 which stores information relating to predetermined gestures, as described in more detail below. As illustrated, the gesture library 19 may be communicatively linked with server 14.
- a user 1 having disposed thereon various user objects, including an HMD 12 and peripherals 13.
- the user objects in this case are wearable devices, wherein each peripheral is an armband, wristband, legband, etc.
- Other suitable configurations are contemplated herein.
- peripherals 13 comprise BT systems for tracking LMO information and biometric information.
- the user 1 may be equipped with an HMD 12, as well as peripherals 13 disposed upon her hands and feet.
- Each peripheral 13 may comprise a BT system for providing LMO information and biometric information for the limb or body part upon which it is disposed.
- the HMD 12 may comprise a processor 130 for generating a rendered image stream comprising CGI.
- the processor 130 shown in Fig. 3 is shown mounted within the HMD 12; however, as previously described, the processor may be located separately from the HMD 12.
- the processor may communicate with the following components of the HMD 12: (i) a scanning system for scanning the physical environment surrounding the HMD 12; (ii) an LMO module 21 for determining LMO information for the HMD;
- a biometric module 22 comprising at least one biometric sensor for providing biometric information, which may include biometric information relating to gesture tracking of the HMD;
- an imaging system such as, for example, a camera system comprising one or more cameras 123, to capture a physical image stream of the physical environment
- a display system 121 for displaying to a user of the HMD 12 the physical image stream and/or the rendered image stream
- a power management system 113 for distributing power to the components
- a sensory feedback system comprising, for example, haptic feedback devices 120, for providing sensory feedback to the user
- an audio system 124 with audio input and output to provide audio interaction.
- the HMD 12 may further comprise a wireless communication system 126 having, for example, antennae, to communicate with other components in an AR and/or VR system, such as, for example, other HMDs, peripherals, actors, a gaming console, or a router.
- the LMO module 21 of the HMD comprises a transmitter or receiver of a magnetic tracking system.
- the LMO module 21 comprises a tracking marker configured to be tracked by a camera to determine LMO information of the HMD.
- the BT system 15 comprises an LMO module 21 for providing LMO information and a biometric module 22 for providing biometric information.
- the LMO module 21 may provide LMO information of the peripheral in a physical environment, or may provide LMO information for the peripheral relative to the HMD 12.
- the biometric module 22 may provide biometric information to the processor for detecting gestures performed by the user, or to detect the user's vital signs, such as a user's heartbeat.
- a user 1 wears the peripheral 13 comprising the BT system 15 on their forearm.
- the LMO module 21 provides real-time LMO information which the processor may process to map the user's motions to a virtual map.
- the LMO module 21 may comprise a transmitter/receiver of a magnetic tracking system, or may comprise a marker imaged by an associated camera in an active tracking system, either of which may provide LMO information for the peripheral to the processor via a communication module.
- the BT system 15 receives biometric information from a biometric sensor of the biometric module 22, such as an electromyographic sensor, and provides the biometric information to the processor to determine any gestures performed by the user.
- the processor may determine that biometric information detected by the biometric sensor corresponds to a predetermined gesture stored in a gesture library, such as a trigger pulling gesture associated with flexing of the index finger 16.
- the processor may further receive information from the gesture library relating to the predetermined gesture, such as associated motion and user input information.
- the processor may provide user input to an AR or VR application, such as to control a virtual gun to fire.
- Motion information received for the predetermined gesture may cause the processor to map the user's finger motion to the virtual map based on an approximate finger motion expected for the trigger pulling gesture.
- the processor may process the LMO information for the user object to determine a position and orientation of a virtual object associated with the user object in an AR application - such as a virtual gun.
- the processor may correlate the LMO information of a given peripheral 13 to a virtual object by way of a predetermined kinematic relationship between the peripheral and the virtual object. For example, where a virtual gun is associated with the peripheral 13 in an AR application, the LMO information may be processed so that any virtual bullet fired from the virtual gun is represented in a rendered image stream according to the determined position and orientation of the virtual gun at a time of firing, along the trajectory 18.
- the magnetic receiver is preferably located near the cameras 123 at a fixed distance therefrom.
- the magnetic receiver provides the relative distance and orientation of the peripheral relative to the receiver. Since the offset between the receiver and the cameras is known, the processor may transform the relative position and orientation from relative to the receiver to a relative position and orientation relative to the camera. The processor may then map motions of the peripheral 13 to the virtual map so that the peripheral 13 may be rendered as a virtual object having motions corresponding to the peripheral's 13 motions within the physical environment.
- LMO information from the peripheral 13 may be used to determine the position and orientation of a virtual object associated with the peripheral in an AR application. It will be understood that the position and orientation of a virtual object may be determined based on what, if any, virtual object is currently associated with the peripheral in the AR and VR application, and that various virtual objects are contemplated.
- a virtual object may comprise a flashlight
- the processor may process LMO information from the peripheral to determine and render the orientation of the flashlight and its emitted light, and update the orientation of the flashlight in real time.
- An LMO module 21 may comprise one or more of the following: (i) a position sensor configured to track a position or orientation, the position sensor comprising, for example, one or more of a magnetic, ultrasound, radio-frequency (RF), LiDAR or radar type of sensor, a tracking maker with an associated camera, whether alone or in combination; and (ii) a motion sensor such as an inertial measurement unit configured to track motions, the motion sensor comprising, for example, one or more of the following: an accelerometer, a gyroscope, a magnetometer, whether alone or in combination.
- the sensors in the LMO module may be, for example, MEMS- type sensors. Certain sensor configurations may provide all LMO information as a single suite. For example, a magnetic position tracking system alone may provide real-time location, positioning and orientation information.
- a user object comprises an LMO module having more than one position sensor or motion sensor
- readings from more than one sensor may be processed together to provide a more accurate reading of LMO for the user object.
- readings from an inertia measuring unit may be combined with measurements from a magnetic tracking system to improve accuracy of determined position and orientation.
- an LMO module 21 comprises a receiver or a transmitter of a receiver- transmitter pair for determining relative LMO information of the receiver with respect to the transmitter (or vice versa).
- the receiver of the receiver-transmitter pair may be mounted to an HMD worn by the user or may be located elsewhere in the physical environment nearby the user.
- the transmitter may be coupled to the user object and may transmit a signal comprising information which can be processed for determining the position of the transmitter relative to the receiver.
- the receiver may be configured to provide information relating to a detected signal from the transmitter to the processor.
- the transmitter and receiver repeatedly and continuously transmit and receive, respectively, signals to provide substantially real-time LMO information for the LMO module.
- the transmitter may be a magnetic source and the receiver may be a magnetic receiver/antenna.
- the pair can be configured so that the receiver is able to detect the field of the transmitter, and the receiver may communicate characteristics of the detected field to the processor for determining LMO information of the receiver relative to the transmitter.
- the processor may process signals detected by the receiver to provide LMO information of the peripheral relative to the HMD.
- the processor may use the LMO information to map the substantially real-time location and orientation of the peripheral relative to the position of the HMD.
- the magnetic receiver is preferably mounted to the HMD and the magnetic source is preferably mounted to (or otherwise disposed upon) the peripheral.
- the receiver mounted to the HMD may provide information to the processor from a plurality of transmitters mounted to a plurality of peripherals, without requiring additional wireless or wired communications systems.
- the transmitters may vary transmission characteristics (such as frequency or modulation), to facilitate identification by the receiver of the source transmitter of transmitted information.
- each magnetic source provides a transmission having different magnetic field characteristics.
- the receiver may be mounted to the peripheral while the transmitter may be mounted to the HMD, and the LMO information for the HMD and peripheral may be provided to the processor from the peripheral via wired or wireless communication.
- an LMO module on a user object comprises a tracking marker for tracking by a camera to determine LMO information of the user object.
- Active tracking may determine LMO information for a user object by: (1) mounting a visual marker on the LMO module of a user object; (2) imaging the marker with a camera; (3) providing the camera's captured images to a processor for processing; and (4) if the processor detects a marker in the captured images, the processor may determine the position and orientation of the marker, and its associated user object from an image of the marker in each captured image.
- the user object's LMO can continue to be tracked as the camera continues to capture additional images and sends them to the processor for processing to determine additional LMO information of the user object.
- the camera may be oriented to have the user object within its field of view. Markers may include light emitters, such as active LEDs or infrared markers.
- the LMO module of the user object may further comprise an inertial measurement unit (IMU) to provide orientation measurements of the user object to the processor and to be combined by the processor with LMO information determined from the captured images.
- IMU inertial measurement unit
- the IMU may allow for more accurate measurements of location, motion and orientation of user objects.
- markers can be independently controlled by signals from the processor to change in color to reflect a specific assigned controller number for each of a plurality of user objects, such as peripherals. For example, if a user wears four peripherals, an LED color can be provided for each peripheral (e.g. blue, red, green, yellow, etc.). This may allow for more accurate tracking.
- the use of different LED marker colours is not limited to controller identity, but could extend to other applications requiring tracking of different user objects.
- the LMO module of a peripheral coupled to a particular HMD may provide LMO information as values or vectors relative to the location and orientation of the HMD at a point in time.
- the HMD and peripheral may comprise a magnetic position tracking transmitter/receiver pair, configured to provide location and orientation information for the peripheral relative to the HMD.
- the processor may map the location and orientation of the peripheral with reference to the HMD's location and orientation.
- biometric module 22 Various embodiments and components of the biometric module 22 will now be described.
- Biometric modules comprise biometric sensors configured to gather biometric information from the user and provide that information to a processor for various uses in AR and VR applications.
- Biometric sensors may comprise sensors for use in tracking a user's gestures, such as myography sensors, including, for example, electromyography (EMG), mechanomyography (MMG) or phonomyography sensors, to measure the user's gestures and body movements.
- myography sensors including, for example, electromyography (EMG), mechanomyography (MMG) or phonomyography sensors, to measure the user's gestures and body movements.
- EMG electromyography
- MMG mechanomyography
- phonomyography sensors to measure the user's gestures and body movements.
- Tracked gestures may be mapped to a virtual map, processed to provide an associated user input or may control functionality of an HMD, such as activating or deactivating the HMD.
- Biometric sensors may further comprise sensors for measuring a user's vital signs, such as a user's heart rate.
- Biometric sensors for measuring a user's heart rate may include sensors for providing an electrocardiogram (ECG) or a photoplethysmogram (PPG).
- ECG electrocardiogram
- PPG photoplethysmogram
- biometric sensor readings may be processed having regard to where a biometric sensor is mounted on a user. Accordingly, sensor readings provided by a biometric sensor mounted to a peripheral on a user's calf, may be processed having regard to the fact that they relate to muscle activity of the calf.
- EMG sensors may measure a user's muscle activity at the surface of the skin to provide sensor readings.
- a processor may correlate the sensor readings to predetermined gestures. Predetermined gestures may relate to measured biometric information such as a contraction or stretching of individual or a combination of muscles. While performing a gesture, activated muscles can be determined and measured.
- Sensors for providing an ECG may include electrodes which measure the electrical activity of the heart to provide sensor signals. Specifically the sensors generate sensor signals relating to detected electrical discharges on the skin caused by heart activity.
- Sensors for providing a PPG may include pulse oximeters. These PPGs generate sensor signals relating to measured volumetric changes of the human body such as lung displacement, or in a specific example, volumetric changes in arteries which can be used to calculate heartbeat.
- One method of measuring heart rate comprises utilizing light-emitting diodes (LED) to illuminate a user's skin. The amount of light that is transmitted or reflected to a photodiode can be correlated to pulse rate.
- LED light-emitting diodes
- Biometric information from biometric sensors for use in tracking gestures may be processed by a processor according to the blocks 300 illustrated in Fig. 5 in order to correlate measured sensor readings from biometric sensors to predetermined gestures stored in a gesture library.
- sensor readings may be received by a processor from a biometric sensor for use in tracking a user's gestures, such as a myographic sensor.
- the sensor readings may be processed to provide a clean gesture signal. More specifically, the processing may comprise digital signal processing (DSP), noise reduction, amplification, elimination of motion artifacts and filtering.
- DSP digital signal processing
- the processing can be done through hardware or software filters, rectifiers, or other techniques known to those of skill in the art of signal processing.
- Such filters can be built or programmed for filtering of specific frequencies, or for providing other signal processing.
- the processor may compare and analyze the clean gesture signal to pre-determined gestures stored in a gesture library.
- the processor may determine the particular predetermined gesture relating to the clean gesture signal, and the processor may retrieve information relating to the predetermined gesture, such as associated motion information, and associated user input information.
- the particular predetermined gesture and its associated motion and user input information may be processed for use in various AR applications. As illustrated, the steps performed in relation to the blocks 300 may then repeat.
- the processor may repeatedly call the biometric sensors to provide sensor readings.
- the gesture library may comprise a plurality of predetermined gestures, and associated information for each gesture of the plurality of gestures, such as associated motion information, and user input information.
- Motion information may provide information for mapping a gesture performed by a user to a virtual map - e.g. for mapping a finger moving through a trigger pull gesture to a virtual map.
- User input information may identify user input associated with a gesture.
- a trigger pull gesture may be identified, and user input information associated with the trigger pull gesture may be processed at block 310 in order to cause a virtual gun to be actuated.
- the gesture library may be expanded as needed. Some gestures that may be stored in the gesture library include tapping a finger, giving a thumbs up, closing a fist, opening and flexing open a hand and pulling a trigger (i.e. curling an index finger).
- Any predetermined gesture determined to have been performed by the user, and its associated user input, may be processed in relation to dynamic game rules, and in relation to characteristics of an AR or VR environment. For example, pulling a trigger may serve as user input to cause a user to actuate a virtual gun, if a gun is held. Flexing open a hand may cause a user to drop a user's weapon, if a weapon is held. Closing a fist may cause a user to pick up a weapon, if the user is near a virtual weapon in a virtual environment, and the user does not currently hold a weapon. Conversely, closing a fist may cause a user to punch if the user is not near a weapon in the virtual environment. It will be understood that various predetermined gesture may correlate to various user inputs and game mechanics, and that the above examples are merely illustrative.
- Predetermined gestures may be associated with different user inputs depending on which peripheral they are detected from.
- a user could wear one peripheral which may be tracked to permit a user to open, close and navigate a virtual game menu. While the user's may wear another peripheral to track user inputs relating to gameplay.
- Biometric information from biometric sensors for use in tracking a user's vital signs may be processed by a processor according to the blocks 320 as illustrated in Fig. 6.
- sensor readings may be received from a biometric sensor for use in tracking a vital signs, such as the user's heart rate.
- the sensor readings may be processed to provide a clean vital sign signal. More specifically, the processing may comprise digital signal processing (DSP), noise reduction, amplification, elimination of motion artifacts and filtering.
- DSP digital signal processing
- the processing can be done through hardware or software filters, rectifiers, or other techniques known to those of skill in the art of signal processing. Such filters can be built or programmed for filtering of specific frequencies, or for providing other signal processing.
- the processor may process the clean vital sign signal to determine a particular vital sign, such as a user's current heart rate.
- the processor may further process the user's vital sign for use in various AR and VR applications. As illustrated, the steps performed in relation to the blocks 300 may then repeat.
- the processor may repeatedly call the biometric sensors to provide sensor readings.
- vital signs comprising a user's heart rate may be provided to a display of an HMD. Further, vital signs may be processed and provided to a feedback system to provide physical feedback, visual feedback or audio feedback. For example, biometric information indicating that a user has a rapid heart rate may be processed and provided to a feedback system to generate pulsing haptic feedback correlating to measured pulses of the user's heart rate.
- biometric sensors need not be restricted to locations as illustrated in Fig. 2, but may be positioned in various locations for measurement of specific biometric data.
- biometric sensors may be positioned proximally to particular muscular tissue that is desired to be measured for muscular activity for detecting a particular gesture.
- a biometric module may comprise any combination of biometric sensors.
- a user may be provided with a biometric module comprising both an EMG and a PPG sensor.
- the user may be playing a horror game, and wearing a peripheral on one of their forearms comprising an EMG and a PPG sensor.
- the EMG could track a user's gesture of holding a fist which may be detected and processed such that the user holds up a virtual flashlight. If the EMG detects that the user opens their hand, the flashlight could turn off. Owing to the scary nature of horror games, the user's virtual avatar may be in a dark and eerie setting, creating a sense of fear.
- the peripheral's PPG may measure the user's heart rate.
- biometric sensors can be provided in combination to provide different inputs or user interaction. It will also be understood that different biometric sensors may be provided on different peripherals to provide the same or additional functionality; for example, a user may be wearing a peripheral comprising an EMG sensor on one arm, and a second peripheral comprising a PPG sensor on the user's other arm. [0058] Referring now to Figs.
- blocks 340 illustrate steps performed in processing information from a biometric tracking system 15 for tracking a user object's LMO, gestures performed by the user, and the user's vital signs.
- the biometric system 15 comprises an LMO module 21 and a biometric module 22.
- the biometric module 22 comprises biometric sensors for gesture tracking and biometric sensors for tracking a user's vital signs.
- the LMO module 31 receives LMO sensor readings relating to the position and orientation of a user object.
- a processor receives the LMO sensor readings and processes the readings to determine the LMO of the user object.
- the biometric module receives biometric information from biometric sensors for gesture tracking, such as myographic sensors.
- the biometric information is processed to determine gestures performed by the user in proximity to the user object.
- the processor may receive information associated with the determined gestures, such as associated user input information for providing a user input associated with the gesture to AR applications, and associated motion information for use in mapping any determined gestures to a virtual map. The steps performed in relation to block 354 are further described above in relation to Fig. 5.
- the biometric module retrieves further biometric information from further biometric sensors for tracking a user's vital signs.
- the processor processes the further biometric information to determine a current vital sign of the user, such as the user's heart rate.
- the steps performed in relation to block 356 are further described above in relation to Fig. 6.
- the processor processes the determined LMO of the user objects, determined gestures and the user's vital signs to generate outputs, for use as inputs for AR applications.
- the processor may further process the outputs to generate a rendered image stream comprising computer generated imagery ("CGI") with respect to the user objects.
- the processor may process the LMO information to determine the LMO of a virtual object associated with the user object in an AR application according to a predetermined kinematic relationship.
- the processor may process the information associated with the determined gestures to map additional motion information relating to a determined gesture to a virtual map or to provide user input to AR applications.
- the processor may then transmit a rendered image stream to the display system of an HMD for display to user thereof.
- the steps described in relation to blocks 340 may be repeated, as the biometric system 15 may continuously poll the LMO module and biometric module.
- the outputs may be provided to a feedback system for providing feedback to the user relating to the outputs.
- at least one user object such as a peripheral, may comprise a feedback system.
- the feedback system may be configured to provide physical, audible, or visual feedback to a user in response to detected gestures or vital signs.
- Physical feedback may include vibrations and micro-shocks.
- Visual feedback may include activation of blinking LEDs.
- Audible feedback may include an audible notification.
- the physical feedback could be activated in response to detection of a gesture indicating that a user has pulled a trigger.
- Visual feedback could be activated in response to particular game parameters, as determined by the processor.
- visual feedback could be provided to user when a particular task is completed in an AR/VR application.
- the feedback system may not necessarily be solely activated in relation to outputs from the processor, for example the feedback system may be configured to provide an indication to the user when the HMD is being powered on or off.
- the foregoing systems and methods may enable a user equipped with a wearable LMO module, and optionally biometric modules, to perform exercises, the motions of which may be mapped and tracked for analysis by, for example, the user's trainer.
- the foregoing systems and methods may enable a user to use gesture controls to interact with a processor, for example to initiate AR or VR gameplay in a physical environment.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Neurosurgery (AREA)
- Neurology (AREA)
- Health & Medical Sciences (AREA)
- Dermatology (AREA)
- Biomedical Technology (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Optics & Photonics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
L'invention concerne le suivi de la position, du mouvement et de l'orientation d'objets d'utilisateur au sein d'un environnement physique en vue d'une présentation à un utilisateur équipé d'un afficheur facial. L'invention concerne en outre le suivi d'informations biométriques de l'utilisateur. Un processeur transpose la position et l'orientation sensiblement en temps réel des objets d'utilisateur vers une carte virtuelle de l'environnement physique, et transpose en outre des gestes effectués par l'utilisateur, tels que détectés par des capteurs biométriques.
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201461941055P | 2014-02-18 | 2014-02-18 | |
US61/941,055 | 2014-02-18 | ||
US201462052863P | 2014-09-19 | 2014-09-19 | |
US62/052,863 | 2014-09-19 | ||
US201462097331P | 2014-12-29 | 2014-12-29 | |
US62/097,331 | 2014-12-29 | ||
US201562099418P | 2015-01-02 | 2015-01-02 | |
US62/099,418 | 2015-01-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015123771A1 true WO2015123771A1 (fr) | 2015-08-27 |
Family
ID=53877474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CA2015/050120 WO2015123771A1 (fr) | 2014-02-18 | 2015-02-18 | Suivi de gestes et commande en réalité augmentée et virtuelle |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2015123771A1 (fr) |
Cited By (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105487673A (zh) * | 2016-01-04 | 2016-04-13 | 京东方科技集团股份有限公司 | 一种人机交互系统、方法及装置 |
WO2017048439A1 (fr) * | 2015-09-16 | 2017-03-23 | Intel Corporation | Techniques permettant une reconnaissance des gestes à l'aide d'un capteur photopléthysmographique (ppmg) et dispositif de reconnaissance des gestes portable de faible puissance utilisant lesdites techniques |
WO2017163096A1 (fr) * | 2016-03-25 | 2017-09-28 | Zero Latency PTY LTD | Système et procédé pour déterminer une orientation à l'aide de cameras de suivi et de mesures inertielles |
WO2017213903A1 (fr) * | 2016-06-06 | 2017-12-14 | Microsoft Technology Licensing, Llc | Amélioration optique du suivi électromagnétique en réalité mixte |
WO2018011497A1 (fr) * | 2016-07-13 | 2018-01-18 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Système et procédé de capture embarquée et de reproduction 3d/360° du mouvement d'un opérateur dans son environnement |
US9916496B2 (en) | 2016-03-25 | 2018-03-13 | Zero Latency PTY LTD | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects |
US9924265B2 (en) | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
WO2018102615A1 (fr) * | 2016-11-30 | 2018-06-07 | Logitech Europe S.A. | Système pour importer des dispositifs d'interface utilisateur dans une réalité virtuelle/augmentée |
WO2018111656A1 (fr) * | 2016-12-12 | 2018-06-21 | Microsoft Technology Licensing, Llc | Cadre rigide virtuel pour sous-système de capteur |
WO2018183390A1 (fr) * | 2017-03-28 | 2018-10-04 | Magic Leap, Inc. | Système de réalité augmentée avec audio spatialisé lié à un objet virtuel manipulé par l'utilisateur |
US10206620B2 (en) | 2016-03-23 | 2019-02-19 | Intel Corporation | User's physiological context measurement method and apparatus |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
US10421012B2 (en) | 2016-03-25 | 2019-09-24 | Zero Latency PTY LTD | System and method for tracking using multiple slave servers and a master server |
US10486061B2 (en) | 2016-03-25 | 2019-11-26 | Zero Latency Pty Ltd. | Interference damping for continuous game play |
CN111103967A (zh) * | 2018-10-25 | 2020-05-05 | 北京微播视界科技有限公司 | 虚拟对象的控制方法和装置 |
US10717001B2 (en) | 2016-03-25 | 2020-07-21 | Zero Latency PTY LTD | System and method for saving tracked data in the game server for replay, review and training |
US10751609B2 (en) | 2016-08-12 | 2020-08-25 | Zero Latency PTY LTD | Mapping arena movements into a 3-D virtual world |
US10860090B2 (en) | 2018-03-07 | 2020-12-08 | Magic Leap, Inc. | Visual tracking of peripheral devices |
WO2021067916A1 (fr) * | 2019-10-04 | 2021-04-08 | Tactual Labs Co. | Mécanomyographie capacitive |
CN112752945A (zh) * | 2018-07-02 | 2021-05-04 | 梦境沉浸股份有限公司 | 用于虚拟现实系统的枪械模拟布置 |
CN112783326A (zh) * | 2021-01-28 | 2021-05-11 | 唐庆圆 | 手势识别装置和手势识别系统 |
WO2021207174A1 (fr) * | 2020-04-06 | 2021-10-14 | Tactual Labs Co. | Détection de préhension |
WO2022000045A1 (fr) * | 2020-07-02 | 2022-01-06 | VirtuReal Pty Ltd | Système de réalité virtuelle |
EP3404624B1 (fr) * | 2016-01-15 | 2022-10-12 | Meleap Inc. | Système d'affichage d'images, procédé de commande d'un système d'affichage d'images, système de distribution d'images et visiocasque |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011055326A1 (fr) * | 2009-11-04 | 2011-05-12 | Igal Firsov | Interface d'entrée-sortie universelle pour utilisateur humain |
WO2012114791A1 (fr) * | 2011-02-24 | 2012-08-30 | 日本電気株式会社 | Système à commande gestuelle |
US20130346168A1 (en) * | 2011-07-18 | 2013-12-26 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
-
2015
- 2015-02-18 WO PCT/CA2015/050120 patent/WO2015123771A1/fr active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011055326A1 (fr) * | 2009-11-04 | 2011-05-12 | Igal Firsov | Interface d'entrée-sortie universelle pour utilisateur humain |
WO2012114791A1 (fr) * | 2011-02-24 | 2012-08-30 | 日本電気株式会社 | Système à commande gestuelle |
US20130346168A1 (en) * | 2011-07-18 | 2013-12-26 | Dylan T X Zhou | Wearable augmented reality eyeglass communication device including mobile phone and mobile computing via virtual touch screen gesture control and neuron command |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9924265B2 (en) | 2015-09-15 | 2018-03-20 | Intel Corporation | System for voice capture via nasal vibration sensing |
WO2017048439A1 (fr) * | 2015-09-16 | 2017-03-23 | Intel Corporation | Techniques permettant une reconnaissance des gestes à l'aide d'un capteur photopléthysmographique (ppmg) et dispositif de reconnaissance des gestes portable de faible puissance utilisant lesdites techniques |
US10348355B2 (en) | 2015-09-16 | 2019-07-09 | Intel Corporation | Techniques for gesture recognition using photoplethysmographic (PPMG) sensor and low-power wearable gesture recognition device using the same |
TWI712877B (zh) * | 2015-09-16 | 2020-12-11 | 美商英特爾股份有限公司 | 使用光體積變化描述波形(ppmg)感測器之姿勢識別的技術以及使用ppmg感測器之低功率可穿戴式姿勢識別裝置 |
US10324494B2 (en) | 2015-11-25 | 2019-06-18 | Intel Corporation | Apparatus for detecting electromagnetic field change in response to gesture |
CN105487673A (zh) * | 2016-01-04 | 2016-04-13 | 京东方科技集团股份有限公司 | 一种人机交互系统、方法及装置 |
US10585488B2 (en) | 2016-01-04 | 2020-03-10 | Boe Technology Group Co., Ltd. | System, method, and apparatus for man-machine interaction |
EP3404624B1 (fr) * | 2016-01-15 | 2022-10-12 | Meleap Inc. | Système d'affichage d'images, procédé de commande d'un système d'affichage d'images, système de distribution d'images et visiocasque |
US10206620B2 (en) | 2016-03-23 | 2019-02-19 | Intel Corporation | User's physiological context measurement method and apparatus |
US9916496B2 (en) | 2016-03-25 | 2018-03-13 | Zero Latency PTY LTD | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects |
US10071306B2 (en) | 2016-03-25 | 2018-09-11 | Zero Latency PTY LTD | System and method for determining orientation using tracking cameras and inertial measurements |
US10717001B2 (en) | 2016-03-25 | 2020-07-21 | Zero Latency PTY LTD | System and method for saving tracked data in the game server for replay, review and training |
US10486061B2 (en) | 2016-03-25 | 2019-11-26 | Zero Latency Pty Ltd. | Interference damping for continuous game play |
WO2017163096A1 (fr) * | 2016-03-25 | 2017-09-28 | Zero Latency PTY LTD | Système et procédé pour déterminer une orientation à l'aide de cameras de suivi et de mesures inertielles |
US10421012B2 (en) | 2016-03-25 | 2019-09-24 | Zero Latency PTY LTD | System and method for tracking using multiple slave servers and a master server |
US10430646B2 (en) | 2016-03-25 | 2019-10-01 | Zero Latency PTY LTD | Systems and methods for operating a virtual reality environment using colored marker lights attached to game objects |
US10254546B2 (en) | 2016-06-06 | 2019-04-09 | Microsoft Technology Licensing, Llc | Optically augmenting electromagnetic tracking in mixed reality |
WO2017213903A1 (fr) * | 2016-06-06 | 2017-12-14 | Microsoft Technology Licensing, Llc | Amélioration optique du suivi électromagnétique en réalité mixte |
US10298282B2 (en) | 2016-06-16 | 2019-05-21 | Intel Corporation | Multi-modal sensing wearable device for physiological context measurement |
WO2018011497A1 (fr) * | 2016-07-13 | 2018-01-18 | Commissariat A L'energie Atomique Et Aux Energies Alternatives | Système et procédé de capture embarquée et de reproduction 3d/360° du mouvement d'un opérateur dans son environnement |
US10751609B2 (en) | 2016-08-12 | 2020-08-25 | Zero Latency PTY LTD | Mapping arena movements into a 3-D virtual world |
US10241583B2 (en) | 2016-08-30 | 2019-03-26 | Intel Corporation | User command determination based on a vibration pattern |
WO2018102615A1 (fr) * | 2016-11-30 | 2018-06-07 | Logitech Europe S.A. | Système pour importer des dispositifs d'interface utilisateur dans une réalité virtuelle/augmentée |
EP3552053A1 (fr) * | 2016-12-12 | 2019-10-16 | Microsoft Technology Licensing, LLC | Cadre rigide virtuel pour sous-système de capteur |
CN110050221A (zh) * | 2016-12-12 | 2019-07-23 | 微软技术许可有限责任公司 | 用于传感器子系统的虚拟刚性框架 |
US10248191B2 (en) | 2016-12-12 | 2019-04-02 | Microsoft Technology Licensing, Llc | Virtual rigid framework for sensor subsystem |
WO2018111656A1 (fr) * | 2016-12-12 | 2018-06-21 | Microsoft Technology Licensing, Llc | Cadre rigide virtuel pour sous-système de capteur |
WO2018183390A1 (fr) * | 2017-03-28 | 2018-10-04 | Magic Leap, Inc. | Système de réalité augmentée avec audio spatialisé lié à un objet virtuel manipulé par l'utilisateur |
US10747301B2 (en) | 2017-03-28 | 2020-08-18 | Magic Leap, Inc. | Augmented reality system with spatialized audio tied to user manipulated virtual object |
US11231770B2 (en) | 2017-03-28 | 2022-01-25 | Magic Leap, Inc. | Augmented reality system with spatialized audio tied to user manipulated virtual object |
US10860090B2 (en) | 2018-03-07 | 2020-12-08 | Magic Leap, Inc. | Visual tracking of peripheral devices |
US11181974B2 (en) | 2018-03-07 | 2021-11-23 | Magic Leap, Inc. | Visual tracking of peripheral devices |
US11625090B2 (en) | 2018-03-07 | 2023-04-11 | Magic Leap, Inc. | Visual tracking of peripheral devices |
US11989339B2 (en) | 2018-03-07 | 2024-05-21 | Magic Leap, Inc. | Visual tracking of peripheral devices |
CN112752945A (zh) * | 2018-07-02 | 2021-05-04 | 梦境沉浸股份有限公司 | 用于虚拟现实系统的枪械模拟布置 |
CN111103967A (zh) * | 2018-10-25 | 2020-05-05 | 北京微播视界科技有限公司 | 虚拟对象的控制方法和装置 |
WO2021067916A1 (fr) * | 2019-10-04 | 2021-04-08 | Tactual Labs Co. | Mécanomyographie capacitive |
WO2021207174A1 (fr) * | 2020-04-06 | 2021-10-14 | Tactual Labs Co. | Détection de préhension |
WO2022000045A1 (fr) * | 2020-07-02 | 2022-01-06 | VirtuReal Pty Ltd | Système de réalité virtuelle |
CN112783326A (zh) * | 2021-01-28 | 2021-05-11 | 唐庆圆 | 手势识别装置和手势识别系统 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2015123771A1 (fr) | Suivi de gestes et commande en réalité augmentée et virtuelle | |
JP6938542B2 (ja) | 組込みセンサと外界センサとを組み合わせる多関節トラッキングのための方法およびプログラム製品 | |
US11157725B2 (en) | Gesture-based casting and manipulation of virtual content in artificial-reality environments | |
US9939911B2 (en) | Computer interface for remotely controlled objects and wearable articles with absolute pose detection component | |
RU2746686C2 (ru) | Носимая система отслеживания движения | |
JP6664512B2 (ja) | アイブレインインターフェースシステムのキャリブレーション方法、及びシステム内のスレーブデバイス、ホストデバイス | |
CN106575159B (zh) | 手套接口对象 | |
JP6669069B2 (ja) | 検出装置、検出方法、制御装置、および制御方法 | |
US7826641B2 (en) | Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features | |
CA2767656C (fr) | Suivi de mouvement oculaire et corporel pour test et/ou entrainement | |
US20190286234A1 (en) | System and method for synchronized neural marketing in a virtual environment | |
Heo et al. | A realistic game system using multi-modal user interfaces | |
KR20130027006A (ko) | 최소 침습 수술 시스템에서 손 제스처 제어를 위한 방법 및 장치 | |
KR20120115487A (ko) | 원격조종 최소 침습 종속 수술 기구의 손 제어를 위한 방법 및 시스템 | |
KR20140015144A (ko) | 최소 침습 수술 시스템에서 손 존재 검출을 위한 방법 및 시스템 | |
US20240054816A1 (en) | Automated eye tracking assessment solution for skill development | |
WO2023095321A1 (fr) | Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations | |
Khaksar | A Framework for Gamification of Human Joint Remote Rehabilitation, Incorporating Non-Invasive Sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15751278 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15751278 Country of ref document: EP Kind code of ref document: A1 |