EP3188075A1 - Apparatus and method for recognizing hand gestures in a virtual reality headset - Google Patents
Apparatus and method for recognizing hand gestures in a virtual reality headset Download PDFInfo
- Publication number
- EP3188075A1 EP3188075A1 EP16181893.5A EP16181893A EP3188075A1 EP 3188075 A1 EP3188075 A1 EP 3188075A1 EP 16181893 A EP16181893 A EP 16181893A EP 3188075 A1 EP3188075 A1 EP 3188075A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- headset
- vision sensor
- identified
- identified object
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 17
- 238000010586 diagram Methods 0.000 description 11
- 230000002596 correlated effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 210000000245 forearm Anatomy 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000000875 corresponding effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/211—Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5255—Changing parameters of virtual cameras according to dedicated instructions from a player, e.g. using a secondary joystick to rotate the camera around a player's character
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/20—Scenes; Scene-specific elements in augmented reality scenes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0187—Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0261—Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/10—Mixing of images, i.e. displayed pixel being the result of an operation, e.g. adding, on the corresponding input pixels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present application relates generally to virtual reality (VR) headset and, in particular, to apparatus and method for identifying the hand gestures of a user of a VR headset.
- VR virtual reality
- the VR headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the VR headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the VR headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor.
- the controller is configured to: a) detect a hand in a first image captured by the forward-looking vision sensor; b) detect an arm of the user in a second image captured by the downward-looking vision sensor; and c) determine whether the detected hand in the first image is a hand of the user.
- the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative position of the detected hand in the first image and a relative position of the detected arm of the user in the second image.
- the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative movement of the detected hand in the first image and a relative movement of the detected arm of the user in the second image.
- the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative alignment of the detected hand in the first image and a relative alignment of the detected arm of the user in the second image.
- a VR apparatus includes a detector comprising a first vision sensor and a second vision sensor and a controller operatively coupled with the detector.
- the controller is configured to identify a first object through the first vision sensor and determine whether the first identified object is an object of a user of the VR apparatus, based on the second identified object through the second vision sensor.
- a method of operating a VR apparatus includes identifying a first object through a first vision sensor and determining whether the first identified object is an object of a user of the VR apparatus, based on the second identified object through a second vision sensor.
- FIGURES 1A through 8 discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged virtual reality headset.
- VR virtual reality
- augmented reality a virtual reality
- VR virtual reality
- main user or “user” refers to the person actually wearing and operating the virtual reality (VR) head mounted display (HMD) or headset
- intruder refers to any person other than the user whose hand gestures are intentionally or accidentally triggering undesirable effects on the VR user interface of the HMD/headset.
- FIGURE 1A is a perspective view of VR headset 100 according to one embodiment of the disclosure.
- FIGURE 1B is a front view of VR headset 100 according to one embodiment of the disclosure.
- VR headset 100 comprises chassis (or housing) 105, forward vision sensor 110, head strap 120, and downward vision sensor 130.
- Chassis 105 houses the electronics of VR headset 100.
- a user places VR headset 100 on his or her head and tightens head strap 120 to hold VR headset 100 in place.
- Forward vision sensor 110 captures forward field of view (FOV) 150 and displays forward FOV 150 on the internal display of VR headset 100. The user may then view on the internal display any objects in the forward FOV 150.
- FOV forward field of view
- the forward vision sensor 110 and the internal processor(s) of VR headset 100 detect a hand in forward FOV 150 for the purpose of determining hand gestures, it may be difficult to determine whether the hand belongs to the main user or to an intruder. It is necessary to prevent a hand gesture from an intruder from causing undesirable interference to the user interface.
- the present disclosure provides a method of distinguishing legitimate user hand gestures from intruder hand gestures by using downward vision sensor 130, which captures downward FOV 160.
- downward vision sensor 130 and the internal processor(s) of VR headset 100 are operable to detect and to identify the arm(s) of the user in downward FOV 160 and then to correlate and/or to associate the user hand movements with the user arm movements.
- VR headset 100 is capable of determining if a detected hand in the forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder. Once this determination is made, the internal processor(s) of VR headset 100 will only process hand gesture commands from the user and will ignore hand gestures from an intruder.
- FIGURE 2 illustrates a hand gesture detection operation of VR headset 100 according to one embodiment of the disclosure.
- the user extends her arm and hand forward to interact with object(s) in the virtual world.
- Forward vision sensor 110 detects user hand 210 in forward FOV 150 and downward vision sensor 130 detects user arm 220 in downward FOV 160.
- VR headset 110 determines whether user hand 210 belongs to the user by comparing the alignments and/or positions of user hand 210 and user arm 220.
- VR headset 110 may also determine whether user hand 210 belongs to the user by comparing the relative movements of user hand 210 and user arm 220.
- the tracked movements may include left-right (lateral) movement of the hands and arms, up-down (vertical) movement of the hands and arms, and/or forward-backward (extension) movements of the hands and arms away from or toward the body of the user.
- FIGURE 3 illustrates detected hands 310 and 320 in forward FOV 150 of forward vision sensor 110 and detected arms 311 and 321 in the downward FOV 160 of downward vision sensor 130 of VR headset 100 according to one embodiment of the disclosure.
- the user will only see detected hands 310 and 320 in forward FOV 150 on the internal display of VR headset 100.
- Detected arms 311 and 321 are only seen and analyzed by the internal processor(s) of VR headset 100.
- the lateral movements of detected arms 311 and 321 (indicted by left-right arrows) may be correlated with similar lateral movements of detected hands 310 and 320, thereby identifying detected hands 310 and 320 as the hands of the user of VR headset 100 and not the hands of an intruder.
- FIGURE 4 is a schematic block diagram of VR headset 100 according to one embodiment of the disclosure.
- VR headset 100 comprises forward vision sensor (VS) 110 and downward VS 130.
- VR headset 100 further comprises VR headset controller 410, memory 420, VR source video 430, video processor 440, display 450, and speakers 460.
- forward VS 110 and downward VS 130 may comprise conventional video cameras (e.g., RGB video cameras).
- VR headset controller 410 is a microprocessor or microcontroller that controls the overall operation of VR headset 410 by executing an operating system program and one or more application programs stored in memory 420.
- Video processor 440 receives source video from VR source video 430, which video processor 440 then displays on one or more screens of display 450.
- VR source video 430 may be an external VR video player coupled wirelessly or by wireline to VR headset 410.
- VR source video 430 may be an internal memory (including a part of memory 420), in which VR video content is stored.
- VR headset controller 410 directs the real-world outputs of forward VS 110 and downward VS 130 to video processor 440 so that the user can see the real-world around the user on display 450, as well as augmented reality (AR) video content.
- AR augmented reality
- VR headset controller 410 is configured to direct video processor 440 to detect the hand(s) of the user in forward FOV 150 in the video output of forward VS 110 and to detect the arm(s) of the user in downward FOV 160 in the video output of downward VS 130.
- VR headset controller 410 is further configured to direct video processor 440 to correlate and/or to associate the user hand movements with the user arm movements. In this way, video processor 440 is capable of determining if a detected hand in forward FOV 150 belongs to the legitimate user of VR headset 100 or to an intruder.
- FIGURE 5 is a flow diagram illustrating the operation of VR headset 100 according to one embodiment of the disclosure.
- the user activates VR headset 100 and places VR headset 100 on his or her head (step 505).
- the user may launch an application that may be controlled by user hand gestures.
- video processor 440 detects one or more hand(s) in forward FOV 150 (step 510).
- Video processor 440 also detects a portion (e.g., a forearm) of at least one arm of the user in downward FOV 160 (step 515).
- Video processor 440 attempts to determine if a detected hand in forward FOV 150 is the hand of the user or an intruder. Video processor 440 may do this by comparing and analyzing detected objects in forward FOV 150 and downward FOV 160 in order to correlate the alignments and/or movements of a detected hand(s) and a detected forearm(s) (step 520). From this comparison, video processor 440 identifies the hand(s) of the legitimate user of VR headset 100 and ignores the detected hand(s) of intruder(s) (step 525). Thereafter, video processor 440 and/or VR headset controller 410 process the hand gestures of legitimate user (step 530).
- FIGURE 6 is a flow diagram for determining whether an identified hand is an object of a user of VR headset according to one embodiment of the disclosure.
- VR headset 100 identifies a first object through a first vision sensor (step 605).
- the first vision sensor may be a forward vision sensor.
- the first object may be hand 210 of the user.
- VR headset 100 determines whether the first identified object is the object of the user, based on a second identified object through a second vision sensor (step 610).
- the second vision sensor may be downward vision sensor 130.
- the second object may be arm 220 of the user.
- VR headset 100 determines whether the first identified object is the object of the user by determining whether the first identified object is correlated with the second identified object. If the first identified object is the object of the user, VR headset 100 permits an input by the first identified object and ignores an input by the intruder. Also, VR headset 100 may determine the object of the user, using the first vision sensor and the second vision sensor, and not using an additional hardware (e. g. armband). The intruder may mean a third party.
- FIGURE 7 is a flow diagram for ignoring a hand gesture of an intruder other than a user of VR headset according to one embodiment of the disclosure.
- VR headset 100 identifies hands of an intruder in forward FOV 150 (step 705). VR headset 100 determines whether the identified hands are hands of the intruder according to various embodiment of the disclosure. VR headset 100 identifies arms 311 and 321 of the user of VR headset 100 in downward FOV 160 (step 710). FIGURE 7 illustrates the flow diagram that step 705 and step 710 is performed in the order but each step may be performed simultaneously. Also, step 705 and step 710 may be performed in reverse order according to implementation method. VR headset 100 compare the identified objects in forward FOV 150 and downward FOV 160 to determine whether positions, alignments, and/or movements is correlated with the identified hands and the identified arms 311 and 321 (step 715).
- VR headset 100 may determine the identified objects using a method corresponding to the method described step 520 of FIGURE 5 .
- VR headset 100 determines the hands of the intruder (step 720). In other words, VR headset 100 determines the hands of the intruder among the objects compared in step 715.
- VR headset 100 ignores hand gestures of the intruder (step 725). As a result, VR headset 100 may prevent the hand gestures of the intruder from causing undesirable effect to user interface.
- FIGURE 8 is a flow diagram for determining whether a gesture of an intruder other than a user of VR headset is permitted according to one embodiment of the disclosure.
- VR headset 100 determines whether the gesture of the intruder is permitted in VR (step 805). If VR headset 100 permits the gesture of the intruder in VR, VR headset 100 may permit an input by the gesture of the intruder in an application (step 810). For example, if VR headset 100 permits the gesture of the intruder, VR headset 100 may implement the application that both the user and the intruder may cooperatively use. If VR headset 100 ignores the gesture of the intruder in VR, VR headset 100 may semitransparently display the gesture of the intruder (step 815). As a result, VR headset 100 may identify exist of an object of the intruder in forward FOV by displaying the gesture of the intruder semitransparently in VR.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Vascular Medicine (AREA)
- Optics & Photonics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Cardiology (AREA)
- Heart & Thoracic Surgery (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- The present application relates generally to virtual reality (VR) headset and, in particular, to apparatus and method for identifying the hand gestures of a user of a VR headset.
- There is a need in the art for an improved apparatus and method for identifying hand gesture of the user of a virtual reality (VR) headset for minimize effect by a intruder other than a user of the VR headset.
- To address the above-discussed deficiencies of the prior art, it is a primary object to provide a virtual reality (VR) headset configured to be worn by a user. In a preferred embodiment of the disclosure, the VR headset comprises: i) a forward-looking vision sensor for detecting objects in the forward field of view of the VR headset; ii) a downward-looking vision sensor for detecting objects in the downward field of view of the VR headset; iii) a controller coupled to the forward-looking vision sensor and the downward-looking vision sensor. The controller is configured to: a) detect a hand in a first image captured by the forward-looking vision sensor; b) detect an arm of the user in a second image captured by the downward-looking vision sensor; and c) determine whether the detected hand in the first image is a hand of the user.
- In one embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative position of the detected hand in the first image and a relative position of the detected arm of the user in the second image.
- In another embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative movement of the detected hand in the first image and a relative movement of the detected arm of the user in the second image.
- In still another embodiment, the controller determines whether the detected hand in the first image is the hand of the user by comparing a relative alignment of the detected hand in the first image and a relative alignment of the detected arm of the user in the second image.
- A VR apparatus includes a detector comprising a first vision sensor and a second vision sensor and a controller operatively coupled with the detector. The controller is configured to identify a first object through the first vision sensor and determine whether the first identified object is an object of a user of the VR apparatus, based on the second identified object through the second vision sensor.
- A method of operating a VR apparatus includes identifying a first object through a first vision sensor and determining whether the first identified object is an object of a user of the VR apparatus, based on the second identified object through a second vision sensor.
- Before undertaking the DETAILED DESCRIPTION below, it may be advantageous to set forth definitions of certain words and phrases used throughout this patent document: the terms "include" and "comprise," as well as derivatives thereof, mean inclusion without limitation; the term "or," is inclusive, meaning and/or; the phrases "associated with" and "associated therewith," as well as derivatives thereof, may mean to include, be included within, interconnect with, contain, be contained within, connect to or with, couple to or with, be communicable with, cooperate with, interleave, juxtapose, be proximate to, be bound to or with, have, have a property of, or the like; and the term "controller" means any device, system or part thereof that controls at least one operation, such a device may be implemented in hardware, firmware or software, or some combination of at least two of the same. It should be noted that the functionality associated with any particular controller may be centralized or distributed, whether locally or remotely. Definitions for certain words and phrases are provided throughout this patent document, those of ordinary skill in the art should understand that in many, if not most instances, such definitions apply to prior, as well as future uses of such defined words and phrases.
- For a more complete understanding of the present disclosure and its advantages, reference is now made to the following description taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts:
-
FIGURE 1A is a perspective view of a virtual reality (VR) headset according to one embodiment of the disclosure. -
FIGURE 1B is a front view of a VR headset according to one embodiment of the disclosure. -
FIGURE 2 illustrates a hand gesture detection operation of a VR headset according to one embodiment of the disclosure. -
FIGURE 3 illustrates detected hands in the field of view of the forward-looking vision sensor and detected arms in the field of view of the downward-looking vision sensor of a VR headset according to one embodiment of the disclosure. -
FIGURE 4 is a schematic block diagram of a VR headset according to one embodiment of the disclosure. -
FIGURE 5 is a flow diagram illustrating the operation of a VR headset according to one embodiment of the disclosure. -
FIGURE 6 is a flow diagram for determining whether an identified hand is an object of a user of a VR headset according to one embodiment of the disclosure. -
FIGURE 7 is a flow diagram for ignoring a hand gesture of an intruder other than a user of a VR headset according to one embodiment of the disclosure. -
FIGURE 8 is a flow diagram for determining whether a gesture of an intruder other than a user of a VR headset is permitted according to one embodiment of the disclosure. -
FIGURES 1A through 8 , discussed below, and the various embodiments used to describe the principles of the present disclosure in this patent document are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. Those skilled in the art will understand that the principles of the present disclosure may be implemented in any suitably arranged virtual reality headset. - In the disclosure below, the phrase "virtual reality" will be used generically for both virtual reality and augmented reality in order to simplify the descriptions that follow. Also, the following terms have the following meanings unless otherwise specified: i) "vision sensor" refers to any video camera (e.g., RGB camera), depth sensor, or motion detection circuitry device; ii) "main user" or "user" refers to the person actually wearing and operating the virtual reality (VR) head mounted display (HMD) or headset; and iii) "intruder" refers to any person other than the user whose hand gestures are intentionally or accidentally triggering undesirable effects on the VR user interface of the HMD/headset.
-
FIGURE 1A is a perspective view ofVR headset 100 according to one embodiment of the disclosure.FIGURE 1B is a front view ofVR headset 100 according to one embodiment of the disclosure.VR headset 100 comprises chassis (or housing) 105,forward vision sensor 110,head strap 120, anddownward vision sensor 130.Chassis 105 houses the electronics ofVR headset 100. A user placesVR headset 100 on his or her head and tightenshead strap 120 to holdVR headset 100 in place.Forward vision sensor 110 captures forward field of view (FOV) 150 and displays forwardFOV 150 on the internal display ofVR headset 100. The user may then view on the internal display any objects in theforward FOV 150. - When the
forward vision sensor 110 and the internal processor(s) ofVR headset 100 detect a hand inforward FOV 150 for the purpose of determining hand gestures, it may be difficult to determine whether the hand belongs to the main user or to an intruder. It is necessary to prevent a hand gesture from an intruder from causing undesirable interference to the user interface. The present disclosure provides a method of distinguishing legitimate user hand gestures from intruder hand gestures by usingdownward vision sensor 130, which captures downwardFOV 160.Downward vision sensor 130 and the internal processor(s) ofVR headset 100 are operable to detect and to identify the arm(s) of the user indownward FOV 160 and then to correlate and/or to associate the user hand movements with the user arm movements. In this way,VR headset 100 is capable of determining if a detected hand in theforward FOV 150 belongs to the legitimate user ofVR headset 100 or to an intruder. Once this determination is made, the internal processor(s) ofVR headset 100 will only process hand gesture commands from the user and will ignore hand gestures from an intruder. -
FIGURE 2 illustrates a hand gesture detection operation ofVR headset 100 according to one embodiment of the disclosure. InFIGURE 2 , the user extends her arm and hand forward to interact with object(s) in the virtual world.Forward vision sensor 110 detectsuser hand 210 inforward FOV 150 anddownward vision sensor 130 detectsuser arm 220 indownward FOV 160.VR headset 110 then determines whetheruser hand 210 belongs to the user by comparing the alignments and/or positions ofuser hand 210 anduser arm 220.VR headset 110 may also determine whetheruser hand 210 belongs to the user by comparing the relative movements ofuser hand 210 anduser arm 220. The tracked movements may include left-right (lateral) movement of the hands and arms, up-down (vertical) movement of the hands and arms, and/or forward-backward (extension) movements of the hands and arms away from or toward the body of the user. -
FIGURE 3 illustrates detectedhands forward FOV 150 offorward vision sensor 110 and detectedarms downward FOV 160 ofdownward vision sensor 130 ofVR headset 100 according to one embodiment of the disclosure. Generally, the user will only see detectedhands forward FOV 150 on the internal display ofVR headset 100. Detectedarms VR headset 100. InFIGURE 3 , the lateral movements of detectedarms 311 and 321 (indicted by left-right arrows) may be correlated with similar lateral movements of detectedhands hands VR headset 100 and not the hands of an intruder. -
FIGURE 4 is a schematic block diagram ofVR headset 100 according to one embodiment of the disclosure.VR headset 100 comprises forward vision sensor (VS) 110 and downwardVS 130.VR headset 100 further comprisesVR headset controller 410,memory 420,VR source video 430,video processor 440,display 450, andspeakers 460. In an exemplary embodiment, forward VS 110 and downward VS 130 may comprise conventional video cameras (e.g., RGB video cameras). -
VR headset controller 410 is a microprocessor or microcontroller that controls the overall operation ofVR headset 410 by executing an operating system program and one or more application programs stored inmemory 420.Video processor 440 receives source video fromVR source video 430, whichvideo processor 440 then displays on one or more screens ofdisplay 450.VR source video 430 may be an external VR video player coupled wirelessly or by wireline toVR headset 410. Alternatively,VR source video 430 may be an internal memory (including a part of memory 420), in which VR video content is stored. In camera mode,VR headset controller 410 directs the real-world outputs of forward VS 110 and downward VS 130 tovideo processor 440 so that the user can see the real-world around the user ondisplay 450, as well as augmented reality (AR) video content. - According to the principles of the disclosure,
VR headset controller 410 is configured to directvideo processor 440 to detect the hand(s) of the user inforward FOV 150 in the video output of forward VS 110 and to detect the arm(s) of the user indownward FOV 160 in the video output ofdownward VS 130.VR headset controller 410 is further configured to directvideo processor 440 to correlate and/or to associate the user hand movements with the user arm movements. In this way,video processor 440 is capable of determining if a detected hand inforward FOV 150 belongs to the legitimate user ofVR headset 100 or to an intruder. -
FIGURE 5 is a flow diagram illustrating the operation ofVR headset 100 according to one embodiment of the disclosure. Initially, the user activatesVR headset 100 andplaces VR headset 100 on his or her head (step 505). After activation, the user may launch an application that may be controlled by user hand gestures. In response,video processor 440 detects one or more hand(s) in forward FOV 150 (step 510).Video processor 440 also detects a portion (e.g., a forearm) of at least one arm of the user in downward FOV 160 (step 515). -
Video processor 440 then attempts to determine if a detected hand inforward FOV 150 is the hand of the user or an intruder.Video processor 440 may do this by comparing and analyzing detected objects inforward FOV 150 anddownward FOV 160 in order to correlate the alignments and/or movements of a detected hand(s) and a detected forearm(s) (step 520). From this comparison,video processor 440 identifies the hand(s) of the legitimate user ofVR headset 100 and ignores the detected hand(s) of intruder(s) (step 525). Thereafter,video processor 440 and/orVR headset controller 410 process the hand gestures of legitimate user (step 530). -
FIGURE 6 is a flow diagram for determining whether an identified hand is an object of a user of VR headset according to one embodiment of the disclosure. -
VR headset 100 identifies a first object through a first vision sensor (step 605). The first vision sensor may be a forward vision sensor. The first object may behand 210 of the user.VR headset 100 determines whether the first identified object is the object of the user, based on a second identified object through a second vision sensor (step 610). The second vision sensor may bedownward vision sensor 130. The second object may bearm 220 of the user. -
VR headset 100 determines whether the first identified object is the object of the user by determining whether the first identified object is correlated with the second identified object. If the first identified object is the object of the user,VR headset 100 permits an input by the first identified object and ignores an input by the intruder. Also,VR headset 100 may determine the object of the user, using the first vision sensor and the second vision sensor, and not using an additional hardware (e. g. armband). The intruder may mean a third party. -
FIGURE 7 is a flow diagram for ignoring a hand gesture of an intruder other than a user of VR headset according to one embodiment of the disclosure. -
VR headset 100 identifies hands of an intruder in forward FOV 150 (step 705).VR headset 100 determines whether the identified hands are hands of the intruder according to various embodiment of the disclosure.VR headset 100 identifiesarms VR headset 100 in downward FOV 160 (step 710).FIGURE 7 illustrates the flow diagram that step 705 and step 710 is performed in the order but each step may be performed simultaneously. Also, step 705 and step 710 may be performed in reverse order according to implementation method.VR headset 100 compare the identified objects inforward FOV 150 anddownward FOV 160 to determine whether positions, alignments, and/or movements is correlated with the identified hands and the identifiedarms 311 and 321 (step 715). For example,VR headset 100 may determine the identified objects using a method corresponding to the method describedstep 520 ofFIGURE 5 .VR headset 100 determines the hands of the intruder (step 720). In other words,VR headset 100 determines the hands of the intruder among the objects compared instep 715.VR headset 100 ignores hand gestures of the intruder (step 725). As a result,VR headset 100 may prevent the hand gestures of the intruder from causing undesirable effect to user interface. -
FIGURE 8 is a flow diagram for determining whether a gesture of an intruder other than a user of VR headset is permitted according to one embodiment of the disclosure. -
VR headset 100 determines whether the gesture of the intruder is permitted in VR (step 805). IfVR headset 100 permits the gesture of the intruder in VR,VR headset 100 may permit an input by the gesture of the intruder in an application (step 810). For example, ifVR headset 100 permits the gesture of the intruder,VR headset 100 may implement the application that both the user and the intruder may cooperatively use. IfVR headset 100 ignores the gesture of the intruder in VR,VR headset 100 may semitransparently display the gesture of the intruder (step 815). As a result,VR headset 100 may identify exist of an object of the intruder in forward FOV by displaying the gesture of the intruder semitransparently in VR. - Although the present disclosure has been described with an exemplary embodiment, various changes and modifications may be suggested to one skilled in the art. It is intended that the present disclosure encompass such changes and modifications as fall within the scope of the appended claims.
Claims (15)
- A virtual reality (VR) apparatus comprising:a detector comprising a first vision sensor and a second vision sensor; anda controller operatively coupled with the detector,wherein the controller is configured to:identify a first object through the first vision sensor, anddetermine whether the first identified object is an object of a user of the VR apparatus, based on the second identified object through the second vision sensor.
- The VR apparatus as set forth in Claim 1, wherein the controller is configured to:capture an image comprising the first object through the first vision sensor, andidentify the first object from the captured image.
- The VR apparatus as set forth in Claim 2, wherein the controller is configured to:capture an image comprising the second object through the second vision sensor, andidentify the second object from the captured image.
- The VR apparatus as set forth in Claim 3, wherein the controller is configured to determine whether the first identified object is the object of the user, based on at least one of a relative position, a relative movement, and a relative alignment of the first identified object and the second identified object.
- The VR apparatus as set forth in Claim 1, wherein the controller is further configured to:identify a third object belonging to a third party other than the user through the first vision sensor, anddetermine whether the first identified object is the object of the user, based on the second identified object and the third identified object.
- The VR apparatus as set forth in Claim 5, wherein the controller is configured to determine the third identified object is the object of the user, based on at least one of a relative position, a relative movement, and a relative alignment of the second identified object and the third identified object.
- The VR apparatus as set forth in Claim 6, wherein the controller is further configured to identify a gesture of the first identified object if the first identified object is the object of the user.
- The VR apparatus as set forth in Claim 7, wherein the controller is further configured to display the gesture of the identified third object semitransparently in VR.
- A method of operating a virtual reality (VR) apparatus comprising:identifying a first object through a first vision sensor; anddetermining whether the first identified object is an object of a user of the VR apparatus, based on a second identified object through a second vision sensor.
- The method as set forth in Claim 9,
identifying a first object comprising:capturing an image comprising the first object through the first vision sensor; andidentifying the first object from the captured image. - The method as set forth in Claim 10,
determining whether the first identified object is an object of a user of the VR apparatus comprising:capturing an image comprising the second object through the second vision sensor;identifying the second object from the captured image; anddetermining whether the first identified object is the object of the user of the VR apparatus, based on the second identified object. - The method as set forth in Claim 11, determining whether the first identified object is the object of the user of the VR apparatus comprising determining whether the first identified object is the object of the user, based on at least one of a relative position, a relative movement, and a relative alignment of the first identified object and the second identified object.
- The method as set forth in Claim 9, further comprising:identifying a third object belonging to a third party other than the user through the first vision sensor; anddetermining whether the first identified object is the object of the user, based on the second identified object and the third identified object.
- The method as set forth in Claim 13, determining whether the first identified object is the object of the user comprising determining whether the third identified object is the object of the user, based on a relative position, a relative movement, and a relative alignment of the second identified object and the third identified object.
- The method as set forth in Claim 14, further comprising identifying a gesture of the first identified object if the first identified object is the object of the user.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/982,299 US10140507B2 (en) | 2015-12-29 | 2015-12-29 | Apparatus and method for recognizing hand gestures in a virtual reality headset |
KR1020160026357A KR102568708B1 (en) | 2015-12-29 | 2016-03-04 | Apparatus and method for recognizing hand gestures in a virtual reality headset |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3188075A1 true EP3188075A1 (en) | 2017-07-05 |
EP3188075B1 EP3188075B1 (en) | 2023-02-22 |
Family
ID=56684462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP16181893.5A Active EP3188075B1 (en) | 2015-12-29 | 2016-07-29 | Apparatus and method for recognizing hand gestures in a virtual reality headset |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP3188075B1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220253130A1 (en) * | 2021-02-08 | 2022-08-11 | Multinarity Ltd | Keyboard sensor for augmenting smart glasses sensor |
US11475650B2 (en) | 2021-02-08 | 2022-10-18 | Multinarity Ltd | Environmentally adaptive extended reality display system |
US11480791B2 (en) | 2021-02-08 | 2022-10-25 | Multinarity Ltd | Virtual content sharing across smart glasses |
US11748056B2 (en) | 2021-07-28 | 2023-09-05 | Sightful Computers Ltd | Tying a virtual speaker to a physical space |
US11877203B2 (en) | 2022-01-25 | 2024-01-16 | Sightful Computers Ltd | Controlled exposure to location-based virtual content |
US11948263B1 (en) | 2023-03-14 | 2024-04-02 | Sightful Computers Ltd | Recording the complete physical and extended reality environments of a user |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120249416A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Modular mobile connected pico projectors for a local multi-user collaboration |
US20150199824A1 (en) * | 2014-01-10 | 2015-07-16 | Electronics And Telecommunications Research Institute | Apparatus and method for detecting multiple arms and hands by using three-dimensional image |
-
2016
- 2016-07-29 EP EP16181893.5A patent/EP3188075B1/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120249416A1 (en) * | 2011-03-29 | 2012-10-04 | Giuliano Maciocci | Modular mobile connected pico projectors for a local multi-user collaboration |
US20150199824A1 (en) * | 2014-01-10 | 2015-07-16 | Electronics And Telecommunications Research Institute | Apparatus and method for detecting multiple arms and hands by using three-dimensional image |
Non-Patent Citations (2)
Title |
---|
DANIEL LAU: "LEADING EDGE VIEWS: 3-D Imaging Advances Capabilities of Machine Vision: Part I - Vision Systems Design", 1 April 2012 (2012-04-01), XP055249233, Retrieved from the Internet <URL:http://www.vision-systems.com/articles/print/volume-17/issue-4/departments/leading-edge-views/3-d-imaging-advances-capabilities-of-machine-vision-part-i.html> [retrieved on 20160211] * |
S THELEN ET AL: "Enhancing Large Display Interaction with User Tracking Data", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER GRAPHICS AND VIRTUAL REALITY (CGVR), 1 January 2012 (2012-01-01), Atlanta, pages 1, XP055366711, Retrieved from the Internet <URL:http://world-comp.org/p2012/CGV2412.pdf> * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11609607B2 (en) | 2021-02-08 | 2023-03-21 | Multinarity Ltd | Evolving docking based on detected keyboard positions |
US11481963B2 (en) | 2021-02-08 | 2022-10-25 | Multinarity Ltd | Virtual display changes based on positions of viewers |
US11601580B2 (en) | 2021-02-08 | 2023-03-07 | Multinarity Ltd | Keyboard cover with integrated camera |
US11599148B2 (en) | 2021-02-08 | 2023-03-07 | Multinarity Ltd | Keyboard with touch sensors dedicated for virtual keys |
US11496571B2 (en) | 2021-02-08 | 2022-11-08 | Multinarity Ltd | Systems and methods for moving content between virtual and physical displays |
US11514656B2 (en) | 2021-02-08 | 2022-11-29 | Multinarity Ltd | Dual mode control of virtual objects in 3D space |
US11516297B2 (en) | 2021-02-08 | 2022-11-29 | Multinarity Ltd | Location-based virtual content placement restrictions |
US11561579B2 (en) | 2021-02-08 | 2023-01-24 | Multinarity Ltd | Integrated computational interface device with holder for wearable extended reality appliance |
US11567535B2 (en) | 2021-02-08 | 2023-01-31 | Multinarity Ltd | Temperature-controlled wearable extended reality appliance |
US11574451B2 (en) | 2021-02-08 | 2023-02-07 | Multinarity Ltd | Controlling 3D positions in relation to multiple virtual planes |
US11574452B2 (en) | 2021-02-08 | 2023-02-07 | Multinarity Ltd | Systems and methods for controlling cursor behavior |
US11582312B2 (en) | 2021-02-08 | 2023-02-14 | Multinarity Ltd | Color-sensitive virtual markings of objects |
US11580711B2 (en) | 2021-02-08 | 2023-02-14 | Multinarity Ltd | Systems and methods for controlling virtual scene perspective via physical touch input |
US11588897B2 (en) | 2021-02-08 | 2023-02-21 | Multinarity Ltd | Simulating user interactions over shared content |
US11592872B2 (en) | 2021-02-08 | 2023-02-28 | Multinarity Ltd | Systems and methods for configuring displays based on paired keyboard |
US11592871B2 (en) | 2021-02-08 | 2023-02-28 | Multinarity Ltd | Systems and methods for extending working display beyond screen edges |
US11480791B2 (en) | 2021-02-08 | 2022-10-25 | Multinarity Ltd | Virtual content sharing across smart glasses |
US20220253130A1 (en) * | 2021-02-08 | 2022-08-11 | Multinarity Ltd | Keyboard sensor for augmenting smart glasses sensor |
US11475650B2 (en) | 2021-02-08 | 2022-10-18 | Multinarity Ltd | Environmentally adaptive extended reality display system |
US11620799B2 (en) | 2021-02-08 | 2023-04-04 | Multinarity Ltd | Gesture interaction with invisible virtual objects |
US11627172B2 (en) | 2021-02-08 | 2023-04-11 | Multinarity Ltd | Systems and methods for virtual whiteboards |
US11650626B2 (en) | 2021-02-08 | 2023-05-16 | Multinarity Ltd | Systems and methods for extending a keyboard to a surrounding surface using a wearable extended reality appliance |
US11927986B2 (en) | 2021-02-08 | 2024-03-12 | Sightful Computers Ltd. | Integrated computational interface device with holder for wearable extended reality appliance |
US11797051B2 (en) | 2021-02-08 | 2023-10-24 | Multinarity Ltd | Keyboard sensor for augmenting smart glasses sensor |
US11924283B2 (en) | 2021-02-08 | 2024-03-05 | Multinarity Ltd | Moving content between virtual and physical displays |
US11811876B2 (en) | 2021-02-08 | 2023-11-07 | Sightful Computers Ltd | Virtual display changes based on positions of viewers |
US11882189B2 (en) | 2021-02-08 | 2024-01-23 | Sightful Computers Ltd | Color-sensitive virtual markings of objects |
US11863311B2 (en) | 2021-02-08 | 2024-01-02 | Sightful Computers Ltd | Systems and methods for virtual whiteboards |
US11861061B2 (en) | 2021-07-28 | 2024-01-02 | Sightful Computers Ltd | Virtual sharing of physical notebook |
US11829524B2 (en) | 2021-07-28 | 2023-11-28 | Multinarity Ltd. | Moving content between a virtual display and an extended reality environment |
US11816256B2 (en) | 2021-07-28 | 2023-11-14 | Multinarity Ltd. | Interpreting commands in extended reality environments based on distances from physical input devices |
US11809213B2 (en) | 2021-07-28 | 2023-11-07 | Multinarity Ltd | Controlling duty cycle in wearable extended reality appliances |
US11748056B2 (en) | 2021-07-28 | 2023-09-05 | Sightful Computers Ltd | Tying a virtual speaker to a physical space |
US11877203B2 (en) | 2022-01-25 | 2024-01-16 | Sightful Computers Ltd | Controlled exposure to location-based virtual content |
US11941149B2 (en) | 2022-01-25 | 2024-03-26 | Sightful Computers Ltd | Positioning participants of an extended reality conference |
US11948263B1 (en) | 2023-03-14 | 2024-04-02 | Sightful Computers Ltd | Recording the complete physical and extended reality environments of a user |
Also Published As
Publication number | Publication date |
---|---|
EP3188075B1 (en) | 2023-02-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10140507B2 (en) | Apparatus and method for recognizing hand gestures in a virtual reality headset | |
EP3188075B1 (en) | Apparatus and method for recognizing hand gestures in a virtual reality headset | |
US9696814B2 (en) | Information processing device, gesture detection method, and gesture detection program | |
EP2393042A1 (en) | Selecting view orientation in portable device via image analysis | |
US20110080337A1 (en) | Image display device and display control method thereof | |
US10234955B2 (en) | Input recognition apparatus, input recognition method using maker location, and non-transitory computer-readable storage program | |
CN105847540A (en) | Method and mobile phone for controlling picture movement of VR (Virtual Reality) glasses based on eyeball tracking and VR glasses | |
US20140267004A1 (en) | User Adjustable Gesture Space | |
KR101470243B1 (en) | Gaze detecting apparatus and gaze detecting method thereof | |
WO2016008265A1 (en) | Method and apparatus for locating position | |
JP4968922B2 (en) | Device control apparatus and control method | |
US20130308835A1 (en) | Mobile Communication Device with Image Recognition and Method of Operation Therefor | |
JP2012238293A (en) | Input device | |
WO2015104884A1 (en) | Information processing system, information processing method, and program | |
WO2018198499A1 (en) | Information processing device, information processing method, and recording medium | |
US11501552B2 (en) | Control apparatus, information processing system, control method, and program | |
WO2018146922A1 (en) | Information processing device, information processing method, and program | |
US10902627B2 (en) | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm | |
WO2020054760A1 (en) | Image display control device and program for controlling image display | |
US11944897B2 (en) | Device including plurality of markers | |
KR101477181B1 (en) | Touch Input Control Method and Apparatus | |
US20150070459A1 (en) | Information processing apparatus and information processing method | |
JP6373546B2 (en) | Information processing apparatus, information processing method, and program | |
JP2015126369A (en) | Imaging apparatus | |
US9697419B2 (en) | Information processing apparatus, non-transitory storage medium encoded with computer readable information processing program, information processing system, and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20171009 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200728 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602016077911 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06K0009000000 Ipc: A63F0013212000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A63F 13/213 20140101ALI20221021BHEP Ipc: G06V 40/20 20220101ALI20221021BHEP Ipc: G06V 40/10 20220101ALI20221021BHEP Ipc: G06V 20/20 20220101ALI20221021BHEP Ipc: G06V 10/147 20220101ALI20221021BHEP Ipc: G06F 3/147 20060101ALI20221021BHEP Ipc: G06F 3/14 20060101ALI20221021BHEP Ipc: G06F 3/03 20060101ALI20221021BHEP Ipc: G02B 27/01 20060101ALI20221021BHEP Ipc: A63F 13/5255 20140101ALI20221021BHEP Ipc: A63F 13/26 20140101ALI20221021BHEP Ipc: A63F 13/25 20140101ALI20221021BHEP Ipc: A63F 13/212 20140101ALI20221021BHEP Ipc: A63F 13/211 20140101ALI20221021BHEP Ipc: A63F 13/00 20140101ALI20221021BHEP Ipc: G06F 3/01 20060101ALI20221021BHEP Ipc: G06K 9/00 20060101AFI20221021BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20221209 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06V 40/20 20220101ALI20221125BHEP Ipc: G06V 40/10 20220101ALI20221125BHEP Ipc: G06V 20/20 20220101ALI20221125BHEP Ipc: G06V 10/147 20220101ALI20221125BHEP Ipc: G06F 3/147 20060101ALI20221125BHEP Ipc: G06F 3/14 20060101ALI20221125BHEP Ipc: G06F 3/03 20060101ALI20221125BHEP Ipc: G02B 27/01 20060101ALI20221125BHEP Ipc: A63F 13/5255 20140101ALI20221125BHEP Ipc: A63F 13/26 20140101ALI20221125BHEP Ipc: A63F 13/25 20140101ALI20221125BHEP Ipc: A63F 13/213 20140101ALI20221125BHEP Ipc: G06F 3/01 20060101ALI20221125BHEP Ipc: A63F 13/00 20140101ALI20221125BHEP Ipc: A63F 13/211 20140101ALI20221125BHEP Ipc: A63F 13/212 20140101AFI20221125BHEP |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 1549148 Country of ref document: AT Kind code of ref document: T Effective date: 20230315 Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602016077911 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20230222 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1549148 Country of ref document: AT Kind code of ref document: T Effective date: 20230222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230622 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230522 Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230622 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230523 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20230620 Year of fee payment: 8 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602016077911 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20230620 Year of fee payment: 8 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20231123 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20230222 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20230731 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230729 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20230729 |