EP3218781A1 - Interaction spatiale en réalité augmentée - Google Patents

Interaction spatiale en réalité augmentée

Info

Publication number
EP3218781A1
EP3218781A1 EP15784798.9A EP15784798A EP3218781A1 EP 3218781 A1 EP3218781 A1 EP 3218781A1 EP 15784798 A EP15784798 A EP 15784798A EP 3218781 A1 EP3218781 A1 EP 3218781A1
Authority
EP
European Patent Office
Prior art keywords
user
scene
hand
coordinate system
mapping mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP15784798.9A
Other languages
German (de)
English (en)
Other versions
EP3218781B1 (fr
Inventor
Hartmut SEICHTER
Dieter Schmalstieg
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of EP3218781A1 publication Critical patent/EP3218781A1/fr
Application granted granted Critical
Publication of EP3218781B1 publication Critical patent/EP3218781B1/fr
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof

Definitions

  • aspects of the present disclosure relate generally to augmented reality and in particular to spatial interaction in an augmented reality environment.
  • Augmented reality is a burgeoning technology that combines real-world imagery with computer-generated data, such as graphics or textual information. Augmented reality technology superimposes graphics, audio, and other sensory enhancements onto the real world, thus enhancing reality, using advanced user devices such as tablets, smart phones, and the like. Augmented reality allows the user to interact with real objects in real time, registering the virtual imagery with the real world.
  • a method for spatial interaction in Augmented Reality includes displaying an AR scene that includes an image of a real-world scene, a virtual target object, and a virtual cursor.
  • a position of the virtual cursor is provided according to a first coordinate system within the AR scene.
  • a user device tracks a pose of the user device relative to a user hand according to a second coordinate system.
  • the second coordinate system is mapped to the first coordinate system to control movements of the virtual cursor.
  • virtual cursor movement is controlled to change a distance between the virtual cursor and the virtual target object.
  • a second mapping mode virtual cursor movement is controlled to manipulate the virtual target object.
  • User input is detected to control which of the first mapping mode or the second mapping mode is used.
  • an apparatus in another aspect, includes means for acquiring, by a user device, an image of a real-world scene and means for displaying, on the user device, an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor.
  • a position of the virtual cursor is provided according to a first coordinate system within the AR scene.
  • the apparatus further includes means for tracking a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand.
  • a means for mapping the second coordinate system to the first coordinate system is also included in the apparatus to control movement of the virtual cursor in the AR scene in response to movements of the user hand.
  • the means for mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, where the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and where the second mapping mode is configured to control movement of virtual hand to manipulate the virtual target object within the AR scene.
  • the apparatus also includes means for detecting, at the user device, a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene.
  • a user device that includes a camera, a display, memory, and a processing unit.
  • the camera is configured to capture an image of a real-world scene and the display is configured to display an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor.
  • AR augmented reality
  • a position of the virtual cursor is provided according to a first coordinate system within the AR scene.
  • the memory is adapted to store program code and the processing unit is coupled to the memory to access and execute instructions included in the program code to direct the user device to: (i) track a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; (ii) map the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, wherein mapping the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, wherein the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and wherein the second mapping mode is configured to control movement of virtual cursor to manipulate the virtual target object within the AR scene; and (iii) detect a user input to control which of the first mapping mode and the second mapping mode is used to control movement of the virtual cursor in the AR scene.
  • a non-transitory computer-readable medium includes program code stored thereon.
  • the program code includes instructions which when executed cause a user device to: (i) acquire an image of a real-world scene; (ii) display an augmented reality (AR) scene that includes the image of the real-world scene, a virtual target object, and a virtual cursor, where a position of the virtual cursor is provided according to a first coordinate system within the AR scene ; (iii) track a pose of the user device relative to a user hand according to a second coordinate system that defines a relationship between the user device and the user hand; (iv) map the second coordinate system to the first coordinate system to control movement of the virtual cursor in the AR scene in response to movements of the user hand, where the mapping of the second coordinate system to the first coordinate system includes a first mapping mode and a second mapping mode, where the first mapping mode is configured to control movement of the virtual cursor to change a distance between the virtual cursor and the virtual target object in the AR scene, and where
  • FIG. 1 illustrates an augmented reality environment according to one or more implementations of the technology described herein.
  • FIG. 2 illustrates a spatial interaction mechanism for use in an augmented reality environment according to implementations of the technology described herein.
  • FIG. 3 illustrates a spatial interaction mechanism for use in according to alternative implementations of the technology described herein.
  • FIG. 4 is a flowchart of a method of operating a spatial interaction mechanism for use in how an augmented reality environment according to implementations of the technology described herein.
  • FIGs. 5A through 5C are pictorial representations illustrating a relationship between a target and a hand in various poses according to one or more implementations of the technology described herein.
  • FIGs. 6 through 9 are pictorial representations illustrating a spatial interaction mechanism for use in an augmented reality environment according to alternative implementations of the technology described herein.
  • FIG. 10 is a functional block diagram illustrating an apparatus 1000 capable of performing the processes discussed herein.
  • FIG. 1 1 is a simplified block diagram illustrating several sample aspects of components that may be employed in a user device configured to provide spatial interaction with an augmented reality scene, as taught herein.
  • a user device has multiple coordinate systems: an object-to-device coordinate system (for an object space), a hand- to-device coordinate system (for a control space), and a virtual cursor coordinate system (for cursor space).
  • the user device acquires translational and rotational movements of a user hand and/or translational and rotational movements of the user device itself to control movement of a "virtual hand" or a "virtual cursor” in three dimensions (3D).
  • the virtual cursor may then interact with the augmented reality (AR) scene. Because a user hand may have twenty-seven degrees of freedom, more options for mapping coordinates between coordinate system may be provided to control movement of the virtual cursor.
  • the examples described herein may implement two different mapping modes for mapping the hand-to-device coordinate system to the virtual cursor coordinate system to control movements of the virtual hand in the AR scene in response to movements of the user hand.
  • the first mapping mode (also referred to herein as a GoGo mechanism) uses the metaphor of interactively growing the user's arm and non-linear mapping for reaching and manipulating distant objects in the AR scene to enable seamless direct manipulation of both nearby objects and those at a distance.
  • mapping the hand-to-device coordinate system to the virtual cursor coordinate system according to the first mapping mode takes into account translational and rotational movement of the user hand relative to the user device as well as translational and rotational movement of the user device relative to the virtual target object when controlling movement of the virtual cursor.
  • the second mapping mode (also referred to herein as a Magic Hand mechanism) enables precise manipulation of virtual target objects included in the AR scene.
  • the user device-to-hand coordinate system may be mapped directly to the virtual cursor coordinate system, such that the control space is put directly into the object space. This allows for very precise direct manipulation of objects.
  • mapping the hand-to-device coordinate system to the virtual cursor coordinate system according to the second mapping mode takes into account translational and rotational movement of the user hand relative to the user device independent of any translational and rotational movement of the user device relative to the virtual target object when controlling movement of the virtual cursor.
  • the GoGo Magic mechanism also includes gesture and/or posture detections for fingers.
  • Interpretation of the finger posture allows for switching between the GoGo mechanism (open Hand) and the Magic Hand mechanism (index finger pointing).
  • Interpretation of the gesture and/or posture detections for fingers also allows for distinguishing between selection and manipulation.
  • the GoGo Magic mechanism may interpret a fist as a request to switch between selection provided by the GoGo mechanism and manipulation provided by the Magic Hand mechanism.
  • a Chopsticks mechanism uses the tips of chopsticks to create points that are on one side of the user device.
  • the Chopsticks mechanism also uses the center point of the user device as a cube. The user can then use the Chopsticks to select objects in the augmented reality scene with the tips of the Chopsticks.
  • a user could use virtual chopsticks.
  • a user may simulate chopstick motion using a thumb and index finger, or any two fingers, to accomplish a pinching gesture to zoom virtual or actual tips of the chopsticks in and out.
  • Other suitable finger poses and/or gestures include varying the pressure on a button in the hand, etc.
  • Other user inputs are possible as well, including, but not limited to voice, touch, and the like.
  • FIG. 1 illustrates an augmented reality environment 100 according to one or more implementations of the technology described herein.
  • the illustrated augmented reality environment 100 includes a user hand 102, a user device 104, and a target 106.
  • a camera included in the user device 104 captures, or otherwise acquires an image of a real-world scene that includes the target 106.
  • the user device 104 may then render an AR scene on the display 112 that includes the image of the scene, a virtual target object 108 (e.g., a virtual house), and a virtual hand 114 (i.e., a virtual cursor).
  • the illustrated user hand 102 includes an index finger 1 10.
  • the virtual hand 1 14 includes a virtual index finger 1 16.
  • the augmented reality environment 100 also includes a first coordinate system 120 and a second coordinate system 1 18.
  • a position of the virtual hand 1 14 is provided within the AR scene according to a first coordinate system 120.
  • a second coordinate system 1 18 defines a relationship between the user device 104 and the user hand 102.
  • a pose of the user device 104 relative to user hand 102 may be expressed using coordinates from the second coordinate system 118.
  • the augmented reality environment 100 also includes a sensor 122 coupled to the user device 104.
  • the augmented reality environment 100 includes a virtual space 124.
  • the augmented reality environment 100 uses a vision- based tracking system by way of a camera coupled to or embedded within user device 104 to track the pose of user device 104 relative to the virtual target object 108.
  • the augmented reality environment 100 then may determine the three-dimensional (3D) relationship between the virtual target object 108 and the user device 104.
  • the augmented reality environment 100 also tracks a pose of the user device 104 relative to the user hand 102 using sensor 122 with reference to coordinate system 1 18.
  • the coordinate system 118 is then mapped to the coordinate system 120 to control movements of virtual hand 114 in the AR scene in response to movements of user hand 102.
  • the augmented reality environment 100 may capture an image of a real- world scene, identify objects in the scene using a scene recognition algorithm, retrieve information based on the identified objects, and create a combined display of an image of the physical scene and information related to the identified objects, and thereby augment the physical scene.
  • a user's view of the real world is enhanced with virtual computer-generated graphics (e.g., virtual target object 108). These graphics are spatially registered so that they appear aligned with the real world from the perspective of the viewing user. For example, the spatial registration can make a virtual character appear to be standing on a real table.
  • the augmented reality environment 100 may be used in a gaming setting, an instructional setting, industrial design, sports and entertainment, a medical environment, or other suitable environment that can benefit from the use of augmented reality technology.
  • the user hand 102 may be part of any human viewer of the user device 104. As such, the user hand 102 can interact with the user device 104 using a variety of modalities. Of course, the user hand 102 can be part of any mechanical device such as a robot capable of interacting with the user device 104, under the control of a human, for example.
  • the user device 104 may be any user equipment such as telephones, tablet computers, "phablet (phone + tablet)" computers, smart phones, laptop and desktop computers, and the like.
  • the user device 104 may include one or more orientation sensing modules, cameras, wireless transceivers, graphics engines, processors, user interfaces (e.g. display 1 12, keypad), eye movement detection modules, hand movement detection modules, voice detection module, speech recognition module, facial expression recognition module, head tracking module, and the like.
  • the user device 104 may have six degrees of freedom.
  • the target 106 may be a place, object, general direction, person, or other similar item.
  • the target 106 may be stationary or moving.
  • target 106 is an object that includes distinguishing observable features, such as texture, shape, pattern, or size that are recognized by the user device 104 for tracking and aiding in pose estimates by the user device 104 for the generation of virtual target objects in the AR scene.
  • target 106 may be a printout of a previously determined pattern that is recognizable by the vision-based tracking system of the user device 104.
  • the camera of user device 104 may capture one or more images of a real-world scene that includes target 106, perform one or more recognition algorithms on the images to detect the presence of target 106, then track target 106 in subsequent images such that the pose of user device 104 may be tracked relative to target 106 as well as relative to any virtual target objects (e.g., 108) that are generated with respect to target 106.
  • virtual target objects e.g. 108
  • the illustrated display 112 may be a touch screen display, a holographic display, etc., that is capable of displaying text, images, and the like.
  • the illustrated coordinate system 118 is the coordinate system for the user hand 102 relative the user device 104.
  • the coordinate system 120 is the coordinate system for providing a position of the virtual hand 114. Having the coordinate system 118 and the coordinate system 118 provides twelve degrees of freedom because the user hand 102 can move, the user hand 102 can turn, the user device 104 can move, and the user device 104 can turn.
  • the illustrated sensor 122 may be any suitable device that is capable of sensing an articulated hand, such as an infrared sensor.
  • the sensor 122 may be a magnetic, acoustic, inertial, optical, mechanical, etc., sensor that are capable of tracking and/or detecting movement of a user hand, finger poses, fist, etc.
  • the sensor 122 may implement mechanisms to determine the position of the user device 104.
  • the sensor 122 may implement mechanisms that include using data from a network, including triangulation, Wi-Fi positioning, and the like.
  • the sensor 122 may distinguish fingers, palm, etc., and provide the data to the user device 104.
  • the illustrated sensor 122 may be any suitable three-dimensional (3D) sensing device.
  • FIG. 2 illustrates an implementation of a first mapping mode (e.g., a GoGo mechanism) according to implementations of the technology described herein.
  • a user hand 102 can reach objects at a distance greater than the user's arm's length.
  • the illustrated GoGo mechanism 200 may include a control space 202 (including coordinate system 1 18), a cursor space 204 (including coordinate system 120), and an object space 206.
  • the object space 206 is associated with the user device 104 and may provide for a pose of the user device 104 relative to target 106.
  • the control space 202 spans between the user device 104 and the user hand 102 (shown in FIG. 1).
  • the size of the control space 202 may be dynamically mapped to the object space 206 to provide the cursor space 204.
  • the target 106 and/or virtual target object 108 are also registered in the cursor space 204.
  • user hand 102 may be able to always reach into the scene on the screen on the display 1 12.
  • user device 104 may track the pose of user device 104 relative to user hand 102 in order to detect a translational movement of the user hand 102 for a first distance 130A (e.g., user hand 102 moves closer/towards user device 104).
  • the first mapping mode includes normalizing the detected translational movement of the user hand 102 to a translational movement of the virtual hand 114 a second distance 130B in the AR scene (e.g., virtual hand 102 moves closer/towards virtual target object 108). However, because of the normalizing of the translational movement into the coordinate system 120 the second distance 130B is greater than the first distance 130A.
  • user device 104 may track the pose of user device 104 relative to user hand 102 in order to detect a translational movement of the user hand 102 for a first distance 132A (e.g., user hand 102 moves away from user device 104).
  • the first mapping mode also includes normalizing the detected translational movement of the user hand 102 to a translational movement of the virtual hand 1 14 a second distance 132B in the AR scene (e.g., virtual hand 102 moves away from virtual target object 108).
  • the second distance 132B is greater than the first distance 132A.
  • the user device may detect translational and/or rotational movement of user device 104, where the detected translational and rotational movement of the user device is combined with the translational and rotational movement of the user hand to provide translational and rotational movement of the virtual hand 1 14.
  • FIG. 3 illustrates an implementation of a second mapping mode (e.g., the Magic Hands mechanism 300) according to implementations of the technology described herein.
  • a control space 302 which is generated by the sensor 122 on the user device 104, is being mapped (e.g., directly mapped) to a cursor space 304 that is associated with object space 306.
  • the cursor space 304 is detached from the user device 104. That is, the mapping may be a one-to-one mapping so that objects (e.g., virtual target object 108) in the object space 306 may be precisely manipulated.
  • the Magic Hands mechanism may allow for precise manipulation of objects in the scene shown on the display 112 of the user device 104.
  • Manipulation of the virtual target object 108 may include selecting, activating, touching, moving, resizing, rotating, or otherwise interacting with the virtual target object 108 through virtual hand 114.
  • the sensor 122 may be implemented using infrared optics, cameras, depth sensors, and the like.
  • the cursor space 304 is where objects can be manipulated and where the virtual hand 1 16 is moving around.
  • the Magic Hands mechanism 300 allows for fine grained manipulations of objects.
  • user device 104 may track the pose of user device 104 relative to user hand 102 in order to detect a translational movement of the user hand 102 for a first distance 130A (e.g., user hand 102 moves closer or towards user device 104).
  • the second mapping mode includes directly mapping the detected translational movement of the user hand 102 to a translational movement of the virtual hand 1 14 a second distance 130B in the AR scene (e.g., virtual hand 102 moves closer or towards virtual target object 108).
  • the second distance 130B is equal to the first distance 130A.
  • the user device may detect rotational movement of user hand 102, where the detected rotational movement is directly mapped to a rotational movement of virtual hand 114 (e.g., a 10 degree rotation of user hand 102 results in a 10 degree rotation of virtual hand 1 14).
  • FIG. 4 is a flowchart of a process 400 illustrating operation of the augmented reality environment 100 to allow a user to select and manipulate a virtual target object in accordance with one or more implementations.
  • Process 400 is one possible process performed by user device 104 of FIG. 1.
  • a process block 402 user device 104 captures, or otherwise acquires an image of a real- world scene.
  • the display 112 displays an AR scene that includes the image of the real-world scene, the virtual target object 108, and the virtual hand 114.
  • a position of the virtual hand is provided according to a first coordinate system 120 within the AR scene.
  • the user device 104 tracks a pose of the user device 104 relative to the user hand 102 according to the second coordinate system 118.
  • the user device 104 detects user input. As mentioned above, detecting user input may be performed by way of sensor 122 to detect at least one of a finger posture and/or a hand gesture of user hand 102.
  • user device 412 determines whether the user input indicates that the user device 102 should operate in the first mapping mode or the second mapping mode to control movement of virtual hand 116 in the AR scene.
  • a finger point e.g., by index finger 1 10) is detected in order to switch to the second mapping mode (e.g., Magic hands mechanism).
  • an open hand gesture of user hand 102 is detected in order to switch to the first mapping mode (e.g., GoGo mechanism).
  • a fist posture of user hand 102 may be detected to switch from the first mapping mode to the second mapping mode.
  • user hand 102 may, in effect, reach for a virtual target object 108 in the first mapping mode by way of an open hand gesture, and then switch to control or manipulation of the virtual target object 108 in the second mapping mode by then closing the open hand to a fist posture.
  • process 400 proceeds to process block 414 where the second coordinate system 1 18 is mapped to the first coordinate system 120 to control movement of the virtual hand 1 14 according to the first mapping mode, such as described above with reference to the GoGo mechanism 200 of FIG. 2.
  • process 400 proceeds to process block 416 where the second coordinate system 118 is mapped to the first coordinate system 120 to control movement of the virtual hand 114 according to the second mapping mode, such as described above with reference to the Magic Hands mechanism 300 of FIG. 3.
  • FIGs. 5 A through 5C illustrate a relationship between the target 106 and the hand 102 in the augmented reality environment 100 according to one or more implementations of the technology described herein.
  • FIG. 5A illustrates an index finger 1 10 pointing posture according to one or more implementations of the technology described herein.
  • FIG. 5B illustrates an open hand gesture of user hand 102 according to one or more implementations of the technology described herein.
  • FIG. 5C illustrates a fist posture 502 according to one or more implementations of the technology described herein.
  • the user device 102 may interpret the fist posture 502 as a request to switch from virtual hand 114 movement provided by the GoGo mechanism 200 and virtual target object 108 manipulation provided by the Magic Hand mechanism 300.
  • FIG. 6 is a picture illustrating a Chopsticks mechanism 600 according to one or more implementations of the technology described herein.
  • the Chopsticks mechanism 600 includes tips 602 to control a target point in the augmented reality scene.
  • the target point controlled by the chopsticks mechanism 600 is a virtual cursor as described above.
  • the target point controlled by the chopsticks mechanism 600 is the virtual target object, as described above. The user can then use the Chopsticks mechanism 600 to select and/or manipulate objects in the augmented reality scene with the tips 602.
  • FIG. 7 is a pictorial diagram illustrating Chopsticks mechanism 600 operation according to one or more implementations of the technology described herein.
  • the Chopstick 600 mechanism measures the distance h of the hand 102 from the screen and the distance d of the thumb 702 and forefinger 704 using a depth camera included in the user device 104.
  • a target point lies on a ray 706 through the midpoint between thumb 702 and the forefinger 704.
  • the origin of the ray 706 may be fixed relative to the screen on the display 1 12, or may be determined from tracking of a user 708's head with the camera in the user device 104 that faces the user.
  • the constant k is chosen by the user 708.
  • the constant k is predetermined by the user device 104.
  • FIGS. 8A and 8B are diagrams illustrating the Chopsticks mechanism 600 operation according to one or more implementations of the technology described herein.
  • the user device 104 may control the AR scene such that the distance p to the target point becomes larger.
  • the user device 104 controls the distance p my moving the target point, such as virtual hand 114 within the AR scene.
  • FIG. 8B if the user 708 decreases the distance d of thumb 702 and the forefinger 704, then user device 104 controls the AR scene such that the distance p to the target point becomes smaller.
  • FIGS. 9A and 9B are diagrams illustrating the Chopsticks mechanism 600 operation according to one or more implementations of the technology described herein.
  • the Chopsticks mechanism 600 illustrated in FIG. 9A if the user 708 moves his or her hand 102 towards the screen on the display 1 12, decreasing the distance h, then the user device 104 controls the AR scene such that the distance p to the target point becomes smaller.
  • FIG. 9B if the user moves the hand away from the screen, increasing the distance h, then the user device 104 controls the AR scene such that the distance p to the target point becomes larger.
  • FIG. 10 is a functional block diagram illustrating an apparatus 1000 capable of performing the processes discussed herein.
  • apparatus 1000 is a user device (e.g., user device 104) capable performing spatial interaction with an AR scene, such as process 400, described above.
  • Apparatus 100 may include a camera 1002 as well as a sensor 1004.
  • camera 1002 is a back-facing camera, such that it may capture images of a real-world scene where target 106 is located, while sensor 1004 is front-facing, such that it may track the pose of the apparatus 1000 relative to a user hand 102 on the display-side of the apparatus 1000.
  • the sensor 1004 may include detectors and/or trackers that may detect and/or track the movement of a user hand 102, finger poses, fist, etc.
  • the sensor 1004 may receive inertial information for the apparatus 1000 from an inertial measurement unit (IMU) to determine whether and how the apparatus 1000 has moved. Additionally, the sensor 1004 may implement mechanisms to determine the position of the apparatus 1000. Such mechanisms may include using data from a network, including triangulation, Wi-Fi positioning, and the like.
  • Apparatus 1000 also includes a user interface 1008 that includes the display 1026 capable of displaying the AR scene generated by the apparatus 1000.
  • the AR scene includes images of the real-world scene captured by the camera 1002, as well as the virtual target object 108 and the virtual hand 1 14.
  • User interface 1008 may also include a keypad 1028 or other input device through which the user can input information into the apparatus 1000. If desired, the keypad 1028 may be obviated by integrating a virtual keypad into the display 1026 with a touch sensor.
  • User interface 1008 may also include a microphone 1030 and speaker 1032.
  • Apparatus 1000 also includes a control unit 1006 that is connected to and communicates with the camera 1002, sensor 1004, and user interface 1008.
  • the control unit 1006 accepts and processes images received from the camera 1002 and/or from network adapter 1020.
  • the control unit 1006 also accepts and processes data received from sensor 1004 for the tracking of the pose of apparatus 1000 relative to a user hand 102.
  • Control unit 1006 may be provided by a processing unit 1010 and associated memory 1016, hardware 1012, firmware 1014, software 1018, and graphics engine 1024.
  • Control unit 1006 may further include an augmented reality (AR) engine 1022.
  • AR engine 1022 may be configured to perform one or more spatial interaction procedures, such as described above with reference to process 400 of FIG. 4. Both the images captured by camera 1002 as well as the data provided by sensor 1004 may be provided to the AR engine 1022. The AR engine 1022 may then render or otherwise generate visual elements of the AR scene in an image on the display 1026.
  • AR augmented reality
  • Processing unit 1010 and AR engine 1022 are illustrated separately for clarity, but may be a single unit and/or implemented in the processing unit 1010 based on instructions in the software 1018 which is run in the processing unit 1010.
  • Processing unit 1010, as well as the AR engine 1022 can, but need not necessarily include, one or more microprocessors, embedded processors, controllers, application specific integrated circuits (ASICs), digital signal processors (DSPs), and the like.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • processor and processing unit describes the functions implemented by the system rather than specific hardware.
  • memory refers to any type of computer storage medium, including long term, short term, or other memory associated with apparatus 1000, and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the processes described herein may be implemented by various means depending upon the application. For example, these processes may be implemented in hardware 1012, firmware 1014, a combination of hardware 1012 and software 1018, or any combination thereof.
  • the processing units may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, electronic devices, other electronic units designed to perform the functions described herein, or a combination thereof.
  • the processes may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein.
  • modules e.g., procedures, functions, and so on
  • Any computer-readable medium tangibly embodying instructions may be used in implementing the processes described herein.
  • program code may be stored in memory 1016 and executed by the processing unit 1010.
  • Memory may be implemented within or external to the processing unit 1010.
  • the functions may be stored as one or more instructions or code on a computer-readable medium. Examples include non-transitory computer-readable media encoded with a data structure and computer-readable media encoded with a computer program.
  • Computer-readable media includes physical computer storage media. A storage medium may be any available medium that can be accessed by a computer.
  • such computer-readable media can comprise RAM, ROM, Flash Memory, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer; disk and disc, as used herein, includes compact disc (CD), laser disc, optical disc, digital versatile disc (DVD), floppy disk and Blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer- readable media.
  • FIG. 1 1 is a simplified block diagram illustrating several sample aspects of components that may be employed in a user device configured to provide spatial interaction with an augmented reality scene, as taught herein.
  • User device 1100 is one possible implementation of user device 104 of FIG. 1, and/or apparatus 1000 of FIG. 10, represented as a series of interrelated functional modules.
  • a module 11 10 for acquiring an image of a real-world scene may correspond at least in some aspects to, for example, a camera 1002 of FIG. 10.
  • a module 1 120 for displaying an AR scene may correspond at least in some aspects to, for example, a display 1026 of FIG. 10.
  • a module 1140 for tracking a pose of the user device relative to a user hand according to a second coordinate system may correspond at in some aspects to, for example, AR engine 1022 in combination with sensor 1004, of FIG. 10.
  • a module 1 150 for mapping the second coordinate system to the first coordinate system to control movements of the virtual hand may correspond at in some aspects to, for example, AR engine 1022, of FIG. 10.
  • a module 1 160 for detecting user input to control which of the first mapping mode and the second mapping mode is used by module 1 150 may correspond at in some aspects to, for example, AR engine 1022 in combination with sensor 1004, of FIG. 10.
  • modules 1 110-1160 of FIG. 10 may be implemented in various ways consistent with the teachings herein.
  • the functionality of these modules 1 1 10-1 160 may be implemented as one or more electrical components.
  • the functionality of these modules 1 1 10-1 160 may be implemented as a processing system including one or more processor components.
  • the functionality of these modules 1 110-1 160 may be implemented using, for example, at least a portion of one or more integrated circuits (e.g., an ASIC).
  • an integrated circuit may include a processor, software, other related components, or some combination thereof.
  • modules may be implemented, for example, as different subsets of an integrated circuit, as different subsets of a set of software modules, or a combination thereof.
  • a given subset e.g., of an integrated circuit and/or of a set of software modules
  • FIG. 10 may be implemented using any suitable means. Such means also may be implemented, at least in part, using corresponding structure as taught herein.
  • the components described above in conjunction with the "module for" components of FIG. 10 also may correspond to similarly designated “means for” functionality.
  • one or more of such means may be implemented using one or more of processor components, integrated circuits, or other suitable structure as taught herein.
  • One or more implementations are described herein with reference to illustrations for particular applications. It should be understood that the implementations are not intended to be limiting.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé d'interaction spatiale en réalité augmentée (RA) qui comprend l'affichage d'une scène en RA qui comprend une image d'une scène du monde réel, un objet cible virtuel, et un curseur virtuel. Une position du curseur virtuel est prévue selon un premier système de coordonnées au sein de la scène en RA. Un dispositif utilisateur suit une pose du dispositif utilisateur par rapport à la main d'un utilisateur selon un second système de coordonnées. Le second système de coordonnées est mis en correspondance par rapport au premier système de coordonnées pour contrôler les mouvements du curseur virtuel. Dans un premier mode de mise en correspondance, le mouvement du curseur virtuel est contrôlé pour modifier une distance entre le curseur virtuel et l'objet cible virtuel. Dans un second mode de mise en correspondance, le mouvement du curseur virtuel est contrôlé pour manipuler l'objet cible virtuel. Une entrée d'utilisateur est détectée pour contrôler lequel du premier mode de mise en correspondance ou du second mode de mise en correspondance est utilisé.
EP15784798.9A 2014-11-14 2015-09-21 Interaction spatiale en réalité augmentée Active EP3218781B1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201462080050P 2014-11-14 2014-11-14
US14/858,777 US9911235B2 (en) 2014-11-14 2015-09-18 Spatial interaction in augmented reality
PCT/US2015/051237 WO2016076951A1 (fr) 2014-11-14 2015-09-21 Interaction spatiale en réalité augmentée

Publications (2)

Publication Number Publication Date
EP3218781A1 true EP3218781A1 (fr) 2017-09-20
EP3218781B1 EP3218781B1 (fr) 2022-03-23

Family

ID=54347809

Family Applications (1)

Application Number Title Priority Date Filing Date
EP15784798.9A Active EP3218781B1 (fr) 2014-11-14 2015-09-21 Interaction spatiale en réalité augmentée

Country Status (4)

Country Link
US (1) US9911235B2 (fr)
EP (1) EP3218781B1 (fr)
CN (1) CN107077169B (fr)
WO (1) WO2016076951A1 (fr)

Families Citing this family (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10409443B2 (en) * 2015-06-24 2019-09-10 Microsoft Technology Licensing, Llc Contextual cursor display based on hand tracking
JP6841232B2 (ja) * 2015-12-18 2021-03-10 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
US9972134B2 (en) * 2016-06-30 2018-05-15 Microsoft Technology Licensing, Llc Adaptive smoothing based on user focus on a target object
DE202017104928U1 (de) * 2016-08-23 2017-11-24 Google Inc. Manipulation virtueller Objekte anhand von Controllern mit sechs Freiheitsgraden in erweiterten bzw. virtuellen Realitätsumgebungen
CN106339488B (zh) * 2016-08-30 2019-08-30 西安小光子网络科技有限公司 一种基于光标签的虚拟设施插入定制实现方法
CN106200831A (zh) * 2016-08-31 2016-12-07 广州数娱信息科技有限公司 一种ar、全息智能装置
WO2018100575A1 (fr) 2016-11-29 2018-06-07 Real View Imaging Ltd. Retour d'information tactile dans un système d'affichage
US10453273B2 (en) 2017-04-25 2019-10-22 Microsoft Technology Licensing, Llc Method and system for providing an object in virtual or semi-virtual space based on a user characteristic
EP3616035B1 (fr) 2017-06-19 2024-04-24 Apple Inc. Interface de réalité augmentée pour interagir avec des cartes affichées
WO2018232742A1 (fr) 2017-06-23 2018-12-27 Tencent Technology (Shenzhen) Company Limited Procédé et dispositif pour pointer un objet dans une scène de réalité virtuelle (vr), et appareil vr
IL253432A0 (en) 2017-07-11 2017-09-28 Elbit Systems Ltd System and method for correcting a rolling display effect
WO2019012522A1 (fr) * 2017-07-11 2019-01-17 Elbit Systems Ltd. Système et procédé de correction d'un effet d'affichage roulant
WO2019059938A1 (fr) 2017-09-25 2019-03-28 Hewlett-Packard Development Company, L.P. Système de réalité augmentée à reciblage haptique
US10777007B2 (en) 2017-09-29 2020-09-15 Apple Inc. Cooperative augmented reality map interface
CN109710054B (zh) * 2017-10-26 2022-04-26 北京京东尚科信息技术有限公司 用于头戴式显示设备的虚拟物体呈现方法和装置
CN108427499A (zh) * 2018-02-13 2018-08-21 视辰信息科技(上海)有限公司 一种ar系统及ar设备
US20210102820A1 (en) * 2018-02-23 2021-04-08 Google Llc Transitioning between map view and augmented reality view
KR102616850B1 (ko) 2018-03-05 2023-12-26 삼성전자주식회사 전자 장치, 전자 장치와 결합 가능한 외부 디바이스 및 이의 디스플레이 방법
CN108519817A (zh) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 基于增强现实的交互方法、装置、存储介质及电子设备
KR101938754B1 (ko) * 2018-08-13 2019-01-15 김종길 젓가락 마우스
US10770035B2 (en) 2018-08-22 2020-09-08 Google Llc Smartphone-based radar system for facilitating awareness of user presence and orientation
US10890653B2 (en) 2018-08-22 2021-01-12 Google Llc Radar-based gesture enhancement for voice interfaces
US10698603B2 (en) * 2018-08-24 2020-06-30 Google Llc Smartphone-based radar system facilitating ease and accuracy of user interactions with displayed objects in an augmented-reality interface
US10788880B2 (en) 2018-10-22 2020-09-29 Google Llc Smartphone-based radar system for determining user intention in a lower-power mode
US10761611B2 (en) 2018-11-13 2020-09-01 Google Llc Radar-image shaper for radar-based applications
CN110147196A (zh) * 2018-12-04 2019-08-20 腾讯科技(深圳)有限公司 交互控制方法和装置、存储介质及电子装置
WO2020226832A1 (fr) 2019-05-06 2020-11-12 Apple Inc. Dispositif, procédé et support lisible par ordinateur pour présenter des fichiers de réalité générée par ordinateur
WO2020226833A1 (fr) 2019-05-06 2020-11-12 Apple Inc. Dispositif, procédé, et interface utilisateur graphique pour composer des fichiers cgr
CN110727345B (zh) * 2019-09-19 2023-12-26 北京耐德佳显示技术有限公司 一种通过手指交叉点移动实现人机交互的方法及系统
EP3850468B1 (fr) * 2019-12-04 2024-01-31 Google LLC Plage de capture pour des objets de réalité augmentée
EP4058874A4 (fr) * 2019-12-05 2023-05-03 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Procédé et système d'association de systèmes de coordonnées de dispositif dans un système de ra à plusieurs personnes
CN110989842A (zh) * 2019-12-06 2020-04-10 国网浙江省电力有限公司培训中心 基于虚拟现实的培训方法、系统和电子设备
CN111240476B (zh) * 2020-01-06 2021-06-08 腾讯科技(深圳)有限公司 基于增强现实的交互方法、装置、存储介质和计算机设备
CN111665943B (zh) * 2020-06-08 2023-09-19 浙江商汤科技开发有限公司 一种位姿信息展示方法及装置
CN111857341B (zh) * 2020-06-10 2023-06-13 浙江商汤科技开发有限公司 一种展示控制方法及装置
JP2022025461A (ja) * 2020-07-29 2022-02-10 株式会社AniCast RM アニメーション制作システム
CN111882674A (zh) * 2020-07-30 2020-11-03 北京市商汤科技开发有限公司 虚拟对象的调整方法、装置、电子设备及存储介质
CN111880657B (zh) * 2020-07-30 2023-04-11 北京市商汤科技开发有限公司 一种虚拟对象的控制方法、装置、电子设备及存储介质
CN112068703B (zh) * 2020-09-07 2021-11-16 北京字节跳动网络技术有限公司 目标物体的控制方法、装置、电子设备及存储介质
CN112991553B (zh) * 2021-03-11 2022-08-26 深圳市慧鲤科技有限公司 信息展示方法及装置、电子设备和存储介质
CN113703571B (zh) * 2021-08-24 2024-02-06 梁枫 一种虚拟现实人机交互的方法、装置、设备和介质
CN113741698B (zh) * 2021-09-09 2023-12-15 亮风台(上海)信息科技有限公司 一种确定和呈现目标标记信息的方法与设备
CN114115544B (zh) * 2021-11-30 2024-01-05 杭州海康威视数字技术股份有限公司 人机交互方法、三维显示设备及存储介质
CN114237403A (zh) * 2021-12-27 2022-03-25 郑州捷安高科股份有限公司 基于vr交互设备的操作手势检测处理方法、设备及介质
CN114764327B (zh) * 2022-05-09 2023-05-05 北京未来时空科技有限公司 一种三维可交互媒体的制作方法、装置及存储介质
DE102022112930A1 (de) * 2022-05-23 2023-11-23 Gestigon Gmbh Erfassungssystem und verfahren zur erfassung von kontaktlosen gerichteten benutzereingaben und verfahren zur kalibrierung des erfassungssystems

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6611610B1 (en) 1997-04-02 2003-08-26 Gentex Corporation Vehicle lamp control
JP4807439B2 (ja) 2009-06-15 2011-11-02 株式会社デンソー 霧画像復元装置及び運転支援システム
DE102010002488A1 (de) 2010-03-02 2011-09-08 Robert Bosch Gmbh Verfahren und Vorrichtung zur Nebelerkennung mittels Spektroskopie
WO2011136784A1 (fr) 2010-04-29 2011-11-03 Hewlett-Packard Development Company, L.P. Collaboration de participants sur version affichée d'objet
TWI501130B (zh) 2010-10-18 2015-09-21 Ind Tech Res Inst 虛擬觸控輸入系統
US20120113223A1 (en) * 2010-11-05 2012-05-10 Microsoft Corporation User Interaction in Augmented Reality
KR101727899B1 (ko) * 2010-11-26 2017-04-18 엘지전자 주식회사 휴대 단말기 및 그 동작 제어방법
US8514295B2 (en) 2010-12-17 2013-08-20 Qualcomm Incorporated Augmented reality processing based on eye capture in handheld device
US8488888B2 (en) * 2010-12-28 2013-07-16 Microsoft Corporation Classification of posture states
WO2014111947A1 (fr) 2013-01-21 2014-07-24 Pointgrab Ltd. Commande gestuelle en réalité augmentée
JP5561396B1 (ja) 2013-02-19 2014-07-30 日本電気株式会社 運転支援システムおよび運転支援方法

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BOWMAN D A ET AL: "AN EVALUATION OF TECHNIQUES FOR GRABBING AND MANIPULATING REMOTE OBJECTS IN IMMERSIVE VIRTUAL ENVIRONMENTS", PROCEEDINGS OF 1997 SYMPOSIUM ON INTERACTIVE 3 D GRAPHICS 27-30 APRIL 1997 PROVIDENCE, RI, USA; [PROCEEDINGS OF THE SYMPOSIUM ON INTERACTIVE 3D GRAPHICS], ACM, PROCEEDINGS 1997 SYMPOSIUM ON INTERACTIVE 3D GRAPHICS ACM NEW YORK, NY, USA, 27 April 1997 (1997-04-27), pages 35 - 38, XP000725357, ISBN: 978-0-89791-884-8, DOI: 10.1145/253284.253301 *
DANIEL VOGEL ET AL: "Distant freehand pointing and clicking on very large, high resolution displays", PROCEEDINGS OF THE 18TH ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY , UIST '05, 27 October 2005 (2005-10-27), New York, New York, USA, pages 33, XP055038902, ISBN: 978-1-59-593271-6, DOI: 10.1145/1095034.1095041 *
See also references of WO2016076951A1 *

Also Published As

Publication number Publication date
US20160140763A1 (en) 2016-05-19
CN107077169B (zh) 2020-04-28
US9911235B2 (en) 2018-03-06
CN107077169A (zh) 2017-08-18
EP3218781B1 (fr) 2022-03-23
WO2016076951A1 (fr) 2016-05-19

Similar Documents

Publication Publication Date Title
US9911235B2 (en) Spatial interaction in augmented reality
US10761612B2 (en) Gesture recognition techniques
US10248218B2 (en) Systems and methods of direct pointing detection for interaction with a digital device
EP3855288B1 (fr) Relations spatiales pour l'intégration d'images visuelles d'environnement physique dans la réalité virtuelle
US20210011556A1 (en) Virtual user interface using a peripheral device in artificial reality environments
KR101791366B1 (ko) 증강된 가상 터치패드 및 터치스크린
KR101688355B1 (ko) 다수의 지각 감지 입력의 상호작용
TWI470534B (zh) 藉由使用運動的性質在一顯示器上之三維使用者介面效應
EP2630563B1 (fr) Appareil et méthode d'entrée d'utilisateur permettant de commander des informations affichées
EP2558924B1 (fr) Appareil, procédé et programme d'entrée d'utilisateur à l'aide d'une caméra
JP4323180B2 (ja) 自己画像表示を用いたインタフェース方法、装置、およびプログラム
WO2016109409A1 (fr) Lasers virtuels pour interagir avec des environnements de réalité augmentée
EP3327544B1 (fr) Appareil, procédé associé et support lisible par ordinateur associé
WO2013008236A1 (fr) Système et procédé de vision par ordinateur basée sur une identification d'un geste de la main
JP6591411B2 (ja) 空間的対話における追加モダリティのための顔追跡
WO2013190538A1 (fr) Procédé de commande sans contact d'un dispositif
US11886643B2 (en) Information processing apparatus and information processing method
KR20180044535A (ko) 홀로그래피 스마트홈 시스템 및 제어방법
Park et al. 3D Gesture-based view manipulator for large scale entity model review
AU2015252151A1 (en) Enhanced virtual touchpad and touchscreen
JP2012108826A (ja) 表示制御装置及び表示制御装置の制御方法、プログラム
Takaki et al. Using gaze for 3-D direct manipulation interface
KR20160113498A (ko) 홀로그래피 터치 방법 및 프로젝터 터치 방법
KR20160013501A (ko) 홀로그래피 터치 방법 및 프로젝터 터치 방법
GB2502946A (en) Maintaining augmented reality image when marker leaves field of view

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20170331

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20200401

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Ref document number: 602015077729

Country of ref document: DE

Free format text: PREVIOUS MAIN CLASS: G06F0001160000

Ipc: G06F0003010000

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: GRANT OF PATENT IS INTENDED

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 5/225 20060101ALN20210915BHEP

Ipc: G06F 3/0481 20130101ALI20210915BHEP

Ipc: G06F 1/16 20060101ALI20210915BHEP

Ipc: G06F 3/03 20060101ALI20210915BHEP

Ipc: G06F 3/01 20060101AFI20210915BHEP

RIC1 Information provided on ipc code assigned before grant

Ipc: H04N 5/225 20060101ALN20210924BHEP

Ipc: G06F 3/0481 20130101ALI20210924BHEP

Ipc: G06F 1/16 20060101ALI20210924BHEP

Ipc: G06F 3/03 20060101ALI20210924BHEP

Ipc: G06F 3/01 20060101AFI20210924BHEP

INTG Intention to grant announced

Effective date: 20211015

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE PATENT HAS BEEN GRANTED

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602015077729

Country of ref document: DE

REG Reference to a national code

Ref country code: AT

Ref legal event code: REF

Ref document number: 1477908

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220415

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG9D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220623

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220623

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 1477908

Country of ref document: AT

Kind code of ref document: T

Effective date: 20220323

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220624

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220725

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220723

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602015077729

Country of ref document: DE

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

26N No opposition filed

Effective date: 20230102

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: BE

Ref legal event code: MM

Effective date: 20220930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220921

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220921

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220930

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20230810

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20230810

Year of fee payment: 9

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20230808

Year of fee payment: 9

Ref country code: DE

Payment date: 20230808

Year of fee payment: 9

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20150921

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20220323