WO2019032014A1 - Système d'interaction tactile en réalité virtuelle - Google Patents

Système d'interaction tactile en réalité virtuelle Download PDF

Info

Publication number
WO2019032014A1
WO2019032014A1 PCT/SE2018/050781 SE2018050781W WO2019032014A1 WO 2019032014 A1 WO2019032014 A1 WO 2019032014A1 SE 2018050781 W SE2018050781 W SE 2018050781W WO 2019032014 A1 WO2019032014 A1 WO 2019032014A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch
sensitive apparatus
touch sensitive
user
image data
Prior art date
Application number
PCT/SE2018/050781
Other languages
English (en)
Inventor
Kristofer JAKOBSON
Tomas Christiansson
Mattias KRUS
Original Assignee
Flatfrog Laboratories Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flatfrog Laboratories Ab filed Critical Flatfrog Laboratories Ab
Publication of WO2019032014A1 publication Critical patent/WO2019032014A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a touch- based VR interaction system and a related method.
  • VR virtual-reality
  • touch-sensitive panels are being used for providing input data to computers, gaming devices, presentation- and
  • Virtual-reality presents the user with an environment partially if not fully disconnected from the actual physical environment of the user.
  • Various ways of interacting with this environment have been tried. These include IR tracked gloves, IR tracked wands or other gesturing tools, gyroscope-/accelerometer tracked objects.
  • the IR tracked objects are typically tracked using one or more IR sensors configured to view and triangulate IR light sources on the IR tracked objects.
  • Such interaction systems provide high latency, low accuracy user input to the virtual
  • One objective is to provide a VR interaction system with a high-precision interface.
  • Another objective is to provide a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.
  • a touch-based virtual-reality (VR) interaction system comprising a touch sensitive apparatus configured to receive touch input from a user, a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space, a positioning unit configured to provide spatial position information of the position of the touch sensitive apparatus relative to the user, and a processing unit configured to map the spatial position information of the touch sensitive apparatus to the VR environment coordinate system.
  • the processing unit is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus to the VR output device so that the touch sensitive apparatus is displayed within the virtual space together with the virtual representation of the touch input.
  • a method in a touch-based VR interaction system having a touch sensitive apparatus configured to receive touch input from a user, and a VR output device configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space.
  • the method comprises providing spatial information of the position of the touch sensitive apparatus relative to the user, mapping the spatial position information of the touch sensitive apparatus to the VR environment coordinate system, and
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
  • Some examples of the disclosure provide for a touch-based VR interaction system in which a VR user interact with a high precision touch sensitive apparatus in the physical reality whilst viewing the interaction in the virtual reality.
  • Some examples of the disclosure provide for an enhanced VR experience via interaction with a touch panel.
  • Some examples of the disclosure provide for capturing input from a user's interaction with a VR environment with a high accuracy.
  • Fig. 1 shows a touch-based virtual-reality (VR) interaction system according to examples of the disclosure
  • Fig. 2 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 3 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 4 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 5 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 6 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 7 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 8 shows a touch-based VR interaction system according to examples of the disclosure
  • Fig. 9 shows a touch-based VR interaction system according to examples of the disclosure.
  • Fig. 10 shows a VR environment in which a plurality of virtual
  • Fig. 1 1 is a flowchart of a method in a touch-based VR interaction system according to examples of the disclosure.
  • Fig. 1 is a schematic illustration of a touch-based virtual-reality (VR) interaction system 100 comprising a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a
  • VR virtual-reality
  • the touch sensitive apparatus 101 may be configured to receive input using e.g. one or more fingers, a pointer or stylus etc on a touch panel 101 ' of the touch sensitive apparatus 101 .
  • the VR output device 102 may be configured to be wearable by the user, and may thus comprise a VR headset.
  • the VR output device 102 presents a virtual space to the user, as well as a virtual representation of the touch input, when the user provides touch input to the touch sensitive apparatus 101 .
  • a virtual representation of the user, such as one or more fingers, and/or a pointer or stylus may be presented in the VR output device 102 to facilitate orientation in the virtual space.
  • the touch-based VR interaction system 100 comprises a positioning unit 103 configured to provide spatial position information of the position of the touch sensitive apparatus 101 relative to the user, and a processing unit 104 configured to map the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system.
  • the processing unit 104 is configured to communicate a set of VR environment coordinates of the touch sensitive apparatus 101 to the VR output device 102 so that the touch sensitive apparatus 101 is displayed within the virtual space together with the virtual representation of the touch input. The VR user may thus reliably interact with a high precision touch sensitive apparatus 101 in the physical reality whilst viewing the interaction in VR.
  • Various input from the user's interaction with a VR environment may thus be captured with an increased accuracy.
  • touch input of fine details of a component for a machine presented in the VR space may be captured with the increased accuracy and low latency of the touch sensitive apparatus 101 , that otherwise would not be resolved by typical spatial sensors in previous VR systems.
  • Mapping the position of the touch sensitive apparatus 101 to the VR environment provides further for an enhanced VR experience combining the freedom of customizing different VR environments to the user's tasks with the tactile interaction provided by the touch sensitive apparatus 101 .
  • the simultaneous interaction with the touch sensitive apparatus 101 allows for a more viable handling of user input from a VR environment, such as the communication of a user's input to various related systems and applications. A realistic and more practical utilization of VR may thus be provided, across a range of applications and technical fields.
  • touch-sensitive panels There are numerous known techniques for providing touch sensitivity to the touch panel 101 ', e.g. by using cameras to capture light scattered off the point(s) of touch on the panel, by using cameras to directly observe the objects interacting with the panel, by incorporating resistive wire grids, capacitive sensors, strain gauges, etc. into the panel.
  • a plurality of optical emitters and optical receivers are arranged around the periphery of the touch surface of the panel 101 ' to create a grid of intersecting light paths (otherwise known as detection lines) above the touch surface. Each light path extends between a respective emitter/receiver pair. An object that touches the touch surface will block or attenuate some of the light paths. Based on the identity of the receivers detecting a blocked light path, a processor can determine the location of the intercept between the blocked light paths.
  • the touch-based VR interaction system 100 may comprise at least one spatial marker 105, 105', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 2.
  • the positioning unit 103 may be configured to track the at least one spatial marker 105, 105', to determine an associated position of the touch sensitive apparatus 101 relative to the user.
  • the at least one spatial marker 105, 105' may comprise IR markers such as IR light sources, or any other marker configured for allowing tracking by the positioning unit 102, such as markers of different shapes and configurations being physically provided on parts of the touch sensitive apparatus 101 and/or displayed on the touch panel 101 ' thereof. Accurate mapping of the obtained spatial position information to the VR environment coordinate system may then be provided.
  • Fig. 2 illustrates first and second spatial markers 105, 105', but it is conceivable that the number of spatial markers may be varied to provide for an optimized position detection.
  • the touch-based VR interaction system 100 may comprise an image sensor device 106 configured to be wearable by the user, as schematically illustrated in Figs. 3 - 8.
  • the image sensor device 106 may be configured to capture image data 107, 107', 107", 107"', associated with the position of the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data, such as by a triangulation process of the obtained image data. Since the image sensor device 106 may be arranged at the position of the user, i.e. by being wearable, the relative position between the user and the touch sensitive apparatus 101 may be accurately determined. This provides for accurately determining the VR environment coordinates of the touch sensitive apparatus
  • the touch- based VR interaction system 100 thus enables high-resolution input and for more complex tasks to be carried out by the user in the VR space.
  • the image sensor device 106 may be configured to capture image data 107 of the at least one spatial marker 105, 105', and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data.
  • Fig. 3 illustrates an example where the image sensor device 106 locates the position of the touch sensitive apparatus 101 based on spatial markers 105, 105'.
  • the processing unit 104 may then accurately map the retrieved spatial position information to the VR environment coordinate system.
  • the image sensor device 106 may be configured to capture image data 107' displayed by the touch sensitive apparatus 101 and communicate the image data to the positioning unit 103, as schematically illustrated in Fig. 4.
  • the image data 107' displayed by the touch sensitive apparatus 101 may comprise objects of varying shapes and configurations that allow for a calibration of the position of the touch sensitive apparatus 101 in the VR environment coordinate system. A flexible and highly optimizable calibration may thus be provided, since the displayed image data may be varied for different conditions and applications.
  • the touch sensitive apparatus 101 may be configured to display image data comprising at least one orientation tag 107", as schematically illustrated in Fig. 5.
  • the positioning unit 103 may be configured to track the position of the at least one orientation tag 103 to determine an associated position of the touch sensitive apparatus 101 relative to the user.
  • the number of tags 103 displayed and the configurations thereof may vary to provide for a precise positioning procedure and a VR environment which is accurately anchored to the physical reality, i.e. the touch sensitive apparatus 101 .
  • the touch sensitive apparatus 101 may be configured to display a calibration image 108 at (or at a defined distance to) the position of a user input device 109 on the touch sensitive apparatus 101 when the touch sensitive apparatus receives touch input from the user input device 109, as schematically illustrated in Fig. 6.
  • the image sensor device 106 may be configured to capture image data comprising the calibration image 108 and the user input device 109 and/or the user 1 1 1 .
  • the positioning unit 103 may be configured to determine an orientation of the user input device 109 and/or the user 1 1 1 (such as one or more fingers, hand or lower arm of the user) relative the touch sensitive apparatus 101 based on a projected image 110 of the user input device 109 and/or the user 1 1 1 on the calibration image 108.
  • the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1.
  • Such spatial position information is then mapped to the VR environment coordinate system as described, which provides for a facilitated interaction with the touch sensitive apparatus 101 , e.g. by displaying a virtual representation of the user input device 109 and/or the user 1 1 1 in the VR space.
  • This also allows for providing sufficient information to allow effective palm rejection, e.g. by identifying a stylus tip from the projected image and ignoring all other touches around that stylus tip position.
  • the calibration image 108 is advantageously displayed around the user input device 109 and/or the hand of the user 1 11 .
  • the touch sensitive apparatus 101 may be configured to display the calibration image 108 tracking the position of the user input device 109, and/or the user 11 1 , on the touch sensitive apparatus 101 .
  • the calibration image 108 may thus follow the position of the user input device 109, and/or the user 1 1 1 , on the touch sensitive apparatus 101 , which may improve the detection of the above mentioned spatial position information.
  • the touch-based VR interaction system 100 may comprise a light emitter 1 16 arranged at a determined spatial position relative to the touch sensitive apparatus 101 , as schematically illustrated in Fig. 8.
  • the image sensor device 106 may be configured to capture image data 107"' of light emitted by the light emitter and communicate the image data to the positioning unit 103, which is configured to determine the position of the touch sensitive apparatus 101 relative to the user based on the captured image data.
  • the light may be IR light or light of any other wavelength suitable for detection by the image sensor device 106.
  • the image sensor device 106 may be arranged at the VR output device 102, as schematically illustrated in Figs. 3 - 8. It is conceivable however that the image sensor device 106 may be displaced from the VR output device 102 but at a predetermined distance from the touch sensitive apparatus 101 and communicating with the positioning unit 103, so that the image data 107 - 107"' may be received by the positioning unit 103.
  • the touch-based VR interaction system 100 may comprise a second image sensor device 113, 1 13', arranged on the touch sensitive apparatus 101 , as schematically illustrated in Fig. 7.
  • the second image sensor device 1 13, 1 13' may be configured to capture image data of the user 1 1 1 and/or a user input device 109 and communicate the image data to the positioning unit 103, which is configured to determine an orientation of the user 1 1 1 and/or a user input device 109 relative to the touch sensitive apparatus 101 based on the captured image data.
  • the second image sensor device 1 13, 113' may comprise depth cameras for accurately determining the spatial positioning information. Inertia sensors may also track the movement of the user input device 109 for defined periods of time, such as the time between letters when writing a word.
  • the positioning device 103 may determine the orientation, position, or dynamics of the movement, such as the speed or acceleration, of the user input device 109 and/or the user 1 1 1 from the image data.
  • the processing unit 104 may subsequently map such spatial position information to the VR environment coordinate system as described above for providing a precise representation of the user input device 109 and/or the user 1 1 1 in the VR space.
  • the accuracy of the virtual representation of the touch input in the VR environment coordinate system may thus be improved so that user may experience a more direct connection between physical movements of e.g. input device 109 and the resulting virtual presentation, which is critical for fine touch input gestures e.g. in high-resolution tasks.
  • Such improved VR representation and tracking of the user input device 109 and/or the user 1 11 is also
  • the processing unit 104 may thus be configured to map spatial position information associated with the determined orientation of the user 1 1 1 and/or a user input device 109 to the VR environment coordinate system, and the VR output device 102 may be configured to display the orientation of the user 1 1 1 and/or a user input device 109 in the virtual space.
  • the positioning unit 103 may be configured to determine a calibration position of a user input device 109 in the VR environment coordinate system when touching at least one physical coordinate 112 on the touch sensitive apparatus 101 (i.e. on the touch panel 101 ' thereof).
  • the processing unit 104 may be configured to map the position of the at least one physical coordinate to the VR environment coordinate system by registering the at least one physical coordinate to the calibration position when detecting the touch of the at least one physical coordinate 1 12.
  • the user may calibrate the position of the touch sensitive apparatus 101 in the VR space with a few touches on the touch panel 101 '.
  • Each touch with the user input device 109 connects the respective physical coordinate at the touch site of the touch panel 101 ' with the coordinate of the user input device 109 in the VR environment coordinate system, when at the same point in time.
  • the VR output device 102 may be configured to display the touch sensitive apparatus as a plurality of virtual representations 1 14 thereof in the virtual space, as schematically illustrated in Fig. 10.
  • the processing unit 104 may be configured to associate at least a second 1 15 virtual representation of the plurality of virtual representations 1 14 with a second set of VR environment coordinates in response to a user input so that the VR output device 102 displays the second virtual representation 1 15 as being separated within the virtual space from a virtual representation 1 15' of the touch sensitive apparatus receiving touch input.
  • the VR output device 102 displays a presentation session in the VR space in one application, in which a plurality of virtual representations 1 14 of a touch sensitive apparatus is displayed to a user 11 1 or a plurality of users.
  • a user 1 1 1 may interact with a first virtual representation 1 15' of the touch sensitive apparatus.
  • the user may subsequently provide a dedicated touch input, such as a swipe gesture, to shift the first virtual representation 115' to a different location in the VR space (e.g. as denoted by reference 1 15 in Fig. 10) and continue interaction with another virtual representation of the touch sensitive apparatus in the VR space, but with the same physical touch sensitive apparatus 101 .
  • a plurality of virtual representations 114 may be arranged in the VR space for viewing and further interaction by the participating VR users.
  • a user may then 'activate' any of the virtual representations 1 14 for touch input, by again anchoring a virtual representation 115 to the VR coordinates represented by the touch sensitive apparatus 101 .
  • the virtual representation 1 15' aligned with the physical touch sensitive apparatus 101 may be highlighted e.g. with a different color in the VR space to facilitate the user orientation.
  • the touch-based VR interaction system 100 thus provides for a highly dynamic interaction with the freedom to utilize the VR space while ensuring that all of the user's input is structured and retained, with high resolution and accuracy. It is conceivable that several touch sensitive apparatuses 101 are connected over a communication network, where the touch-based VR interaction system 100 incorporates the touch sensitive apparatuses 101 so that simultaneous input to the plurality of touch panels 101 ' can be provided and mapped to the VR space for simultaneous interaction and viewing by a plurality of user's in a network.
  • Fig. 1 1 illustrates a flow chart of a method 200 in a touch-based VR interaction system. The order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is
  • the touch-based VR interaction system 100 has a touch sensitive apparatus 101 configured to receive touch input from a user, and a VR output device 102 configured to display a position of the user and a virtual representation of the touch input in a VR environment coordinate system within a virtual space.
  • the method 200 comprises providing 201 spatial information of the position of the touch sensitive apparatus 101 relative to the user, and mapping 202 the spatial position information of the touch sensitive apparatus 101 to the VR environment coordinate system.
  • the method 200 comprises communicating 203 a set of VR environment coordinates of the touch sensitive apparatus to the VR output device 102 so that the touch sensitive apparatus 101 is displayed 204 within the virtual space together with the virtual representation of the touch input.
  • the method 200 thus provides for the advantageous benefits as described above in relation to the system 100 and Figs. 1 - 10.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un système d'interaction tactile en réalité virtuelle (VR). Le système d'interaction VR comprend un appareil tactile configuré pour recevoir une entrée tactile d'un utilisateur, un dispositif de sortie VR configuré pour afficher une position de l'utilisateur et une représentation virtuelle de l'entrée tactile dans un système de coordonnées d'environnement VR à l'intérieur d'un espace virtuel, une unité de positionnement configurée pour fournir des informations de position spatiale de la position de l'appareil tactile par rapport à l'utilisateur, et une unité de traitement configurée pour mapper les informations de position spatiale de l'appareil tactile sur le système de coordonnées d'environnement VR. L'unité de traitement est configurée pour communiquer un ensemble de coordonnées d'environnement VR de l'appareil tactile au dispositif de sortie VR de telle sorte que l'appareil tactile s'affiche dans l'espace virtuel conjointement avec la représentation virtuelle de l'entrée tactile. L'invention concerne également un procédé associé.
PCT/SE2018/050781 2017-08-07 2018-07-31 Système d'interaction tactile en réalité virtuelle WO2019032014A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE1730206-8 2017-08-07
SE1730206 2017-08-07

Publications (1)

Publication Number Publication Date
WO2019032014A1 true WO2019032014A1 (fr) 2019-02-14

Family

ID=65270988

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2018/050781 WO2019032014A1 (fr) 2017-08-07 2018-07-31 Système d'interaction tactile en réalité virtuelle

Country Status (1)

Country Link
WO (1) WO2019032014A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023234822A1 (fr) * 2022-05-31 2023-12-07 Flatfrog Laboratories Ab Système d'interaction à réalité étendue

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227470A1 (en) * 2002-06-06 2003-12-11 Yakup Genc System and method for measuring the registration accuracy of an augmented reality system
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
US20130265393A1 (en) * 2006-08-10 2013-10-10 Canon Kabushiki Kaisha Image capture environment calibration method and information processing apparatus
EP2981079A1 (fr) * 2013-03-28 2016-02-03 Sony Corporation Dispositif et méthode de traitement d'image, et programme
US20160093105A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Display of text information on a head-mounted display
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object
US20170199580A1 (en) * 2012-10-17 2017-07-13 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030227470A1 (en) * 2002-06-06 2003-12-11 Yakup Genc System and method for measuring the registration accuracy of an augmented reality system
US20130265393A1 (en) * 2006-08-10 2013-10-10 Canon Kabushiki Kaisha Image capture environment calibration method and information processing apparatus
US20120206323A1 (en) * 2010-02-28 2012-08-16 Osterhout Group, Inc. Ar glasses with event and sensor triggered ar eyepiece interface to external devices
US20170199580A1 (en) * 2012-10-17 2017-07-13 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality
EP2981079A1 (fr) * 2013-03-28 2016-02-03 Sony Corporation Dispositif et méthode de traitement d'image, et programme
US20160093105A1 (en) * 2014-09-30 2016-03-31 Sony Computer Entertainment Inc. Display of text information on a head-mounted display
US20170061700A1 (en) * 2015-02-13 2017-03-02 Julian Michael Urbach Intercommunication between a head mounted display and a real world object

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023234822A1 (fr) * 2022-05-31 2023-12-07 Flatfrog Laboratories Ab Système d'interaction à réalité étendue

Similar Documents

Publication Publication Date Title
KR101809636B1 (ko) 컴퓨터 장치의 원격 제어
US10185433B2 (en) Method and apparatus for touch responding of wearable device as well as wearable device
US8180114B2 (en) Gesture recognition interface system with vertical display
JP2022540315A (ja) 人工現実環境において周辺デバイスを使用する仮想ユーザインターフェース
CN102317892B (zh) 控制信息输入装置的方法、信息输入装置、程序和信息存储介质
US8743089B2 (en) Information processing apparatus and control method thereof
CN108431729A (zh) 用以增大显示区域的三维对象跟踪
EP2677398A2 (fr) Dispositif tactile virtuel sans pointeur sur la surface d'affichage
US20140317576A1 (en) Method and system for responding to user's selection gesture of object displayed in three dimensions
CN103809733A (zh) 人机交互系统和方法
CN103365411A (zh) 信息输入设备、信息输入方法和计算机程序
US11640198B2 (en) System and method for human interaction with virtual objects
US11995254B2 (en) Methods, devices, apparatuses, and storage media for mapping mouse models for computer mouses
US9310851B2 (en) Three-dimensional (3D) human-computer interaction system using computer mouse as a 3D pointing device and an operation method thereof
US9703410B2 (en) Remote sensing touchscreen
CN106406572A (zh) 光标的控制方法和装置
WO2019032014A1 (fr) Système d'interaction tactile en réalité virtuelle
JP2010272036A (ja) 画像処理装置
JP6966777B2 (ja) 入力システム
KR102322968B1 (ko) 사용자의 손동작에 따른 명령 입력 장치 및 이를 이용한 명령 입력 방법
WO2023234822A1 (fr) Système d'interaction à réalité étendue
US20200409478A1 (en) Enhanced 2D/3D Mouse For Computer Display Interactions
Caruso et al. AR-Mote: A wireless device for Augmented Reality environment
US10936147B2 (en) Tablet computing device with display dock
EP2390761A1 (fr) Procédé et système pour sélectionner un article dans un espace tridimensionnel

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18843741

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18843741

Country of ref document: EP

Kind code of ref document: A1