WO2023234823A1 - A virtual-reality interaction system with haptic feedback - Google Patents

A virtual-reality interaction system with haptic feedback Download PDF

Info

Publication number
WO2023234823A1
WO2023234823A1 PCT/SE2023/050457 SE2023050457W WO2023234823A1 WO 2023234823 A1 WO2023234823 A1 WO 2023234823A1 SE 2023050457 W SE2023050457 W SE 2023050457W WO 2023234823 A1 WO2023234823 A1 WO 2023234823A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
user
interaction
controller
motion
Prior art date
Application number
PCT/SE2023/050457
Other languages
French (fr)
Inventor
Mattias KRUS
Original Assignee
Flatfrog Laboratories Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Flatfrog Laboratories Ab filed Critical Flatfrog Laboratories Ab
Publication of WO2023234823A1 publication Critical patent/WO2023234823A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Definitions

  • the present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a VR interaction system to send tactile output instructions to a haptic feedback device and a related method.
  • VR virtual-reality
  • VR virtual-reality
  • AR Augmented reality
  • Haptic output may be used to generate a tactile sensation for the user in order to increase the sense of involvement and feedback from the interaction.
  • VR interaction systems are however typically associated with high latency and sub- optimal accuracy in the user feedback, while the key to an immersive experience is low latency between the interaction and the feedback. The limitations of a high latency interaction the hinders the potential of VR as an interaction tool and is rather tiring to the user over time.
  • One objective is to provide a VR interaction system with low-latency user feedback.
  • One or more of these objectives, and other objectives that may appear from the description below, are at least partly achieved by means of a VR interaction system and a related method according to the independent claims, embodiments thereof being defined by the dependent claims.
  • a virtual-reality (VR) interaction system comprising a VR controller configured to generate a virtual space within a VR environment coordinate system to be displayed in a wearable display device for a user, wherein the VR controller determines VR object coordinates of virtual objects in relation to a virtual user location in the virtual space, a sensor configured to detect user motion in a room having room coordinates, a feedback controller in communication with the sensor and a haptic feedback device to generate a tactile output, wherein the sensor is configured to communicate user motion data to the feedback controller, wherein the VR controller is configured to detect a probability for an interaction event, between the user and a virtual object, based on a position and/or movement of the virtual object in the virtual space and the virtual user location, access an interaction rule for the interaction event as the probability satisfies a defined probability threshold, and send the interaction rule to the feedback controller, wherein the feedback controller is configured to receive the interaction rule and compare the interaction rule with the user motion data received from the sensor, wherein the interaction rule comprises a
  • a method in a virtual-reality (VR) interaction system comprising generating a virtual space within a VR environment coordinate system in a VR controller to be displayed in a wearable display device for a user, the VR controller determining VR object coordinates of virtual objects in relation to a virtual user location in the virtual space, communicating user motion data in a room having room coordinates to a feedback controller being in communication with a haptic feedback device to generate a tactile output, detecting a probability for an interaction event, between the user and a virtual object, based on a position and/or movement of the virtual object in the virtual space and the virtual user location, accessing an interaction rule for the interaction event as the probability satisfies a defined probability threshold, sending the interaction rule to the feedback controller, receiving the interaction rule at the feedback controller to compare the interaction rule with the received user motion data, wherein the interaction rule comprises a motion criterion for tactile output, and sending tactile output instructions to the haptic feedback device when the motion data satisfies the motion cri
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
  • Some examples of the disclosure provide for a VR interaction system with low-latency user feedback.
  • Some examples of the disclosure provide for a VR interaction system with user feedback of high precision.
  • Some examples of the disclosure provide for a high accuracy in the user interaction with a VR environment.
  • Some examples of the disclosure provide for a VR interaction system with an enhanced VR experience.
  • Fig. 1 shows a VR interaction system according to an example of the disclosure
  • Figs. 2a-b show images of a virtual object and a user in a virtual space
  • Fig. 2c shows a VR interaction system according to an example of the disclosure
  • Fig. 2d shows a VR interaction system according to an example of the disclosure
  • Fig. 3 shows a VR interaction system according to an example of the disclosure.
  • Fig. 4 is a flowchart of a method in a VR interaction system according to an example of the disclosure.
  • Fig. 1 is a schematic illustration of a virtual-reality (VR) interaction system 100 comprising a VR controller 101 configured to generate a virtual space within a VR environment coordinate system (v x ,v y ,v z ), as further schematically depicted in Fig. 1 , within the circular dashed lines.
  • the virtual space is to be displayed in a wearable display device 102, such as a VR headset, for a user 107.
  • the VR controller 101 is configured to determine VR object coordinates (Ovx,Ovy, Ovz) of virtual objects 103 in relation to a virtual user location (u V x,u V y,Uvz) in the virtual space, as further exemplified in Figs. 2a-c.
  • the VR object coordinates and the virtual user location are below denoted o v and u v , respectively, for brevity.
  • the virtual user location u v may comprise a set of coordinates in the VR environment coordinate system (v x ,v y ,v z ) that defines the user in three dimensions (3D).
  • the VR controller 101 may thus be configured to generate a 3D model 108 of the user 107 in the virtual space, as exemplified in Fig. 2a, showing a 3D model 108 of the user’s hand in relation to a virtual object 103.
  • the VR interaction system 100 comprises a sensor 104 configured to detect user motion in a room having room coordinates (x,y,z).
  • the VR interaction system 100 further comprises a feedback controller 105 in communication with the sensor 104 and a haptic feedback device 106.
  • the sensor 104 is configured to communicate user motion data to the feedback controller 105.
  • the haptic feedback device 106 is configured to generate a tactile output for the user 107, such as a vibration.
  • the haptic feedback device 106 may be arranged in any object which is in contact with the user 107, such as in a VR glove 109 or other hand controllers, wrist band 110, or a pen (not shown).
  • the VR controller 101 is configured to detect a probability for an interaction event, between the user 107 and a virtual object 103, based on a position and/or movement of the virtual object 103 in the virtual space and the virtual user location u v .
  • Fig. 2a shows an example of a virtual object 103 such as a 3D object 103 defined by a set of VR object coordinates (o V x,o vy ,o V z) in the virtual space.
  • the VR object coordinates o v may vary over time so that the virtual object 103 has a trajectory with a velocity and acceleration in the virtual space.
  • Fig. 2b indicates a separation between a virtual object 103 and the 3D model 108 as a distance (I), such as a vector (I) defined by a set of coordinates (l vx , l vy , l vz ).
  • the distance (I) thus varies with the relative movement between the user’s 3D model 108 and the virtual object 103.
  • the user 107 intuitively moves closer to a selected virtual object 103, e.g. by reaching out the hand.
  • the user 107 may wish to interact with the virtual object 103 in different ways, such as manipulating or touching the latter.
  • the associated 3D model 108 of the hand thus has a variable distance (I) to the selected virtual object 103, which typically decreases as the user reach out to it.
  • the 3D model 108 may move in a sporadic non-linear fashion, typical of a hand movement, e.g. as the user 107 spontaneously tap some of the fingers on a surface of the virtual object 103, or quickly correcting a finger movement when interacting with a set of control elements of a GUI in the virtual space.
  • the probability for an interaction event may accordingly depend on the relative movement between the virtual object 103 and the 3D model 108, such as the distance (I), speed, and/or the acceleration of the trajectories associated with the virtual object 103 and the 3D model 108.
  • the VR controller 101 is configured to access an interaction rule (R) for the interaction event as the probability for the interaction event satisfies a defined probability threshold.
  • the probability threshold may depend the distance (I), the speed, and/or the acceleration. E.g. reducing the distance (I) with a greater acceleration can be indicative of a more confident choice of an interaction, as opposed to a slow movement which can be indicative of a hesitating user 107, in which case the probability for an interaction event is lower than in the former case.
  • the probability threshold may thus be dependent on the motion data. Fig.
  • the probability may also take into account the frequency by which the VR controller 101 updates the VR object coordinates o v and the user location u v in the virtual space, and the latency by which the virtual space is updated.
  • the VR controller 101 may determine that an interaction event may occur within the latency period, e.g. within 50-100 ms, and in such case determine that the probability threshold is satisfied.
  • the interaction rule (R) comprises criteria for tactile feedback to the user 107, as described in the following.
  • the VR controller 101 is configured to send the interaction rule (R) to the feedback controller 105.
  • the feedback controller 105 may be part of a local sensor loop 111 comprising the sensor 104 and the haptic feedback device 106.
  • Such local sensor loop 111 may run at a higher frequency than the frequency by which the VR controller 101 updates the virtual space. For example, local sensor loop 111 may run at 100-1000 Hz, while the virtual space is updated at 30-60 Hz.
  • the sensor 104 communicates user motion data to the feedback controller 105, where the motion data may spatial position information of the user 107 in the room, having room coordinates (x,y,z). Further, the motion data may comprise a velocity of the user 107 and/or an acceleration of the user 107 in the room.
  • the sensor 104 may be worn by the user 107, such as being provided in a hand controller or VR glove 110, as indicated in the example of Figs. 2c-d.
  • the sensor 104 may thus be a wearable sensor, and may comprise an accelerometer to detect user motion.
  • the sensor 104 may in such case be directly connected to the feedback controller 105, and the haptic feedback device 106.
  • the sensor 104 may be arranged as a peripheral sensor with a viewpoint towards the user 107 as indicated in the example of Fig. 3.
  • the sensor 104 may communicate wirelessly with the feedback controller 105 in such case.
  • the sensor 104 may be arranged in the wearable display device 102.
  • the sensor 104 may comprise an image sensor configured to capture image data of the user 107, where the motion data is determined based in the captured image data, such as by a triangulation process of the obtained image data.
  • the sensor 104 may comprise an IR sensor.
  • the sensor 104 may comprise a line-of-sight sensor.
  • the VR controller 101 may communicate the interaction rule (R) to the feedback controller 105, as schematically indicated in Fig. 2c.
  • the feedback controller 105 is configured to receive the interaction rule (R) and compare the interaction rule with the user motion data received from the sensor 104.
  • the interaction rule (R) comprises a motion criterion for tactile output to the user 107.
  • the feedback controller 105 is configured to send tactile output instructions to the haptic feedback device 106 when the motion data satisfies the motion criterion.
  • Fig. 2d shows an example where the sensor 104 detects user motion from a first set of coordinates (xo,yo,zo) to a second set of coordinates (x’,y’,z’).
  • the associated distance (d) by which user 107 moves may be determined as satisfying the received motion criterion for generating tactile output to the user 107.
  • the user 107 may sense a vibration when moving a finger a distance (d) as schematically indicated in Fig. 2d.
  • the VR controller 101 has already determined an interaction event is probable and satisfies the probability threshold, while it is the feedback controller 105 which assess whether haptic feedback should be generated based directly on sensing the user’s physical movement with sensor 104.
  • This provides for a tactile sensation which is directly coupled to the user’s movement in the room, such as the distance (d) in the example in Fig. 2d, without having to rely on a trigger for haptic output based on exact tracking and positioning of the 3D model 108 and virtual objects 103 in the virtual space.
  • the latency of the tactile feedback to the user 107 can thus be reduced and the user 107 is able to get more precise feedback on the movement.
  • the user 107 and the associated 3D model 108 may approach a virtual keyboard in the virtual space.
  • the VR controller 101 sends an interaction rule (R) to the feedback controller 105 as the 3D model 108 of the user’s hand approaches the virtual keyboard within a distance (I) that satisfies the probability threshold for an interaction event.
  • the user 107 may at this stage visualize the model 108 of the hand as hovering above the virtual keyboard, while the physical position of the hand in the room may correspond to the position (xo,yo,zo) of a finger of the VR glove 110 in Fig. 2c or the hand in Fig. 3.
  • the virtual keyboard may be a virtual object 103 in full VR or in augmented reality (AR).
  • the sensor 104 detects the movement and the feedback controller 105 send instructions for tactile feedback as the movement satisfies the motion criterion, e.g. by the finger moving to (x’,y’,z’) in Figs. 2d and 3.
  • the user 107 may move another finger slightly, e.g. by hesitating or correcting a movement, so that the moved distance does not satisfy the motion criterion, whereby the user 107 does not receive any tactile output.
  • the user 107 will thus get tactile feedback without having to rely on the visual impression of the position of the model 108 in the virtual space. This gives a consistent feeling of an object.
  • the feedback does not give a consistent world view.
  • the VR controller 101 may generate a corresponding movement of model 108 to provide a sense of control and involvement to the user 107, the VR controller 101 does not have to detect the precise timing of the model 108 intersecting, i.e. “touching”, the virtual object 103 to trigger the haptic feedback since the tactile output instructions are based directly on the motion in the room as sensed by sensor 104.
  • the tactile output is thus controlled by the local sensor loop 111 as mentioned above, and could thus run at a significantly higher frequency than the VR controller 101 responsible for generating the virtual space, thus allowing for reducing latency even in highly complex VR environments. I.e.
  • the feedback controller 105 allows for providing feedback to the user 107 without involving the VR controller 101 , which may be seen as a model variant of the human “reflex action”, where instructions are sent to the muscles without the involvement of the brain (e.g. the ‘touching the hot plate’ scenario).
  • the feedback controller 105 upon receiving the interaction rule (R) the feedback controller 105 is configured to autonomously send the tactile output instructions to the haptic feedback device 106, as the motion data satisfies the motion criterion, independent from the VR controller 101 generating the virtual space.
  • Having a local sensor loop 111 as described provides also for reducing the communication latency, in addition to reducing the latency due to allowing a higher frequency.
  • the user motion data is directly evaluated towards the haptic feedback device 106, via the feedback controller 105, as opposed to previous solutions where the user motion data from a sensor is typically transmitted to a VR headset and then to a haptic feedback device.
  • Communication latency may be particularly high in previous solutions where the motion data is sent wirelessly to such VR headset, and then to a haptic feedback device.
  • Having a feedback controller 105 in communication with the haptic feedback device 106 and being configured to receive the user motion data for comparing with the motion criterion and sending tactile output instructions in response provides for reducing such communication latency.
  • the VR interaction system 100 thus provides low-latency user feedback that allows for a more natural user experience, even in complex VR applications.
  • the interaction rule (R) comprises the motion criterion for tactile output.
  • the motion criterion may comprise a distance threshold (d) for the user’s motion, and the tactile output instructions may be sent to the haptic feedback device 106 when the motion data satisfies the distance threshold (d).
  • the motion criterion comprises a velocity threshold and/or an acceleration threshold for the user’s motion. I.e., the tactile output instructions may be sent to the haptic feedback device 106 when the motion data, detected by the sensor 104, satisfies the velocity threshold and/or the acceleration threshold.
  • the motion criterion may be compared to motion data of movement from an onset coordinate (xo,yo,zo) corresponding to the user’s position when the probability threshold is satisfied and the interaction rule (R) is received.
  • tactile output instructions may be sent to the haptic feedback device 106 depending on the user’s further movements from the onset coordinates (xo,yo,zo), as schematically shown in Figs. 2c-d, 3.
  • the interaction rule (R) and the associated motion criterion for tactile output may depend on a type of the virtual object 103.
  • the VR controller 101 may send different interaction rules to the feedback controller 105 as the user 107 and associated model 108 approach different virtual objects 103 in the virtual space.
  • a virtual object 103 such as a virtual keyboard may have an associated interaction rule with a different motion criterion than that of a continuously moving virtual object 103 in a game application or a simulated machine model.
  • the latter case may take into account the speed and/or acceleration of the user 107 while a distance threshold (d) may be sufficient in the former case.
  • the VR controller 101 may thus add and subtract interaction rules (R) in communication with the feedback controller 105 depending on the user’s navigation and engagement with different application in the virtual space.
  • the interaction rule (R) may comprise a predefined interaction geometry against which the aforementioned local sensor loop 111 assesses the user motion data received from the sensor 104.
  • a predefined interaction geometry may comprise a virtual interaction plane and a movement crossing such plane may satisfy the motion criterion for sending tactile output instructions.
  • the motion criterion may e.g. specify that a movement crossing an interaction plane, placed 12 mm in a direction (x,y,z) from the current location, should trigger tactile output.
  • Other interaction geometries may be points, spheres, or other simple primitives, against which the user motion data may be evaluated via the local sensor loop 111.
  • interaction rule (R) provides for a quicker evaluation of the collision/interaction event for haptic feedback by the local sensor loop 111 , as opposed to evaluation against the full 3D world in the virtual space which typically is a large triangle mesh that is too complex for such quick evaluation.
  • the VR controller 101 may sent the interaction rule (R) to the feedback controller 105 as soon as the probability threshold for an interaction event is satisfied.
  • VR controller 101 is configured to send the interaction rule (R) to the feedback controller 105 in dependence on further sub criteria.
  • the VR controller 101 may prioritize different interaction rules (R) against each other.
  • the user 107 may be engaged with several possible interaction events, having a hierarchy of importance. A higher ranked interaction event may have a different probability threshold than a lower ranked interaction event. Thus, even though a possible interaction is determined for the lower ranked event, it may be overruled by the higher ranked event.
  • Further sub criteria may relate to the capacity of the sensor 104 and the local sensor loop 111 , e.g. given a certain memory and processing capacity.
  • the VR controller 101 may be configured to have the ten most important interaction rules (R) loaded to the local sensor loop 111. E.g. if the VR controller 101 would need to load more interaction rules (R), it may remove the least prioritized one that has already been loaded, to allocate space for the new interaction rule (R).
  • a further sub criterion may relate to prediction accuracy. E.g. the VR controller 101 may know that the local sensor loop 111 has a certain prediction error. In one example, the sensor 104 may sense small movements more accurately compared to than larger movements.
  • the VR controller 101 may be configured to add/remove interaction rules (R) related to the known accuracy of the sensor 104. This may prevent triggering of an interaction rule (R) due to an inaccuracy in the sensing of the sensor 104.
  • the tactile output instructions sent to the haptic feedback device 106 may be determined based on a type of the virtual object 103.
  • the interaction rule (R) may comprise information on the current virtual object 103 engaged by the user 107, which may be incorporated in the tactile output instructions.
  • a virtual keyboard may generate a different tactile output than virtual stream of water.
  • the tactile output instructions may comprise a defined sequence of tactile outputs for generation by the haptic feedback device 106.
  • the tactile output instructions may be determined based on the user motion data, detected by sensor 104. E.g. a fast acceleration may generate stronger tactile feedback.
  • Fig. 4 illustrates a flow chart of a method 200 in a VR interaction system 100.
  • the order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is conceivable that the steps can be performed in varying order.
  • the method 200 comprises generating 201 a virtual space within a VR environment coordinate system (v x ,v y ,v z ) in a VR controller 101 to be displayed in a wearable display device 102 for a user 107.
  • the VR controller determines VR object coordinates (o V x,o V y,o V z) of virtual objects 103 in relation to a virtual user location (u V x,u V y,u V z) in the virtual space.
  • the method 200 comprises communicating 202 user motion data in a room having room coordinates (x,y,z) to a feedback controller 105 being in communication with a haptic feedback device 106 to generate a tactile output.
  • the method 200 comprises detecting 203 a probability for an interaction event, between the user 107 and a virtual object 103, based on a position and/or movement of the virtual object 103 in the virtual space and the virtual user location (u V x,u V y,u V z).
  • the method 200 comprises accessing 204 an interaction rule (R) for the interaction event as the probability satisfies a defined probability threshold and sending 205 the interaction rule (R) to the feedback controller 105.
  • the method 200 comprises receiving 206 the interaction rule (R) at the feedback controller 105 to compare the interaction rule (R) with the received user motion data, where the interaction rule (R) comprises a motion criterion for tactile output.
  • the method 200 comprises sending 207 tactile output instructions to the haptic feedback device 106 when the motion data satisfies the motion criterion.
  • the method 200 thus provides for the advantageous benefits as described above in relation to the VR interaction system 100 and Figs. 1 - 3.
  • the method 200 provides low-latency user feedback in a VR interaction system 100.
  • a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual-reality (VR) interaction system is disclosed comprising a VR controller configured to generate a virtual space within a VR environment coordinate system to be displayed in a wearable display device for a user, the VR controller determines VR object coordinates of virtual objects in relation to a virtual user location in the virtual space, and detects a probability for an interaction event, between the user and a virtual object, send the interaction rule to the feedback controller as the probability satisfies a defined probability threshold, the feedback controller is configured to compare the interaction rule with the user motion data received from a sensor, the interaction rule comprises a motion criterion for tactile output, and send tactile output instructions to a haptic feedback device when the motion data satisfies the motion criterion. A related method is disclosed.

Description

A VIRTUAL-REALITY INTERACTION SYSTEM WITH HAPTIC FEEDBACK
Technical Field
The present invention relates generally to the field of virtual-reality (VR) interaction systems. More particularly, the present invention relates to a VR interaction system to send tactile output instructions to a haptic feedback device and a related method.
Background
To an increasing extent, virtual-reality (VR) interaction systems are being used as an interaction tool in a wide range of business and recreational applications. Virtual-reality presents the user with an environment partially if not fully disconnected from the actual physical environment of the user. Augmented reality (AR) allows the overlay of virtual objects in the physical environment. The user interacts with the VR environment in various ways, such as with IR tracked gloves or gyroscope-/accelerometer tracked objects. Haptic output may be used to generate a tactile sensation for the user in order to increase the sense of involvement and feedback from the interaction. VR interaction systems are however typically associated with high latency and sub- optimal accuracy in the user feedback, while the key to an immersive experience is low latency between the interaction and the feedback. The limitations of a high latency interaction the hinders the potential of VR as an interaction tool and is rather tiring to the user over time.
Summary
It is an objective of the invention to at least partly overcome one or more of the above-identified limitations of the prior art.
One objective is to provide a VR interaction system with low-latency user feedback. One or more of these objectives, and other objectives that may appear from the description below, are at least partly achieved by means of a VR interaction system and a related method according to the independent claims, embodiments thereof being defined by the dependent claims.
According to a first aspect a virtual-reality (VR) interaction system is provided comprising a VR controller configured to generate a virtual space within a VR environment coordinate system to be displayed in a wearable display device for a user, wherein the VR controller determines VR object coordinates of virtual objects in relation to a virtual user location in the virtual space, a sensor configured to detect user motion in a room having room coordinates, a feedback controller in communication with the sensor and a haptic feedback device to generate a tactile output, wherein the sensor is configured to communicate user motion data to the feedback controller, wherein the VR controller is configured to detect a probability for an interaction event, between the user and a virtual object, based on a position and/or movement of the virtual object in the virtual space and the virtual user location, access an interaction rule for the interaction event as the probability satisfies a defined probability threshold, and send the interaction rule to the feedback controller, wherein the feedback controller is configured to receive the interaction rule and compare the interaction rule with the user motion data received from the sensor, wherein the interaction rule comprises a motion criterion for tactile output, and send tactile output instructions to the haptic feedback device when the motion data satisfies the motion criterion.
According to a second aspect a method in a virtual-reality (VR) interaction system is provided comprising generating a virtual space within a VR environment coordinate system in a VR controller to be displayed in a wearable display device for a user, the VR controller determining VR object coordinates of virtual objects in relation to a virtual user location in the virtual space, communicating user motion data in a room having room coordinates to a feedback controller being in communication with a haptic feedback device to generate a tactile output, detecting a probability for an interaction event, between the user and a virtual object, based on a position and/or movement of the virtual object in the virtual space and the virtual user location, accessing an interaction rule for the interaction event as the probability satisfies a defined probability threshold, sending the interaction rule to the feedback controller, receiving the interaction rule at the feedback controller to compare the interaction rule with the received user motion data, wherein the interaction rule comprises a motion criterion for tactile output, and sending tactile output instructions to the haptic feedback device when the motion data satisfies the motion criterion.
According to a third aspect a computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to the second aspect.
Further examples of the invention are defined in the dependent claims, wherein features for the first aspect may be implemented for the second and subsequent aspects, and vice versa.
Some examples of the disclosure provide for a VR interaction system with low-latency user feedback.
Some examples of the disclosure provide for a VR interaction system with user feedback of high precision.
Some examples of the disclosure provide for a high accuracy in the user interaction with a VR environment.
Some examples of the disclosure provide for a VR interaction system with an enhanced VR experience.
It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof. Brief Description of the Drawings
These and other aspects, features and advantages of which examples of the invention are capable of will be apparent and elucidated from the following description of examples of the present invention, reference being made to the accompanying schematic drawings, in which;
Fig. 1 shows a VR interaction system according to an example of the disclosure;
Figs. 2a-b show images of a virtual object and a user in a virtual space;
Fig. 2c shows a VR interaction system according to an example of the disclosure;
Fig. 2d shows a VR interaction system according to an example of the disclosure;
Fig. 3 shows a VR interaction system according to an example of the disclosure; and
Fig. 4 is a flowchart of a method in a VR interaction system according to an example of the disclosure.
Detailed Description
Specific examples of the invention will now be described with reference to the accompanying drawings. This invention may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these examples are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. The terminology used in the detailed description of the examples illustrated in the accompanying drawings is not intended to be limiting of the invention. In the drawings, like numbers refer to like elements.
Fig. 1 is a schematic illustration of a virtual-reality (VR) interaction system 100 comprising a VR controller 101 configured to generate a virtual space within a VR environment coordinate system (vx,vy,vz), as further schematically depicted in Fig. 1 , within the circular dashed lines. The virtual space is to be displayed in a wearable display device 102, such as a VR headset, for a user 107. The VR controller 101 is configured to determine VR object coordinates (Ovx,Ovy, Ovz) of virtual objects 103 in relation to a virtual user location (uVx,uVy,Uvz) in the virtual space, as further exemplified in Figs. 2a-c. The VR object coordinates and the virtual user location are below denoted ov and uv, respectively, for brevity. The virtual user location uv may comprise a set of coordinates in the VR environment coordinate system (vx,vy,vz) that defines the user in three dimensions (3D). The VR controller 101 may thus be configured to generate a 3D model 108 of the user 107 in the virtual space, as exemplified in Fig. 2a, showing a 3D model 108 of the user’s hand in relation to a virtual object 103.
The VR interaction system 100 comprises a sensor 104 configured to detect user motion in a room having room coordinates (x,y,z). The VR interaction system 100 further comprises a feedback controller 105 in communication with the sensor 104 and a haptic feedback device 106. The sensor 104 is configured to communicate user motion data to the feedback controller 105. The haptic feedback device 106 is configured to generate a tactile output for the user 107, such as a vibration. The haptic feedback device 106 may be arranged in any object which is in contact with the user 107, such as in a VR glove 109 or other hand controllers, wrist band 110, or a pen (not shown).
The VR controller 101 is configured to detect a probability for an interaction event, between the user 107 and a virtual object 103, based on a position and/or movement of the virtual object 103 in the virtual space and the virtual user location uv. Fig. 2a shows an example of a virtual object 103 such as a 3D object 103 defined by a set of VR object coordinates (oVx,ovy,oVz) in the virtual space. The VR object coordinates ov may vary over time so that the virtual object 103 has a trajectory with a velocity and acceleration in the virtual space. Likewise, the virtual user location uv varies over time with the user’s motion so that the 3D model 108 moves in the virtual space in relation to the virtual object 103, as indicated with the arrow in Fig. 2a. Fig. 2b indicates a separation between a virtual object 103 and the 3D model 108 as a distance (I), such as a vector (I) defined by a set of coordinates (lvx, lvy, lvz). The distance (I) thus varies with the relative movement between the user’s 3D model 108 and the virtual object 103. As the user 107 moves through the virtual space to interact with different virtual objects 103, i.e. to engage in an interaction event, the user 107 intuitively moves closer to a selected virtual object 103, e.g. by reaching out the hand. The user 107 may wish to interact with the virtual object 103 in different ways, such as manipulating or touching the latter. The associated 3D model 108 of the hand thus has a variable distance (I) to the selected virtual object 103, which typically decreases as the user reach out to it. The 3D model 108 may move in a sporadic non-linear fashion, typical of a hand movement, e.g. as the user 107 spontaneously tap some of the fingers on a surface of the virtual object 103, or quickly correcting a finger movement when interacting with a set of control elements of a GUI in the virtual space. The probability for an interaction event may accordingly depend on the relative movement between the virtual object 103 and the 3D model 108, such as the distance (I), speed, and/or the acceleration of the trajectories associated with the virtual object 103 and the 3D model 108.
The VR controller 101 is configured to access an interaction rule (R) for the interaction event as the probability for the interaction event satisfies a defined probability threshold. As elucidated above, the probability threshold may depend the distance (I), the speed, and/or the acceleration. E.g. reducing the distance (I) with a greater acceleration can be indicative of a more confident choice of an interaction, as opposed to a slow movement which can be indicative of a hesitating user 107, in which case the probability for an interaction event is lower than in the former case. The probability threshold may thus be dependent on the motion data. Fig. 2c shows an example where the VR controller 101 determines the probability threshold for an interaction event as satisfied when the distance (I) equals a defined interaction distance denoted as IR, I = IR, in the a VR environment coordinate system (vx,vy,vz). The probability may also take into account the frequency by which the VR controller 101 updates the VR object coordinates ov and the user location uv in the virtual space, and the latency by which the virtual space is updated. For example, the VR controller 101 may determine that an interaction event may occur within the latency period, e.g. within 50-100 ms, and in such case determine that the probability threshold is satisfied. The interaction rule (R) comprises criteria for tactile feedback to the user 107, as described in the following.
The VR controller 101 is configured to send the interaction rule (R) to the feedback controller 105. The feedback controller 105 may be part of a local sensor loop 111 comprising the sensor 104 and the haptic feedback device 106. Such local sensor loop 111 may run at a higher frequency than the frequency by which the VR controller 101 updates the virtual space. For example, local sensor loop 111 may run at 100-1000 Hz, while the virtual space is updated at 30-60 Hz.
The sensor 104 communicates user motion data to the feedback controller 105, where the motion data may spatial position information of the user 107 in the room, having room coordinates (x,y,z). Further, the motion data may comprise a velocity of the user 107 and/or an acceleration of the user 107 in the room.
The sensor 104 may be worn by the user 107, such as being provided in a hand controller or VR glove 110, as indicated in the example of Figs. 2c-d. The sensor 104 may thus be a wearable sensor, and may comprise an accelerometer to detect user motion. The sensor 104 may in such case be directly connected to the feedback controller 105, and the haptic feedback device 106. The sensor 104 may be arranged as a peripheral sensor with a viewpoint towards the user 107 as indicated in the example of Fig. 3. The sensor 104 may communicate wirelessly with the feedback controller 105 in such case. The sensor 104 may be arranged in the wearable display device 102. The sensor 104 may comprise an image sensor configured to capture image data of the user 107, where the motion data is determined based in the captured image data, such as by a triangulation process of the obtained image data. The sensor 104 may comprise an IR sensor. The sensor 104 may comprise a line-of-sight sensor.
In one example, as the probability for the interaction event satisfies the defined probability threshold, the VR controller 101 may communicate the interaction rule (R) to the feedback controller 105, as schematically indicated in Fig. 2c. The feedback controller 105 is configured to receive the interaction rule (R) and compare the interaction rule with the user motion data received from the sensor 104.
The interaction rule (R) comprises a motion criterion for tactile output to the user 107. The feedback controller 105 is configured to send tactile output instructions to the haptic feedback device 106 when the motion data satisfies the motion criterion. Fig. 2d shows an example where the sensor 104 detects user motion from a first set of coordinates (xo,yo,zo) to a second set of coordinates (x’,y’,z’). The associated distance (d) by which user 107 moves may be determined as satisfying the received motion criterion for generating tactile output to the user 107. E.g. the user 107 may sense a vibration when moving a finger a distance (d) as schematically indicated in Fig. 2d. At this point the VR controller 101 has already determined an interaction event is probable and satisfies the probability threshold, while it is the feedback controller 105 which assess whether haptic feedback should be generated based directly on sensing the user’s physical movement with sensor 104. This provides for a tactile sensation which is directly coupled to the user’s movement in the room, such as the distance (d) in the example in Fig. 2d, without having to rely on a trigger for haptic output based on exact tracking and positioning of the 3D model 108 and virtual objects 103 in the virtual space. The latency of the tactile feedback to the user 107 can thus be reduced and the user 107 is able to get more precise feedback on the movement.
For example, the user 107 and the associated 3D model 108 may approach a virtual keyboard in the virtual space. The VR controller 101 sends an interaction rule (R) to the feedback controller 105 as the 3D model 108 of the user’s hand approaches the virtual keyboard within a distance (I) that satisfies the probability threshold for an interaction event. The user 107 may at this stage visualize the model 108 of the hand as hovering above the virtual keyboard, while the physical position of the hand in the room may correspond to the position (xo,yo,zo) of a finger of the VR glove 110 in Fig. 2c or the hand in Fig. 3. The virtual keyboard may be a virtual object 103 in full VR or in augmented reality (AR). As the user 107 decides to press a particular virtual key and moves the aforementioned finger, the sensor 104 detects the movement and the feedback controller 105 send instructions for tactile feedback as the movement satisfies the motion criterion, e.g. by the finger moving to (x’,y’,z’) in Figs. 2d and 3. The user 107 may move another finger slightly, e.g. by hesitating or correcting a movement, so that the moved distance does not satisfy the motion criterion, whereby the user 107 does not receive any tactile output. The user 107 will thus get tactile feedback without having to rely on the visual impression of the position of the model 108 in the virtual space. This gives a consistent feeling of an object. In previous solutions, if the user taps multiple times on a virtual flat table, but can clearly feel that the haptic feedback is given at varying height from the virtual surface, then the feedback does not give a consistent world view.
While the VR controller 101 may generate a corresponding movement of model 108 to provide a sense of control and involvement to the user 107, the VR controller 101 does not have to detect the precise timing of the model 108 intersecting, i.e. “touching”, the virtual object 103 to trigger the haptic feedback since the tactile output instructions are based directly on the motion in the room as sensed by sensor 104. The tactile output is thus controlled by the local sensor loop 111 as mentioned above, and could thus run at a significantly higher frequency than the VR controller 101 responsible for generating the virtual space, thus allowing for reducing latency even in highly complex VR environments. I.e. the feedback controller 105 allows for providing feedback to the user 107 without involving the VR controller 101 , which may be seen as a model variant of the human “reflex action”, where instructions are sent to the muscles without the involvement of the brain (e.g. the ‘touching the hot plate’ scenario).
Hence, upon receiving the interaction rule (R) the feedback controller 105 is configured to autonomously send the tactile output instructions to the haptic feedback device 106, as the motion data satisfies the motion criterion, independent from the VR controller 101 generating the virtual space.
Having a local sensor loop 111 as described provides also for reducing the communication latency, in addition to reducing the latency due to allowing a higher frequency. I.e. the user motion data is directly evaluated towards the haptic feedback device 106, via the feedback controller 105, as opposed to previous solutions where the user motion data from a sensor is typically transmitted to a VR headset and then to a haptic feedback device. Communication latency may be particularly high in previous solutions where the motion data is sent wirelessly to such VR headset, and then to a haptic feedback device. Having a feedback controller 105 in communication with the haptic feedback device 106 and being configured to receive the user motion data for comparing with the motion criterion and sending tactile output instructions in response provides for reducing such communication latency.
The VR interaction system 100 thus provides low-latency user feedback that allows for a more natural user experience, even in complex VR applications.
The interaction rule (R) comprises the motion criterion for tactile output. As exemplified above, the motion criterion may comprise a distance threshold (d) for the user’s motion, and the tactile output instructions may be sent to the haptic feedback device 106 when the motion data satisfies the distance threshold (d). In another example, the motion criterion comprises a velocity threshold and/or an acceleration threshold for the user’s motion. I.e., the tactile output instructions may be sent to the haptic feedback device 106 when the motion data, detected by the sensor 104, satisfies the velocity threshold and/or the acceleration threshold. The motion criterion may be compared to motion data of movement from an onset coordinate (xo,yo,zo) corresponding to the user’s position when the probability threshold is satisfied and the interaction rule (R) is received. Thus, tactile output instructions may be sent to the haptic feedback device 106 depending on the user’s further movements from the onset coordinates (xo,yo,zo), as schematically shown in Figs. 2c-d, 3.
The interaction rule (R) and the associated motion criterion for tactile output may depend on a type of the virtual object 103. For example, the VR controller 101 may send different interaction rules to the feedback controller 105 as the user 107 and associated model 108 approach different virtual objects 103 in the virtual space. E.g. a virtual object 103 such as a virtual keyboard may have an associated interaction rule with a different motion criterion than that of a continuously moving virtual object 103 in a game application or a simulated machine model. The latter case may take into account the speed and/or acceleration of the user 107 while a distance threshold (d) may be sufficient in the former case. The VR controller 101 may thus add and subtract interaction rules (R) in communication with the feedback controller 105 depending on the user’s navigation and engagement with different application in the virtual space.
The interaction rule (R) may comprise a predefined interaction geometry against which the aforementioned local sensor loop 111 assesses the user motion data received from the sensor 104. For example, a predefined interaction geometry may comprise a virtual interaction plane and a movement crossing such plane may satisfy the motion criterion for sending tactile output instructions. The motion criterion may e.g. specify that a movement crossing an interaction plane, placed 12 mm in a direction (x,y,z) from the current location, should trigger tactile output. Other interaction geometries may be points, spheres, or other simple primitives, against which the user motion data may be evaluated via the local sensor loop 111. Having such geometries defined by the interaction rule (R) provides for a quicker evaluation of the collision/interaction event for haptic feedback by the local sensor loop 111 , as opposed to evaluation against the full 3D world in the virtual space which typically is a large triangle mesh that is too complex for such quick evaluation.
In one example the VR controller 101 may sent the interaction rule (R) to the feedback controller 105 as soon as the probability threshold for an interaction event is satisfied. In other examples VR controller 101 is configured to send the interaction rule (R) to the feedback controller 105 in dependence on further sub criteria. For example, the VR controller 101 may prioritize different interaction rules (R) against each other. The user 107 may be engaged with several possible interaction events, having a hierarchy of importance. A higher ranked interaction event may have a different probability threshold than a lower ranked interaction event. Thus, even though a possible interaction is determined for the lower ranked event, it may be overruled by the higher ranked event. Further sub criteria may relate to the capacity of the sensor 104 and the local sensor loop 111 , e.g. given a certain memory and processing capacity. For example, if the memory and processor have a capacity to handle a defined number of interaction rules (R) at a desired target frequency, such as ten interaction rules (R), the VR controller 101 may be configured to have the ten most important interaction rules (R) loaded to the local sensor loop 111. E.g. if the VR controller 101 would need to load more interaction rules (R), it may remove the least prioritized one that has already been loaded, to allocate space for the new interaction rule (R). A further sub criterion may relate to prediction accuracy. E.g. the VR controller 101 may know that the local sensor loop 111 has a certain prediction error. In one example, the sensor 104 may sense small movements more accurately compared to than larger movements. The VR controller 101 may be configured to add/remove interaction rules (R) related to the known accuracy of the sensor 104. This may prevent triggering of an interaction rule (R) due to an inaccuracy in the sensing of the sensor 104.
The tactile output instructions sent to the haptic feedback device 106 may be determined based on a type of the virtual object 103. E.g. the interaction rule (R) may comprise information on the current virtual object 103 engaged by the user 107, which may be incorporated in the tactile output instructions. For example, a virtual keyboard may generate a different tactile output than virtual stream of water. The tactile output instructions may comprise a defined sequence of tactile outputs for generation by the haptic feedback device 106.
The tactile output instructions may be determined based on the user motion data, detected by sensor 104. E.g. a fast acceleration may generate stronger tactile feedback.
Fig. 4 illustrates a flow chart of a method 200 in a VR interaction system 100. The order in which the steps of the method 200 are described and illustrated should not be construed as limiting and it is conceivable that the steps can be performed in varying order. The method 200 comprises generating 201 a virtual space within a VR environment coordinate system (vx,vy,vz) in a VR controller 101 to be displayed in a wearable display device 102 for a user 107. The VR controller determines VR object coordinates (oVx,oVy,oVz) of virtual objects 103 in relation to a virtual user location (uVx,uVy,uVz) in the virtual space. The method 200 comprises communicating 202 user motion data in a room having room coordinates (x,y,z) to a feedback controller 105 being in communication with a haptic feedback device 106 to generate a tactile output. The method 200 comprises detecting 203 a probability for an interaction event, between the user 107 and a virtual object 103, based on a position and/or movement of the virtual object 103 in the virtual space and the virtual user location (uVx,uVy,uVz). The method 200 comprises accessing 204 an interaction rule (R) for the interaction event as the probability satisfies a defined probability threshold and sending 205 the interaction rule (R) to the feedback controller 105. The method 200 comprises receiving 206 the interaction rule (R) at the feedback controller 105 to compare the interaction rule (R) with the received user motion data, where the interaction rule (R) comprises a motion criterion for tactile output. The method 200 comprises sending 207 tactile output instructions to the haptic feedback device 106 when the motion data satisfies the motion criterion. The method 200 thus provides for the advantageous benefits as described above in relation to the VR interaction system 100 and Figs. 1 - 3. The method 200 provides low-latency user feedback in a VR interaction system 100.
A computer program product is provided comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method 200.
The present invention has been described above with reference to specific examples. However, other examples than the above described are equally possible within the scope of the invention. The different features and steps of the invention may be combined in other combinations than those described. The scope of the invention is only limited by the appended patent claims.
More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings of the present invention is/are used.

Claims

Claims
1 . A virtual-reality (VR) interaction system (100) comprising a VR controller (101 ) configured to generate a virtual space within a VR environment coordinate system (vx,vy,vz) to be displayed in a wearable display device (102) for a user (107), wherein the VR controller determines VR object coordinates (oVx,oVy,oVz) of virtual objects (103) in relation to a virtual user location (uVx,uVy,uVz) in the virtual space, a sensor (104) configured to detect user motion in a room having room coordinates (x,y,z), a feedback controller (105) in communication with the sensor and a haptic feedback device (106) to generate a tactile output, wherein the sensor is configured to communicate user motion data to the feedback controller, wherein the VR controller is configured to detect a probability for an interaction event, between the user and a virtual object (103), based on a position and/or movement of the virtual object in the virtual space and the virtual user location, access an interaction rule (R) for the interaction event as the probability satisfies a defined probability threshold, and send the interaction rule to the feedback controller, wherein the feedback controller is configured to receive the interaction rule and compare the interaction rule with the user motion data received from the sensor, wherein the interaction rule comprises a motion criterion for tactile output, and send tactile output instructions to the haptic feedback device when the motion data satisfies the motion criterion.
2. Virtual-reality interaction system according to claim 1, wherein upon receiving the interaction rule the feedback controller is configured to autonomously send the tactile output instructions to the haptic feedback device, as the motion data satisfies the motion criterion, independent from the VR controller generating the virtual space.
3. Virtual-reality interaction system according to claim 1 or 2, wherein the motion criterion comprises a distance threshold (d) for the user’s motion, whereby the tactile output instructions are sent to the haptic feedback device when the motion data satisfies the distance threshold (d).
4. Virtual-reality interaction system according to any of claims 1 - 3, wherein the motion criterion comprises a velocity threshold and/or an acceleration threshold for the user’s motion, whereby the tactile output instructions are sent to the haptic feedback device when the motion data satisfies the velocity threshold and/or the acceleration threshold.
5. Virtual-reality interaction system according to any of claims 1 - 4, wherein the motion criterion is compared to motion data of movement from an onset coordinate (xo,yo,zo) corresponding to the user’s position when the probability threshold is satisfied and the interaction rule is received.
6. Virtual-reality interaction system according to any of claims 1 - 5, wherein the tactile output instructions comprise a defined sequence of tactile outputs for generation by the haptic feedback device.
7. Virtual-reality interaction system according to any of claims 1 - 6, wherein the tactile output instructions are determined based on a type of the virtual object.
8. Virtual-reality interaction system according to any of claims 1 - 7, wherein the tactile output instructions are determined based on the user motion data.
9. Virtual-reality interaction system according to any of claims 1 - 8, wherein the motion data comprises spatial position information of the user in the room.
10. Virtual-reality interaction system according to any of claims 1 - 9, wherein the motion data comprises a velocity of the user and/or an acceleration of the user.
11. Virtual-reality interaction system according to any of claims 1 - 10, wherein the sensor comprises a wearable sensor, such as an accelerometer.
12. Virtual-reality interaction system according to any of claims 1 - 11 , wherein the sensor comprises an image sensor configured to capture image data of the user, wherein the motion data is determined based in the image data.
13. Virtual-reality interaction system according to any of claims 1 - 12, wherein the sensor communicates wirelessly with the feedback controller.
14. Virtual-reality interaction system according to any of claims 1 - 13, wherein the interaction rule depends on a type of the virtual object.
15. Virtual-reality interaction system according to any of claims 1 - 14, wherein the probability threshold is dependent on the motion data.
16. Virtual-reality interaction system according to any of claims 1 - 15, wherein the virtual space is an augmented reality (AR) space.
17. A method (200) in a virtual-reality (VR) interaction system (100) comprising generating (201) a virtual space within a VR environment coordinate system (vx,vy,vz) in a VR controller (101) to be displayed in a wearable display device (102) for a user (107), the VR controller determining VR object coordinates (oVx,ovy,oVz) of virtual objects (103) in relation to a virtual user location (uVx,uvy,uVz) in the virtual space, communicating (202) user motion data in a room having room coordinates (x,y,z) to a feedback controller (105) being in communication with a haptic feedback device (106) to generate a tactile output, detecting (203) a probability for an interaction event, between the user and a virtual object (103), based on a position and/or movement of the virtual object in the virtual space and the virtual user location, accessing (204) an interaction rule (R) for the interaction event as the probability satisfies a defined probability threshold, sending (205) the interaction rule to the feedback controller, receiving (206) the interaction rule at the feedback controller to compare the interaction rule with the received user motion data, wherein the interaction rule comprises a motion criterion for tactile output, and sending (207) tactile output instructions to the haptic feedback device when the motion data satisfies the motion criterion.
18. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the steps of the method according to claim 17.
PCT/SE2023/050457 2022-05-31 2023-05-10 A virtual-reality interaction system with haptic feedback WO2023234823A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SE2230166-7 2022-05-31
SE2230166 2022-05-31

Publications (1)

Publication Number Publication Date
WO2023234823A1 true WO2023234823A1 (en) 2023-12-07

Family

ID=86382787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE2023/050457 WO2023234823A1 (en) 2022-05-31 2023-05-10 A virtual-reality interaction system with haptic feedback

Country Status (1)

Country Link
WO (1) WO2023234823A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110279249A1 (en) * 2009-05-29 2011-11-17 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
US20200073482A1 (en) * 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
KR102251308B1 (en) * 2020-08-05 2021-05-12 플레이스비 주식회사 Haptic controller and system and method for providing haptic feedback using the same

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110279249A1 (en) * 2009-05-29 2011-11-17 Microsoft Corporation Systems and methods for immersive interaction with virtual objects
US20200073482A1 (en) * 2017-03-21 2020-03-05 Pcms Holdings, Inc. Method and system for the detection and augmentation of tactile interactions in augmented reality
KR102251308B1 (en) * 2020-08-05 2021-05-12 플레이스비 주식회사 Haptic controller and system and method for providing haptic feedback using the same

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANTONAKOGLOU KONSTANTINOS ET AL: "Toward Haptic Communications Over the 5G Tactile Internet", IEEE COMMUNICATIONS SURVEYS & TUTORIALS, vol. 20, no. 4, 28 June 2018 (2018-06-28), pages 3034 - 3059, XP011698276, DOI: 10.1109/COMST.2018.2851452 *

Similar Documents

Publication Publication Date Title
Gonzalez et al. Reach+ extending the reachability of encountered-type haptics devices through dynamic redirection in vr
US10417827B2 (en) Syndication of direct and indirect interactions in a computer-mediated reality environment
US9927869B2 (en) Apparatus for outputting virtual keyboard and method of controlling the same
CN107533373B (en) Input via context-sensitive collision of hands with objects in virtual reality
US20200310561A1 (en) Input device for use in 2d and 3d environments
WO2017100406A1 (en) Context sensitive user interface activation in an augmented and/or virtual reality environment
KR101318244B1 (en) System and Method for Implemeting 3-Dimensional User Interface
US20190163266A1 (en) Interaction system and method
EP3096206A1 (en) Haptic effects based on predicted contact
US20140347329A1 (en) Pre-Button Event Stylus Position
US5982353A (en) Virtual body modeling apparatus having dual-mode motion processing
WO2007124614A1 (en) Process for controlling cursor speed in user interface
CN105138136A (en) Hand gesture recognition device, hand gesture recognition method and hand gesture recognition system
EP4307096A1 (en) Key function execution method, apparatus and device, and storage medium
CN108553892A (en) virtual object control method, device, storage medium and electronic equipment
CN112068757B (en) Target selection method and system for virtual reality
WO2023234823A1 (en) A virtual-reality interaction system with haptic feedback
CN113467625A (en) Virtual reality control device, helmet and interaction method
CN114833826B (en) Control method and device for realizing collision touch sense of robot and rehabilitation robot
Halim et al. Raycasting method using hand gesture for target selection on the occluded object in handheld augmented reality
EP3015953B1 (en) Method and system for detecting objects of interest
Schlattmann et al. 3D interaction techniques for 6 DOF markerless hand-tracking
CN105787971B (en) Information processing method and electronic equipment
CN107977071B (en) Operation method and device suitable for space system
Dam et al. A study of selection and navigation techniques using Kinect in VR

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23724064

Country of ref document: EP

Kind code of ref document: A1