NL2025627B1 - A virtual or augmented reality training system - Google Patents

A virtual or augmented reality training system Download PDF

Info

Publication number
NL2025627B1
NL2025627B1 NL2025627A NL2025627A NL2025627B1 NL 2025627 B1 NL2025627 B1 NL 2025627B1 NL 2025627 A NL2025627 A NL 2025627A NL 2025627 A NL2025627 A NL 2025627A NL 2025627 B1 NL2025627 B1 NL 2025627B1
Authority
NL
Netherlands
Prior art keywords
virtual
training system
virtual reality
dummy object
user
Prior art date
Application number
NL2025627A
Other languages
Dutch (nl)
Inventor
Luijten Johannes
Evan Shor Daniel
Gabriël Joan Den Butter Gijs
Peter Bogerd Cornelis
Lammers Max
Original Assignee
Adjuvo Motion B V
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Adjuvo Motion B V filed Critical Adjuvo Motion B V
Priority to NL2025627A priority Critical patent/NL2025627B1/en
Priority to PCT/NL2021/050293 priority patent/WO2021235928A1/en
Application granted granted Critical
Publication of NL2025627B1 publication Critical patent/NL2025627B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/24Use of tools

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A virtual or augmented reality training system (1) com— prising a processor (2), a visual representation tool (3, 4), a user’s wearable (6), and a dummy object (7) which is engageable by the wearable (6), wherein the dummy object (7) corresponds to 5 a virtual object (8) which is to be manipulated and which is generated by the system (1), wherein the system (1) comprises a sensor or camera (9) connected to the processor (2) to detect a position and/or orientation of at least one of the dummy object (7) and the user’s wearable (6), wherein the system (l) is ar— 10 ranged to show with the visual representation tool (3, 4) a rep— resentation of the wearable (6) engaging the virtual object (8) to be manipulated, in a position and/or orientation correspond— ing to the position and/or orientation as detected by the sensor or camera (9).

Description

A virtual or augmented reality training system The invention relates to a virtual or augmented reali- ty training system.
The technological field of virtual reality is getting increasing attention, and it comes in different variations. The following examples may be considered representative.
EP 1 417 547 discloses a system for providing realis- tic sensation within a simulation system by providing tactile (haptic) feedback to a user. The system includes an engageable practice tool that the user engages and a mechanical simula- tion apparatus coupled to the practice tool. An interface de- vice is coupled to the simulation apparatus and a host comput- er is coupled to the interface device for implementing an ap- plication program. The application program provides signals for actuators to move cables and thereby move a mechanical linkage of the simulation apparatus.
Us2019/0258058 discloses a head-mounted device includ- ing a scene camera, a display, and one or more processors: capturing, via the scene camera, a pass-through image; obtaining mixed reality (MR) content to be rendered in associ- ation with the pass-through image; performing, using the one or more processors, one or more im- age signal processing functions on the pass-through image; generating a display image by combining, using the one or more processors, the MR content with the processed pass-through im- age; and displaying, on the display, the display image.
The article A Review on Mixed Reality: Current Trends, Challenges and Prospects, by Somaiieh Rokhsaritalemi et al, Appl. Sci. 2020, 10, 6386; doi:10.3390/appl0020636 available at saw mdpd coms jouinal Japplsci discusses the trends in virtual reality and discusses amongst others the three features of any MR system, to note (1) combining a real-world object and a virtual object; (2) interacting in real-time; and (3) mapping between the virtual object and the real object to create in- teractions between them.
According to the invention a virtual or augmented re- ality training system is proposed comprising a processor, a visual representation tool, a user’s wearable, and a dummy ob-
ject which is engageable by the wearable, wherein the dummy object corresponds to a virtual object which is to be manipu- lated and which is generated by the system, wherein the system comprises a sensor or camera connected to the processor to de- tect a position and/or orientation of at least one of the dum- my object and the user’s wearable, and wherein the system is arranged to show with the visual representation tool a repre- sentation of the wearable engaging the virtual object to be manipulated, in a position and/or orientation corresponding to the position and/or orientation as detected by the sensor or camera.
The invention thus provides a very effective virtual training and manipulation tool avoiding the need to perform training in the real world.
The visual representation tool as applied in the in- vention is for instance a display driven by the processor, or smart glasses worn by the user, wherein the glasses are driven by the processor.
It is preferable that the wearable itself detects a position and/or orientation of at least one of a thump and/or fingers of the user that is provided with the wearable. Anoth- er option is that the sensor or camera is arranged to detect a position and/or orientation of at least one of a thumb and/or fingers of the user that is provided with the wearable. The wearable can be for instance a glove or a thimble or both.
A preferable aspect of the system of the invention is that the wearable is provided with a haptic feedback part or parts to provide the user force- and/or vibrotactile feedback information depending on the position and/or orientation as detected by the sensor or camera. This increases the simulated real life experience, in particular in an embodiment wherein the force- and/or vibrotactile feed-back information is sup- plied to at least one of the thumb and fingers of the user that is wearing the wearable.
The simulated real life experience is further promoted in an embodiment wherein the dummy object is provided with dummy control elements on positions that are representative for the positions of virtual control elements of the virtual object.
40 Preferably the force- and/or vibrotactile feedback in-
formation is supplied to at least one of the thumb and/or fin- ger {s) of the user that is provided with the wearable at posi- tions corresponding to where the wearable engages the dummy control elements of the dummy object, Suitably the dummy object is restricted to a handle only, whereas the virtual reality visual representation tool depicts a complete representation of the virtual object.
More generally the dummy object is a man-machine in- terface device with true or virtual buttons, such as a control panel.
The location of the sensor or camera is arbitrary. In one embodiment the sensor or camera is mounted on the visual representation tool. In another embodiment the sensor or cam- era is implemented in or on the wearable. It depends on the application, which position of the sensor or camera is deemed best.
It is also possible that the dummy object comprises a weight-sensor connected to the processor and configured to de- tect the weight the dummy object and/or of an additional weight held by the dummy object. With the haptic feedback pro- vided through the wearable, the user can thus experience the weight attributed to the virtual object that is simulated.
Further the dummy object may in a further embodiment comprise sensory devices for measuring a user operating the control elements of the dummy object, which sensory devices connect to the processor, so as to again provide haptic feed- back to the user through the wearable.
Finally it is to be noted that the location of the processor is also arbitrary, it may at times be preferable for instance that the processor is positioned in the dummy object.
The invention will hereinafter be further elucidated with reference to the drawing of an exemplary embodiment of a virtual or augmented reality training system according to the invention that is not limiting as to the appended claims.
In the drawing: “figures la - lc show a schematic representation of the system of the invention; ~figure 2 shows a user of the system of figure 1, 40 wearing smart glasses;
-figure 3 shows a view at a user’s hand holding a dum- my object and at the virtual reality visual representation tool with a representation thereof; —figures 4 and 5 show a handle to be used in the sys- tem of the invention.
Whenever in the figures the same reference numerals are applied, these numerals refer to the same parts.
Figures 1 a - c depict a general configuration of a virtual or augmented reality training system 1 according to the invention. The system 1 comprises a processor 2 that drives a visual representation tool 3, 4. The visual represen- tation tool can be either a display 4 or smart glasses 3 as may be worn by a user 5 as shown in figure 2. Further the sys- tem comprises a user’s wearable 6, in this case a glove. Fig- ure 3 shows that part of this system is further a dummy object 7 which is engageable by the wearable 6. The dummy object 7 corresponds to a virtual object 8 which is to be manipulated, and which is generated by the system 1. Figure 3 shows the virtual object 8 on display 4. Alternatively the virtual ob- ject § can of course also be viewed at with the smart glasses 3 worn by the user 5 as depicted in figure 2.
Figure 1b shows that the system 1 further comprises a sensor or camera 9 connected to the processor 2 which sensor or camera 9 is used to detect a position and/or orientation of at least one of the dummy object 7 and the user’s wearable ©. Preferably the wearable 6 itself or optionally the sensor or camera 9 is arranged to detect a position and/or orientation of at least one of a thumb and/or finger {s) of the user that is provided with the wearable 6.
Figure 3 depicts that the system 1 is arranged to show with the visual representation tool (the display 4) a repre- sentation 6’ of the wearable 6 engaging the virtual object 8 to be manipulated, in a position and/or orientation corre- sponding to the position and/or orientation as detected by the sensor or camera 9.
The system 1 is further configured to provide that the processor 2 drives the wearable 6 to supply haptic feedback to the user 5 in the form of force- and/or vibrotactile feedback information depending on the position and/or orientation as 40 detected by the sensor or camera 9. The glove 6 is for that purpose provided with pull wires 10 and/or vibratory devices
11. Accordingly the force- and/or vibrotactile feedback infor- mation is preferably supplied to at least one of a thumb and/or finger (s) of the user that is wearing the wearable 6.
5 Figure 4 depicts the dummy object 7 and shows that this object 7 is provided with dummy control elements 12, 13 on positions that are representative for the positions of vir- tual control elements of the virtual object 8 shown in figure
3. This is however an option, it is also possible that the dummy control elements are fully dispensed with, as is depict- ed in figure 5, so that for instance merely a handle remains.
Particularly when the earlier mentioned dummy control elements 12, 13 are applied, it is preferable that the force- and/or vibrotactile feedback information is supplied to at least one of the thumb and/or finger (s) of the user that is provided with the wearable 6 at positions corresponding to where the wearable 6 engages the dummy control elements 12, 13 of the dummy object 7. In figure 3 this is for instance shown for the situation that the feedback is applied to the index finger 14.
Figures 4 and 5 show an embodiment wherein the dummy object 7 is restricted to a handle only, whereas figure 3 shows that the virtual reality visual representation tool 4 depicts a complete representation of the virtual object 8 that corresponds to the handle 7. The application of a handle is however not the only option; generally speaking the applica- tion of a man-machine interface is envisaged, which can also be embodied as a tablet or control panel with true or virtual buttons representing the above mentioned dummy control ele- ments.
Although the invention has been discussed in the foregoing with reference to an exemplary embodiment of the system of the invention, the invention is not restricted to this particular embodiment which can be varied in many ways without departing from the invention. It is for instance pos- sible that: — the sensor or camera is implemented in or on the wearable 6; — the dummy object 7 comprises a weight-sensor con- 40 nected to the processor 2 and which is configured to detect the weight of the dummy object 7 and/or an additional weight held by the dummy object 7; — the dummy object 7 comprises sensory devices for measuring a user operating the control elements 12, 13 of the dummy object 7, which sensory devices connect to the processor 2; and/or that — the processor 2 is positioned in the dummy object 7. The discussed exemplary embodiment shall therefore not be used to construe the appended claims strictly in accordance therewith. On the contrary the embodiment is merely intended to explain the wording of the appended claims without intent to limit the claims to this exemplary embodiment. The scope of protection of the invention shall therefore be construed in accordance with the appended claims only, wherein a possible ambiguity in the wording of the claims shall be resolved using this exemplary embodiment. Aspects of the invention are itemized in the following section.
1. A virtual or augmented reality training system (1) comprising a processor {2}, a visual representation tool (3, 4), a user’s wearable (6), and a dummy object (7) which is en- gageable by the wearable (6), wherein the dummy object (7) corresponds to a virtual object (8) which is to be manipulated and which is generated by the system (1), wherein the system (1) comprises a sensor or camera (9) connected to the proces- sor (2) to detect a position and/or orientation of at least one of the dummy object (7) and the user’s wearable (6), wherein the system (1) is arranged to show with the visual representation tool (3, 4) a representation of the wearable (6) engaging the virtual object (8) to be manipulated, in a position and/or orientation corresponding to the position and/or orientation as detected by the sensor or camera (9).
2. The virtual reality training system of claim 1, characterized in that the visual representation tool is one of a display (4) and smart glasses (3).
3. The virtual reality training system of claim 1 or 2, characterized in that one of the wearable (6) and the sen- sor or camera (9) is arranged to detect a position and/or ori- entation of at least one of a thumb and/or finger(s) of the 40 user that is provided with the wearable (6).
4. The virtual reality training system of any one of claims 1 - 3, characterized in that the wearable (6) is pro- vided with a haptic feedback part or parts (10, 11) to provide the user force- and/or vibrotactile feedback information de- pending on the position and/or orientation as detected by the sensor or camera (9).
5. The virtual reality training system of claim 3 and 4, characterized in that the force- and/or vibrotactile feed- back information is supplied to at least one of a thumb and/or finger (s) of the user that is wearing the wearable (6).
6. The virtual reality training system of any one of claims 1 - 5, characterized in that the dummy object (7) is provided with dummy control elements (12, 13) on positions that are representative for the positions of virtual control elements of the virtual object (8).
7. The virtual reality training system of claim 5 and 6, characterized in that the force- and/or vibrotactile feed- back information is supplied to at least one of the thumb and/or finger{s) of the user that is provided with the weara- ble (6) at positions corresponding to where the wearable (6) engages the dummy control elements (12, 13) of the dummy ob- ject (7).
8. The virtual reality training system of any one of claims 1 — 7, characterized in that the dummy object (7) is restricted to a handle only, whereas the virtual or augmented reality visual representation tool (3, 4) depicts a complete representation of the virtual object (8).
9. The virtual reality training system of any one of claims 1 -— 7, characterized in that the dummy object (7) is a man-machine interface device with true or virtual buttons, such as a control panel.
10. The virtual reality training system of any one of claims 1 - 9, characterized in that the sensor or camera is implemented in or on the wearable (6).
11. The virtual reality training system of any one of claims 1 - 10, characterized in that the dummy object (7) com- prises a weight-sensor connected to the processor (2) and con- figured to detect the weight of the dummy object (7) and/or an optional additional weight held by the dummy object (7).
40 12. The virtual reality training system of any one of claims 6 - 11, characterized in that the dummy object (7) com- prises sensory devices for measuring a user operating the con- trol elements (12, 13) of the dummy object (7), which sensory devices connect to the processor (2).
13. The virtual reality training system of any one of claims 1 - 12, characterized in that the processor (2) is po- sitioned in the dummy object (7).

Claims (13)

CONCLUSIESCONCLUSIONS 1. Trainingssysteem (1) voor virtual of augmented rea- lity, omvattende een processor {2}, een visuele weergavein- richting (3, 4), een door de gebruiker draagbaar apparaat (6) en een dummy-object (7) dat in contact brengbaar is met het draagbare apparaat (6), waarbij het dummy-object (7) overeen- komt met een virtueel object (8) dat moet worden gemanipuleerd en dat door het systeem (1) wordt gegenereerd, waarbij het systeem (1) een sensor of camera (9) omvat dat verbonden is met de processor (2) om een positie en / of oriëntatie van ten minste één van het dummy-object (7) en het draagbare apparaat (6) van de gebruiker te detecteren, waarbij het systeem (1) is ingericht om met de visuele weergaveinrichting (3, 4) een weergave te tonen van het draagbare apparaat (6) dat aangrijpt op het virtuele te manipuleren object (8), in een positie en / of oriëntatie die overeenkomt met de positie en / of oriënta- tie zoals gedetecteerd door de sensor of camera (9).A training system (1) for virtual or augmented reality, comprising a processor {2}, a visual display device (3, 4), a user wearable device (6) and a dummy object (7) that contactable with the portable device (6), the dummy object (7) corresponding to a virtual object (8) to be manipulated and generated by the system (1), the system (1 ) includes a sensor or camera (9) connected to the processor (2) to detect a position and/or orientation of at least one of the dummy object (7) and the wearable device (6) of the user, wherein the system (1) is arranged to display with the visual display device (3, 4) a display of the portable device (6) engaging the virtual object to be manipulated (8), in a corresponding position and/or orientation with the position and/or orientation as detected by the sensor or camera (9). 2. Trainingssysteem voor virtual reality volgens con- clusie 1, met het kenmerk, dat de visuele weergaveinrichting één van een beeldscherm (4) en een slimme bril (3) is.A virtual reality training system according to claim 1, characterized in that the visual display device is one of a display (4) and a pair of smart glasses (3). 3. Trainingssysteem voor virtual reality volgens con- clusie 1 of 2, met het kenmerk, dat één van het draagbaar ap- paraat (6) en de sensor of camera (9) is ingericht om een positie en / of oriëntatie te bepalen van ten minste één van de een duim en / of vinger (s) van de gebruiker die is voor- zien van het draagbaar apparaat (6).Virtual reality training system according to claim 1 or 2, characterized in that one of the wearable device (6) and the sensor or camera (9) is arranged to determine a position and/or orientation of at least at least one of the user's thumb and/or finger(s) provided with the wearable device (6). 4. Trainingssysteem voor virtual reality volgens een van de conclusies 1-3, met het kenmerk, dat het draagbaar ap- paraat (6) is voorzien van een haptisch feedbackonderdeel of onderdelen (10, 11) om de gebruiker kracht- en / of vi- brotactile feedbackinformatie te verschaffen afhankelijk van de positie en / of oriëntatie zoals gedetecteerd door de sen- sor of camera (9).A virtual reality training system according to any one of claims 1 to 3, characterized in that the wearable device (6) is provided with a haptic feedback part or parts (10, 11) to force and/or vibrate the user. - provide brotactile feedback information depending on the position and/or orientation as detected by the sensor or camera (9). 5. Trainingssysteem voor virtual reality volgens con- clusie 3 en 4, met het kenmerk, dat de kracht- en / of vi- brotactiele feedbackinformatie wordt geleverd aan ten minste één van een duim en / of vinger (s) van de gebruiker die voor- zien is van het draagbare apparaat (6).A virtual reality training system according to claims 3 and 4, characterized in that the force and/or vibrotactile feedback information is provided to at least one of a thumb and/or finger(s) of the user used for - view of the portable device (6). 40 6. Trainingssysteem voor virtual reality volgens een van de conclusies 1-5, met het kenmerk, dat het dummy-object (7) is voorzien van dummy-bedieningselementen (12, 13) op por sities die representatief zijn voor de posities van virtuele bedieningselementen van het virtuele object (8).A virtual reality training system according to any one of claims 1 to 5, characterized in that the dummy object (7) is provided with dummy control elements (12, 13) on positions representative of the positions of virtual reality controls of the virtual object (8). 7. Trainingssysteem voor virtual reality volgens con- clusie 5 en 6, met het kenmerk, dat de kracht- en / of vi- brotactiele feedbackinformatie wordt geleverd aan ten minste één van de duim en / of vinger (s) van de gebruiker die is voorzien van het draagbaar apparaat (6) op posities die over- eenkomen met waar het draagbaar apparaat (6) aangrijpt op de dummy-bedieningselementen (12, 13) van het dummy-object (7).A virtual reality training system according to claims 5 and 6, characterized in that the force and/or vibrotactile feedback information is provided to at least one of the thumb and/or finger(s) of the user who is provided with the portable device (6) at positions corresponding to where the portable device (6) engages the dummy controls (12, 13) of the dummy object (7). 3. Trainingssysteem voor virtual reality volgens een van de conclusies 1-7, met het kenmerk dat het dummy-object (7) beperkt is tot alleen een handgreep, terwijl de virtuele of augmented reality- weergaveinrichting (3, 4) een volledige representatie weergeeft van het virtuele object (8).A virtual reality training system according to any one of claims 1 to 7, characterized in that the dummy object (7) is limited to only a handle, while the virtual or augmented reality display device (3, 4) displays a full representation of the virtual object (8). 9. Trainingssysteem voor virtual reality volgens een van de conclusies 1-7, met het kenmerk, dat het dummy-object (7) een mens-machine-interface-apparaat is met echte of virtu- ele knoppen, zoals een bedieningspaneel.A virtual reality training system according to any one of claims 1 to 7, characterized in that the dummy object (7) is a human-machine interface device with real or virtual buttons, such as a control panel. 10. Trainingssysteem voor virtual reality volgens een van de conclusies 1-9, met het kenmerk, dat de sensor of came- ra is geïmplementeerd in of op het draagbare apparaat (6).A virtual reality training system according to any one of claims 1-9, characterized in that the sensor or camera is implemented in or on the wearable device (6). 11. Trainingssysteem voor virtual reality volgens een van de conclusies 1-10, met het kenmerk, dat het dummny-object (7) een gewichtssensor omvat die is verbonden met de processor (2) en is ontworpen om het gewicht van het dummy-object (7) te detecteren en / of van een optioneel extra gewicht dat wordt vastgehouden door het dummy-robject (7).Virtual reality training system according to any one of claims 1 to 10, characterized in that the dummny object (7) comprises a weight sensor connected to the processor (2) and designed to measure the weight of the dummy object. (7) and/or of an optional extra weight held by the dummy object (7). 12. Trainingssysteem voor virtual reality volgens een van de conclusies 6-11, met het kenmerk, dat het dummy-object (7) sensorische inrichtingen omvat voor het meten van de be- dieningselementen (12, 13) die een gebruiker van het dummy- object (7) bedient, welke sensorische inrichtingen verbinding maken met de processor (2).A virtual reality training system according to any one of claims 6 to 11, characterized in that the dummy object (7) comprises sensory devices for measuring the operating elements (12, 13) that a user of the dummy object (7), which sensory devices connect to the processor (2). 13. Trainingssysteem voor virtual reality volgens een van de conclusies 1 - 12, met het kenmerk, dat de processor (2) in het dummy-object (7) is geplaatst.A virtual reality training system according to any one of claims 1 to 12, characterized in that the processor (2) is placed in the dummy object (7).
NL2025627A 2020-05-20 2020-05-20 A virtual or augmented reality training system NL2025627B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
NL2025627A NL2025627B1 (en) 2020-05-20 2020-05-20 A virtual or augmented reality training system
PCT/NL2021/050293 WO2021235928A1 (en) 2020-05-20 2021-05-06 A virtual or augmented reality training system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL2025627A NL2025627B1 (en) 2020-05-20 2020-05-20 A virtual or augmented reality training system

Publications (1)

Publication Number Publication Date
NL2025627B1 true NL2025627B1 (en) 2021-12-07

Family

ID=71452726

Family Applications (1)

Application Number Title Priority Date Filing Date
NL2025627A NL2025627B1 (en) 2020-05-20 2020-05-20 A virtual or augmented reality training system

Country Status (2)

Country Link
NL (1) NL2025627B1 (en)
WO (1) WO2021235928A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1417547A1 (en) 2001-07-16 2004-05-12 Immersion Corporation Interface apparatus with cable-driven force feedback and four grounded actuators
US20170025031A1 (en) * 2015-03-13 2017-01-26 Airbus Defence and Space GmbH Method and apparatus for testing a device for use in an aircraft
FR3064801A1 (en) * 2017-03-31 2018-10-05 Formation Conseil Securite FIRE EXTINGUISHING DEVICE MANIPULATION SIMULATOR
US20190133689A1 (en) * 2017-06-29 2019-05-09 Verb Surgical Inc. Virtual reality laparoscopic tools
US20190258058A1 (en) 2016-02-18 2019-08-22 Apple Inc. Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060017654A1 (en) * 2004-07-23 2006-01-26 Romo Justin R Virtual reality interactivity system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1417547A1 (en) 2001-07-16 2004-05-12 Immersion Corporation Interface apparatus with cable-driven force feedback and four grounded actuators
US20170025031A1 (en) * 2015-03-13 2017-01-26 Airbus Defence and Space GmbH Method and apparatus for testing a device for use in an aircraft
US20190258058A1 (en) 2016-02-18 2019-08-22 Apple Inc. Head-mounted display for virtual and mixed reality with inside-out positional, user body and environment tracking
FR3064801A1 (en) * 2017-03-31 2018-10-05 Formation Conseil Securite FIRE EXTINGUISHING DEVICE MANIPULATION SIMULATOR
US20190133689A1 (en) * 2017-06-29 2019-05-09 Verb Surgical Inc. Virtual reality laparoscopic tools

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SOMAIIEH ROKHSARITALEMI ET AL.: "A Review on Mixed Reality: Current Trends, Challenges and Prospects", APPL. SCI., vol. 10, 2020, pages 636

Also Published As

Publication number Publication date
WO2021235928A1 (en) 2021-11-25

Similar Documents

Publication Publication Date Title
CN111638801B (en) Controller for gesture recognition and gesture recognition method thereof
Akamatsu et al. Multimodal mouse: A mouse-type device with tactile and force display
JP3290436B2 (en) Force feedback and texture pseudo interface device
Scheggi et al. Touch the virtual reality: using the leap motion controller for hand tracking and wearable tactile devices for immersive haptic rendering
US20200282302A9 (en) Exo Tendon Motion Capture Glove Device with Haptic Grip Response
TW201814447A (en) System and method for detecting hand gesture
JP2006506737A (en) Body-centric virtual interactive device and method
CN109416589A (en) Interactive system and exchange method
CN115763126A (en) Key assembly, virtual reality interface system and method
KR20130101395A (en) Cognitive rehabilitation system and method using tangible interaction
RU187548U1 (en) VIRTUAL REALITY GLOVE
RU179301U1 (en) VIRTUAL REALITY GLOVE
Ban et al. Controlling perceived stiffness of pinched objects using visual feedback of hand deformation
CN113508355A (en) Virtual reality controller
Yasui et al. Immersive virtual reality supporting content for evaluating interface using oculus rift and leap motion
NL2025627B1 (en) A virtual or augmented reality training system
RU2670649C9 (en) Method of manufacturing virtual reality gloves (options)
Romano et al. Toward tactilely transparent gloves: Collocated slip sensing and vibrotactile actuation
Normand et al. Visuo-Haptic Rendering of the Hand during 3D Manipulation in Augmented Reality
JP6428629B2 (en) Tactile sensation presentation device, information terminal, haptic presentation method, and program
RU186397U1 (en) VIRTUAL REALITY GLOVE
RU2673406C1 (en) Method of manufacturing virtual reality glove
US7145550B2 (en) Method and apparatus for reducing repetitive motion injuries in a computer user
US10537811B2 (en) Motor vehicle simulation system for simulating a virtual environment with a virtual motor vehicle and method for simulating a virtual environment
JP2001117715A (en) Touch sense/force sense presentation device and information input/output device