WO2021230834A1 - Tool-independent 3-dimensional surface haptic interface - Google Patents

Tool-independent 3-dimensional surface haptic interface Download PDF

Info

Publication number
WO2021230834A1
WO2021230834A1 PCT/TR2020/051098 TR2020051098W WO2021230834A1 WO 2021230834 A1 WO2021230834 A1 WO 2021230834A1 TR 2020051098 W TR2020051098 W TR 2020051098W WO 2021230834 A1 WO2021230834 A1 WO 2021230834A1
Authority
WO
WIPO (PCT)
Prior art keywords
tool
haptic
user
server
location
Prior art date
Application number
PCT/TR2020/051098
Other languages
French (fr)
Inventor
Mehmet Akif NACAR
Muhittin SOLMAZ
Mehmet Murat AYGÜN
Yusuf Çağri ÖĞÜT
Yiğit TAŞÇIOĞLU
Hulusi BAYSAL
Original Assignee
Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from TR2019/22853A external-priority patent/TR201922853A1/en
Priority claimed from TR2019/22850A external-priority patent/TR201922850A1/en
Application filed by Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi filed Critical Havelsan Hava Elektronik Sanayi Ve Ticaret Anonim Sirketi
Publication of WO2021230834A1 publication Critical patent/WO2021230834A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine

Definitions

  • the present invention relates to a tool-independent haptic system designed to be operated with mixed reality systems. It is presented as a solution in terms of performing the training activities by simulating the tool use by means of an image formed with the mixed reality and the virtual objects suitable with the image.
  • VHMR Visuo-haptic mixed reality
  • the PCT application WO2019059938 may be disclosed. Even though it is aimed to provide two haptic arms and a virtual reality environment in the document, it is not considered similar to the present invention, since it does not comprise a virtual dressing on haptics.
  • the force sensor is provided to the user by changing it before starting the haptic-depending stable and related scenario training.
  • the user uses his/her owns tools (e.g. scalpel, bistoury, etc.) during the training pertaining to the offered method.
  • the end of the haptic system makes contact with the scenario model (virtual object) of the moving tool in the inventive system, the tool, and haptic end are combined and accordingly providing to the user the force sense based on the scenario. Since the user employs his/her tools, the training may be performed with a more realistic haptic system.
  • the present invention relates to haptics and image units designed to operate in accordance with the real world and thus, an apparatus exhibiting a working sense in a 3D environment by giving surface senses.
  • the inventive system comprises fundamentally at least one tracking system, an image unit, a haptic system, and a server.
  • it is a system to allow for programming, such that it exhibits senses, as if the process is performed on a 3D surface.
  • the inventive system offers a solution for the problem that the user employs different tools instead of his/her tools in procedures carried out through the haptic device. Furthermore, it is ensured that surface processes are performed on the haptics by means of defining 3-D surfaces.
  • the inventive system is comprised of a tracking system (101), an augmented reality glasses (102), and a haptic system (103).
  • a server (100) to which the above-mentioned three components are connected, receives, processes, and distributes data.
  • Tracking system (101) tracks tools at the degree of freedom of 6 and transmits the location and angle data to the server (100) as six-group.
  • the tracking model (203) calculates the projection of the tool end on the virtual object by processing this location and angle data, this projection point is the location, where the interaction between the tool and virtual object is possible and by means of this location, the haptic system is directed to be in the right position in case of the interaction and it is ensured that the force sense is provided to the tool (100) held by the user.
  • the user (204) is able to interact visually and tactually with a 3-D virtual object.
  • the operation principle in the haptic-supported augmented reality simulation environment on the server (100) follows the following steps;
  • the user approaches to interact with the visual model.
  • the location information (position and rotation) is transferred to the simulation software by means of the tracking system (101).
  • Information transmitted to the simulation software is transferred to both the augmented reality glasses (102) and haptic software (202).
  • the internal cycle, external cycle models, and the end effector are formed in the haptic software (202).
  • the external cycle force functions in calculating the force desired to be applied.
  • the internal cycle is responsible for implementing the calculated force.
  • the internal cycle model comprises force/torque and movement sensors.
  • the torque/force sensor measures the force and torque values applied by the user.
  • the movement system is a parallel manipulator.
  • the force sense is ensured to the user by tracking the position reference of the internal cycle model.
  • the movement system uses information on the tracking system (101) and performs the tracking process without exceeding the visual model.
  • Said haptic system (103) follows the following steps pertaining to the operation principle in the haptic (103)— supported the augmented reality simulation environment;
  • the user employs the desired tool (100) that is previously modified and approaches so as to interact with the visual model.
  • Location information (position and rotation) of the tool (100) is transmitted to the server (105) by means of the vehicle tracking system [102]. It is processed by the simulation software (106) on the server.
  • Information transmitted to the simulation software [106] is transferred to the haptics (103). It is comprised of the internal cycle, external cycle models, and holding mechanism on the haptics (103).
  • the external cycle force functions in calculating the force sense desired to be applied.
  • the internal cycle is responsible for implementing the calculated force.
  • the holding mechanism ensures the combination when the tool arrives at the border of the visual model.
  • the internal cycle model comprises force/torque and movement sensors.
  • the movement system is a parallel manipulator. It ensures the force sense to the user by tracking the feedback reference generated by the internal cycle model.
  • the movement system uses information on the tracking system (104) and tracks tools (100) without exceeding the visual model.
  • the user makes direct contact with the haptic physical interface.
  • Users are not generally content with the case that reduces the simulation reality and training quality.
  • the haptic interface of the inventive product does not have a direct physical interface to the user. It is ensured to form the force sense by means of any tools desired by the user. Its location is detected by means of the camera localization of the tool in the user's hand and the end of the haptic system (end effector) tracks the detected location in real-time. Thus, it is ensured to from a realistic force sense to use by employing his/her tools. Furthermore, it has become an obligation to use high- precision haptic technologies so as to exhibit the tactual sense properly.
  • a simulation environment is formed, which comprises an innovative design, with which it is able to interact in the simulation environment formed through real objects.
  • the lacking portion of the augmented reality is that the force sense cannot be provided to the user. That is, physical simulation of the real environment constitutes great importance in terms of the user. It is necessary that both the individual experiences visually the real environment and the simulation environment should be formed by experiencing the real physical touching sense.
  • the inventive system offers multiple advantages. Those may be listed as below;
  • Figure 2 illustrates an exemplary embodiment of the inventive product. Accordingly, a user operating on a haptic system (103) in a mixed reality environment formed through an augmented reality glasses is also monitored on an external display.
  • a haptic system (103) in a mixed reality environment formed through an augmented reality glasses is also monitored on an external display.
  • FIG. 3 illustrates the operational mechanics of the inventive product. Accordingly, processes performed on a server are output on a haptic system (103), a server (100), a haptic software (202) operating on a server (100), and on an augmented reality glasses (102) functioning accordingly as a mixed reality unit.
  • Figure 4 illustrates the operation mode of the haptic system (103).

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Analysis (AREA)
  • Pure & Applied Mathematics (AREA)
  • Computational Mathematics (AREA)
  • Medicinal Chemistry (AREA)
  • Health & Medical Sciences (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Algebra (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

In haptic systems, the force sense is provided to the user by changing it before starting the haptic-depending stable and related scenario training. The user uses his/her owns tools (e.g. scalpel, bistoury, etc.) during the training pertaining to the offered method. Furthermore, in case the end of the haptic system makes contact with the scenario model (virtual object) of the moving tool in the inventive system, the tool, and haptic end are combined and accordingly providing to the user the force sense based on the scenario. Since the user employs his/her tools, the training may be performed with a more realistic haptic system.

Description

TOOL-INDEPENDENT 3 -DIMENSIONAL SURFACE
HAPTIC INTERFACE
[1] I
Technical Field
[2] The present invention relates to a tool-independent haptic system designed to be operated with mixed reality systems. It is presented as a solution in terms of performing the training activities by simulating the tool use by means of an image formed with the mixed reality and the virtual objects suitable with the image.
Background Art
[3] Mixed reality is the use of the real world and virtual world objects together in the same simulation environment. The simulation environment formed by combining reality and virtuality is called mixed reality. Applications supported with haptic systems are defined as Visuo-haptic mixed reality (VHMR). The number of studies and projects focusing on VHMR are significantly increased together with the development of augmented reality technology. Considering the current studies in the literature, the haptic system has a significant effect on the training.
[4] An ideal training environment should be ensured by means of forming a haptic-supported augmented reality simulation environment. It is necessary that both the individual experiences visually the real environment and the simulation environment should be formed by experiencing the real physical touching sense.
[5] In the interaction of the user with the mechanical system, it should be covered by synthetic objects through augmented reality glasses on the mechanical system. Thus, when the user sees through the glasses, he/she will see the simulated environment, not the mechanical system. The haptic system applies the force sense and designed, such that it provides the same sense as touched the same object in the real world.
[6] Alignment problems of the haptics and image to each other, being the most challenging factor in the current studies in the literature, will be overcome by means of this method. Visual system (augmented reality) and haptics faced in the present studies will bring a solution to many challenging factors in the literature.
[7] For example, the PCT application WO2019059938 may be disclosed. Even though it is aimed to provide two haptic arms and a virtual reality environment in the document, it is not considered similar to the present invention, since it does not comprise a virtual dressing on haptics.
[8] In the PCT application WO2018224847, it is disclosed a mixed reality system, wherein said mixed reality system controls the haptic unit by means of lasers. Therefore, it is not surprising that it cannot find common use. Since the inventive system provides a novel system embodiment so as to propose a different control method within this scope, it is not considered similar.
[9] In the PCT application W02018090060, it is disclosed a haptic device, on which a virtual environment is dressed, wherein it does not relate to tool tracking and similar subjects. As can be seen in the example (virtual keyboard) it is disclosed an arrangement prepared for planes that can be manually controlled in a simple manner. Therefore, since it is not disclosed the tool tracking, changes in the scenario, selection the scenarios based on the vehicle and the like, it is not considered similar. Since it is not possible to manage the 3-D representation procedure by means of said apparatus, it becomes accordingly an obligation to perform a novel apparatus configuration and to design a method .
[10] In the USA patent US20120038639, it is disclosed a configuration for a mechanical simulation, wherein it is not considered similar, because the haptic structure is not employed, the virtual surface formation process is not performed and the user got no chance to drive his/her tool.
[11] In the USA patent US20060209019, it is disclosed a haptic system, wherein said haptic system allows for operating on a given sphere with a given tool. It is not considered similar, because the haptic structure is not employed, the virtual surface formation process is not performed, and the user got no chance to drive his/her tool.
[12] In the USA patent US8716973, it is disclosed a connected arm operating on a given computer. However, in such a system the device cannot ensure the technical advantage provided by the use of the device that the user uses normally. Further, since this device is not used in space, the reality sense cannot be provided. It is not considered similar at least in terms of these aspects.
[13] In the Chinese utility model document CN206322123, it is disclosed an input device. Since this is not mostly identical with the user device, it cannot ensure the advantage provided by the inventive system and therefore, it is not considered similar.
[14] Since it is not disclosed a virtual object generation and a process sequence performed by the user's device in other similar systems, a similar document cannot be detected.
Summary of Invention
[15] In haptic systems, the force sensor is provided to the user by changing it before starting the haptic-depending stable and related scenario training. The user uses his/her owns tools (e.g. scalpel, bistoury, etc.) during the training pertaining to the offered method. Furthermore, in case the end of the haptic system makes contact with the scenario model (virtual object) of the moving tool in the inventive system, the tool, and haptic end are combined and accordingly providing to the user the force sense based on the scenario. Since the user employs his/her tools, the training may be performed with a more realistic haptic system.
[16] So as to achieve this, the present invention relates to haptics and image units designed to operate in accordance with the real world and thus, an apparatus exhibiting a working sense in a 3D environment by giving surface senses. In order to achieve this, the inventive system comprises fundamentally at least one tracking system, an image unit, a haptic system, and a server. In the processes performed after defining the toolset, it is a system to allow for programming, such that it exhibits senses, as if the process is performed on a 3D surface.
Technical Problem
[17] The inventive system offers a solution for the problem that the user employs different tools instead of his/her tools in procedures carried out through the haptic device. Furthermore, it is ensured that surface processes are performed on the haptics by means of defining 3-D surfaces.
Solution to Problem
[18] The inventive system is comprised of a tracking system (101), an augmented reality glasses (102), and a haptic system (103). A server (100), to which the above-mentioned three components are connected, receives, processes, and distributes data. Tracking system (101) tracks tools at the degree of freedom of 6 and transmits the location and angle data to the server (100) as six-group. The tracking model (203) calculates the projection of the tool end on the virtual object by processing this location and angle data, this projection point is the location, where the interaction between the tool and virtual object is possible and by means of this location, the haptic system is directed to be in the right position in case of the interaction and it is ensured that the force sense is provided to the tool (100) held by the user. Thus, it is ensured that the user (204) is able to interact visually and tactually with a 3-D virtual object. Moreover, the operation principle in the haptic-supported augmented reality simulation environment on the server (100) follows the following steps;
[19] The user approaches to interact with the visual model. The location information (position and rotation) is transferred to the simulation software by means of the tracking system (101). Information transmitted to the simulation software is transferred to both the augmented reality glasses (102) and haptic software (202). The internal cycle, external cycle models, and the end effector are formed in the haptic software (202). The external cycle force functions in calculating the force desired to be applied. The internal cycle is responsible for implementing the calculated force.
[20] The internal cycle model comprises force/torque and movement sensors. The torque/force sensor measures the force and torque values applied by the user. The movement system is a parallel manipulator. The force sense is ensured to the user by tracking the position reference of the internal cycle model. The movement system uses information on the tracking system (101) and performs the tracking process without exceeding the visual model. [21] Said haptic system (103) follows the following steps pertaining to the operation principle in the haptic (103)— supported the augmented reality simulation environment;
[22] The user employs the desired tool (100) that is previously modified and approaches so as to interact with the visual model. Location information (position and rotation) of the tool (100) is transmitted to the server (105) by means of the vehicle tracking system [102]. It is processed by the simulation software (106) on the server. Information transmitted to the simulation software [106] is transferred to the haptics (103). It is comprised of the internal cycle, external cycle models, and holding mechanism on the haptics (103). The external cycle force functions in calculating the force sense desired to be applied. The internal cycle is responsible for implementing the calculated force. The holding mechanism ensures the combination when the tool arrives at the border of the visual model.
[23] The internal cycle model comprises force/torque and movement sensors. When the toolbox (101) and the holding mechanism are combined, it measures the force and torque applied by the user. The movement system is a parallel manipulator. It ensures the force sense to the user by tracking the feedback reference generated by the internal cycle model. The movement system uses information on the tracking system (104) and tracks tools (100) without exceeding the visual model.
[24] Thus, in the interaction of the user with the mechanical system, it should be covered by synthetic objects through augmented reality glasses on the mechanical system. Thus, when the user sees through the glasses, he/she will see the simulated environment, not the mechanical system. The haptic system applies the force sense and designed, such that it provides the same sense as touched the same object in the real world.
Advantageous Effects of Invention
[25] In the present haptic-supported simulation environments, the user makes direct contact with the haptic physical interface. Users are not generally content with the case that reduces the simulation reality and training quality. The haptic interface of the inventive product does not have a direct physical interface to the user. It is ensured to form the force sense by means of any tools desired by the user. Its location is detected by means of the camera localization of the tool in the user's hand and the end of the haptic system (end effector) tracks the detected location in real-time. Thus, it is ensured to from a realistic force sense to use by employing his/her tools. Furthermore, it has become an obligation to use high- precision haptic technologies so as to exhibit the tactual sense properly.
[26] Since it is necessary to interact with the real and virtual object simultaneously so as to form an environment with a high reality level, a simulation environment is formed, which comprises an innovative design, with which it is able to interact in the simulation environment formed through real objects.
[27] In the state of the art, the lacking portion of the augmented reality is that the force sense cannot be provided to the user. That is, physical simulation of the real environment constitutes great importance in terms of the user. It is necessary that both the individual experiences visually the real environment and the simulation environment should be formed by experiencing the real physical touching sense. [28] The inventive system offers multiple advantages. Those may be listed as below;
[29] -Interacting with the virtual scenario formed by means of using real objects in the simulation environment with an innovative and novel design. -Increasing the training precision and quality accordingly,
[30] -Providing a simulation environment, in which the real object to be used are used without any dependencies,
[31] -Forming a realistic force sense with real tools to be used by including in the virtual scenario,
[32] -Providing the sense, as if the user operates in the real environment with the real tools (toolset), by means of an innovative approach,
[33] -Forming the realistic force sense together with the haptic interface designed,
[34] -Developing the interactive model by means of rigid and deformable objects,
[35] -Eliminating the alignment errors,
[36] -Forming a more realistic simulation environment, since the haptic system is always under the visual model,
[37] -Requiring no calibration based on the tool,
[38] -Providing a more realistic simulation environment by forming virtual objects on the real environment,
[39] -Interacting with the virtual scenario formed by means of using real objects in the simulation environment with an innovative and novel design.
[40] -Increasing training precision and quality accordingly.
Brief Description of Drawings
[41] Disclosure of drawings annexed in the description is as below; [42] Figure 1 illustrates the operation mode of the inventive product.
[43] Figure 2 illustrates an exemplary embodiment of the inventive product. Accordingly, a user operating on a haptic system (103) in a mixed reality environment formed through an augmented reality glasses is also monitored on an external display.
[44] Figure 3 illustrates the operational mechanics of the inventive product. Accordingly, processes performed on a server are output on a haptic system (103), a server (100), a haptic software (202) operating on a server (100), and on an augmented reality glasses (102) functioning accordingly as a mixed reality unit.
[45] Figure 4 illustrates the operation mode of the haptic system (103).
Description of Embodiments
[46] Correspondences of the reference signs used in drawings annexed in the description are as below;
[47] 100- Server:
[48] 101- Tracking system
[49] 102- Augmented reality glasses
[50] 103- Haptic system
[51] 201- Augmented reality model
[52] 202- Haptic software
[53] 203- Tracking model
[54] 204- User
Examples
[55]
Industrial Applicability [56]
Reference Signs List
[57]
Reference to Deposited Biological Material
[58]
Sequence Listing Free Text
[59]
Citation List
[60]
Patent Literature
[61] PTL1:
Non-Patent Literature
[62] NPL1:

Claims

Claims
[Claim 1] jHaptic interface designed to provide a 3-D sense, characterized by a. being comprised of a server and a tracking system operating on the server, an augmented reality glasses, and at least one haptic embodiment; b. tracking the tool at a degree of freedom of 6 by the tracking system; transferring the data to the server as a six-group; c. calculating the projection of the tool end on the virtual object by means of the tracking data received from the server through the tracking model; d. detecting the projection point in case of the intersection of the tool end location and virtual object surface; directing the haptic end to the related point by sending the location instruction to the haptics by the server at this point; e. giving a physical response to the user tool by the directed haptic end and thus, providing a surface sense to the user by performing a 3-D surface recognition.
[Claim 2] Haptic interface according to Claim 1, characterized by a. operating three motors, namely internal cycle, and external cycle and end effector, while producing the reciprocation movement by means of the tool tracking model on the server; b. receiving and processing the force/torque data applied by the user through the internal cycle; c. calculating the desired force sense through the external cycle model; d. wherein it is a motor, in which the end effector instructs to give a response to the user based on outputs of the internal cycle motor at the projection point. e. providing a realistic surface sense by means of the coordinated operation of said three motors.
[Claim 3] The product according to Claim 2, characterized by comprising additional augmented reality glasses and displaying the virtual object of the augmented reality glasses.
[Claim 4] A haptic embodiment according to Claim 1, characterized by a. comprising a tool tracking system; wherein said tool tracking system is preferably a stereo camera; b. reading the tool location through the tool tracking system at the degree of freedom of 6; c. transferring the read location; d. generating the instruction comprising the movement as a response to the tool location with the pre saved scenario on the server by means of the tool tracking model and sending the same to the haptics; e. following respectively the physical reaction steps to the user by receiving the location by the haptics and repeating the same at a predetermined interval. I
PCT/TR2020/051098 2019-12-31 2020-11-13 Tool-independent 3-dimensional surface haptic interface WO2021230834A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
TR2019/22853A TR201922853A1 (en) 2019-12-31 2019-12-31 THREE-DIMENSIONAL SURFACE HAPTIC INTERFACE FOR MIXED REALITY SYSTEMS
TR2019/22853 2019-12-31
TR2019/22850 2019-12-31
TR2019/22850A TR201922850A1 (en) 2019-12-31 2019-12-31 VEHICLE INDEPENDENT HAPTIC INTERFACE

Publications (1)

Publication Number Publication Date
WO2021230834A1 true WO2021230834A1 (en) 2021-11-18

Family

ID=78524737

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/TR2020/051098 WO2021230834A1 (en) 2019-12-31 2020-11-13 Tool-independent 3-dimensional surface haptic interface

Country Status (1)

Country Link
WO (1) WO2021230834A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011116332A2 (en) * 2010-03-18 2011-09-22 SPI Surgical, Inc. Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto
US20140336669A1 (en) * 2013-05-08 2014-11-13 Samsung Electronics Co., Ltd. Haptic gloves and surgical robot systems
US20180049622A1 (en) * 2016-08-16 2018-02-22 Insight Medical Systems, Inc. Systems and methods for sensory augmentation in medical procedures
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011116332A2 (en) * 2010-03-18 2011-09-22 SPI Surgical, Inc. Surgical cockpit comprising multisensory and multimodal interfaces for robotic surgery and methods related thereto
US20140336669A1 (en) * 2013-05-08 2014-11-13 Samsung Electronics Co., Ltd. Haptic gloves and surgical robot systems
US20180049622A1 (en) * 2016-08-16 2018-02-22 Insight Medical Systems, Inc. Systems and methods for sensory augmentation in medical procedures
US20180293802A1 (en) * 2017-04-07 2018-10-11 Unveil, LLC Systems and methods for mixed reality medical training

Similar Documents

Publication Publication Date Title
Nee et al. Augmented reality applications in design and manufacturing
Petzold et al. A study on visual, auditory, and haptic feedback for assembly tasks
CN109917911B (en) Information physical interaction-based vibration tactile feedback device design method
US20190163266A1 (en) Interaction system and method
US20190377412A1 (en) Force Rendering Haptic Glove
US20150151431A1 (en) Robot simulator, robot teaching device, and robot teaching method
JP2019188530A (en) Simulation device of robot
Weber et al. Visual, vibrotactile, and force feedback of collisions in virtual environments: effects on performance, mental workload and spatial orientation
CN112041789B (en) Position indication device and spatial position indication system
EP0846286B1 (en) Virtual environment interaction and navigation device
CN107272908B (en) A gesture recognition device, system and gesture recognition method
KR100934391B1 (en) Hand-based Grabbing Interaction System Using 6-DOF Haptic Devices
CN107257946B (en) System for virtual debugging
Bordegoni et al. Evaluation of a haptic-based interaction system for virtual manual assembly
CN110363841B (en) Hand motion tracking method in virtual driving environment
Su et al. Development of a 3D AR-based interface for industrial robot manipulators
JP6005254B2 (en) A method of providing a global 6-DOF motion effect using multiple local force feedback
WO2021230834A1 (en) Tool-independent 3-dimensional surface haptic interface
Ryden Tech to the future: Making a" kinection" with haptic interaction
Matsumaru et al. Three-dimensional aerial image interface, 3DAII
Al-Remaihi et al. A Cost-Effective Immersive Telexistence Platform for Generic Telemanipulation Tasks
Grajewski et al. Use of delta robot as an active touch device in immersive case scenarios
CN115129150A (en) A force-haptic augmented virtual reality factory system
Kossyk et al. Usability of a virtual reality system based on a wearable haptic interface
Stone Virtual reality: A tool for telepresence and human factors research

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20935490

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20935490

Country of ref document: EP

Kind code of ref document: A1