CN210378506U - Orbit operation training system based on virtual reality - Google Patents

Orbit operation training system based on virtual reality Download PDF

Info

Publication number
CN210378506U
CN210378506U CN201920564210.1U CN201920564210U CN210378506U CN 210378506 U CN210378506 U CN 210378506U CN 201920564210 U CN201920564210 U CN 201920564210U CN 210378506 U CN210378506 U CN 210378506U
Authority
CN
China
Prior art keywords
virtual reality
unit
force feedback
training system
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201920564210.1U
Other languages
Chinese (zh)
Inventor
范先群
宋雪霏
李寅炜
孙柔
周慧芳
贾仁兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Original Assignee
Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine filed Critical Ninth Peoples Hospital Shanghai Jiaotong University School of Medicine
Priority to CN201920564210.1U priority Critical patent/CN210378506U/en
Application granted granted Critical
Publication of CN210378506U publication Critical patent/CN210378506U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The application provides an orbit surgery training system based on virtual reality. The method comprises the following steps: the device comprises a three-dimensional model library, a VR wearing unit and an operation unit comprising a force feedback unit and a positioning unit. The application can standardize operation training, has high simulation degree and good interaction effect, has moderate corresponding basic equipment cost, is convenient for large-scale popularization and application, and can further provide more feasibility suggestions for real clinical schemes.

Description

Orbit operation training system based on virtual reality
Technical Field
The application relates to the technical field of human-computer interaction based on virtual reality, in particular to an orbit surgery training system based on virtual reality.
Background
Tumors, inflammations and traumas can cause the damage of the orbital structure, which causes eyeball displacement, double vision, visual deterioration and even blindness, and the operation is the main treatment means of the orbital diseases. However, the orbit anatomical structure is complex, the space is narrow, the cloud of important structures in the orbit is collected, the operation visual field is poor, the exposure is difficult, the operation risk is high, the difficulty is large, and the accuracy is low. The most advanced endoscope navigation technology used in the current orbit doctors can only solve the problem that the deep part of the orbit cannot be directly viewed and positioned, however, how to realize the accurate, stable and safe transfer of the preoperative design scheme to the intraoperative operation is a problem to be solved urgently in clinic at present. The accumulation of surgical experience is the only method for improving the accuracy of surgical path selection, the accuracy of the operations of cutting, osteotomy, reduction, fixation, grinding, drilling and the like in the operation, so as to improve the accuracy and the safety of the orbital surgery. However, the high risk of the orbital surgery and the high requirement on the surgical skill make the growth curve of the orbital surgeon very steep, and the slow accumulation of experience in clinic severely limits the growth of the orbital surgeon, further severely limits the treatment level of the orbital disease, and causes huge social burden. The existing training system generally has the defects of one aspect or a plurality of aspects such as pertinence deficiency, normalization deficiency, low simulation degree, poor interactivity, tactile feedback deficiency, alarm mechanism deficiency, real-time guide deficiency, special case personalized training deficiency, high manufacturing cost and the like, and the application value and the popularization of the training system are limited. The research and development of the method are mostly stopped on a simple graph display level due to insufficient clinical speciality, insufficient technical depth and insufficient cooperation, and the usability is poor. Therefore, a reproducible and valuable orbital surgeon training device is highly desirable in the industry.
SUMMERY OF THE UTILITY MODEL
In view of the above-mentioned drawbacks of the prior art, the technical problem to be solved by the present application is to provide a virtual reality-based orbital surgery training system for solving the prior art problems.
To achieve the above and other related objects, the present application provides a virtual reality-based orbital surgery training system, comprising: a three-dimensional model library for storing one or more three-dimensional tissue and organ models; the VR wearing unit is used for constructing a virtual reality environment according to the selected three-dimensional tissue organ model and providing corresponding virtual display; the operation unit comprises a force feedback unit and a positioning unit and is used for operating the three-dimensional tissue organ model in the virtual reality environment; the real-time three-dimensional positioning information obtained by the positioning unit is used as a basis for enabling the force feedback unit to generate corresponding force feedback perception; the positioning unit comprises an induction sensor; and tracking the accurate position of the operation unit in real time according to the positioning unit and obtaining the operation track of the whole operation training process so as to guide prompt or evaluate/score.
In an embodiment of the present invention, the operation unit is a force feedback operation rod or a force feedback glove.
In an embodiment of the present invention, the force feedback unit includes an effector that generates a vibration with a corresponding force to indicate a corresponding force feedback sensation.
In an embodiment of the present invention, the three-dimensional tissue/organ model stored in the three-dimensional model library is pre-set with a position label corresponding to each step in the surgical path, so as to be used as a reference for guiding prompt or evaluation/scoring.
In an embodiment of the present invention, the inductive sensor is an infrared sensor or a magnetic sensor.
In an embodiment of the present invention, the operation unit further includes a communication unit, configured to transmit the three-dimensional positioning information and/or control the force feedback unit to generate a control signal for sensing the corresponding force feedback.
In an embodiment of the present invention, a communication method of the communication unit includes: any one or more of WIFI, NFC, Bluetooth, Ethernet, GSM, 4G and GPRS.
In an embodiment of the present application, the VR wearing unit is VR glasses or a VR helmet.
In an embodiment of the present invention, the three-dimensional model library is further used for storing a surgical environment model and a surgical instrument model.
As described above, the present application provides a virtual reality orbital surgery training system and storage medium. The system comprises: a three-dimensional model library for storing one or more three-dimensional tissue and organ models; the VR wearing unit is used for constructing a virtual reality environment according to the selected three-dimensional tissue organ model and providing corresponding virtual display; and the operation unit comprises a force feedback unit and a positioning unit and is used for operating the three-dimensional tissue organ model in the virtual reality environment and enabling the force feedback unit to generate corresponding force feedback perception according to the real-time three-dimensional positioning information obtained by the positioning unit.
The following beneficial effects are achieved:
the application can standardize operation training, has high simulation degree and good interaction effect, has moderate corresponding basic equipment cost, is convenient for large-scale popularization and application, and can further provide more feasibility suggestions for real clinical schemes.
Drawings
Fig. 1 is a schematic view of a virtual reality-based orbital surgery training system in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of a virtual reality-based orbital surgery training system in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application is provided by way of specific examples, and other advantages and effects of the present application will be readily apparent to those skilled in the art from the disclosure herein. The present application is capable of other and different embodiments and its several details are capable of modifications and/or changes in various respects, all without departing from the spirit of the present application. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings so that those skilled in the art to which the present application pertains can easily carry out the present application. The present application may be embodied in many different forms and is not limited to the embodiments described herein.
In order to clearly explain the present application, components that are not related to the description are omitted, and the same reference numerals are given to the same or similar components throughout the specification.
Throughout the specification, when a component is referred to as being "connected" to another component, this includes not only the case of being "directly connected" but also the case of being "indirectly connected" with another element interposed therebetween. In addition, when a component is referred to as "including" a certain constituent element, unless otherwise stated, it means that the component may include other constituent elements, without excluding other constituent elements.
When an element is referred to as being "on" another element, it can be directly on the other element, or intervening elements may also be present. When a component is referred to as being "directly on" another component, there are no intervening components present.
Although the terms first, second, etc. may be used herein to describe various elements in some instances, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, the first interface and the second interface, etc. are described. Also, as used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" include plural forms as long as the words do not expressly indicate a contrary meaning. The terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of other features, regions, integers, steps, operations, elements, and/or components.
Terms indicating "lower", "upper", and the like relative to space may be used to more easily describe a relationship of one component with respect to another component illustrated in the drawings. Such terms are intended to include not only the meanings indicated in the drawings, but also other meanings or operations of the device in use. For example, if the device in the figures is turned over, elements described as "below" other elements would then be oriented "above" the other elements. Thus, the exemplary terms "under" and "beneath" all include above and below. The device may be rotated 90 or other angles and the terminology representing relative space is also to be interpreted accordingly.
Although not defined differently, including technical and scientific terms used herein, all terms have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. Terms defined in commonly used dictionaries are to be additionally interpreted as having meanings consistent with those of related art documents and the contents of the present prompts, and must not be excessively interpreted as having ideal or very formulaic meanings unless defined.
Practice is the most ideal form of training for both hospitalized physicians and medical students. However, the chance of participating in the surgery is small, and the large demand cannot be met. Before the appearance of virtual reality, a table simulator, a video trainer and the like are the best tools for students to improve the surgical skills. Virtual reality technology has the advantage of providing an immersive experience for the student to learn in a "near-reality" artificial environment. Several VR simulators have been introduced for robotic surgery, such as surgical education platforms, robotic surgery systems, da vinci skill simulators, and the recently introduced RobotiXMentor. These simulators have been gradually put into use in urology surgery, cardiovascular surgery, neurosurgery, and the like.
However, ophthalmology, due to its particularity, requires that an ophthalmic surgical robot must provide high precision in an anatomically limited environment. The traditional surgical robot is limited in application in ophthalmology, the existing research on the ophthalmic robot is limited to eye surface and internal eye surgery, the surgical robot is not applied to eye socket diseases, and a corresponding training system is not available; and the orbit surgeon has difficult and long training time, which has great limitation on the construction of specialized talents. Therefore, a training system for orbital surgery is urgently needed to overcome this problem.
As mentioned above, the orbital surgery has complex operation, few operation opportunities, difficult growth of doctors and slow experience accumulation. In addition, simulation models are hundreds of thousands in price and cannot be used for daily connections. To facilitate rapid training of clinical experience for doctors and students in medical colleges, the present application provides a virtual reality-based orbital surgery training system.
For ease of understanding, fig. 1 is a schematic view of a virtual reality-based orbital surgery training system according to an embodiment of the present application. As shown, the system mainly comprises: computer 1, Virtual Reality (VR) wearing equipment 2, and force feedback operating device 3.
In brief, a three-dimensional model of a tissue and an organ is stored mainly by the computer 1, and the three-dimensional model of the tissue and the organ can be constructed by scanning data for CT or MRI or the like of an orbit.
For example, three-dimensional tissue organ models containing different cases or disease conditions can be constructed according to the collected actual patient conditions.
In addition, the model of the surgical environment, such as an operating room, an operating table, a hospital bed, etc., may also be stored, and the model of different surgical instruments (virtual), such as a scalpel, a saw, a forceps, a suture needle, etc., may also be stored.
It should be noted here that the main task of the computer 1 is to have various models, rather than building various models, that is, the computer 1 stores various models that have already been built.
The Virtual Reality (VR) wearable device 2, which may be a VR helmet or VR glasses, guides a three-dimensional tissue organ model (or a model together with an operation environment model and an operation tool model) built and stored by the computer 1 into the Virtual Reality (VR) wearable device 2, so as to provide a virtual display through the Virtual Reality (VR) wearable device 2, and the focus (eye socket) can be observed from any angle through the Virtual Reality (VR) wearable device 2.
The force feedback operation device 3 may be a force feedback operation rod as shown in the figure, and may also be a force feedback operation glove. A force feedback unit and a positioning unit which is positioned through an induction sensor are arranged on the force feedback operation device 3, and an operator can present corresponding operation in a virtual environment through operating the force feedback operation device 3.
Then, the real-time three-dimensional position of the force feedback operation device 3 is obtained through the computer 1, and in combination with the virtual scene, when the three-dimensional position of the force feedback operation device 3 (or the three-dimensional position of a virtual surgical instrument or a real surgical instrument) collides with the three-dimensional tissue organ model in the virtual scene, the actual force which can be received and the corresponding deformation of the actual three-dimensional tissue organ model are correspondingly calculated, the corresponding force feedback perception is transmitted through the force feedback operation device 3, the actual touch feeling of an operator is enhanced, and the near-reality visual perception is brought through the corresponding deformation of the three-dimensional tissue organ model. Thereby greatly improving the immersive experience.
Specifically, the force feedback operation device 3 is fused into a virtual reality scene provided by the Virtual Reality (VR) wearable device 2, the computer 1 can calculate corresponding deformation calculation and force calculation according to a collision detection technology of a three-dimensional model of a scene in the prior art, and a corresponding force calculation result is transmitted to a corresponding instruction signal of the force feedback operation device 3, so that the force feedback operation device 3 generates a corresponding force feedback perception.
The collision detection in the current virtual reality scene mainly adopts two methods with higher efficiency, namely a space decomposition method and a hierarchical bounding box method, and it is required to state that the collision detection technology adopts the prior art.
The present application emphasizes the technical effect of the cooperation of the functions of each device or structure, and does not emphasize the specific interactive processing procedure, which is the prior art adopted in each interactive processing procedure.
Fig. 2 is a schematic diagram of a virtual reality-based orbital surgery training system according to an embodiment of the present application. As shown, the system 200 includes:
a three-dimensional model library 210 for constructing and storing one or more three-dimensional tissue/organ models according to the orbital scan data;
the VR wearing unit 220 is used for constructing a virtual reality environment according to the selected three-dimensional tissue organ model and providing corresponding virtual display;
an operation unit 230 comprising a force feedback unit 231 and a stereotactic positioning unit 232, and an operation unit 232 comprising a force feedback unit 231 and a positioning unit, for operating the three-dimensional tissue and organ model in the virtual reality environment; the real-time three-dimensional positioning information obtained by the positioning unit 232 is used as a basis for the force feedback unit 231 to generate a corresponding force feedback perception.
The operation unit 230 is preferably a force feedback operation rod (such as the force feedback operation device 3 shown in fig. 1) or a force feedback glove; the VR wearing unit 220 (the virtual reality wearing device 2 shown in fig. 1) is preferably VR glasses or a VR helmet.
In an embodiment of the present application, the force feedback unit 231 includes an effector, which generates a vibration with a corresponding force to indicate a corresponding force feedback perception.
For example, the computer or the processor may determine whether the VR wearing unit 220 collides with the three-dimensional tissue/organ model according to the real-time three-dimensional positioning information obtained by the positioning unit 232, the determining method may employ a collision detection technology of a scene three-dimensional model in the prior art to calculate corresponding deformation calculation and force calculation, and instruction information corresponding to the corresponding force calculation result is transmitted to the effector of the force feedback unit 231, so that the effector generates a corresponding degree of vibration, and the operator feels a corresponding force feedback perception, thereby enhancing the real feeling.
It should be noted that the present application emphasizes the technical effect of the cooperation of the functions of the devices or structures, and does not emphasize the specific interactive processing procedures, which are the prior art adopted by the interactive processing procedures.
In an embodiment of the present application, the positioning unit 232 includes an inductive sensor; and tracking the accurate position of the operation unit 430 in real time according to the positioning unit 232 and obtaining the operation track of the whole operation training process so as to guide prompt or evaluate/score.
In an embodiment of the present application, the inductive sensor is preferably an infrared sensor or a magnetic sensor.
In addition, the sensing sensor may include a multifunctional sensor, or may be an infrared light emitting and receiving component, and a sensing signal is received in real time by providing a marker facilitating infrared light sensing on the operation unit 230 (a lever or a glove), so that real-time three-dimensional position coordinates of the operation unit 230 are detected.
In this embodiment, the positioning unit 232 can track the precise position of the operation unit 230 in real time, and the operation trajectory in the whole operation training process can be obtained by integrating and analyzing these data.
In an embodiment of the present application, the three-dimensional tissue/organ models stored in the three-dimensional model library 210 are pre-set with position labels corresponding to each step in the surgical path, so as to be used as a reference for guiding prompt or evaluation/scoring.
Namely, by labeling the position of each step in the preset corresponding operation path on the three-dimensional tissue/organ model stored in the three-dimensional model library 210, various operation planning training can be formed to be used as a reference for guidance prompt or evaluation/scoring.
For example, pre-labeling surgical path based on the three-dimensional tissue organ model (detailed to the specific travel path for each pass) provides a guideline for training, judgment, and false operation alarms. The positioning unit 232 of the operation unit 230 is used for acquiring the real-time three-dimensional position information, so as to judge whether the operator is normal or not, and give timely guidance prompt or reference for evaluation/scoring.
For example, classical surgical planning training may include: the surgical approach comprises anterior orbital surgery, lateral orbital surgery, ethmoid sinus medial orbital surgery, lateral combination medial orbital surgery, transcranial orbital surgery, surgical approach to upper anterior orbital lesion under specific surgical conditions, surgical approach to middle upper lesion, surgical approach to lower lesion, surgical approach to middle posterior lesion, surgical approach to optic nerve and nearby lesion, surgical approach to inner orbital lesion, surgical approach to upper orbital fissure nearby lesion, surgical approach to fill in orbit and peripheral lesion, and the like, thereby greatly enriching the content of surgical training and improving the practical experience of doctors or students.
It should be noted that the force touch man-machine interaction technology is an important component of man-machine interaction in virtual reality, a precise force feedback element can simulate various touch phenomena, and real feedback of various operations in orbit surgery is expected to be copied after precise running-in, so that practical training is achieved.
In the present embodiment, the operation unit 230 meets the specific requirements of the orbital simulation operation: the translation diameter is not less than 100mm, the rotation in each direction is not less than 180 degrees, the continuous force is 12N, and the grabbing force is +/-8N; the linear displacement resolution is less than 0.01mm, the rotary displacement resolution is less than 0.09 degrees, and the grabbing displacement is less than 0.006 mm.
In this embodiment, the operation unit 230 may further include a communication unit to improve the accuracy and stability of the transmission of the three-dimensional positioning information (the three-dimensional position information provided by the positioning unit 232) and/or the control signal for controlling the force feedback unit to generate the corresponding force feedback perception (the signal for controlling the vibration of the effector sent to the force feedback unit 231).
Specifically, the communication method of the communication unit includes: any one or more of WIFI, NFC, Bluetooth, Ethernet, GSM, 4G and GPRS.
In this embodiment, the network communication method of the communication method includes: any one or more of the internet, an intranet, a Wide Area Network (WAN), a Local Area Network (LAN), a wireless network, a Digital Subscriber Line (DSL) network, a frame relay network, an Asynchronous Transfer Mode (ATM) network, a Virtual Private Network (VPN), and/or any other suitable communication network.
In an embodiment of the present application, the three-dimensional model library 210 is further used for storing a surgical environment model and a surgical instrument model.
In this embodiment, in order to make the virtual training environment more realistic, the three-dimensional tissue/organ model may be provided, and at the same time, an operation environment model and an operation tool model may be provided.
For example, the surgical environment model includes an operating room, an operating table, a patient bed, and the like.
For another example, the surgical instrument tool may include: traction instruments, retractors, strippers, osteotomes, curettes, sinus forceps, rongeurs, Stryker saws, power systems, bone hammers, surgical microscopes and loupes, meningeal scissors, electric knives, electric coagulators, electric drills, attractors, ophthalmic scissors, ophthalmic tweezers, strabismus hooks, brain clamps, vascular forceps, needle holders, eyelid retractor, thyroid retractor, rongeur, bone wax, knife handles, blades, and the like.
It should be noted that the surgical instrument tool here may be a virtually constructed three-dimensional model, or may be an actual surgical instrument or a physical surgical instrument mold, on which an inductive sensor is disposed to obtain three-dimensional position information during the operation.
In this embodiment, the three-dimensional model library 210 may include a Random Access Memory (RAM) or may further include a non-volatile memory (non-volatile memory), such as at least one disk memory.
In summary, the present application provides a training system for orbital surgery based on virtual reality. The system comprises: a three-dimensional model library for storing one or more three-dimensional tissue and organ models; the VR wearing unit is used for constructing a virtual reality environment according to the selected three-dimensional tissue organ model and providing corresponding virtual display; the operation unit comprises a force feedback unit and a positioning unit and is used for operating the three-dimensional tissue organ model in the virtual reality environment; and the real-time three-dimensional positioning information obtained by the positioning unit is used as a basis for enabling the force feedback unit to generate corresponding force feedback perception.
In summary, the present application effectively overcomes various disadvantages of the prior art and has a high industrial utility value.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (9)

1. A virtual reality based orbital surgery training system, the system comprising:
a three-dimensional model library for storing one or more three-dimensional tissue and organ models;
the VR wearing unit is used for constructing a virtual reality environment according to the selected three-dimensional tissue organ model and providing corresponding virtual display;
the operation unit comprises a force feedback unit and a positioning unit and is used for operating the three-dimensional tissue organ model in the virtual reality environment; the real-time three-dimensional positioning information obtained by the positioning unit is used as a basis for enabling the force feedback unit to generate corresponding force feedback perception;
the positioning unit comprises an induction sensor; and tracking the accurate position of the operation unit in real time according to the positioning unit and obtaining the operation track of the whole operation training process so as to guide prompt or evaluate/score.
2. The virtual reality based orbital surgery training system of claim 1 wherein the operating unit is a force feedback joystick or a force feedback glove.
3. The virtual reality based orbital surgery training system of claim 2 wherein the force feedback unit comprises an effector that generates a shock of a corresponding force to represent a corresponding force feedback sensation.
4. The virtual reality based orbital surgery training system of claim 1 wherein the three-dimensional tissue organ models stored in the three-dimensional model library have position labels pre-set thereon for each step in the corresponding surgical path as a reference for guidance prompt or evaluation/scoring.
5. The virtual reality based orbital surgery training system of claim 1 wherein the inductive sensor is an infrared sensor or a magnetic sensor.
6. The virtual reality based orbital surgery training system of claim 2 wherein the operating unit further comprises a communication unit for transmitting the three-dimensional positioning information and/or controlling the force feedback unit to generate control signals for corresponding force feedback perception.
7. The virtual reality based orbital surgery training system of claim 6 wherein the communication means of the communication unit comprises: any one or more of WIFI, NFC, Bluetooth, Ethernet, GSM, 4G and GPRS.
8. The virtual reality-based orbital surgery training system of claim 2, wherein the VR wearing unit is VR glasses or a VR helmet.
9. The virtual reality based orbital surgery training system of claim 1 wherein the library of three-dimensional models is further configured to store a surgical environment model and a surgical instrument model.
CN201920564210.1U 2019-04-23 2019-04-23 Orbit operation training system based on virtual reality Active CN210378506U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201920564210.1U CN210378506U (en) 2019-04-23 2019-04-23 Orbit operation training system based on virtual reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201920564210.1U CN210378506U (en) 2019-04-23 2019-04-23 Orbit operation training system based on virtual reality

Publications (1)

Publication Number Publication Date
CN210378506U true CN210378506U (en) 2020-04-21

Family

ID=70258648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201920564210.1U Active CN210378506U (en) 2019-04-23 2019-04-23 Orbit operation training system based on virtual reality

Country Status (1)

Country Link
CN (1) CN210378506U (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109979600A (en) * 2019-04-23 2019-07-05 上海交通大学医学院附属第九人民医院 Orbital Surgery training method, system and storage medium based on virtual reality
CN112509410A (en) * 2020-12-08 2021-03-16 中日友好医院(中日友好临床医学研究所) Virtual reality-based auxiliary teaching system for hip arthroscopy operation
CN115294824A (en) * 2022-07-29 2022-11-04 上海市普陀区中心医院 Training method and system for ophthalmologic operation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109979600A (en) * 2019-04-23 2019-07-05 上海交通大学医学院附属第九人民医院 Orbital Surgery training method, system and storage medium based on virtual reality
CN112509410A (en) * 2020-12-08 2021-03-16 中日友好医院(中日友好临床医学研究所) Virtual reality-based auxiliary teaching system for hip arthroscopy operation
CN115294824A (en) * 2022-07-29 2022-11-04 上海市普陀区中心医院 Training method and system for ophthalmologic operation

Similar Documents

Publication Publication Date Title
US10437339B2 (en) Haptic augmented and virtual reality system for simulation of surgical procedures
US11642179B2 (en) Artificial intelligence guidance system for robotic surgery
US9563266B2 (en) Haptic augmented and virtual reality system for simulation of surgical procedures
US10601950B2 (en) Reality-augmented morphological procedure
Hunter et al. Ophthalmic microsurgical robot and associated virtual environment
Morris et al. A collaborative virtual environment for the simulation of temporal bone surgery
CN109979600A (en) Orbital Surgery training method, system and storage medium based on virtual reality
CN210378506U (en) Orbit operation training system based on virtual reality
Fiorini et al. Concepts and trends in autonomy for robot-assisted surgery
AU2022268383B2 (en) A system and method for interaction and definition of tool pathways for a robotic cutting tool
EP2277441A1 (en) Method for generating images of a human body zone undergoing a surgical operation by means of an apparatus for minimally invasive surgical procedures
EP3991014A1 (en) Virtual reality surgical training systems with advanced haptic feedback
EP3414753A1 (en) Autonomic goals-based training and assessment system for laparoscopic surgery
Wei et al. Augmented optometry training simulator with multi-point haptics
Cavusoglu Telesurgery and surgical simulation: Design, modeling, and evaluation of haptic interfaces to real and virtual surgical environments
Wang et al. Intelligent HMI in orthopedic navigation
Soldozy et al. Transsphenoidal surgery using robotics to approach the sella turcica: Integrative use of artificial intelligence, realistic motion tracking and telesurgery
KR20200080534A (en) System for estimating otorhinolaryngology and neurosurgery surgery based on simulator of virtual reality
US20220354579A1 (en) Systems and methods for planning and simulation of minimally invasive therapy
JP2004348091A (en) Entity model and operation support system using the same
CN112509410A (en) Virtual reality-based auxiliary teaching system for hip arthroscopy operation
Bao et al. Virtual Surgical Instruments and Surgical Simulation
Hao et al. Development of a multi-modal interactive system for Endoscopic Endonasal Approach surgery simulation
Wang et al. A simulation and training system of robot assisted surgery based on virtual reality
Jo et al. Development of Tissue-Tool Interaction Simulation Algorithms for Rotator Cuff Surgery Scenario in Arthroscopic Surgery Training Simulator

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant