CN114520040A - Operation teaching auxiliary system applying virtual reality and method thereof - Google Patents

Operation teaching auxiliary system applying virtual reality and method thereof Download PDF

Info

Publication number
CN114520040A
CN114520040A CN202011292855.8A CN202011292855A CN114520040A CN 114520040 A CN114520040 A CN 114520040A CN 202011292855 A CN202011292855 A CN 202011292855A CN 114520040 A CN114520040 A CN 114520040A
Authority
CN
China
Prior art keywords
dimensional
surgical
training
virtual reality
time point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011292855.8A
Other languages
Chinese (zh)
Inventor
刘伟民
陈菁徽
李宇倢
沈易达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Taipei Medical University TMU
Original Assignee
Taipei Medical University TMU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Taipei Medical University TMU filed Critical Taipei Medical University TMU
Priority to CN202011292855.8A priority Critical patent/CN114520040A/en
Publication of CN114520040A publication Critical patent/CN114520040A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Urology & Nephrology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Educational Technology (AREA)
  • Surgery (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

A virtual reality applied operation teaching auxiliary system and a method thereof establish a virtual reality operation environment in a three-dimensional reconstruction mode on the basis of a two-dimensional operation image, and preset corresponding operation steps, appliance using time points and reference information.

Description

Operation teaching auxiliary system applying virtual reality and method thereof
Technical Field
The invention relates to an operation teaching auxiliary system and a method thereof, in particular to an operation teaching auxiliary system applying virtual reality and a method thereof.
Background
In recent years, with the popularization and vigorous development of virtual reality, various applications based on virtual reality emerge like bamboo shoots in spring after rain, and in addition to common applications in games, virtual reality is also commonly applied in teaching, for example: simulation teaching and training of operations, aviation flight and the like.
In general, the traditional surgical training or teaching method is performed by general or animal, but the method has the problem of no recovery besides high cost. Therefore, manufacturers have applied virtual reality to surgical teaching, such as: simulating human organs or demonstrating the operation process simply by virtual reality. However, the body organs simulated at present are still not realistic enough for teaching, training or training. On the other hand, the degree of freedom of the operation provided by the virtual reality is also insufficient, so the effect of the application in the operation teaching still has its limitation.
Accordingly, manufacturers propose techniques for enhancing fidelity and freedom to enhance immersive experience. However, the high degree of simulation and degree of freedom are not sufficient to improve learning results, and for example, when it is not known how to operate next, it is not known how to operate even if an operating environment with high degree of simulation and high degree of freedom is provided. Therefore, if it is actively detected whether assistance needs to be provided and assistance and guidance are appropriately intervened, it is expected that positive assistance for improving the learning effect of the surgery will be provided.
In view of the above, it is known that the prior art has a problem that the operation learning of virtual reality is not effective for a long time, and therefore, it is necessary to provide an improved technical means to solve the problem.
Disclosure of Invention
The invention discloses an operation teaching auxiliary system applying virtual reality and a method thereof.
First, the present invention discloses an operation teaching assistance system using virtual reality, which includes: the system comprises an operation image database, an establishing module, a training module and a supporting module. The operation image database stores two-dimensional operation images, and the two-dimensional operation images respectively correspond to operation type names, operation steps, instrument using time points and reference information; the establishing module is connected with the operation image database and used for receiving the operation type name to select and load one of the corresponding two-dimensional operation images before operation training is carried out, and establishing a virtual reality operation environment in a three-dimensional reconstruction mode according to the loaded two-dimensional operation images; the training module is connected with the establishing module and used for allowing surgical training after the virtual reality surgical environment is established, continuously detecting surgical operation behaviors during the surgical training, and generating corresponding time point information according to the progress of the surgical training when the surgical operation behaviors are abnormally stopped or delayed; and the support module is connected with the training module and used for playing the loaded two-dimensional operation image according to the generated time point information, synchronously displaying the operation steps, and using the instrument time point and the reference information to perform auxiliary support and guidance.
In addition, the invention also discloses an operation teaching auxiliary method applying virtual reality, which comprises the following steps: providing a plurality of two-dimensional operation images, wherein the two-dimensional operation images respectively correspond to operation type names, operation steps, instrument using time points and reference information; before surgical training, receiving a surgical name to select and load one of the corresponding two-dimensional surgical images, and establishing a virtual reality surgical environment in a three-dimensional reconstruction mode according to the loaded two-dimensional surgical image; after the virtual reality operation environment is established, allowing operation training, continuously detecting operation behaviors during the operation training, and generating corresponding time point information according to the progress of the operation training when the operation behaviors are abnormally stopped or delayed; and playing the loaded two-dimensional operation image according to the generated time point information, synchronously displaying the operation steps, and using the instrument time point and the reference information to perform auxiliary support and guidance.
The system and the method disclosed by the invention are different from the prior art in that a virtual reality operation environment is established in a three-dimensional reconstruction mode on the basis of a two-dimensional operation image, corresponding operation steps, appliance using time points and reference information are preset, when operation training is carried out in virtual reality, operation behaviors are continuously detected, and when abnormal suspension or delay of the operation behaviors is detected, corresponding auxiliary support and guidance are triggered according to the progress of the operation training.
Through the technical means, the invention can achieve the technical effect of improving the operation learning effect of virtual reality.
Drawings
Fig. 1 is a system block diagram of an operation teaching assistance system using virtual reality according to the present invention.
Fig. 2A and 2B are flow charts of the method of the surgical teaching assistance method using virtual reality according to the present invention.
Fig. 3 is a schematic view of a virtual reality surgical operation using the present invention.
FIG. 4 is a schematic diagram illustrating the operation of detecting abnormal behavior or delay of surgical operation according to the present invention.
The reference numbers are as follows:
110 database of surgical images
120 building module
130 training module
140 support module
300 graphical user interface
301 first display area
302 warning graphic representation
303 second display area
304 time point display block
310a,310b controller
311a,311b virtual surgical instrument
320 surgical instrument icon
Step 210 provides a plurality of two-dimensional surgical images, each corresponding to a surgical name, an operation procedure, a tool using time point, and a reference message
Step 220, before performing surgery training, receiving the surgery type name to select and load one of the two-dimensional surgery images, and establishing a virtual reality surgery environment in a three-dimensional reconstruction manner according to the loaded two-dimensional surgery image
Step 230, after the virtual reality surgical environment is established, allowing surgical training, continuously detecting a surgical operation behavior during the surgical training, and generating a corresponding time point message according to the progress of the surgical training when the surgical operation behavior is abnormally suspended or delayed
231, after the virtual reality operation environment is established, allowing the operation training and continuously detecting the operation behavior during the operation training
Step 232, inputting the detected operation behavior into a machine learning model trained in advance for identifying whether the detected operation behavior is aborted or delayed through the machine learning model, wherein the generation mode of the machine learning model trained is to continuously input the operation behavior without abort or delay into the machine learning model untrained in advance for training
Step 240 is to play the loaded two-dimensional operation image according to the generated time point information, and to synchronously display the operation step, the time point of the tool and the reference information for auxiliary support and guidance
Detailed Description
The embodiments of the present invention will be described in detail with reference to the drawings and examples, so that how to implement the technical means for solving the technical problems and achieving the technical effects of the present invention can be fully understood and implemented.
First, before describing the operation teaching assistance system and method using virtual reality disclosed in the present invention, an environment to which the present invention is applied will be described, where the present invention is applied to virtual reality, the virtual reality is a virtual world that is generated by using computer simulation and provides a user with sense simulation about vision and the like, so that the user feels as if he/she is experiencing the virtual world, and can observe things in the three-dimensional space in real time without limitation.
Referring to fig. 1, fig. 1 is a block diagram of a surgical teaching assistance system using virtual reality, and the system includes: surgical image database 110, setup module 120, training module 130, and support module 140. The operation image database 110 is used to store two-dimensional operation images, which are respectively corresponding to operation names, operation steps, instrument using time points and reference information. For example, if there is a two-dimensional surgical image corresponding to the surgical name "hysteromyomectomy", the two-dimensional surgical image also corresponds to the procedure, the time of use of the tool, and the reference message. The operation steps are the execution steps of the operation formula, the time point of using the surgical instruments is used for recording various surgical instruments needed by the operation formula and the time points (or called as time points) of using the surgical instruments, and the reference information can record various information needed to pay attention to the operation formula.
The establishing module 120 is connected to the operation image database 110, and is configured to receive the operation type name to select and load one of the two-dimensional operation images before performing the operation training, and establish a virtual reality operation environment in a three-dimensional reconstruction manner according to the loaded two-dimensional operation image. In practical implementation, the three-dimensional Reconstruction (3D Reconstruction) refers to a mathematical process and a calculator technology for recovering three-dimensional information (shape, etc.) of an object by using two-dimensional projection or image, and can be implemented by algorithms such as light measurement, geometry, and deep learning, so as to convert the two-dimensional operation image into three-dimensional information of virtual reality.
The training module 130 is connected to the establishing module 120, and is configured to allow the surgical training to be performed after the virtual reality surgical environment is established, and continuously detect the surgical operation behavior during the surgical training, and generate a corresponding time point message according to the progress of the surgical training when the surgical operation behavior is abnormally suspended or delayed. In practice, the operation behavior without abnormal termination or delay can be input into the machine learning model for training, and after the training is completed, the detected operation behavior is input into the machine learning model, so as to identify whether the operation behavior is abnormally terminated or delayed. In addition, the operation behavior can be detected through at least one of the motion sensor, the wrist-worn device and the hand dynamic input device.
The support module 140 is connected to the training module 130, and is used for playing the loaded two-dimensional surgical image according to the generated time point message, and synchronously displaying the operation steps, using the instrument time point and the reference message for auxiliary support and guidance. For example, assuming that the generated time point message is "ten minutes and thirty seconds", the support module 140 will start to play the loaded two-dimensional surgical image from ten minutes and thirty seconds, and synchronously display the corresponding operation steps, use the instrument time point and the reference message for auxiliary support and guidance. In other words, when the user's operation behavior is abnormally suspended or delayed, the support module 140 displays the time point (or simply referred to as "time point") and the subsequent operation steps, the time point of using the surgical instrument, and provides the normal physiological data, attention points, etc. at the time point to the user as reference information.
In particular, in practical implementation, all the modules described in the present invention can be implemented by various manners, including software, hardware or any combination thereof, for example, in some embodiments, each module can be implemented by software, hardware or any combination thereof, and besides, the present invention can also be implemented partially or completely by hardware, for example, one or more modules in a System can be implemented by an integrated circuit Chip, a System on Chip (SoC), a Complex Programmable Logic Device (CPLD), a Field Programmable Gate Array (FPGA), and the like. The present invention may be a system, method and/or computer program. The computer program may include a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement various aspects of the present invention, the computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: hard disk, random access memory, read only memory, flash memory, compact disk, floppy disk, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical signals through a fiber optic cable), or electrical signals transmitted through a wire. Additionally, the computer-readable program instructions described herein may be downloaded to the various computing/processing devices from a computer-readable storage medium, or over a network, for example: the internet, local area network, wide area network, and/or wireless network to an external computer device or external storage device. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, hubs and/or gateways. The network card or network interface in each computing/processing device receives the computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device. The computer program instructions which carry out operations of the present invention may be assembly language instructions, instruction set architecture instructions, machine dependent instructions, micro-instructions, firmware instructions, or Object Code (Object Code) written in any combination of one or more programming languages, including an Object oriented programming language such as: common Lisp, Python, C + +, Objective-C, Smalltalk, Delphi, Java, Swift, C #, Perl, Ruby, and PHP, etc., as well as conventional Procedural (Procedural) programming languages, such as: c or a similar programming language. The computer program instructions may execute entirely on the computer, partly on the computer, as stand-alone software, partly on a client computer and partly on a remote computer or entirely on the remote computer or server.
Please refer to fig. 2A and fig. 2B, and fig. 2A and fig. 2B are flowcharts of a method of the surgical teaching assistance method using virtual reality according to the present invention, and the method includes the steps of: providing a plurality of two-dimensional operation images, wherein the two-dimensional operation images respectively correspond to operation type names, operation steps, instrument using time points and reference information (step 210); before surgical training, receiving a surgical name to select and load one of the two-dimensional surgical images, and establishing a virtual reality surgical environment in a three-dimensional reconstruction mode according to the loaded two-dimensional surgical image (step 220); after the virtual reality operation environment is established, allowing the operation training, continuously detecting the operation behavior during the operation training, and generating a corresponding time point message according to the progress of the operation training when the operation behavior is abnormally suspended or delayed (step 230); and playing the loaded two-dimensional operation image according to the generated time point information, and synchronously displaying the operation steps, using the instrument time point and the reference information for auxiliary support and guidance (step 240). Through the steps, the virtual reality operation environment can be established in a three-dimensional reconstruction mode on the basis of the two-dimensional operation image, the corresponding operation steps, the appliance using time point and the reference information are preset, the operation behavior is continuously detected when the operation training is carried out in the virtual reality, and the corresponding auxiliary support and guidance are triggered according to the progress of the operation training when the abnormal suspension or delay of the operation behavior is detected.
In addition, as illustrated in fig. 2B, step 230 can be further divided into two steps: after the virtual reality operation environment is established, allowing operation training, and continuously detecting operation behaviors during the operation training (step 231); and inputting the detected operation behavior into a pre-trained machine learning model for identifying whether the detected operation behavior is aborted or delayed through the machine learning model, wherein the generation mode of the trained machine learning model is generated by continuously inputting the operation behavior without abort or delay into the untrained machine learning model for training (step 232).
The following description will be made by way of example with reference to fig. 3 and 4, and as shown in fig. 3, fig. 3 is a schematic view of the virtual reality surgical operation to which the present invention is applied. Before the user intends to perform the virtual reality operation, the user may input the surgical name through the controller 310b, such as: hysteromyomectomy to load a corresponding two-dimensional surgical image from the surgical image database 110. Then, the building module 120 builds a virtual reality surgical environment from the loaded two-dimensional surgical image in a three-dimensional reconstruction manner, and after the building is completed, the user can browse the virtual surgical environment through the first display area 301 in the graphical user interface 300, and can control the virtual surgical instruments (311a, 311b) to perform surgical training by operating the controllers (310a, 310 b). At this time, the training module 130 continuously detects the operation behavior, and generates a corresponding time point message according to the progress of the operation training when the operation behavior is abnormally stopped or delayed, so that the support module 140 plays the loaded two-dimensional operation image according to the time point message. For example, if the surgical training is performed for thirty seconds and ten minutes, and the surgical operation behavior is detected to be abnormally suspended or delayed, the time message is ten minutes and thirty seconds, the support module 140 plays the loaded two-dimensional surgical image from thirty seconds and synchronously displays the operation steps, and uses the instrument time and the reference message for auxiliary support and guidance.
Please refer to fig. 4, wherein fig. 4 is a schematic diagram illustrating the abnormal behavior or delay of the surgical operation detected by the present invention. In practice, when the training module 130 detects the abnormal suspension or delay of the operation behavior, as illustrated in fig. 4, the warning icon 302 is displayed, and the first display area 301, the second display area 303 and the timing display area 304 are simultaneously displayed. The second display block 303 is used for displaying the loaded two-dimensional surgical image according to the time point information, and synchronously displaying the operation steps, using the instrument time point and the reference information for auxiliary support and guidance; the timing display block 304 displays the timing information, such as: fifteen minutes may be shown as "15: 00". In practical implementation, the icons 320 of the surgical instruments to be used can be sequentially displayed according to the time points of using the surgical instruments, so that the user can have psychological preparation for switching the surgical instruments. In other words, when the training module 130 detects that the surgical operation behavior is abnormally suspended or delayed, the first display block 301 and the virtual surgical environment therein are not simply displayed, but are assisted by various messages to support and guide the user, for example: the user is alerted by the warning icon 302 that the current operation behavior is abnormal or delayed, and the two-dimensional operation image corresponding to the time point of the abnormal or delayed operation is displayed in the second display block 303, and the time point of the abnormal or delayed operation is displayed in the time point display block 304, and even the to-be-used surgical instrument icons 320 are sequentially displayed according to the time point. Therefore, when the user does not determine the subsequent steps or does not know the subsequent steps at all, the auxiliary and guide can be intervened in real time, and the operation learning effect of the virtual reality is greatly improved.
In summary, it can be seen that the difference between the present invention and the prior art is that a virtual reality surgical environment is established by means of three-dimensional reconstruction based on a two-dimensional surgical image, and corresponding operation steps, appliance using time points and reference information are preset, when surgical training is performed in virtual reality, surgical operation behaviors are continuously detected, and when abnormal suspension or delay of the surgical operation behaviors is detected, corresponding auxiliary support and guidance are triggered according to the progress of the surgical training.
Although the present invention has been described with reference to the foregoing embodiments, it should be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. A surgical teaching assistance system using virtual reality, comprising:
the system comprises an operation image database, a database management module and a database management module, wherein the operation image database is used for storing a plurality of two-dimensional operation images, and each two-dimensional operation image corresponds to an operation type name, an operation step, an instrument using time point and a reference message;
the establishing module is connected with the operation image database and used for receiving the operation type name to select and load one of the corresponding two-dimensional operation images before operation training is carried out, and establishing a virtual reality operation environment in a three-dimensional reconstruction mode according to the loaded two-dimensional operation images;
the training module is connected with the establishing module and used for allowing the operation training after the virtual reality operation environment is established, continuously detecting an operation behavior during the operation training, and generating a corresponding time point message according to the progress of the operation training when the operation behavior is abnormally suspended or delayed; and
and the support module is connected with the training module and used for playing the loaded two-dimensional operation image according to the generated time point information and synchronously displaying the operation step, the time point of using the instrument and the reference information so as to perform auxiliary support and guidance.
2. The system of claim 1, wherein the training module inputs the detected operation behavior into a machine learning model trained in advance, so as to identify whether the detected operation behavior is aborted or delayed through the machine learning model.
3. The surgical teaching aid system according to claim 2, wherein the trained machine learning model is generated by continuously inputting surgical operation behaviors without abort or delay in advance into the untrained machine learning model for training.
4. The system of claim 1, wherein the three-dimensional reconstruction is computed by at least one of photometric method, geometric and deep learning to convert the two-dimensional surgical image into a three-dimensional virtual reality message.
5. The system of claim 1, wherein the surgical operation behavior is detected by at least one of a motion sensor, a wrist-worn device, and a hand-dynamic input device.
6. An operation teaching assistance method using virtual reality is characterized by comprising the following steps:
providing a plurality of two-dimensional operation images, wherein each two-dimensional operation image corresponds to an operation type name, an operation step, an instrument using time point and a reference message;
before surgical training, receiving the surgical name to select and load one of the corresponding two-dimensional surgical images, and establishing a virtual reality surgical environment in a three-dimensional reconstruction mode according to the loaded two-dimensional surgical image;
after the virtual reality operation environment is established, allowing operation training, continuously detecting an operation behavior during the operation training, and generating a corresponding time point message according to the progress of the operation training when the operation behavior is abnormally suspended or delayed; and
and playing the loaded two-dimensional operation image according to the generated time point message, and synchronously displaying the operation step, the appliance using time point and the reference message to perform auxiliary support and guidance.
7. The method as claimed in claim 6, further comprising the step of inputting the detected operation behavior into a machine learning model trained in advance for recognizing whether the detected operation behavior is abnormally suspended or delayed by the machine learning model.
8. The surgical teaching assistance method according to claim 7, wherein the trained machine learning model is generated by continuously inputting surgical operation behaviors without abort or delay in advance into the untrained machine learning model for training.
9. The method as claimed in claim 6, wherein the three-dimensional reconstruction is calculated by at least one of light measurement, geometry and deep learning to convert the two-dimensional operation image into a three-dimensional virtual reality message.
10. The method of claim 6, wherein the operation behavior is detected by at least one of a motion sensor, a wrist-worn device, and a hand-operated dynamic input device.
CN202011292855.8A 2020-11-18 2020-11-18 Operation teaching auxiliary system applying virtual reality and method thereof Pending CN114520040A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011292855.8A CN114520040A (en) 2020-11-18 2020-11-18 Operation teaching auxiliary system applying virtual reality and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011292855.8A CN114520040A (en) 2020-11-18 2020-11-18 Operation teaching auxiliary system applying virtual reality and method thereof

Publications (1)

Publication Number Publication Date
CN114520040A true CN114520040A (en) 2022-05-20

Family

ID=81595343

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011292855.8A Pending CN114520040A (en) 2020-11-18 2020-11-18 Operation teaching auxiliary system applying virtual reality and method thereof

Country Status (1)

Country Link
CN (1) CN114520040A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114300093A (en) * 2021-12-27 2022-04-08 北京众森信和科技有限公司 Interactive operation teaching method and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105788390A (en) * 2016-04-29 2016-07-20 吉林医药学院 Medical anatomy auxiliary teaching system based on augmented reality
CN205569066U (en) * 2016-03-07 2016-09-14 上海盟云移软网络科技股份有限公司 Virtual reality medical system
CN106203626A (en) * 2016-06-30 2016-12-07 北京奇虎科技有限公司 Car steering behavioral value method and device, automobile
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN110400620A (en) * 2019-07-25 2019-11-01 上海交通大学医学院附属上海儿童医学中心 System is instructed in a kind of cardiac three-dimensional model building method and simulation openheart surgery
CN110796739A (en) * 2019-09-27 2020-02-14 哈雷医用(广州)智能技术有限公司 Virtual reality simulation method and system for craniocerebral operation

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205569066U (en) * 2016-03-07 2016-09-14 上海盟云移软网络科技股份有限公司 Virtual reality medical system
CN105788390A (en) * 2016-04-29 2016-07-20 吉林医药学院 Medical anatomy auxiliary teaching system based on augmented reality
CN106203626A (en) * 2016-06-30 2016-12-07 北京奇虎科技有限公司 Car steering behavioral value method and device, automobile
CN106901834A (en) * 2016-12-29 2017-06-30 陕西联邦义齿有限公司 The preoperative planning of minimally invasive cardiac surgery and operation virtual reality simulation method
CN110400620A (en) * 2019-07-25 2019-11-01 上海交通大学医学院附属上海儿童医学中心 System is instructed in a kind of cardiac three-dimensional model building method and simulation openheart surgery
CN110796739A (en) * 2019-09-27 2020-02-14 哈雷医用(广州)智能技术有限公司 Virtual reality simulation method and system for craniocerebral operation

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114300093A (en) * 2021-12-27 2022-04-08 北京众森信和科技有限公司 Interactive operation teaching method and system

Similar Documents

Publication Publication Date Title
EP3992918A1 (en) Method for generating 3d expression base, voice interactive method, apparatus and medium
JP2021103574A (en) Method, device, and electronic apparatus for training facial fusion model
CN113946211A (en) Method for interacting multiple objects based on metauniverse and related equipment
US20090311655A1 (en) Surgical procedure capture, modelling, and editing interactive playback
JP2022088529A (en) Sight tracking method and apparatus, model training method and apparatus, terminal device, computer-readable storage medium and computer program
US11647983B2 (en) Automating ultrasound examination of a vascular system
CN110070076B (en) Method and device for selecting training samples
CN110458441A (en) Checking method, device, system and the storage medium of quality inspection
CN113761796A (en) Simulation system and method of heart hemorrhage and hemostasis model based on virtual reality
CN114520040A (en) Operation teaching auxiliary system applying virtual reality and method thereof
CN112991208B (en) Image processing method and device, computer readable medium and electronic equipment
CN115761855B (en) Face key point information generation, neural network training and three-dimensional face reconstruction method
TWI765370B (en) Surgery teaching auxiliary system using virtual reality (vr) and method thereof
KR20230056004A (en) Apparatus and Method for Providing a Surgical Environment based on a Virtual Reality
US20230334998A1 (en) Surgical teaching auxiliary system using virtual reality and method thereof
CN110135583A (en) The generation method of markup information, the generating means of markup information and electronic equipment
CN110209751A (en) Route sharing method and device suitable for Driving Test application
WO2022063030A1 (en) Audio-visual interaction with implanted devices
US20150082244A1 (en) Character input device using event-related potential and control method thereof
CN115564897A (en) Intelligent magnetic resonance holographic imaging method and system
CN115575931A (en) Calibration method, calibration device, electronic equipment and storage medium
CN113805704A (en) Vision treatment method and system based on VR technology
TWI765369B (en) Decision support system for surgical based on augmented reality (ar) and method thereof
US20230329806A1 (en) Surgical decision support system based on augmented reality (ar) and method thereof
CN114550876A (en) Operation decision support system and method based on augmented reality

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination