CN107067856B - Medical simulation training system and method - Google Patents

Medical simulation training system and method Download PDF

Info

Publication number
CN107067856B
CN107067856B CN201611266712.3A CN201611266712A CN107067856B CN 107067856 B CN107067856 B CN 107067856B CN 201611266712 A CN201611266712 A CN 201611266712A CN 107067856 B CN107067856 B CN 107067856B
Authority
CN
China
Prior art keywords
data
dimensional
user
model
dimensional model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611266712.3A
Other languages
Chinese (zh)
Other versions
CN107067856A (en
Inventor
马一方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201611266712.3A priority Critical patent/CN107067856B/en
Publication of CN107067856A publication Critical patent/CN107067856A/en
Application granted granted Critical
Publication of CN107067856B publication Critical patent/CN107067856B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/28Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Educational Technology (AREA)
  • Educational Administration (AREA)
  • Business, Economics & Management (AREA)
  • Computational Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Medicinal Chemistry (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a medical simulation training system and a method, wherein the system comprises: the object model building module is used for collecting three-dimensional model data of a specific part of a human body and building a corresponding three-dimensional virtual model object based on the collected three-dimensional model data; a model database storing three-dimensional virtual model object data; the sensor tracking module is used for acquiring operation data of a user on the three-dimensional model; the data processing module is used for processing the acquired data to obtain shape data of the three-dimensional model and operation data of the three-dimensional model by a user and outputting the shape data and the operation data; the image generation module generates a three-dimensional virtual image comprising simulation operation result information according to shape data of the three-dimensional model and operation data of a user; and the display module is used for displaying and outputting the three-dimensional virtual image for a user to check the simulation operation result. The system and the method of the embodiment have strong immersion, simulate real medical scenes to the maximum extent, and reduce the cost of clinical teaching and training of surgical operations, intravenous injection and the like.

Description

Medical simulation training system and method
Technical Field
The invention relates to the technical field of virtual reality, in particular to a medical simulation training system and method.
Background
In the medical field, traditional medical simulation training is simple and crude, for example, surgical operation practice teaching is not up to the clinical application level basically, some medical institutions and hospitals use simulated organ erasers for teaching, and teaching links such as medical assistance for surgical operation training, operation preview, clinical diagnosis, remote operation, rehabilitation treatment and the like are realized by using organ real specimen explanation, autopsy teaching, expert operation observation and the like.
This method has a number of disadvantages: 1. the simulation organ teaches, and rubber material is easily ageing by itself, just needs to be changed after using many times, and the price ratio of rubber organ is than higher, changes the teaching aid greatly increased the teaching cost often. 2. Autopsy teaching, which is the teaching of students through autopsy in laboratories, is very high in cost, and the autopsy and living bodies have large difference, so that the students can not learn intuitively and easily and operate manually. 3. The operation observation form of the expert learns and learns various operation actual operations through various senses such as audio-visual touch and the like, and for young doctors or practice students, the operation cannot be performed in person and compared with the expert operation method in time, so that errors and defects of the operation cannot be found, a large amount of money and time cannot be wasted in training young medical personnel undoubtedly, a good training effect cannot be obtained, and the training efficiency is also seriously influenced.
In conclusion, the existing medical simulation training technical scheme has the disadvantages of high training cost, poor training effect and low efficiency.
Disclosure of Invention
The invention provides a medical simulation training system and method, and aims to solve the technical problems of high training cost, poor training effect and low efficiency of the existing medical simulation training technical scheme in the prior art.
According to an aspect of the present invention, there is provided a medical simulation training system, the system comprising:
the object model building module is used for collecting three-dimensional model data of a specific part of a human body, building a corresponding three-dimensional virtual model object based on the collected three-dimensional model data, and sending the built three-dimensional virtual model object to a model database;
a model database for storing three-dimensional virtual model object data;
the sensor tracking module is used for acquiring real-time position data and pressure data of the held instrument through a sensor arranged on the instrument held by a user, and acquiring position data and pressure data of a three-dimensional model through the sensor in the three-dimensional model acted by the instrument in a tracking manner;
the data processing module is used for processing the position data and the pressure data acquired by the sensor tracking module to obtain shape data of the three-dimensional model and operation data of a user on the three-dimensional model, and outputting the shape data of the three-dimensional model and the operation data of the user on the three-dimensional model to the image generation module;
the image generation module is used for generating a three-dimensional virtual image comprising simulation operation result information according to the shape data of the three-dimensional model and the operation data of the three-dimensional model by the user and sending the generated three-dimensional virtual image to the display module;
and the display module is used for displaying and outputting the three-dimensional virtual image for a user to check the simulation operation result.
According to another aspect of the present invention, there is provided a medical simulation training method, the method comprising:
acquiring three-dimensional model data of a specific part of a human body, and constructing a corresponding three-dimensional virtual model object based on the acquired three-dimensional model data;
storing the constructed three-dimensional virtual model object;
acquiring real-time position data and pressure data of a held instrument through a sensor arranged on the instrument held by a user and tracking and acquiring the position data and the pressure data of a three-dimensional model through a sensor in the three-dimensional model acted by the instrument;
processing the collected position data and pressure data to obtain shape data of the three-dimensional model and operation data of a user on the three-dimensional model;
generating a three-dimensional virtual image comprising simulation operation result information according to the shape data of the three-dimensional model and the operation data of the three-dimensional model by a user;
and displaying and outputting the three-dimensional virtual image for a user to check the simulation operation result.
The embodiment of the invention has the beneficial effects that: the medical simulation training system and method provided by the embodiment of the invention have the advantages that the virtual reality technology is utilized to collect the three-dimensional model data of the specific part of the human body and construct the virtual model object, the operation data of the three-dimensional model by the user is received, the corresponding virtual reality image is generated according to the operation data, and the generated virtual reality image is presented in the virtual reality display equipment, so that the user can conveniently watch the operation result. Therefore, the sensors are arranged on the instruments and the operation objects operated in the real space, the operation data of the user are collected in real time, the corresponding three-dimensional virtual reality image is manufactured, the simulated touch perception and the simulated visual perception are provided for the user, the immersion feeling of the user is improved, and the interactivity is strong. The operation data can be directly collected and updated in real time, the operation result is fed back truly, and the training effect is improved. The medical teaching and operation training device meets the requirements of medical teaching and operation training, is simple, efficient and easy to operate, is convenient for clinical popularization and popularization, and has high application value.
Drawings
FIG. 1 is a block diagram of a medical simulation training system according to an embodiment of the present invention;
FIG. 2 is a block diagram of an intravenous injection simulation training system according to an embodiment of the present invention;
FIG. 3 is a block diagram of a surgical simulation training system in accordance with one embodiment of the present invention;
FIG. 4 is a flow chart of a medical simulation training method according to an embodiment of the present invention.
Detailed Description
Virtual Reality (Virtual Reality) technology, also known as smart technology, is a new technology that has rapidly developed in recent years. The virtual reality is a virtual world which is generated in a three-dimensional space by utilizing computer simulation, provides simulation of senses such as vision, hearing, touch and the like for a user, and enables the user to observe objects in the three-dimensional space in time without limitation as if the user is in his own environment. The technology is a series of computer related scientific technologies such as computer graphics, man-machine interaction technology, 3D modeling and the like, and real scenes are simulated to the maximum extent by simulating real space, timely sensing behaviors (such as body movement, head rotation and the like) of people in a virtual environment and timely feeding back the behaviors.
The design concept of the invention is as follows: the virtual reality technology is applied to medical simulation training, a three-dimensional image of a virtual space is generated through the virtual reality technology, operation data of a user on a three-dimensional model of a specific part of a human body in the real world are collected in real time, operation result data are calculated according to the real-time operation data, the operation result data are overlapped with a virtual model object, and therefore a virtual image comprising the operation result data is generated and displayed and output to the user for the user to check. Therefore, the simulation of senses such as vision, touch and the like of the user is provided, the user can observe the model object in the three-dimensional space in time without limitation as if the user is in his own situation, the medical training effect is improved, expensive training equipment does not need to be purchased, and the cost is saved.
Example one
Fig. 1 is a block diagram of a medical simulation training system according to an embodiment of the present invention, and referring to fig. 1, a medical simulation training system 10 according to the embodiment includes:
an object model constructing module 101, configured to collect three-dimensional model data of a specific part of a human body, construct a corresponding three-dimensional virtual model object based on the collected three-dimensional model data, and send the constructed three-dimensional virtual model object to a model database 104;
a model database 104 for storing three-dimensional virtual model object data;
the sensor tracking module 103 is used for acquiring real-time position data and pressure data of a held instrument through a sensor arranged on the instrument held by a user, and acquiring position data and pressure data of a three-dimensional model acted by the instrument through sensor tracking in the three-dimensional model;
the data processing module 102 is configured to process the acquired data to obtain shape data of the three-dimensional model and operation data of the user on the three-dimensional model, and output the shape data of the three-dimensional model and the operation data of the user on the three-dimensional model to the image generation module 105;
an image generation module 105 for generating a three-dimensional virtual image including simulation operation result information from shape data of the three-dimensional model and operation data of the three-dimensional model by a user, and transmitting the generated three-dimensional virtual image to a display module 106;
and the display module 106 is configured to display and output a three-dimensional virtual image for a user to view a simulation operation result.
As shown in fig. 1, in the virtual reality-based medical simulation training system of this embodiment, by acquiring three-dimensional model data of a specific part of a human body, constructing a corresponding three-dimensional virtual model object based on the acquired three-dimensional model data, and sending the constructed three-dimensional virtual model object to a model database, a user can select the three-dimensional virtual model object in the model database, and using a virtual reality technology, when the user operates an instrument such as a scalpel in the real world to operate a three-dimensional model, operation data of each step can be acquired by a sensor on the instrument such as the scalpel and a sensor on the three-dimensional model, and the acquired data are sent to a data processing module, which calculates and identifies operation data (for example, operation force data, operation direction data, data of an operated part and the like) and then sent to the image generation unit, the image generation unit generates a corresponding three-dimensional operation result image according to the operation data, and then a user can see the operation result of each step of operation on the virtual model object through the display of the three-dimensional display, so that the medical training effect is improved, the training cost is reduced, and the large-scale popularization and application are facilitated.
In an embodiment of the present invention, the image generating module 105 is specifically configured to recognize, according to shape data of the three-dimensional stereo model, the three-dimensional stereo model corresponding to the shape data in a preset model database, and call the three-dimensional image processing library according to operation data of the three-dimensional stereo model by a user, so as to generate a three-dimensional image including simulation operation result information.
In an embodiment of the present invention, the object model constructing module 101 is configured to collect three-dimensional stereo model data of a human arm and internal organs, and construct a three-dimensional virtual model object corresponding to the human arm and a three-dimensional virtual model object corresponding to each internal organ based on the collected three-dimensional stereo model data of the human arm and the internal organs.
In one embodiment of the invention, a user-held instrument comprises: the system comprises a virtual reality scalpel or a virtual reality injector and a sensor tracking module 103, wherein the sensor tracking module is used for acquiring real-time position data of an instrument held by a user through a position sensor and acquiring real-time operation position data of a three-dimensional model acted by the instrument through the position sensor; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
In one embodiment of the present invention, the display module 106 is provided with a camera for tracking the eye movement of the user, or the display module is provided with an acceleration sensor for collecting the head movement data of the user in real time to track the head movement of the user, and determining the viewpoint of the user in real time according to the eye movement or the head movement,
when the three-dimensional virtual image sent by the image generation module 105 is received, the display output position of the three-dimensional virtual image is adjusted according to the viewpoint of the user determined in real time. In one embodiment of the invention the display module 106 is virtual reality glasses or a virtual reality helmet.
In one embodiment of the present invention, the image generation module 105 is further configured to perform distortion correction on the three-dimensional virtual image before sending the three-dimensional virtual image to the virtual reality glasses or virtual reality helmet according to the optical characteristics of the lenses in the virtual reality glasses or virtual reality helmet.
The following describes the structure of the virtual reality-based medical simulation training system according to the present invention with reference to two specific application scenarios. The two specific application scenarios are respectively: simulation training of surgical operations and nursing intravenous injection simulation training.
Example two
The embodiment provides nursing intravenous injection simulation training based on virtual reality, and specifically, the intravenous injection simulation training system based on virtual reality needs to use the following tools when applied:
performing intravenous injection simulation training on a three-dimensional model of an arm, namely, an object operated by a user;
syringes, i.e., instruments held by the user;
the computer is used for model data storage and data processing, image generation, data transmission and the like;
and the VR display equipment (such as VR glasses) is used for displaying the virtual reality three-dimensional image.
The principle of realizing the intravenous injection simulation training system based on the virtual reality is as follows: and (3) arranging sensors in simulated blood vessels (or intramuscular injection areas) in the three-dimensional arm model, acquiring dynamic hand blood vessel trend data, and performing centralized storage on the acquired data. In the embodiment, dynamic data of the trend of the hand blood vessels can be classified according to different age groups, collected data are stored in a centralized mode, a complete arm physiological metabolism model database is built, sensors are arranged in intramuscular injection areas and important blood vessels in the model, the collected data are transmitted to VR glasses once all the sensors are touched, then an object model of virtual injection operation is realized through programming, an operator conducts intravenous injection simulation on the arm model in a virtual reality space by wearing 3D glasses, in addition, the thickness of a skin tissue fat layer and the thickness of blood vessels can be freely switched in the model database, and the intravenous (intramuscular) injection effect is really fed back to the VR glasses through the touch feeling connected with the sensors. The technical scheme of the embodiment of the invention has the advantages of strong immersion, strong interactivity, simplicity, high efficiency and easy operation.
Fig. 2 is a block diagram of an intravenous injection simulation training system according to an embodiment of the present invention, and referring to fig. 2, an intravenous injection simulation training system 20 according to the embodiment includes:
the object model constructing unit 201 collects arm three-dimensional model data by using a virtual reality technology, sets a virtual three-dimensional arm model according with the situation, and stores the virtual three-dimensional arm model into a computer operated by the object model constructing unit 201 for simulation operation. In addition, the 3D model can be provided to a 3D application program interface in the practical application process for designing an intravenous (intramuscular) injection scheme.
The data processing unit 202 is configured to process the data transmitted from the data transmission unit 204, and obtain information therein: for example: user viewpoint information viewpoint, Position information of the operation object, and Motion information, and the recognized information is transmitted to the image generating unit 203 to generate a three-dimensional virtual image, and a virtual scene is generated according to the corresponding injection operation.
Wherein, viewpoint information Viewport can be collected by the following modes: the VR head-mounted equipment is provided with a camera for tracking the eyeball motion of the user, and the viewpoint of the user is determined according to the eyeball motion.
When a user wears the VR headset and is in an intravenous injection simulation training virtual scene, the action of the user is identified by utilizing the sensing of the sensor on the human arm model and the sensing of the sensor on the injector, and the action instruction is transmitted to the data processing unit of the computer.
The image generating unit 203 receives the viewpoint information, the position information, the motion information, and the like transmitted from the data processing unit 202, processes the information, extracts shape (object contour) information of the three-dimensional model operated by the user, recognizes object model data corresponding to the object shape information from a preset model database, calls a 3D image processing library (e.g., OpenGL or Direct3D), and constructs a three-dimensional model and generates and renders a three-dimensional image.
Opengl (open Graphics library) refers to a professional graphical program interface defining a cross-programming language and cross-platform programming interface specification, which is used for three-dimensional images (two-dimensional ones are also available), and is a powerful underlying Graphics library that is convenient to call.
Direct3D is a 3D graphics API based on microsoft's common Object mode com (common Object mode), and can bypass the Graphics Display Interface (GDI) to directly perform the underlying operations of various hardware supporting the API, thereby increasing the operating speed, and being suitable for wide and practical 3D graphics calculations such as multimedia, entertainment, and instant 3D animation.
The data transmission unit 204 receives information sent by the object sensing and tracking unit 206 in real time, and transmits the information to the data processing unit 202 for processing, and tracks operation information of an injector (note that the injector is a device in the device access unit, that is, the injector needs to be accessed into a computer and communicate with the data transmission unit to transmit sensor data in the injector to the data transmission unit, so that the data transmission unit sends the information to the data processing unit), and transmits the operation information to the data processing unit for processing, so as to generate a three-dimensional virtual image.
The data processing unit may be provided in a computer that provides a device access interface to facilitate data transfer between the injector and the object sensor tracking unit 206 in a three-dimensional volumetric model.
The object sensor tracking unit 206 is arranged in the arm three-dimensional model and the injector, during actual operation, as the arm three-dimensional model and the injector are distributed with a plurality of sensors, once the injector is inserted into a blood vessel on the arm three-dimensional model, the data processing unit determines whether the current operation of a user is correct according to the information provided by the sensor in the injector and the arm three-dimensional model sensor, if the current operation of the user is incorrect, the current operation data is transmitted to the VR display equipment, and then the VR display equipment immediately presents a three-dimensional virtual image of blood vessel rupture, bleeding and redness, so that the most real injection simulation effect is achieved. In addition, the arm simulation effect data of various forms can be stored in a model database of the computer for standby application when the user exercises.
The sensor here includes: the device comprises a position sensor and a pressure sensor, wherein the position sensor is used for acquiring real-time position data of an instrument held by a user during actual application, and the position sensor is used for acquiring real-time operation position data of a three-dimensional model acted by the instrument; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
The display unit 207 is configured to receive the 3D image data generated by the image generation unit 203 and display the 3D image data in a display device, such as VR glasses, and a user can view an operation simulation effect on the virtual reality interactive operation platform by wearing the VR glasses; the VR glasses can be replaced by VR helmets, and this is not a limitation.
Optionally, the virtual reality-based intravenous injection simulation training system of this embodiment further includes: and the verification unit 205 records the times of user operation errors according to the operation data of the user, and then scores and grades the user operation according to the weight assignment and quantitative evaluation information set in the model database, so as to be used for subsequent viewing and reference, and improve the operation success rate.
The specific working process of the intravenous injection simulation training system based on the virtual reality technology comprises the following steps:
starting the intravenous injection simulation training system based on the virtual reality technology, entering initialization work, and enabling a user to select an operation object (namely, a specific human arm model).
Step 201: the user picks up the syringe for operation, the object sensor tracking unit 206 transmits position information and motion information, the real-time sensing is performed, the data processing unit 202 performs gesture recognition and other work, and the image generating unit 203 is notified to generate a three-dimensional image according to the recognized corresponding operation.
Step 202: during the injection operation, the object sensor tracking unit 207 and the data transmission unit 204 perform interactive tracking injection operation to obtain information of the injection operation, and transmit the information to the data processing unit 202 for processing. In addition, the collected data may be periodically sent to the object sensor tracking unit 207 to obtain the latest data from the object sensor tracking unit 207, so as to collect the action instruction of the user and accurately collect the user operation information.
Step 203: the data processing unit 202 processes the data transmitted from the data transmission unit 204, and after the user performs the injection operation on the arm three-dimensional model in the real space, the image generation unit 203 starts a virtual environment subroutine in the computer and calls the data of the arm three-dimensional model, so that the three-dimensional model feeds back the action instruction of the operator in time, and the result of the simulated injection operation is presented to the user.
Step 204: and after the simulated injection operation is finished, evaluating the simulated injection operation according to the recorded total times of the operation and the times of operation errors, and typing out a reference score.
Compared with the existing intravenous injection model, the intravenous injection model has the advantages of high repeatable utilization rate, low cost, high vein simulation degree and visual venipuncture simulation result, and solves the problems of few learning opportunities, unskilled operation and the like of venipuncture of medical staff such as nurses and the like. The teaching cost and the medical risk are reduced, the training resources are saved, the teaching quality is improved, and the method is simple, efficient, easy to operate, convenient for clinical popularization and popularization, and has higher application value and significance.
EXAMPLE III
The embodiment provides a surgical operation simulation training system based on virtual reality. Specifically, the following tools are required to be used when the virtual reality-based surgical operation simulation training system is applied:
(1) virtual reality VR scalpel (optionally still include the operation gloves), VR scalpel contains the sensor here, can utilize the sensor to carry out data acquisition and action response to carry out gesture recognition, discern user's operation action. The sensor here includes: the device comprises a position sensor and a pressure sensor, wherein the position sensor is used for acquiring real-time position data of an instrument held by a user during actual application, and the position sensor is used for acquiring real-time operation position data of a three-dimensional model acted by the instrument; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
(2) The system can preset human body three-dimensional model data for a user to select the human body organ three-dimensional model as a surgical object, and a plurality of sensors are distributed at the main position of the model organ and each large blood vessel. In this way, during the operation of the user, once the scalpel touches a certain blood vessel, the sensor sends sensing information to the data processing unit for further data analysis and processing.
(3) The computer is used for realizing the functions of model data storage, data processing, image generation, data transmission and the like;
(4) and the VR display equipment (such as VR glasses or VR helmet) is used for displaying the virtual reality three-dimensional image.
The invention applies the virtual technology to the surgery simulation training and has three advantages: firstly, the physiological metabolism dynamic process is truly presented, the metabolic data rate of organs in different age groups is adjusted, and the surgical operation site is highly simulated. Secondly, the operation progress is fed back in real time, the operation result of an operator is sensed by using a model sensing technology, the operation process is fed back really, a good practical operation technology is laid for the operator, and the operation success rate is improved. And thirdly, the teaching cost of the clinical operation is reduced.
Fig. 3 is a block diagram showing a configuration of a surgical simulation training system according to an embodiment of the present invention, and referring to fig. 3, the surgical simulation training system 30 includes:
a viewpoint determining unit 301, where the viewpoint determining unit includes two sub-units, namely, an eye movement tracking determining viewpoint sub-unit and a head movement tracking determining viewpoint sub-unit, and a camera is arranged in a virtual reality display device, for example, a VR helmet, and performs eye tracking by using the camera, and determines a current viewpoint of a user according to real-time movement data of the user's eyes; alternatively, a camera and a specific sensor (e.g., an acceleration sensor) are provided in the VR headset, and head movement position tracking is performed, so as to determine the viewpoint. It should be noted that the manner of determining the viewpoint in the present embodiment can be implemented by using the prior art, and therefore, a description of how to determine the viewpoint in a more detailed implementation manner is not repeated in the present embodiment, and reference may be made to the related description in the prior art.
The surgical object model 302, here, the surgical object model is generated by the image processing unit in the computer according to the three-dimensional model of the human organ selected by the user. For example, if the user selects a lung organ model in the real space to perform a surgical operation, the image generation unit in the virtual reality-based surgical operation simulation training system of the present embodiment selects model data of a corresponding lung organ from the operation model database, constructs a three-dimensional virtual model, and presents the three-dimensional virtual model in the virtual reality display device.
The hand motion collecting unit 303 is configured to collect hand motions of the user by using a sensor on the scalpel or the surgical glove, for example, a current position of the hand of the user, direction information of the user operation, force information, and the like, and determine a position of the surgical object model operated by the user according to the information and information collected by the sensor in the three-dimensional model of the specific part of the human body. In the previous example, when the user performs a surgical operation on a lung organ at present, the sensor on the three-dimensional lung organ model in the real space acquires whether the scalpel held by the user contacts a blood vessel on the lung organ, if the scalpel contacts the blood vessel, the sensor can immediately acquire the data and send the operation data to the data processing unit, then the data processing unit sends the data to the image generation unit, and the image generation unit generates a corresponding virtual reality three-dimensional image according to the operation of the user on the blood vessel and displays the virtual reality three-dimensional image in front of the eyes of the user.
The surgical model database 304 stores organ model data of various specific parts of the human body for the user to select.
A data processing unit 305 for performing data analysis processing by using data transmitted from a plurality of sensors distributed in the organ (e.g., lung) model and sensors distributed on the scalpel. Once the scalpel touches the corresponding position of the sensor, whether the current operation is correct is determined according to the information provided by the sensor, and the data sensed by the sensor is transmitted to the data processing unit for data analysis: for example, once a lung blood vessel is touched, the data processing unit acquires operation data of the touched blood vessel and sends the operation data to the image generation unit, and the image generation unit generates a three-dimensional image of the broken blood vessel and sends the three-dimensional image to the display unit for presentation.
An image generating unit 306 for generating a three-dimensional virtual organ model and generating and rendering a corresponding three-dimensional virtual image from the data transmitted from the data processing unit 305.
In addition, due to the optical characteristics of the lenses in the VR headset (all are concave lenses, which generate image distortion during display), the image generation unit 306 in this embodiment performs distortion processing on the generated image, that is, performs distortion correction on the generated three-dimensional virtual image before sending the image to the display unit, so as to achieve the desired display effect.
And a virtual reality display unit 307 for displaying the three-dimensional virtual image generated by the image generation unit 306, wherein the user sees the simulated operation effect of the real operation and the VR surgical scene by wearing the virtual reality display unit 307.
The specific working process of the surgical operation simulation training system based on the virtual reality technology comprises the following steps:
step 301: the user wears the VR head-mounted device and picks up the scalpel.
Step 302: an image processing unit in the computer generates a corresponding operation virtual model image according to an operation object in the real space and transmits the operation virtual model image into the VR display equipment.
Step 303: and (4) performing operation by the user, and performing operation action identification.
Step 304: the VR headset determines the position of the user's head or tracks the user's eyes to determine the point of view.
Step 305: in the operation process, the sensors on the scalpel and the human body three-dimensional model sense the operation cutting point of the scalpel held by the user on the operation object and perform feedback.
Step 306: and generating a three-dimensional image according to the feedback information, and transmitting the three-dimensional image to VR head-mounted equipment for display.
Therefore, the surgical operation simulation training system based on the virtual reality technology of the embodiment provides visual and tactile simulation perception through the virtual reality technology, simulates real operation, and is high in interactivity. The teaching cost and the medical risk are reduced, the medical training resources are saved, the teaching quality is improved, and the method is simple, efficient, easy to operate and convenient to popularize.
Example four
FIG. 4 is a flow chart of a medical simulation training method according to an embodiment of the present invention. Referring to fig. 4, the virtual reality-based medical simulation training method of the present embodiment includes the following steps:
step S401, collecting three-dimensional model data of a specific part of a human body, constructing a corresponding three-dimensional virtual model object based on the collected three-dimensional model data, and storing the constructed three-dimensional virtual model object;
step S402, acquiring real-time position data and pressure data of a held instrument through a sensor arranged on the instrument held by a user and tracking and acquiring position data and pressure data of a three-dimensional model acted by the instrument through a sensor in the three-dimensional model;
step S403, processing the acquired position data and pressure data to obtain shape data of the three-dimensional model and operation data of the user on the three-dimensional model;
step S404, generating a three-dimensional virtual image comprising simulation operation result information according to the shape data of the three-dimensional model and the operation data of the three-dimensional model by the user;
and S405, displaying and outputting the three-dimensional virtual image for the user to check the simulation operation result.
In one embodiment of the present invention, the step S404 of generating a three-dimensional virtual image including simulation operation result information from shape data of the three-dimensional stereoscopic model and operation data of the user includes:
according to the shape data of the three-dimensional model, the three-dimensional model corresponding to the shape data is recognized in a preset model database, and according to the operation data of the three-dimensional model by a user, a three-dimensional image processing library is called to generate a three-dimensional image comprising simulation operation result information.
In an embodiment of the present invention, the step S401 of acquiring three-dimensional stereo model data of a specific part of a human body includes:
three-dimensional model data of human arms and internal organs are collected,
the user-held instrument includes: a virtual reality scalpel or a virtual reality injector,
the method for tracking and collecting the operation data of the user on the three-dimensional model through the sensor arranged on the instrument held by the user and the sensor in the three-dimensional model acted by the instrument comprises the following steps:
acquiring real-time position data of an instrument held by a user through a position sensor, and acquiring real-time operation position data of a three-dimensional model acted by the instrument through the position sensor; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
In one embodiment of the present invention, the method shown in fig. 4 further comprises: tracking the eyeball movement of the user, or acquiring head movement data of the user in real time to track the head movement of the user, determining the viewpoint of the user in real time according to the eyeball movement or the head movement, and adjusting the display output position of the three-dimensional virtual image according to the viewpoint of the user determined in real time when the three-dimensional virtual image is received.
Displaying the output three-dimensional virtual image includes displaying the output three-dimensional virtual image through virtual reality glasses or a virtual reality helmet.
In step S404 of one embodiment of the present invention, generating a three-dimensional virtual image including simulation operation result information from shape data of the three-dimensional stereoscopic model and operation data of the user further includes: according to the optical characteristics of the lenses in the virtual reality glasses or the virtual reality helmet, distortion correction is performed on the three-dimensional virtual image before the three-dimensional virtual image is sent to the virtual reality glasses or the virtual reality helmet.
In summary, in the medical simulation training method of this embodiment, the virtual reality technology is used to acquire three-dimensional stereo model data of a specific part of a human body and construct a virtual model object, operation data of a user on the three-dimensional stereo model is received, a corresponding virtual reality image is generated according to the operation data, and the generated virtual reality image is presented in the virtual reality display device, so that the user can conveniently view an operation result. Therefore, the sensors are arranged on the instruments and the operation objects operated in the real space, the operation data of the user are collected in real time, the corresponding three-dimensional virtual reality image is manufactured, the simulated touch perception and the simulated visual perception are provided for the user, the immersion feeling of the user is improved, and the interactivity is strong. The operation data can be directly collected and updated in real time, the operation result is fed back truly, and the training effect is improved. The medical teaching and operation training device meets the requirements of medical teaching and operation training, is simple, efficient and easy to operate, is convenient for clinical popularization and popularization, and has high application value.
While the foregoing is directed to embodiments of the present invention, other modifications and variations of the present invention may be devised by those skilled in the art in light of the above teachings. It should be understood by those skilled in the art that the foregoing detailed description is for the purpose of illustrating the invention rather than the foregoing detailed description, and that the scope of the invention is defined by the claims.

Claims (10)

1. A medical simulation training system, the system comprising:
the object model building module is used for collecting three-dimensional model data of a specific part of a human body, building a corresponding three-dimensional virtual model object based on the collected three-dimensional model data, and sending the built three-dimensional virtual model object to a model database;
a model database for storing three-dimensional virtual model object data;
the sensor tracking module is used for acquiring real-time position data and pressure data of the held instrument through a sensor arranged on the instrument held by a user, and acquiring position data and pressure data of a three-dimensional model through the sensor in the three-dimensional model acted by the instrument in a tracking manner;
the data processing module is used for processing the position data and the pressure data acquired by the sensor tracking module to obtain shape data of the three-dimensional model and operation data of a user on the three-dimensional model, and outputting the shape data of the three-dimensional model and the operation data of the user on the three-dimensional model to the image generation module;
the image generation module is used for generating a three-dimensional virtual image comprising simulation operation result information according to the shape data of the three-dimensional model and the operation data of the three-dimensional model by the user and sending the generated three-dimensional virtual image to the display module;
the display module is used for displaying and outputting a three-dimensional virtual image for a user to check a simulation operation result;
the verification module is used for recording the times of user operation errors according to the operation data of the user and grading the operation of the user;
the sensor tracking module is specifically used for acquiring real-time position data of an instrument held by a user through a position sensor and acquiring real-time position data of a three-dimensional model acted by the instrument through the position sensor; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
2. The system according to claim 1, wherein the image generating module is specifically configured to identify a three-dimensional stereo model corresponding to the shape data in the preset model database according to the shape data of the three-dimensional stereo model, and invoke a three-dimensional image processing library according to operation data of a user on the three-dimensional stereo model to generate a three-dimensional image including simulation operation result information.
3. The system according to claim 1, wherein the object model constructing module is specifically configured to collect three-dimensional model data of the human arm and the internal organs, and construct a three-dimensional virtual model object corresponding to the human arm and a three-dimensional virtual model object corresponding to each internal organ based on the collected three-dimensional model data of the human arm and the internal organs.
4. The system of claim 1, wherein the user-held instrument comprises: a virtual reality scalpel or a virtual reality injector.
5. The system according to claim 1, wherein the display module is provided with a camera for tracking the eye movement of the user, or the display module is provided with an acceleration sensor for collecting the head movement data of the user in real time to track the head movement of the user and determining the viewpoint of the user in real time according to the eye movement or the head movement,
and when the three-dimensional virtual image sent by the image generation module is received, adjusting the display output position of the three-dimensional virtual image according to the viewpoint of the user determined in real time.
6. The system of claim 5, wherein the display module is virtual reality glasses or a virtual reality helmet.
7. The system of claim 6, wherein the image generation module is further configured to perform distortion correction processing on the three-dimensional virtual image before sending the three-dimensional virtual image to the virtual reality glasses or virtual reality helmet according to optical characteristics of lenses in the virtual reality glasses or virtual reality helmet.
8. A method of medical simulation training, the method comprising:
acquiring three-dimensional model data of a specific part of a human body, constructing a corresponding three-dimensional virtual model object based on the acquired three-dimensional model data, and storing the constructed three-dimensional virtual model object;
acquiring real-time position data and pressure data of a held instrument through a sensor arranged on the instrument held by a user and tracking and acquiring the position data and the pressure data of a three-dimensional model through a sensor in the three-dimensional model acted by the instrument;
processing the collected position data and pressure data to obtain shape data of the three-dimensional model and operation data of a user on the three-dimensional model;
generating a three-dimensional virtual image comprising simulation operation result information according to the shape data of the three-dimensional model and the operation data of the three-dimensional model by a user;
displaying and outputting a three-dimensional virtual image for a user to check a simulation operation result;
recording the times of user operation errors according to the operation data of the user, and grading the operation of the user;
the method for acquiring real-time position data and pressure data of the instrument held by a user through a sensor arranged on the instrument held by the user and acquiring position data and pressure data of a three-dimensional model acted by the instrument through a sensor in the three-dimensional model in a tracking manner comprises the following steps:
acquiring real-time position data of an instrument held by a user through a position sensor, and acquiring real-time position data of a three-dimensional model acted by the instrument through the position sensor; and acquiring real-time pressure data of an instrument held by a user through the pressure sensor, and acquiring real-time pressure data of a three-dimensional model acted by the instrument through the pressure sensor.
9. The method according to claim 8, wherein the generating a three-dimensional virtual image including simulation operation result information from shape data of the three-dimensional stereoscopic model and operation data of a user comprises:
according to the shape data of the three-dimensional model, the three-dimensional model corresponding to the shape data is recognized in a preset model database, and according to the operation data of the three-dimensional model by a user, a three-dimensional image processing library is called to generate a three-dimensional image comprising simulation operation result information.
10. The method of claim 8, wherein the acquiring three-dimensional stereo model data of a specific part of a human body comprises:
three-dimensional model data of human arms and internal organs are collected,
the user-held instrument includes: a virtual reality scalpel or a virtual reality injector.
CN201611266712.3A 2016-12-31 2016-12-31 Medical simulation training system and method Active CN107067856B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611266712.3A CN107067856B (en) 2016-12-31 2016-12-31 Medical simulation training system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611266712.3A CN107067856B (en) 2016-12-31 2016-12-31 Medical simulation training system and method

Publications (2)

Publication Number Publication Date
CN107067856A CN107067856A (en) 2017-08-18
CN107067856B true CN107067856B (en) 2020-03-27

Family

ID=59624651

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611266712.3A Active CN107067856B (en) 2016-12-31 2016-12-31 Medical simulation training system and method

Country Status (1)

Country Link
CN (1) CN107067856B (en)

Families Citing this family (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2915157B1 (en) 2012-10-30 2019-05-08 Truinject Corp. System for injection training
EP3111438B1 (en) 2014-01-17 2018-12-19 Truinject Medical Corp. Injection site training system
US10290231B2 (en) 2014-03-13 2019-05-14 Truinject Corp. Automated detection of performance characteristics in an injection training system
KR20170102233A (en) 2014-12-01 2017-09-08 트루인젝트 코프 Injection training tool emitting omnidirectional light
US10500340B2 (en) 2015-10-20 2019-12-10 Truinject Corp. Injection system
WO2017151441A2 (en) 2016-02-29 2017-09-08 Truinject Medical Corp. Cosmetic and therapeutic injection safety systems, methods, and devices
WO2017151963A1 (en) 2016-03-02 2017-09-08 Truinject Madical Corp. Sensory enhanced environments for injection aid and social training
US10648790B2 (en) 2016-03-02 2020-05-12 Truinject Corp. System for determining a three-dimensional position of a testing tool
CN107066082B (en) * 2016-12-30 2018-10-02 百度在线网络技术(北京)有限公司 Display methods and device
WO2018136901A1 (en) 2017-01-23 2018-07-26 Truinject Corp. Syringe dose and position measuring apparatus
CN107393390A (en) * 2017-08-25 2017-11-24 江苏嘉纳宝医疗科技有限公司 Virtual reality first aid training model person and training system
CN107591074A (en) * 2017-09-05 2018-01-16 杭州市第三人民医院 A kind of Flexible ureteroscope simulation learning method made based on 3D printing
CN107527542B (en) * 2017-09-18 2020-09-08 南京梦宇三维技术有限公司 Percussion training system based on motion capture
CN107633724B (en) * 2017-09-18 2021-04-20 南京梦宇三维技术有限公司 Auscultation training system based on motion capture
CN107657881B (en) * 2017-10-17 2020-07-03 天津大学 Near-distance particle implantation operation training method based on virtual reality
CN109841135A (en) * 2017-11-29 2019-06-04 深圳市掌网科技股份有限公司 A kind of operative training method and system based on virtual reality
CN107862966A (en) * 2017-12-14 2018-03-30 江苏农牧科技职业学院 Acupoint animal model and its component
CN108053709A (en) * 2017-12-29 2018-05-18 六盘水市人民医院 A kind of department of cardiac surgery deep suture operation training system and analog imaging method
CN108335599B (en) * 2018-01-19 2020-02-04 聊城市人民医院 Operation model training method based on three-dimensional modeling image technology
CN108320645B (en) * 2018-01-19 2020-02-07 中南大学湘雅二医院 Medical simulation training method
CN108510939A (en) * 2018-04-24 2018-09-07 中航大(天津)模拟机工程技术有限公司 Simulation imaging system based on flexible O L ED display screen
CN108538095A (en) * 2018-04-25 2018-09-14 惠州卫生职业技术学院 Medical teaching system and method based on virtual reality technology
US20210228276A1 (en) * 2018-04-27 2021-07-29 Crisalix S.A. Medical Platform
CN110610632A (en) * 2018-06-15 2019-12-24 刘军 Virtual in-vivo navigation system for vascular intervention operation
CN109064817A (en) * 2018-07-18 2018-12-21 杭州行开科技有限公司 Surgery simulation system based on CT Three-dimension Reconstruction Model
CN109064819B (en) * 2018-07-24 2020-12-18 贝利尔科技发展南京有限公司 Virtual teaching method
CN108922300A (en) * 2018-07-24 2018-11-30 杭州行开科技有限公司 Surgical simulation 3D system based on digitized humans
CN109065147A (en) * 2018-07-30 2018-12-21 广州狄卡视觉科技有限公司 Medical Digital 3D model human body surgical simulation human-computer interaction system and method
CN109061892A (en) * 2018-09-27 2018-12-21 广州狄卡视觉科技有限公司 Plastic surgery medical image Model Reconstruction interacts naked-eye stereoscopic display system and method
CN109410721A (en) * 2018-10-30 2019-03-01 深圳市墨优科技开发有限公司 A kind of emergency care training method and terminal
CN109935313A (en) * 2019-03-04 2019-06-25 苏州达辰医疗科技有限公司 The medical application of three-dimensional visualization and mixed reality
CN109767681A (en) * 2019-03-28 2019-05-17 福建水立方三维数字科技有限公司 Heimlich first aid Operations Simulation Facility
CN110335516B (en) * 2019-06-27 2021-06-25 王寅 Method for performing VR cardiac surgery simulation by adopting VR cardiac surgery simulation system
CN110353771B (en) * 2019-07-10 2020-07-10 解涛 VR simulation training scalpel for brain surgery
EP3771449A1 (en) 2019-07-31 2021-02-03 Siemens Healthcare GmbH Method for deformation simulation and device
CN110556031A (en) * 2019-09-06 2019-12-10 重庆大学附属肿瘤医院 Medical guidance system for ostomy patient
CN110930804A (en) * 2019-12-19 2020-03-27 郑州工业应用技术学院 Clinical medicine education system based on cloud platform
CN111127991B (en) * 2020-02-27 2024-06-04 无锡市妇幼保健院 3D medical teaching system
CN111243413B (en) * 2020-03-06 2021-07-02 吉林大学 Modeling method and teaching system for facial anatomy teaching
CN111613122A (en) * 2020-05-19 2020-09-01 威爱医疗科技(中山)有限公司 Virtual-actual fused vascular interventional operation simulation system
CN112120908A (en) * 2020-08-12 2020-12-25 厦门立方幻境科技有限公司 Multi-mode interactive joint loosening training equipment and training method
CN112535533B (en) * 2020-12-06 2022-08-05 西安交通大学 Supporting device for helping remote operation by 3D printing focus model
CN112951014A (en) * 2021-01-29 2021-06-11 中南大学湘雅三医院 Intelligent medical education system, method, terminal and medium integrating teaching and practical training
CN112820376A (en) * 2021-01-30 2021-05-18 天津体育学院 Hemiplegia rehabilitation training system based on virtual reality technology rowing motion
CN113269995B (en) * 2021-05-28 2023-05-02 黑龙江拓盟科技有限公司 MR-based simulation training system for peritoneal dialysis catheterization by using B ultrasonic guided puncture method
CN113257387B (en) * 2021-06-07 2023-01-31 上海圻峰智能科技有限公司 Wearable device for rehabilitation training, rehabilitation training method and system
CN113823172A (en) * 2021-09-18 2021-12-21 安徽声谷智能科技有限公司 Machine vision system application experiment development platform
CN117253409A (en) * 2023-11-16 2023-12-19 成都泰盟软件有限公司 Haimarick first aid teaching system

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120251991A1 (en) * 2004-11-30 2012-10-04 Eric Savitsky Peripheral Probe with Six Degrees of Freedom Plus Compressive Force Feedback
CN101923799A (en) * 2010-09-26 2010-12-22 天津信仁科技发展有限公司 Chinese medicinal diagnosis and treatment and acupoint massage teaching training system for head and neck
US8992230B2 (en) * 2012-01-23 2015-03-31 Virtamed Ag Medical training systems and methods
CN103106348A (en) * 2013-03-08 2013-05-15 上海交通大学医学院附属第九人民医院 Virtual surgery simulation method and device thereof
CN204029245U (en) * 2014-08-01 2014-12-17 卓思生命科技有限公司 A kind of surgery simulation system
CN105395252A (en) * 2015-12-10 2016-03-16 哈尔滨工业大学 Wearable three-dimensional image navigation device for vascular intervention operation and realizing man-machine interaction
CN106157776A (en) * 2016-08-12 2016-11-23 北京大学人民医院 Interactive virtual fractional curettage art teaching system and teaching method

Also Published As

Publication number Publication date
CN107067856A (en) 2017-08-18

Similar Documents

Publication Publication Date Title
CN107067856B (en) Medical simulation training system and method
Coles et al. Integrating haptics with augmented reality in a femoral palpation and needle insertion training simulation
AU762444B2 (en) Endoscopic tutorial system
CN106327983A (en) Acupuncture acupoint determination auxiliary teaching system
CN110335516B (en) Method for performing VR cardiac surgery simulation by adopting VR cardiac surgery simulation system
EP3423972A1 (en) Sensory enhanced environments for injection aid and social training
CN104778894A (en) Virtual simulation bone-setting manipulation training system and establishment method thereof
CN106293082A (en) A kind of human dissection interactive system based on virtual reality
CN110400620B (en) Heart three-dimensional model construction method and simulated heart operation guidance system
Mathew et al. Role of immersive (XR) technologies in improving healthcare competencies: a review
CN110826835A (en) Glove-based acupuncture training method, system, platform and storage medium
CN107361843A (en) Immersion neurosurgery analogy method with true sense of touch
Li et al. Design and evaluation of personalized percutaneous coronary intervention surgery simulation system
Soto et al. AR stereoscopic 3D human eye examination app
Greenleaf Developing the tools for practical VR applications [Medicine]
Camporesi et al. The effects of avatars, stereo vision and display size on reaching and motion reproduction
CN111276022A (en) Gastroscope simulation operation system based on VR technique
Wagner et al. Intraocular surgery on a virtual eye
CN116631252A (en) Physical examination simulation system and method based on mixed reality technology
Coles Investigating augmented reality visio-haptic techniques for medical training
KR20030091343A (en) Oriental medicine acupuncture system in virtual reality environment and method of training using it
KR20200080534A (en) System for estimating otorhinolaryngology and neurosurgery surgery based on simulator of virtual reality
KR102444058B1 (en) System for mixed-reality acupuncture training with dummy and acupuncture controller
Sun et al. Design and development of a mixed reality acupuncture training system
EP4243001A1 (en) Simulator system for the safe training of medical personnel

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201015

Address after: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronic office building)

Patentee after: GoerTek Optical Technology Co.,Ltd.

Address before: 266104 Laoshan Qingdao District North House Street investment service center room, Room 308, Shandong

Patentee before: GOERTEK TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221215

Address after: 266104 No. 500, Songling Road, Laoshan District, Qingdao, Shandong

Patentee after: GOERTEK TECHNOLOGY Co.,Ltd.

Address before: 261031 north of Yuqing street, east of Dongming Road, high tech Zone, Weifang City, Shandong Province (Room 502, Geer electronics office building)

Patentee before: GoerTek Optical Technology Co.,Ltd.

TR01 Transfer of patent right