CN111588999A - Operation guide model, head-wearing wearable device-assisted operation navigation system and method - Google Patents

Operation guide model, head-wearing wearable device-assisted operation navigation system and method Download PDF

Info

Publication number
CN111588999A
CN111588999A CN202010446648.7A CN202010446648A CN111588999A CN 111588999 A CN111588999 A CN 111588999A CN 202010446648 A CN202010446648 A CN 202010446648A CN 111588999 A CN111588999 A CN 111588999A
Authority
CN
China
Prior art keywords
surgical
model
image data
patient
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010446648.7A
Other languages
Chinese (zh)
Other versions
CN111588999B (en
Inventor
李硕
孙兴华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Unicorn Science And Technology Ltd
Original Assignee
Beijing Unicorn Science And Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Unicorn Science And Technology Ltd filed Critical Beijing Unicorn Science And Technology Ltd
Priority to CN202010446648.7A priority Critical patent/CN111588999B/en
Publication of CN111588999A publication Critical patent/CN111588999A/en
Application granted granted Critical
Publication of CN111588999B publication Critical patent/CN111588999B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1001X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy using radiation sources introduced into or applied onto the body; brachytherapy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/103Treatment planning systems
    • A61N5/1039Treatment planning systems using functional images, e.g. PET or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/105Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using a laser alignment system
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61NELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
    • A61N5/00Radiation therapy
    • A61N5/10X-ray therapy; Gamma-ray therapy; Particle-irradiation therapy
    • A61N5/1048Monitoring, verifying, controlling systems and methods
    • A61N5/1049Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam
    • A61N2005/1055Monitoring, verifying, controlling systems and methods for verifying the position of the patient with respect to the radiation beam using magnetic resonance imaging [MRI]

Abstract

The invention provides a surgery guiding model, a head wearable device assisted surgery navigation system and a method, wherein the head wearable device comprises: the image acquisition module is used for acquiring image data of a patient operation image through the image acquisition device, wherein the image data comprises an operation guide model arranged at an operation position of a patient; the image processing module is used for identifying and obtaining the position of the surgical guide model in the image data and the position of a surgical tool held by a doctor through an image identification technology, and displaying a preset surgical guide model and a preset three-dimensional model of the surgical tool and the image data to the doctor together according to the position of the surgical guide model and the position of the surgical tool; and the positioning navigation module is used for determining whether the relative position of the surgical guidance model and the surgical tool meets a preset condition, and if so, feeding navigation completion information back to a doctor.

Description

Operation guide model, head-wearing wearable device-assisted operation navigation system and method
Technical Field
The invention relates to the technical field of wearable medical equipment, in particular to a surgery guide model, a head-mounted wearable equipment-assisted surgery navigation system and a head-mounted wearable equipment-assisted surgery navigation method.
Background
In the prior art, radiation therapy (radiotherapy) is one of the important treatments for tumors. In order to minimize side effects of radiotherapy and damage to normal tissues caused by radiotherapy, a radioactive seed implantation treatment technique (hereinafter, referred to as "seed implantation") has been developed.
With the development of medical technology, in the clinical practice of current particle implantation, medical scanning data, such as CT scanning data and MRI scanning data, can be obtained by scanning a patient with a medical imaging device. Doctors can draw the tumor focus area and the treatment target area of the implanted particles according to the medical scanning data and make a treatment plan. The treatment plan can comprise the puncture position, the depth, the angle, the number of implanted particles and the like of the puncture needle for implanting the particles, a surgery guide model can be designed and obtained according to the treatment plan, and the surgery guide model can be obtained through 3D printing and other forming technologies according to the three-dimensional data of the surgery guide model. The doctor is guided to puncture and implant particles through the operation guide model, and compared with the traditional method that the puncture needle is inserted by depending on experience, the efficiency and the accuracy of the radiotherapy mode are improved.
However, the operation guide model can only guide the doctor to apply the needle, and the doctor cannot directly observe and determine whether the puncture needle is punctured in place or not, and still needs to judge whether the puncture needle is punctured in place through experience or CT scanning after the implantation of the particles, so that the accuracy and efficiency of the particle implantation operation are reduced.
Disclosure of Invention
It is an object of the present invention to provide a head-worn wearable device-assisted surgery navigation system that tracks a physician's procedure for a particle implantation procedure and ensures proper implementation of a treatment plan through virtual reality techniques. It is another object of the present invention to provide a surgical guide model. It is yet another object of the present invention to provide a head-mounted wearable device-assisted surgical navigation method. It is a further object of the present invention to provide a computer apparatus. It is a further object of this invention to provide such a readable medium.
In order to achieve the above object, an aspect of the present invention discloses a head-mounted wearable device assisted surgery navigation system, the head-mounted wearable device including:
the image acquisition module is used for acquiring image data of a patient operation image through the image acquisition device, wherein the image data comprises an operation guide model arranged at an operation position of a patient;
the image processing module is used for identifying and obtaining the position of the surgical guide model in the image data and the position of a surgical tool held by a doctor through an image identification technology, and displaying a preset surgical guide model and a preset three-dimensional model of the surgical tool and the image data to the doctor together according to the position of the surgical guide model and the position of the surgical tool;
and the positioning navigation module is used for determining whether the relative position of the operation guide model and the operation tool meets a preset condition or not, and if so, feeding navigation completion information back to a doctor.
Preferably, the positioning calibration module further comprises:
a laser generating device for forming a laser marking line irradiated to the body surface of the patient;
and the positioning mark is used for being arranged on a laser marking line on the body surface of the patient so as to position the positioning mark at the position of the operation guide model.
Preferably, the surgical tool is provided with a moving mark;
the image processing module is specifically configured to identify and obtain positions of the positioning marker and the moving marker in the image data through an image identification technology, and to respectively serve as the position of the surgical guide model and the position of the surgical tool.
Preferably, the surgical guide model is provided with a positioning line which is at least partially identical to the laser marking line and a positioning hole which is matched and nested with the positioning mark.
Preferably, the surgical guide model is matched with the surgical part of the patient in shape, the surgical tool is a puncture needle, the surgical guide model is further provided with at least one guide post, and a hollow part which is convenient for the puncture needle to penetrate through is formed in the center of the guide post.
Preferably, the surgical system further comprises a surgical data acquisition device for acquiring patient scanning data, forming a treatment plan according to surgical information input by a doctor, and forming preset surgical data according to the treatment plan, wherein the preset surgical data comprises three-dimensional model data of a surgical guide model and a surgical tool and the preset conditions.
Preferably, the image processing module is further configured to form a three-dimensional model of the surgical site according to the patient scan data, and display the three-dimensional model of the surgical site and the image data together to the surgeon according to the surgical guide model position.
Preferably, the image processing module is further configured to display the image data based on a preset display mode according to model confirmation information input by a doctor;
the preset display mode includes at least one of deleting the image data, hiding the image data, and displaying the image data of a preset transparency.
The invention also discloses a surgical guide model, which comprises a surgical guide model matched with the surgical part of a patient in shape, wherein at least one guide column is arranged on the surgical guide model, and a hollow part facilitating the penetration and insertion of a puncture needle is formed in the center of the guide column;
the operation guide model is provided with a positioning line which is at least partially identical to the laser marking line irradiated on the body surface of the patient and a positioning hole which is matched and nested with a positioning mark arranged on the laser marking line to position the operation guide model.
The invention also discloses a head-wearing wearable device assisted surgery navigation method, which comprises the following steps:
acquiring image data of a patient operation image through an image acquisition device, wherein the image data comprises an operation guide model arranged at an operation position of a patient;
identifying and obtaining the position of a surgical guide model in the image data and the position of a surgical tool held by a doctor by an image identification technology, and displaying a preset surgical guide model, a preset three-dimensional model of the surgical tool and the image data to the doctor together according to the position of the surgical guide model and the position of the surgical tool;
and determining whether the relative position of the surgical guidance model and the surgical tool meets a preset condition, and if so, feeding navigation completion information back to a doctor.
Preferably, further prior to acquiring image data of a surgical image of a patient by the image acquisition device:
forming a laser marking line and irradiating the laser marking line to the body surface of a patient;
and arranging a positioning mark for positioning the position of the operation guide model on a laser marking line on the body surface of the patient.
Preferably, the surgical tool is provided with a moving mark;
the identifying and obtaining of the surgical guide model position in the image data and the surgical tool position of the surgical tool held by the doctor by the image identification technology specifically includes:
and identifying the positions of the positioning mark and the moving mark in the image data through an image identification technology, and respectively using the positions as the position of the surgical guide model and the position of the surgical tool.
Preferably, the surgical guide model is provided with a positioning line which is at least partially identical to the laser marking line and a positioning hole which is matched and nested with the positioning mark.
Preferably, the surgical guide model is matched with the surgical part of the patient in shape, the surgical tool is a puncture needle, the surgical guide model is further provided with at least one guide post, and a hollow part which is convenient for the puncture needle to penetrate through is formed in the center of the guide post.
Preferably, the method further comprises, before acquiring image data of a surgical image of the patient by the image acquisition device:
acquiring patient scan data;
forming a treatment plan according to the operation information input by the doctor;
and forming preset operation data according to the treatment plan, wherein the preset operation data comprises three-dimensional model data of an operation guide model and an operation tool and the preset condition.
Preferably, the method further comprises the following steps:
and forming a three-dimensional model of the operation position according to the scanning data of the patient, and displaying the three-dimensional model of the operation position and the image data to a doctor together according to the position of the operation guide model.
Preferably, the method further comprises the following steps:
displaying the image data based on a preset display mode according to model confirmation information input by a doctor, wherein the preset display mode comprises at least one of deleting the image data, hiding the image data and displaying the image data with a preset transparency.
The invention also discloses a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor,
the processor, when executing the program, implements the method as described above.
The invention also discloses a computer-readable medium, having stored thereon a computer program,
which when executed by a processor implements the method as described above.
The invention collects the image data of the surgery image of the patient through the head wearable device, wherein, the position of the patient to be implanted with the particles is provided with the surgery guide model, and the collected image data comprises the image of the surgery guide model. Furthermore, the positions of the surgical guide model and the surgical tool held by the doctor in the image data are identified through an image identification technology, the preset three-dimensional models of the surgical guide model and the surgical tool are superposed on the image data and are displayed to the doctor together with the image data, and the doctor can directly observe the relative positions of the surgical tool and the surgical guide model. Further, whether the relative position of the surgical guiding model and the surgical tool meets the preset condition or not is judged in real time according to the preset condition, if yes, navigation completion information is fed back to the doctor, the fact that the surgical tool reaches the correct position is indicated, the next operation can be carried out, therefore, the fact that the surgical process of the doctor is consistent with the preset condition is guaranteed, and the accuracy and efficiency of the surgery are improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 illustrates one of the block diagrams of one particular embodiment of a head-mounted wearable device assisted surgical navigation system;
FIG. 2 illustrates a second block diagram of an embodiment of a head-mounted wearable device assisted surgical navigation system;
FIG. 3 is a schematic diagram of laser marking lines and positioning marks in one embodiment of a head-mounted wearable device-assisted surgical navigation system;
FIG. 4 is a schematic diagram of a surgical guidance model in one embodiment of a head-mounted wearable device-assisted surgical navigation system;
FIG. 5 illustrates a third block diagram of an embodiment of a head-mounted wearable device assisted surgical navigation system;
FIG. 6 is a schematic diagram of a surgical site three-dimensional model, a surgical guidance model, and a surgical tool three-dimensional model in one embodiment of a head-mounted wearable device-assisted surgical navigation system;
FIG. 7 is a schematic diagram illustrating an application of one embodiment of a head-mounted wearable device-assisted surgical navigation system;
FIG. 8 illustrates one of the flow diagrams of one particular embodiment of a head-mounted wearable device-assisted surgical navigation method;
FIG. 9 illustrates a second flowchart of an embodiment of a head-mounted wearable device-assisted surgical navigation method;
FIG. 10 is a third flowchart of an embodiment of a head-mounted wearable device-assisted surgical navigation method;
FIG. 11 illustrates a schematic block diagram of a computer device suitable for use in implementing embodiments of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
With the development of the attached medical technology, in the current particle implantation surgery, in order to improve the efficiency and accuracy of the surgery, medical scanning data can be obtained by scanning a patient through a medical imaging device. The doctor can draw up the treatment target area of tumour focus region and injection particle according to medical science scanning data to formulate treatment plan, can design according to treatment plan and obtain the operation guide model, can obtain the operation guide model through forming techniques such as 3D printing according to the three-dimensional data of operation guide model. The doctor is guided to puncture and implant particles through the operation guide model, and compared with the traditional method that the puncture needle is inserted by depending on experience, the efficiency and the accuracy of the radiotherapy mode are improved. However, the operation guide model can only guide the doctor to apply the needle, and the doctor cannot directly observe whether the puncture needle is punctured in place or not, and cannot determine whether the puncture needle is punctured in place or not, and the judgment still needs to be carried out through experience, so that the accuracy and the efficiency of the particle implantation operation are reduced.
However, with the development of the computer field, the head wearable device can realize real environment and virtual object real-time superposition by adopting virtual reality, augmented reality and mixed reality technologies. Aiming at the problems in the prior art, the wearable head device is introduced into the tumor treatment operation, the internal condition of the operation position and the position of the operation guide model are displayed in real time when the tumor treatment is carried out on the wearable head device, the position of the operation tool held by a doctor is tracked, whether the operation tool reaches the operation position or not is determined on the wearable head device, information is fed back to the doctor, and the accuracy of the operation of the doctor is ensured through the tracking and feedback mechanism.
According to one aspect of the invention, the present embodiment discloses a head-worn wearable device assisted surgery navigation system. As shown in fig. 1 to 7, in the present embodiment, the head-mounted wearable device includes an image acquisition module 11, an image processing module 12, and a positioning navigation module 13.
The image acquisition module 11 is configured to acquire image data of a surgical image of a patient through an image acquisition device, where the image data includes a surgical guidance model 10 set at a surgical position of the patient.
The image processing module 12 is configured to identify the position of the surgical guide model 10 and the surgical tool position of the surgical tool held by the doctor in the image data through an image identification technology, and display the preset three-dimensional models of the surgical guide model 10 and the surgical tool and the image data to the doctor together according to the position of the surgical guide model 10 and the surgical tool position.
The positioning navigation module 13 is configured to determine whether a relative position between the surgical guidance model 10 and the surgical tool meets a preset condition, and if so, feed back navigation completion information to a doctor.
The invention collects the image data of the surgery image of the patient through the head wearable device, wherein, the position of the patient to be implanted with the particles is provided with the surgery guide model 10, and the collected image data comprises the image of the surgery guide model 10. Further, the positions of the surgical guide model 10 and the surgical tool held by the doctor in the image data are identified through an image identification technology, the preset three-dimensional models of the surgical guide model 10 and the surgical tool are superposed on the image data and are displayed to the doctor together with the image data, and the doctor can directly observe the relative positions of the surgical tool and the surgical guide model 10. Further, whether the relative position of the surgical guidance model 10 and the surgical tool meets the preset condition or not is judged in real time according to the preset condition, if yes, navigation completion information is fed back to the doctor, the surgical tool is shown to reach the correct position, and the next operation can be performed, so that the consistency of the surgical process of the doctor and the preset condition is ensured, and the accuracy and the efficiency of the surgery are improved.
In a preferred embodiment, as shown in FIG. 2, the system further includes a position calibration module 14. The positioning calibration module 14 includes a laser generator and a positioning mark 200.
Wherein the laser generating device is used to form a laser marking line 100 which is irradiated to the body surface of the patient. In a preferred embodiment, the laser generating device may be a CT device, and the laser marking line 100 is formed by using the laser generating function of the CT device.
The positioning mark 200 is used for positioning the positioning mark 200 on the laser marking line 100 on the body surface of the patient to position the surgical guide model 10.
It is understood that in the preferred embodiment, as shown in fig. 3, the laser generator can be controlled by an operator such as a doctor to form a laser marking line 100 and irradiate the surface of the patient for initially positioning the setting range of the surgical guide model 10. The doctor can set the positioning mark 200 on the laser marking line 100 according to the actual operation region, and after the operation guide model 10 is set at the corresponding position on the body surface of the patient, the positioning mark 200 corresponds to the position of the operation guide model 10, and then the positioning mark 200 can be used to determine the position of the operation guide model 10.
In a preferred embodiment, the surgical tool is provided with a movement marker. The image processing module 12 is specifically configured to identify and obtain positions of the positioning markers 200 and the moving markers in the image data through an image recognition technology, and respectively serve as the positions of the surgical guide model 10 and the surgical tool.
It is understood that the positioning and calibration module 14 includes positioning markers 200 corresponding to the positions of the surgical guide model 10, and by identifying the positioning markers 200 in the image data, the positions of the surgical guide model 10 can be determined according to the positional relationship between the positioning markers 200 and the surgical guide model 10. The surgery tool held by the doctor can be provided with the mobile mark, in the surgery process of the doctor, the surgery tool held by the doctor can appear in the collection range of the image collection device, the mobile mark in the collected image data is identified, the position of the surgery tool can be determined, and then surgery navigation is carried out according to the preset condition and the position of the surgery guide model 10.
In a preferred embodiment, the surgical guide former 10 is formed with a positioning line 300 at least partially identical to the laser marking line 100 and a positioning hole 400 matingly nested with the positioning mark 200.
It will be appreciated that the surgical guide former 10 is formed with a location line 300 that is at least partially identical to the laser marking line 100, as shown in fig. 4. The position of the surgical guide model 10 is adjusted so that the positioning line 300 on the surgical guide model 10 is arranged to coincide with the same portion of the laser marking line 100 as the positioning line 300, and thus the preliminary position of the surgical guide model 10 can be determined. Furthermore, the surgical guide model 10 is further provided with a positioning hole 400 which is matched and nested with the positioning mark 200, so that the positioning hole 400 of the surgical guide model 10 is nested on the positioning mark 200, and the surgical guide model 10 is further accurately positioned.
In a preferred embodiment, as shown in fig. 4, the surgical guide model 10 has a shape matching the surgical site of the patient, the surgical tool is a puncture needle 700, at least one guide post 500 is further provided on the surgical guide model 10, and a hollow portion 600 facilitating insertion of the puncture needle 700 is formed at the center of the guide post 500.
It can be understood that the shape of the surgical guide model 10 matches with the surgical site of the patient, so that the surgical guide model 10 can completely fit with the body surface of the patient, and the stability is enhanced. The surface of the surgical guide mold 10 is formed with at least one guide post 500, and a hollow portion 600 is formed at the center of each guide post 500. Specifically, the puncture needle 700 can be inserted through the hollow portion 600, and the angle and position of the guide post 500 can be set to guide the puncture needle 700 to puncture to the position and angle of the puncture needle 700 in the treatment plan, so that the guiding function of the surgical guide model 10 is realized, and the error caused by the fact that the doctor inserts the puncture needle 700 by experience is prevented.
In a preferred embodiment, as shown in fig. 5, the system further comprises a surgical data acquisition device 15. The operation data acquisition device 15 is configured to acquire patient scan data, form a treatment plan according to operation information input by a doctor, and form preset operation data according to the treatment plan, where the preset operation data includes the three-dimensional model data of the operation guide model 10 and the operation tool and the preset condition.
It will be appreciated that the surgical data acquisition apparatus 15 may acquire medical scan data from a scan of a patient by a medical imaging device. The medical imaging apparatus currently in common use includes a CT medical imaging apparatus, an MRI medical imaging apparatus, and the like, and other medical imaging apparatuses may also be used, which is not limited in the present invention. The doctor can determine operation information in advance according to the condition of the patient and input the operation information into the data acquisition device, and the operation data acquisition device 15 receives the operation information input by the doctor to form a treatment plan. The treatment plan may include data such as a puncture position, a depth, an angle, and a number of implanted particles of the puncture needle 700 into which the particles are implanted, and three-dimensional data of the surgical guide model 10 may be formed according to the treatment plan, and the surgical guide model 10 may be further obtained by a forming technique such as 3D printing. The surgical guide mold 10 may be obtained by a molding technique such as injection molding, but the present invention is not limited thereto.
In a preferred embodiment, as shown in fig. 6, the image processing module 12 is further configured to form a three-dimensional model of the surgical site from the patient scan data, and to display the three-dimensional model of the surgical site and the image data together to the surgeon based on the position of the surgical guide model 10.
It is understood that the image capturing device can only capture a three-dimensional image of the surface of the patient, and cannot observe the state of the internal operation, that is, the position of the puncture needle 700 in the patient, and the like. In the preferred embodiment, a three-dimensional model of the operation position can be formed according to medical scanning data obtained by scanning a patient through the medical imaging device, the three-dimensional model of the operation position can clearly see the operation position of the patient, and the position and the state of the operation tool in the patient body can be simulated through the displayed three-dimensional model of the operation tool. The three-dimensional model of the operation position is superposed on the image data and is displayed together with the image data to the doctor, so that the doctor can conveniently and visually observe the simulated real operation state, and meanwhile, the doctor can also confirm whether the positions of all the models displayed by the image processing module 12 are accurate or not by comparing the image data acquired by the image acquisition device with the three-dimensional models of the operation position, the operation guide model 10 and the operation tool displayed by the image processing module 12 to the doctor so as to determine whether the system operates normally or not.
In a preferred embodiment, the image processing module 12 is further configured to display the image data based on a preset display mode according to the model confirmation information input by the doctor, wherein the preset display mode includes at least one of deleting the image data, hiding the image data and displaying the image data with a preset transparency.
It is understood that when the doctor compares the displayed image data with the three-dimensional model of the surgical site, the surgical guidance model 10 and the three-dimensional model of the surgical tool, which are displayed to the doctor by the image processing module 12, and the positions of the respective models displayed by the confirmation image processing module 12 correspond to actual objects in the image data, the model confirmation information may be input through a touch display screen or keys of the head-mounted wearable device. The image processing module 12 may display the image data based on a preset display mode after receiving the model confirmation information input by the doctor. The preset display mode includes at least one of deleting the image data, hiding the image data, and displaying the image data with a preset transparency, and deleting the image data or hiding the image data, that is, only displaying the three-dimensional model data of the surgical site and the surgical guidance model 10, where the three-dimensional model data of the surgical site may be CT data, and a doctor may perform a particle implantation surgery directly according to the model displayed by the image processing module 12. The preset transparency can be set, the image data of the patient collected on site is displayed together with the preset operation data such as the operation position, the operation guide model 10 and the operation tool with a certain display transparency, so that the actual image data and the preset operation data are displayed together without being shielded, and a doctor can visually check the position of the operation guide model 10, the implantation condition of the particles and the like, as shown in fig. 6.
The invention will be further illustrated by means of a specific example. The doctor adjusts the body position of the patient according to the tumor position of the patient in advance, passes through a laser generating device such as a CT scanning device (CT medical imaging device), passes through a CT laser line (laser marking line 100) emitted by the CT scanning device on the body surface of the patient, and sets a positioning mark 200 on the CT laser line. In order to continuously maintain the function of the laser marking line 100, a line coincident with the CT laser line may be drawn on the patient's body surface by a marker pen to continuously maintain the position of the laser marking line 100.
Meanwhile, a doctor can perform preoperative scanning through a CT scanning device, and the thickness of a CT layer can be preferably selected to be 5mm, 2.5mm and 1mm respectively for scanning, so that medical scanning data can be obtained. The operation data acquisition device 15 acquires the Dicom-format scanning data obtained by the CT scanning equipment and then displays the Dicom-format scanning data to a doctor, and the doctor determines the target area of the tumor and the peripheral critical organs according to the scanning data and determines related operation information. The operation data acquisition device 15 determines data such as the puncture position, depth, angle, and number of implanted particles of the puncture needle 700 into which the particles are implanted, based on the operation information input by the doctor, forms a treatment plan, generates three-dimensional data of the operation guide model 10, and extracts a model entity of the operation guide model 10 by a 3D printer.
Then, a positioning line 300 partially identical to the laser marking line 100 and a positioning hole 400 nested with the positioning mark 200 are formed on the surgical guide former 10. The operation guide model 10 is placed on the body surface of the patient through the positioning hole 400 and the positioning mark 200, so that the positioning line 300 of the operation guide model 10 is overlapped with the laser mark line 100, and the operation guide model 10 is placed at the operation position. After 2 or 3 puncture needles 700 are further selected to puncture to a certain depth, the angle and position of the puncture needle 700 are scanned and observed by the CT scanning device to be consistent with the treatment plan so as to determine whether the surgical guide model 10 is accurate, and if not, the surgical guide model 10 is adjusted.
The surgical data acquisition device 15 forms preset surgical data including the surgical guide model 10 and the three-dimensional model data of the surgical tool and the preset conditions according to the treatment plan. The surgical data acquisition device 15 sends the preset surgical data and the scanning data together to the head wearable device.
As shown in fig. 7, after the doctor wears the head wearable device, the image capturing module 11 of the head wearable device captures a surgical image of the patient through a camera (image capturing device), recognizes the position of the surgical guide model 10 and the position of the surgical tool held by the doctor in the image through an image recognition technique, superimposes the three-dimensional models of the surgical guide model 10 and the surgical tool on the image according to the position of the surgical guide model 10 and the position of the surgical tool, respectively, and superimposes the three-dimensional model of the surgical position on the image according to the position of the surgical guide model 10. After confirming that the positions of the models are accurate, a doctor can delete the surgical image of the patient from a display picture, hide the image data or display the image data with preset transparency by inputting model confirmation information.
The positioning navigation module 13 of the wearable device determines in real time whether the relative position of the surgical guidance model 10 and the surgical tool meets a preset condition, and if so, feeds back navigation completion information to a doctor. Upon completion of navigation, indicating that needle 700 penetration is complete, the physician may follow the pre-operative plan for further procedures, such as implanting radioactive seeds into the interior of the tumor.
Based on the same principle, the embodiment also discloses a surgical guiding model 10. The shape of the surgical guide model 10 is matched with the surgical site of the patient, at least one guide column 500 is arranged on the surgical guide model 10, and a hollow part 600 which is convenient for the puncture needle 700 to penetrate and insert is formed in the center of the guide column 500.
The surgical guide model 10 is formed with a positioning line 300 at least partially identical to the laser marking line 100 irradiated on the body surface of the patient and a positioning hole 400 which is fitted and nested with a positioning mark 200 provided on the laser marking line 100 to position the surgical guide model 10.
Since the principle of solving the problem of the surgical guide model 10 is similar to the above system, the implementation of the surgical guide model 10 can be referred to the implementation of the system, and will not be described herein again.
Based on the same principle, the embodiment also discloses a head-wearing wearable device assisted surgery navigation method. As shown in fig. 8, in this embodiment, the method includes:
s100: image data of a surgical image of a patient is acquired by an image acquisition device, wherein the image data comprises a surgical guide model 10 arranged at a surgical position of the patient.
S200: and identifying the position of the surgical guide model 10 and the surgical tool position of the surgical tool held by the doctor in the image data by using an image identification technology, and displaying the preset three-dimensional models of the surgical guide model 10 and the surgical tool and the image data to the doctor together according to the position of the surgical guide model 10 and the surgical tool position.
S300: and determining whether the relative position of the surgical guidance model 10 and the surgical tool meets a preset condition, and if so, feeding navigation completion information back to a doctor.
In a preferred embodiment, as shown in fig. 9, the method further comprises, before S100:
s011: a laser marking line 100 is formed and irradiated onto the patient's body surface.
S012: positioning marks 200 for positioning the surgical guide model 10 are provided on the laser mark lines 100 on the surface of the patient.
In a preferred embodiment, the surgical tool is provided with a movement marker.
In S200, the identifying and obtaining the position of the surgical guide model 10 and the surgical tool position of the surgical tool held by the doctor in the image data by using the image identification technology specifically includes:
s210: the positions of the positioning markers 200 and the moving markers in the image data are identified by an image identification technology and are respectively used as the position of the surgical guide model 10 and the position of the surgical tool.
In a preferred embodiment, the surgical guide former 10 is formed with a positioning line 300 at least partially identical to the laser marking line 100 and a positioning hole 400 matingly nested with the positioning mark 200.
In a preferred embodiment, the surgical guide model 10 has a shape corresponding to a surgical site of a patient, the surgical tool is a puncture needle 700, the surgical guide model 10 is further provided with at least one guide post 500, and a hollow part 600 through which the puncture needle 700 is inserted is formed at the center of the guide post 500.
In a preferred embodiment, as shown in fig. 10, the method further comprises, before S100:
s021: patient scan data is acquired.
S022: a treatment plan is formed based on the surgical information entered by the physician.
S023: and forming preset operation data according to the treatment plan, wherein the preset operation data comprises the three-dimensional model data of the operation guide model 10 and the operation tool and the preset condition.
In a preferred embodiment, the method further comprises:
s400: a three-dimensional model of the surgical site is formed from the patient scan data and displayed to the surgeon along with the image data based on the position of the surgical guide model 10.
In a preferred embodiment, the method further comprises:
s500: displaying the image data based on a preset display mode according to model confirmation information input by a doctor, wherein the preset display mode comprises at least one of deleting the image data, hiding the image data and displaying the image data with a preset transparency.
Because the principle of solving the problems by the method is similar to that of the system, the implementation of the method can be referred to the implementation of the system, and is not described in detail herein.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. A typical implementation device is a computer device, which may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
In a typical example, the computer device comprises in particular a memory, a processor and a computer program stored on the memory and executable on the processor, which when executed by the processor implements the method as described above.
Referring now to FIG. 11, shown is a schematic diagram of a computer device 600 suitable for use in implementing embodiments of the present application.
As shown in fig. 11, the computer apparatus 600 includes a Central Processing Unit (CPU)601 which can perform various appropriate works and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM)) 603. In the RAM603, various programs and data necessary for the operation of the system 600 are also stored. The CPU601, ROM602, and RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output section 607 including a Cathode Ray Tube (CRT), a liquid crystal feedback (LCD), and the like, and a speaker and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted as necessary on the storage section 608.
In particular, according to an embodiment of the present invention, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the invention include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functionality of the units may be implemented in one or more software and/or hardware when implementing the present application.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (19)

1. A head-worn wearable device-assisted surgical navigation system, the head-worn wearable device comprising:
the image acquisition module is used for acquiring image data of a patient operation image through the image acquisition device, wherein the image data comprises an operation guide model arranged at an operation position of a patient;
the image processing module is used for identifying and obtaining the position of the surgical guide model in the image data and the position of a surgical tool held by a doctor through an image identification technology, and displaying a preset surgical guide model and a preset three-dimensional model of the surgical tool and the image data to the doctor together according to the position of the surgical guide model and the position of the surgical tool;
and the positioning navigation module is used for determining whether the relative position of the operation guide model and the operation tool meets a preset condition or not, and if so, feeding navigation completion information back to a doctor.
2. The assisted surgery navigation system of claim 1, further comprising a positioning calibration module, the positioning calibration module comprising:
a laser generating device for forming a laser marking line irradiated to the body surface of the patient;
and the positioning mark is used for being arranged on a laser marking line on the body surface of the patient so as to position the positioning mark at the position of the operation guide model.
3. The assisted surgery navigation system of claim 2, wherein the surgical tool is provided with a movement marker;
the image processing module is specifically configured to identify and obtain positions of the positioning marker and the moving marker in the image data through an image identification technology, and to respectively serve as the position of the surgical guide model and the position of the surgical tool.
4. The assisted surgery navigation system of claim 2, wherein the surgery guide model is formed with a positioning line at least partially identical to the laser marking line and a positioning hole matching and nesting with the positioning mark.
5. The surgical navigation aid system according to claim 1, wherein the surgical guide model has a shape matching the surgical site of the patient, the surgical tool is a puncture needle, and the surgical guide model is further provided with at least one guide post, and a hollow portion for facilitating insertion of the puncture needle is formed in the center of the guide post.
6. The assisted surgery navigation system of claim 1, further comprising a surgery data acquisition device for acquiring scan data of a patient, forming a treatment plan according to surgery information inputted by a doctor, and forming preset surgery data according to the treatment plan, wherein the preset surgery data includes three-dimensional model data of a surgery guidance model and a surgery tool and the preset condition.
7. The assisted surgery navigation system of claim 6, wherein the image processing module is further configured to form a three-dimensional model of the surgical site from the patient scan data, and to display the three-dimensional model of the surgical site and the image data together to the surgeon based on the surgical guide model position.
8. The assisted surgery navigation system of claim 7, wherein the image processing module is further configured to display the image data based on a preset display mode according to model confirmation information input by a doctor;
the preset display mode includes at least one of deleting the image data, hiding the image data, and displaying the image data of a preset transparency.
9. A surgical guide model is characterized in that the shape of the surgical guide model is matched with the surgical part of a patient, at least one guide column is arranged on the surgical guide model, and a hollow part which is convenient for a puncture needle to penetrate through is formed in the center of the guide column;
the operation guide model is provided with a positioning line which is at least partially identical to the laser marking line irradiated on the body surface of the patient and a positioning hole which is matched and nested with a positioning mark arranged on the laser marking line to position the operation guide model.
10. A head-worn wearable device-assisted surgical navigation method, comprising:
acquiring image data of a patient operation image through an image acquisition device, wherein the image data comprises an operation guide model arranged at an operation position of a patient;
identifying and obtaining the position of a surgical guide model in the image data and the position of a surgical tool held by a doctor by an image identification technology, and displaying a preset surgical guide model, a preset three-dimensional model of the surgical tool and the image data to the doctor together according to the position of the surgical guide model and the position of the surgical tool;
and determining whether the relative position of the surgical guidance model and the surgical tool meets a preset condition, and if so, feeding navigation completion information back to a doctor.
11. The assisted surgery navigation method according to claim 10, further comprising, prior to acquiring image data of a surgical image of a patient by an image acquisition device:
forming a laser marking line and irradiating the laser marking line to the body surface of a patient;
and arranging a positioning mark for positioning the position of the operation guide model on a laser marking line on the body surface of the patient.
12. The navigation method of assisted surgery of claim 11, wherein the surgical tool is provided with a moving mark;
the identifying and obtaining of the surgical guide model position in the image data and the surgical tool position of the surgical tool held by the doctor by the image identification technology specifically includes:
and identifying the positions of the positioning mark and the moving mark in the image data through an image identification technology, and respectively using the positions as the position of the surgical guide model and the position of the surgical tool.
13. The surgical navigation aid method according to claim 11, wherein the surgical guide model is formed with a positioning line at least partially identical to the laser marking line and a positioning hole matching and nesting with the positioning mark.
14. The navigation method of assisted surgery according to claim 10, wherein the surgical guide model has a shape matching the surgical site of the patient, the surgical tool is a puncture needle, the surgical guide model is further provided with at least one guide post, and a hollow portion facilitating insertion of the puncture needle is formed at the center of the guide post.
15. The assisted surgery navigation method according to claim 10, further comprising, prior to acquiring image data of a surgical image of a patient by the image acquisition device:
acquiring patient scan data;
forming a treatment plan according to the operation information input by the doctor;
and forming preset operation data according to the treatment plan, wherein the preset operation data comprises three-dimensional model data of an operation guide model and an operation tool and the preset condition.
16. The assisted surgery navigation method of claim 15, further comprising:
and forming a three-dimensional model of the operation position according to the scanning data of the patient, and displaying the three-dimensional model of the operation position and the image data to a doctor together according to the position of the operation guide model.
17. The assisted surgery navigation method according to claim 16, further comprising:
displaying the image data based on a preset display mode according to model confirmation information input by a doctor, wherein the preset display mode comprises at least one of deleting the image data, hiding the image data and displaying the image data with a preset transparency.
18. A computer device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor,
the processor, when executing the program, implements the method of any of claims 10-17.
19. A computer-readable medium, having stored thereon a computer program,
the program when executed by a processor implementing the method according to any of claims 10-17.
CN202010446648.7A 2020-05-25 2020-05-25 Operation guide model and head-wearing wearable equipment-assisted operation navigation system Active CN111588999B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010446648.7A CN111588999B (en) 2020-05-25 2020-05-25 Operation guide model and head-wearing wearable equipment-assisted operation navigation system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010446648.7A CN111588999B (en) 2020-05-25 2020-05-25 Operation guide model and head-wearing wearable equipment-assisted operation navigation system

Publications (2)

Publication Number Publication Date
CN111588999A true CN111588999A (en) 2020-08-28
CN111588999B CN111588999B (en) 2022-07-08

Family

ID=72180952

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010446648.7A Active CN111588999B (en) 2020-05-25 2020-05-25 Operation guide model and head-wearing wearable equipment-assisted operation navigation system

Country Status (1)

Country Link
CN (1) CN111588999B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113143457A (en) * 2021-02-09 2021-07-23 席庆 Maxillofacial operation auxiliary system and method based on MR head-mounted equipment

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105963002A (en) * 2016-08-01 2016-09-28 北京启麟科技有限公司 Three-dimensional printed minimally invasive guide template and making method thereof
CN107028646A (en) * 2017-05-25 2017-08-11 张昊 Combined type intervention operation guiding die plate
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
CN109674536A (en) * 2019-01-25 2019-04-26 上海交通大学医学院附属第九人民医院 Operation guiding system and its equipment, method and storage medium based on laser
CN109758229A (en) * 2018-12-25 2019-05-17 襄阳市第一人民医院(襄阳市肿瘤医院) A kind of neurosurgery control system and method based on virtual reality
WO2019127449A1 (en) * 2017-12-29 2019-07-04 威朋(苏州)医疗器械有限公司 Surgical navigation method and system
US20190231443A1 (en) * 2017-10-02 2019-08-01 Mcginley Engineered Solutions, Llc Surgical instrument with real time navigation assistance
CN110584779A (en) * 2019-08-20 2019-12-20 周苹 Head-mounted visual surgical site navigation system and operation method thereof
CN110584783A (en) * 2019-10-14 2019-12-20 中国科学技术大学 Surgical navigation system
CN110711030A (en) * 2019-10-21 2020-01-21 北京国润健康医学投资有限公司 Femoral head necrosis minimally invasive surgery navigation system and surgery method based on AR technology

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105963002A (en) * 2016-08-01 2016-09-28 北京启麟科技有限公司 Three-dimensional printed minimally invasive guide template and making method thereof
CN107028646A (en) * 2017-05-25 2017-08-11 张昊 Combined type intervention operation guiding die plate
CN107374729A (en) * 2017-08-21 2017-11-24 上海霖晏医疗科技有限公司 Operation guiding system and method based on AR technologies
US20190231443A1 (en) * 2017-10-02 2019-08-01 Mcginley Engineered Solutions, Llc Surgical instrument with real time navigation assistance
WO2019127449A1 (en) * 2017-12-29 2019-07-04 威朋(苏州)医疗器械有限公司 Surgical navigation method and system
CN109758229A (en) * 2018-12-25 2019-05-17 襄阳市第一人民医院(襄阳市肿瘤医院) A kind of neurosurgery control system and method based on virtual reality
CN109674536A (en) * 2019-01-25 2019-04-26 上海交通大学医学院附属第九人民医院 Operation guiding system and its equipment, method and storage medium based on laser
CN110584779A (en) * 2019-08-20 2019-12-20 周苹 Head-mounted visual surgical site navigation system and operation method thereof
CN110584783A (en) * 2019-10-14 2019-12-20 中国科学技术大学 Surgical navigation system
CN110711030A (en) * 2019-10-21 2020-01-21 北京国润健康医学投资有限公司 Femoral head necrosis minimally invasive surgery navigation system and surgery method based on AR technology

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113143457A (en) * 2021-02-09 2021-07-23 席庆 Maxillofacial operation auxiliary system and method based on MR head-mounted equipment

Also Published As

Publication number Publication date
CN111588999B (en) 2022-07-08

Similar Documents

Publication Publication Date Title
US6549802B2 (en) Seed localization system and method in ultrasound by fluoroscopy and ultrasound fusion
CN109419524B (en) Control of medical imaging system
US6675032B2 (en) Video-based surgical targeting system
US20190142359A1 (en) Surgical positioning system and positioning method
US20030029464A1 (en) Video-based surgical targeting system
JP6867486B2 (en) Custom-made surgery guide, custom-made surgery guide generation method and generation program
US20110009748A1 (en) Transperineal prostate biopsy system and methods
WO2016201341A1 (en) Systems and methods for guiding tissue resection
CN103892912B (en) The Needle localization method that X line is auxiliary and system
EP1596701B1 (en) Seed localization system for use in an ultrasound system
CN106821496A (en) A kind of accurate planning system of percutaneous foramen intervertebrale lens operation and method
JP7111680B2 (en) Visualization and Manipulation of Results from Device-to-Image Registration Algorithms
CN110680470B (en) Laser guide positioning device of automatic tumor puncture machine
CN109173087A (en) A method of radioactive prospecting instrument is realized using laser aiming
CN106408652B (en) Screw path positioning method and system for acetabulum anterior column forward screw
CN108852400B (en) Method and device for realizing position verification of treatment center
CN110537985A (en) Spine space coordinate system positioning device and method for augmented reality surgery system
CN111588999B (en) Operation guide model and head-wearing wearable equipment-assisted operation navigation system
CN109152929B (en) Image-guided treatment delivery
Zhou et al. Surgical navigation system for brachytherapy based on mixed reality using a novel stereo registration method
CN111728695B (en) Light beam auxiliary positioning system for craniotomy
CN207804782U (en) A kind of cranium brain seeds implanted guidance system based on 3D printing personalization
CN115349955A (en) Holographic image virtual template-based robot system for guiding particle implantation tumor surgery
Otal et al. A method to incorporate interstitial components into the TPS gynecologic rigid applicator library
JP6526346B2 (en) Brachytherapy system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PP01 Preservation of patent right
PP01 Preservation of patent right

Effective date of registration: 20230811

Granted publication date: 20220708