US20170215969A1 - Human organ movement monitoring method, surgical navigation system and computer readable medium - Google Patents
Human organ movement monitoring method, surgical navigation system and computer readable medium Download PDFInfo
- Publication number
- US20170215969A1 US20170215969A1 US15/106,746 US201415106746A US2017215969A1 US 20170215969 A1 US20170215969 A1 US 20170215969A1 US 201415106746 A US201415106746 A US 201415106746A US 2017215969 A1 US2017215969 A1 US 2017215969A1
- Authority
- US
- United States
- Prior art keywords
- coordinate system
- movement monitoring
- image
- orientation
- movement
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0004—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
- A61B5/0013—Medical image data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
- A61B5/0015—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
- A61B5/002—Monitoring the patient using a local or closed circuit, e.g. in a room or building
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1107—Measuring contraction of parts of the body, e.g. organ, muscle
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
-
- G06F19/321—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
- G06T7/248—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/20—ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00694—Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00694—Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
- A61B2017/00699—Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2074—Interface software
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3954—Markers, e.g. radio-opaque or breast lesions markers magnetic, e.g. NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3966—Radiopaque markers visible in an X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30061—Lung
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the present disclosure generally relates to a surgical navigation system and method, and in particular relates to a human organ movement monitoring method for monitoring human organ movement in a medical process in real time, a surgical navigation system and a computer readable medium.
- An interventional operation is a developed method of modern surgery, the difference of the interventional operation from the traditional surgery lies in that no surgical operation is needed, a special surgical instrument, such as a catheter, a frozen needle, a radio frequency ablation needle, a guide wire and the like can be penetrated into a focus location or a surgical target location in vivo through a very small wound, and then a treatment purpose is achieved by a variety of physical/chemical effects, so as to solve the problems of tumor excision, biopsy, artificial equipment placement and the like that can be solved only by open surgery previously.
- a special surgical instrument such as a catheter, a frozen needle, a radio frequency ablation needle, a guide wire and the like can be penetrated into a focus location or a surgical target location in vivo through a very small wound, and then a treatment purpose is achieved by a variety of physical/chemical effects, so as to solve the problems of tumor excision, biopsy, artificial equipment placement and the like that can be solved only by open surgery previously.
- Surgical navigation is used for guiding the execution of clinical surgery in real time by means of a medical image of the patient and a three-dimensional model generated thorough reconstruction of the medical image.
- the medical image data of the patient are obtained, for example, by means of CT (computerized tomography) or MM (Magnetic Resonance Imaging) scanning.
- a surgical navigation system associates the preoperative medical image data with the intraoperative surgical site of the patient through a positioning device, and can accurately display an anatomical structure of the patient and the details of a three-dimensional spatial position near the focus in a software interface.
- the surgical instrument points to any location in the patient body, the coordinate information thereof will be obtained by the navigation system in real time and is displayed on the three-dimensional model of the patient. In this case, even if no surgery under a operation knife is carried out on the patient, the doctor can get the relative position relation of the surgical instrument and the tumor focus in real time.
- the surgical navigation has the following advantages:
- the focus can be reconstructed in real time through the three-dimensional technology, and the structural features on the surrounding of a surgical field are displayed;
- the most suitable surgical approach can be selected through a preoperative surgical plan design
- the position and orientation of the surgical instrument can be accurately calculated and displayed in real time
- the surgical approach can be adjusted in the surgery in real time so as to arrive at the focus location more accurately.
- a CT image is used as an example of the medical image below, but obviously other images, such as an MM image, can also be used as the example of the medical image.
- Real-time surgical navigation relates to the use of a positioning device or a tracking system, for example, an electromagnetic (EM, electromagnetic) tracker is used for tracking the distal tip of a surgical tool, such as a puncture needle, so as to associate the position of the surgical tool with the preoperative medical image, for example, the CT image and display a fused image to a clinician.
- EM electromagnetic
- a registration process based on reference marks is generally carried out before navigation.
- These reference marks are identified from the CT image, and these reference markers are touched by a calibrated tracking probe to obtain coordinates of these reference markers in the tracker space, which is called a positioning coordinate system below.
- registration based on points is executed to find a coordinate transformation matrix between the CT space and the tracker space.
- a registration matrix is obtained through the coordinate transformation matrix, and the registration matrix is used for aligning the surgical tool with the preoperative medical image, so that the image of the surgical tool can be accurately fused with the CT image in a CT coordinate system based on the position information of the surgical tool in the positioning system, and the fused image is visually displayed to the clinician.
- a pleuroperitoneal cavity tissue movement (including a tumor tissue and a normal tissue) caused by breath can generally reach a range of 2-4 cm, while the accuracy of clinical surgery is generally at a millimeter level, and such an error will result in that the currently mainstream surgical navigation systems are difficult to be safely and stably used in the clinical practice of the pleuroperitoneal cavity interventional surgeries.
- a human organ movement monitoring method for monitoring human organ movement in a surgical process in real time wherein a three-dimensional medical image of a treatment site of a patient fixed with two or more movement monitoring tools is obtained by scanning prior to the surgery, the three-dimensional medical image having an associated image coordinate system, and the human organ movement monitoring method includes the following steps: obtaining first position and orientation of the movement monitoring tools in the image coordinate system identified from the preoperative three-dimensional medical image; in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, determining second position and orientation of the movement monitoring tools in a positioning coordinate system in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of positioning the position and orientation of a surgical tool; calculating an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and
- a surgical navigation system including: a positioning device, used for tracking position and orientation of a surgical tool and movement monitoring tools in a positioning coordinate system; two or more movement monitoring tools, which are fixed on a patient body, and the position and orientation of which in the positioning coordinate system are capable of being tracked by the positioning device; a three-dimensional medical image obtaining unit, used for obtaining a preoperative three-dimensional medical image of a treatment site of a patient fixed with the movement monitoring tools, the three-dimensional medical image having an associated image coordinate system; a surgical navigation workstation, used for registering and combining the preoperative three-dimensional medical image with an intraoperative surgical tool image and visually displaying the same on a connected display device to guide the surgical operation of a surgeon; wherein the surgical navigation workstation further monitors the movement state of a human organ through the following operations: identifying first position and orientation of the movement monitoring tools in the image coordinate system from the preoperative three-dimensional medical image; determining second position and orientation of the movement monitoring tools in the positioning coordinate system in the
- a computer readable medium on which a computer program is recorded, the computer program being used in combination with a surgical navigation system and executing the following operations when being executed by a processing device: obtaining first position and orientation of movement monitoring tools in an image coordinate system identified from a preoperative three-dimensional medical image, wherein the preoperative three-dimensional medical image is obtained by scanning a treatment site of a patient fixed with two or more movement monitoring tools prior to the surgery, the three-dimensional medical image having an associated image coordinate system; in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, determining second position and orientation of the movement monitoring tools in a positioning coordinate system in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of positioning the position and orientation of a surgical tool; calculating an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the
- the surgical navigation system and the human organ movement monitoring method provided by the embodiments of the present disclosure can be used for solving or relieving the problem in pleuroperitoneal cavity surgical navigation that a tumor displacement is caused by a respiratory movement of the human body to reduce the surgical navigation precision, and also simplifying the surgical navigation procedures to facilitate the use and popularization of the surgical navigation in a plurality of fields.
- FIG. 1 shows a configuration block diagram of an exemplary surgical navigation system 100 according to an embodiment of the present disclosure.
- FIG. 2 shows an exemplary structural diagram of a breath monitoring tool 120 according to an embodiment of the present disclosure.
- FIG. 3 shows an exemplary process of a surgical navigation method 200 according to an embodiment of the present disclosure.
- FIG. 4 shows a schematic diagram of a part of a time change curve of an overall error of coordinate transformation in a respiratory movement process.
- a preoperative CT image is collected at a certain breath state moment or breath state stage of a patient.
- position and orientation of reference marks in a positioning coordinate system are inconvenient to track in general.
- a CT scanning moment and an actual surgical moment may be separated for several days.
- the patent is consistently in a periodic movement process of breath, theoretically only when the breath state of the patient at a certain moment is uniform with or close to that at the preoperative CT scanning moment or stage, the position and orientation of the reference marks in the positioning coordinate system at the moment correspond to the position and orientation of the reference marks in an image coordinate system identified previously from the CT image, so the registration matrix obtained at this time is correct, then the position of a surgical tool can be accurately transformed into a CT space at this moment, and the visual display of a fused three-dimensional model provided for a clinician is accurate.
- lung and related tissue movement caused by a respiratory movement is used as examples of a human organ movement, but the present disclosure is not limited thereto and is suitable for monitoring other processes of dynamic internal movements of a human body, for example, a heartbeat movement.
- Embodiments described herein can be preferably used for helping to position the lung or tissues close to the lung, but the tissue can also be located at other positions, for example, the heart, digestive organs, blood vessels, the kidney, etc.
- the position and orientation can be expressed by 6 numerical values, namely position coordinates expressing three-dimensional positions and three angle values expressing orientation, and this is called 6D pose in some traditional technologies.
- 6D pose position coordinates expressing three-dimensional positions and three angle values expressing orientation
- the position and orientation can be described by fewer values as necessary.
- registration indicates establishing a mapping relation or a coordinate transformation relation between two coordinate systems
- registration between the positioning coordinate system and the image coordinate system indicates establishing a mapping relation between the positioning coordinate system and the image coordinate system or figuring out a coordinate transformation matrix therebetween
- the positioning coordinate system herein is a coordinate system which is referenced by a positioning device when positioning the surgical tool and movement monitoring tools.
- FIG. 1 shows a configuration block diagram of an exemplary surgical navigation system 100 according to an embodiment of the present disclosure.
- the surgical navigation system 100 includes a positioning device 110 , two or more movement monitoring tools 120 , a three-dimensional medical image obtaining unit 130 and a surgical navigation workstation 140 .
- FIG. 1 further shows a control console 210 and a medical tool 220 , which cooperatively work with the surgical navigation system 100 .
- the positioning device 110 is configured to track position and orientation of the movement monitoring tools in the positioning coordinate system.
- positioning sensors 112 for example, magnetic induction positioning sensors can be placed on each movement monitoring tool.
- the positioning device 110 can track the position and orientation of the movement monitoring tools in the positioning coordinate system by means of the magnetic induction positioning sensors.
- the positioning device 110 is further configured to track the position and orientation of the surgical tool in a second positioning coordinate system.
- the surgical tool is a puncture needle
- the positioning sensor 112 is installed on the puncture needle
- the positioning device 110 can track the position and orientation of the puncture needle by means of the positioning sensor.
- one positioning device can track the signals of 8 positioning sensors at the same time.
- the positioning device 110 can adopt one of electromagnetic (EM) tracking technology, optical tracking technology and fiber grating tracking technology.
- EM electromagnetic
- the Aurora system of the Canada NDI company the DriveBay of the American Asension company, the PercuNav system of the Dutch Philips company, the InstaTrak3500 plus system of the American GE company, the StealthStation AxiEM system of the American Medtronic company and the Cygnus FPS system of the American Compass company are available.
- the two or more movement monitoring tools 120 are configured to be fixed on a patient body at the same position and orientation before the surgery (specifically, before CT scanning) and in the surgery, but before the intervention of the surgical tool.
- a description is given below by taking it as an example that the monitored movement is a respiratory movement.
- the movement monitoring tools 120 may be called breath monitoring tools below.
- the wearing positions of the breath monitoring tools on the patient body are places greatly influenced by the breath to the best, so that the relative positions of the several breath monitoring tools may generate relative change to a certain extent with the respiratory movement.
- each movement monitoring tool is provided with at least four mark points capable of being tracked by a positioning device, in which any three arbitrary mark points are non-collinear, wherein the first position and orientation of the movement monitoring tools in the image coordinate system are identified by identifying the mark points of each movement monitoring tool in the three-dimensional medical image; and the second position and orientation of the movement monitoring tools in the positioning coordinate system are determined by tracking the mark points of each movement monitoring tool in the surgery through the positioning device.
- mark points are arranged on one movement monitoring tool, and any three arbitrary mark points are non-collinear, so that a coordinate transformation matrix from the positioning coordinate system to the image coordinate system can be figured out based on the position and orientation of the at least four mark points in the image coordinate system and the first positioning coordinate system.
- the mark points are metal balls for example, and the image parameters of the metal balls in the CT are obviously different from the image parameters of the human body material, thereby capable of being manually identified or automatically identified through image processing easily.
- a unique coordinate transformation matrix can be determined according to the first position and orientation of the breath monitoring tool in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and the coordinate transformation matrix can be used for guaranteeing zero transformation error.
- a single coordinate transformation matrix is used for transforming each one of the plurality of breath monitoring tools from the first position and orientation in the image coordinate system to the second position and orientation in the positioning coordinate system, thus a transformation error will be generated.
- the main concept of the present disclosure is to monitor the difference of the poses of the breath monitoring tools relative to the poses at a preoperative CT scanning moment by calculating optimal uniform transformation matrixes at various moments and corresponding overall transformation errors, so as to monitor the difference of the states of the human organ at various moments relative to the state at the preoperative CT scanning moment to achieve the purpose of monitoring the human organ movement.
- each breath monitoring tool is not necessarily provided with four or more mark points.
- the transformation matrix and the overall transformation error between the two coordinate systems can be figured out on the basis of the positions of all mark points of all breath monitoring tools in the positioning coordinate system and the image coordinate system.
- each breath monitoring tool has three mark points, for example, in this way, an optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the 6 mark points in the positioning coordinate system and the positions thereof in the image coordinate system.
- each breath monitoring tool has two mark points, in this way, the optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the eight mark points in the positioning coordinate system and the positions thereof in the image coordinate system.
- 5 breath monitoring tools are provided, each breath monitoring tool has one mark point, in this way, the optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the 5 mark points in the positioning coordinate system and the positions thereof in the image coordinate system.
- FIG. 2 shows an exemplary structural diagram of a breath monitoring tool 120 according to an embodiment of the present disclosure.
- the breath monitoring tool 120 can be composed of a plastic framework and an object to be identified.
- 4 image identifiable markers for example, metal balls, for example, being made of a lead material
- the image parameters mainly including CT values and MRI values
- the markers can be extracted using a known method.
- adhesive electrodes adopted in ECG production can be adopted.
- Identifiable markers can be automatically extracted from the image through subsequent processing of the CT image or the MRI image, and the information (including coordinate information and direction information of a tool center) of the position and orientation (for the convenience of description, can be called pose below) of the tool can be further calculated.
- a magnetic induction positioning device can be placed on the facade of the exemplary breath monitoring tool 120 in FIG. 2 , and a round hole 122 at the middle and a concave structure at the left side are mainly used for facilitating the disassembly of the magnetic induction positioning device.
- markers 121 capable of being identified by CR or MR can be placed on the rods, and the pose information can be calculated after the markers are identified.
- a round hole 123 in a small cross bar at the distal tip of each straight rod can be used for recording position information on the human body surface through a marking pen, in this way, after the breath monitoring tool 120 is removed from the surface of the patient, the breath monitoring tool 120 can be accurately adhered on the human body surface again according to the recorded pose information.
- the structure and shape of the breath monitoring tool as shown in FIG. 2 are merely exemplary, and the configuration thereof can be changed as needed.
- the utilization process of the breath monitoring tools can be divided into a CT scanning stage and an intraoperative stage. Specifically, before CT scanning is carried out, two or more breath monitoring tools are fixed at the chest position of the patient, the wearing pose information is marked on the human body surface, for example, by using the marking pen, then the patient is pushed into a CT room to carry out the CT scanning to obtain a CT scanning data set, and the position and orientation of the breath monitoring tools can be identified from such CT scanning data set.
- a three-dimensional model of a related treatment site of the patient can be reconstructed from such CT scanning data set via image processing, and such three-dimensional model can be visualized on the display device and is fused with the image of the tracked surgical tool for display to assist the surgery of the doctor.
- the breath monitoring tools are worn according to the aforementioned pose information recorded on the body surface of the patient, the poses of the breath monitoring tools in the positioning coordinate system are monitored in real time and the coordinate transformation matrix and corresponding overall errors of all breath monitoring tools are optimized to figure out, and the human organ movement is monitored based on the change of the overall errors over time.
- the three-dimensional medical image obtaining unit 130 can be a part of the surgical navigation workstation 140 or a separate device.
- the three-dimensional medical image obtaining unit 130 is used for obtaining a preoperative three-dimensional medical image of a treatment site of the patient fixed with the movement monitoring tools, the three-dimensional medical image having an associated image coordinate system.
- a three-dimensional data set obtained by preoperative CT scanning is stored in a database
- the three-dimensional medical image obtaining unit 130 can be operatively coupled to the database, read a specific three-dimensional data set from the database and visually display the specific three-dimensional data set on the display device through image processing, such as rendering and the like.
- the three-dimensional medical image obtaining unit 130 can simply read and display a reconstructed three-dimensional model and can also read the non-reconstructed CT scanning data set and reconstruct the CT scanning data set to obtain and display the three-dimensional model.
- the method for reconstructing the three-dimensional model from the CT or MRI data set reference can be made to related introduction of coordinate transformation and error calculation, for example, reference can be made to introductions in the third chapter and fourth chapter in a doctoral dissertation entitled “RESEARCH ON KEY TECHNOLOGY OF COMPUTER ASSISTED INTERVENTIONAL OPERATION UNDER IMAGE GUIDANCE” of Zhai Weiming, the entire contents of which are herein incorporated by reference.
- the control console 210 can be connected to the surgical navigation workstation 140 or is a part of the surgical navigation workstation 140 .
- the control console 210 can receive the pose information of the breath monitoring tools and the pose information of the surgical tool from the positioning device 110 .
- the control console 210 can also send an instruction of operating the puncture needle according to the indication of the clinician.
- the surgical navigation workstation 140 is configured to register and combine the preoperative three-dimensional medical image with an intraoperative surgical tool image and visually display the same on the connected display device to guide the surgical operation of a surgeon.
- the surgical navigation workstation 140 can include a processor 141 and a memory 142 , and an application program 1421 and an operating system 1422 can be stored in the memory 142 .
- the application program 1421 can include a program code, and the program code, when being executed by the processor 141 , can execute the human organ movement monitoring method that will be described in detail below.
- the processor here is a generalized concept and can be implemented by a single dedicated processor, a single shared processor, or a plurality of individual processors, wherein one processor among the processors can be shared, in addition, the use of the word “processor” or “controller” should not be explained as exclusively referring to hardware capable of executing software, but can implicitly include, but not limited to, digital signal processor (DSP) hardware, a read-only memory (ROM) used for storing software and a nonvolatile memory.
- DSP digital signal processor
- ROM read-only memory
- the surgical navigation workstation 140 can further include an interface used for real-time interaction with a display 150 .
- the display 150 can display the orientation, shape and/or position or the like of a medical target tissue under the control of the surgical navigation workstation 140 , and can also display the fusion of an image of the three-dimensional model of a related position of the patient and an image of the surgical tool.
- the surgical navigation workstation 140 can further include man-machine interaction interfaces, for example, being connected to such peripheral devices as a keyboard, a mouse, a touch screen system, etc.
- the surgical navigation workstation 140 can further include an interface used for interacting with the three-dimensional medical image obtaining unit 130 and an interface used for interacting with the control console 210 , etc.
- the clinical surgical procedure can include preoperative preparation, preoperative image scanning, preoperative surgical plan, intraoperative surgical navigation, postoperative surgical evaluation, etc.
- the preoperative preparation includes: preparing devices in a navigation operating room, fixing the patient by a vacuum pad, and fixing the movement monitoring tools (or adhering mark points and the liked) on the body surface of the patient.
- the preoperative image scanning includes carrying out preoperative CT scanning or MRI scanning on the patient.
- the preoperative surgical plan can include three-dimensional visualization of a medical image, three-dimensional reconstruction of an organ and focus, interactive surgical approach plan and surgical effect simulation and the like, wherein for example, the surgical approach plan can include selection of a needle entry point and a needle entry angle of the puncture needle; the surgical effect simulation can include thermal field simulation, damage field model reconstruction, etc.
- the intraoperative surgical navigation includes puncturing according to an approach in the surgical plan, etc.
- the postoperative surgical evaluation can include re-scanning a CT image or an MM image at specified time after the surgery, measuring the size of an actual injury area on the CT data and comparing the size with data calculated in the surgical plan to measure the accuracy and effect of the surgery.
- the human organ movement monitoring method according to the embodiment of the present disclosure can be used in an intraoperative surgical navigation process, for example.
- FIG. 3 shows an exemplary process of a surgical navigation method 200 according to an embodiment of the present disclosure.
- a step S 210 two or more movement monitoring tools are fixed on a patient body before the surgery.
- 2-4 breath monitoring tools are placed on the body surface of the patient, the wearing positions are places greatly influenced by breath to the best, so that the relative positions of the several tools will generate relative change to a certain extent with the respiratory movement, the properties of the respiratory movement can be better represented by tracking the poses of the breath monitoring tools, and thus the properties of a target treatment part (focus) which moves with the respiratory movement can be better described.
- breath monitoring tools is merely exemplary, and more breath monitoring tools can be selected as needed, for example, 5, 6 or more, etc.
- a three-dimensional medical image of the treatment site of the patient fixed with the movement monitoring tools is obtained by preoperative scanning, the three-dimensional medical image having an associated image coordinate system.
- the preoperative scanning can be CR scanning or MRI scanning.
- the three-dimensional medical image is obtained by the three-dimensional medical image obtaining unit 130 as shown in FIG. 1 by reconstruction based on a CT scanning image data set.
- the three-dimensional medical image of the treatment site of the patient fixed with the movement monitoring tools can be visually displayed on the display device, and the clinician can plan a surgical approach with reference to the three-dimensional medical image by means of a navigation software system, for example.
- a step S 230 for example, the surgical navigation workstation 140 identifies first position and orientation of the movement monitoring tools in the image coordinate system from the preoperative three-dimensional medical image.
- mark points can be arranged on the movement monitoring tools, the mark points have image parameters which are obviously different from those of a human tissue in a CT image, so that the mark points can be manually or automatically identified from the CT image, and accordingly, the position and orientation of the movement monitoring tools in the image coordinate system can be identified.
- step S 220 and the step S 230 are connected by a dashed line arrow, which indicates that the two steps can be separated for a relatively long time, for example, several hours or several days.
- a step S 240 in the surgery, in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, second position and orientation of the movement monitoring tools in a positioning coordinate system are determined in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of tracking the position and orientation of a surgical tool by the positioning device.
- the positioning coordinate system is a coordinate system which is referenced in a process of tracking the position and orientation of a surgical tool by the positioning device.
- the second position and orientation of the movement monitoring tools in the positioning coordinate system are obtained by the positioning device and the positioning sensors on the movement monitoring tools as shown in FIG. 1 and are transmitted to the surgical navigation workstation 140 .
- the surgical navigation workstation calculates an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and calculates overall errors of the movement monitoring tools during coordinate transformation from the positioning coordinate system to the image coordinate system based on the optimal coordinate transformation relation.
- a step S 260 movement degree of a human organ at various moments relative to a preoperative scanning moment are evaluated based on the real-time determined overall errors of coordinate transformation of the movement monitoring tools at the various moments.
- the mapping of the breath monitoring tool at the moment in the image coordinate system is closest to the pose of the breath monitoring tool in the image coordinate system obtained by the preoperative CT scanning, and thus the coordinate transformation error corresponding to the coordinate transformation matrix obtained by optimization at the moment in the surgery is the smallest.
- FIG. 4 shows a schematic diagram of a part of a time change curve of an overall error of coordinate transformation in a respiratory movement process, wherein a horizontal axis represents time and a vertical axis represents an overall error of coordinate transformation.
- the overall error of coordinate transformation is small, at this time, the breath state is close to the breath state at the preoperative scanning moment, the breath states at other moments deviate from the breath state at the preoperative scanning moment to different degrees, and the overall error of coordinate transformation changes accordingly.
- the difference degrees of the states of the human organ at the various moments relative to the preoperative scanning moment can be evaluated.
- a human organ moment monitoring result in the embodiment of the present disclosure can be adopted in a variety of forms.
- the real-time obtained time change curve of the overall error of coordinate transformation can be visually displayed on the display device in the form as shown in FIG. 4 , for example.
- the clinician can determine the timing of, for example, puncturing the puncture needle in the patient by observing the time change curve of the overall error.
- the surgical navigation workstation in FIG. 1 determines the timing for the surgical tool to intervene in a human focus based on the evaluated movement degree of the human organ at the various moments relative to the preoperative three-dimensional scanning moment, and reminds a surgeon of the timing.
- the overall error of coordinate transformation at a certain moment is 50% of the average overall error of coordinate transformation of a certain time period and the change trend is reduction
- the certain moment is determined as the timing of the surgical tool for intervening the human focus, and then the timing information is displayed on the display device, for example, in a short message form or is output from a loudspeaker in a voice form to remind the clinician.
- a moment when the pose of the human organ is basically uniform with the pose at the preoperative CT scanning moment can be determined on the basis of the evaluated movement (or difference) degrees of the pose of the human organ at various moments relative to the pose at the preoperative CT scanning moment, and the real-time tracked position of the surgical tool is transformed into the image coordinate system and is fused with the three-dimensional medical image by using the calculated coordinate transformation matrix from the positioning coordinate system to the image coordinate system at the moment, and then the fused image is visually displayed on the display device.
- the human organ movement monitoring method may further include: determining a moment when the overall error of coordinate transformation is smaller than a preset threshold; determining a coordinate transformation matrix from the positioning coordinate system to the image coordinate system at the moment; and determining the position of the surgical tool in the image coordinate system based on the coordinate transformation matrix, and combining the image of the surgical tool with the three-dimensional medical image at the position and displaying the combined image on a display device.
- the registration from the positioning coordinate system where the surgical tool is located to the image coordinate system is carried out, the image of the surgical tool is fused with the human three-dimensional model based on the position of the surgical tool tracked by the positioning device 110 for example, and is visually displayed on the display device to assist the surgery of the clinician.
- breath monitoring process can be continuously carried out in the surgery.
- a rule obtained by breath monitoring for example, the information of a time interval (as shown in FIG. 4 ) between a former moment having the smallest overall error and a moment having the smallest overall error can also be provided for the clinician, to enable the clinician to make psychological preparation and operation preparation before puncturing the needle at the next time.
- the movement degree (or difference degree) of the state of the human organ relative to the state at the CT scanning moment can be monitored without using a special human organ monitor, for example, for the respiratory movement, no special respiratory gate control device is needed. But, the present disclosure does not exclude use in cooperation with the special human organ monitor.
- the surgical navigation system and the human organ movement monitoring method in the embodiment of the present disclosure do not simply consider the human body as a rigid body for carrying out registration on the preoperative positioning coordinate system and the image coordinate system, but consider the internal movement of the human body for monitoring the difference degree of the state of the human organ relative to the state at the CT scanning moment, so that the doctor can be reminded of a proper timing for operation and can execute the preoperative surgical plan more accurately, and the dependence on the individual experience of the clinician is reduced.
- the embodiment of the disclosure can take the form of a computer program product.
- the computer program product can be accessed via a computer readable medium, for use by or in combination with a computer or any instruction execution system.
- the computer readable medium can include a device for storing, exchanging, transmitting or sending a program.
- the computer readable medium can be an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system (or device or apparatus) or a transmission medium. Examples of the computer readable medium include a semiconductor or a solid-state memory, a magnetic tape, a detachable computer floppy disk, a random access memory (RAM), a read-only memory (ROM), a magnetic disk or an optical disk.
Abstract
Description
- The present disclosure generally relates to a surgical navigation system and method, and in particular relates to a human organ movement monitoring method for monitoring human organ movement in a medical process in real time, a surgical navigation system and a computer readable medium.
- An interventional operation is a developed method of modern surgery, the difference of the interventional operation from the traditional surgery lies in that no surgical operation is needed, a special surgical instrument, such as a catheter, a frozen needle, a radio frequency ablation needle, a guide wire and the like can be penetrated into a focus location or a surgical target location in vivo through a very small wound, and then a treatment purpose is achieved by a variety of physical/chemical effects, so as to solve the problems of tumor excision, biopsy, artificial equipment placement and the like that can be solved only by open surgery previously.
- In an interventional operation process, a doctor cannot directly observe a treatment site or focus in a patent body and need to rely on computer-assisted surgery navigation technology guided by medical images. The computer-assisted surgery navigation technology is a cross research topic integrating computer science, artificial intelligence, automatic control, image processing, three-dimensional graphics, virtual reality and clinical treatment and other aspects. Multiple modes of medical images are adopted in the surgery navigation technology to assist the doctor to directly penetrate the surgical instrument into the focus for local treatment, so as to improve the surgery quality, reduce surgical trauma and reduce the pain of patients.
- Surgical navigation is used for guiding the execution of clinical surgery in real time by means of a medical image of the patient and a three-dimensional model generated thorough reconstruction of the medical image. The medical image data of the patient are obtained, for example, by means of CT (computerized tomography) or MM (Magnetic Resonance Imaging) scanning. A surgical navigation system associates the preoperative medical image data with the intraoperative surgical site of the patient through a positioning device, and can accurately display an anatomical structure of the patient and the details of a three-dimensional spatial position near the focus in a software interface. When the surgical instrument points to any location in the patient body, the coordinate information thereof will be obtained by the navigation system in real time and is displayed on the three-dimensional model of the patient. In this case, even if no surgery under a operation knife is carried out on the patient, the doctor can get the relative position relation of the surgical instrument and the tumor focus in real time.
- Compared with the traditional surgical means, the surgical navigation has the following advantages:
- 1, the focus can be reconstructed in real time through the three-dimensional technology, and the structural features on the surrounding of a surgical field are displayed;
- 2, the most suitable surgical approach can be selected through a preoperative surgical plan design;
- 3, possible encountering tissue structures on the surgical approach can be displayed;
- 4, important tissue structures that should be evaded, for example, blood vessels, nerves, bones and the like can be displayed;
- 5, a range for treatment of the focus can be displayed;
- 6, the position and orientation of the surgical instrument can be accurately calculated and displayed in real time;
- 7, the spatial position relation of the surgical instrument and the focus can be displayed, and the advance direction of the surgical instrument can be indicated; and
- 8, the surgical approach can be adjusted in the surgery in real time so as to arrive at the focus location more accurately.
- For the convenience of description, a CT image is used as an example of the medical image below, but obviously other images, such as an MM image, can also be used as the example of the medical image.
- Real-time surgical navigation relates to the use of a positioning device or a tracking system, for example, an electromagnetic (EM, electromagnetic) tracker is used for tracking the distal tip of a surgical tool, such as a puncture needle, so as to associate the position of the surgical tool with the preoperative medical image, for example, the CT image and display a fused image to a clinician. To achieve data integration between a CT space and a tracker space, a registration process based on reference marks is generally carried out before navigation. These reference marks (external reference marks or internal anatomical reference marks) are identified from the CT image, and these reference markers are touched by a calibrated tracking probe to obtain coordinates of these reference markers in the tracker space, which is called a positioning coordinate system below. Thereafter, registration based on points is executed to find a coordinate transformation matrix between the CT space and the tracker space. A registration matrix is obtained through the coordinate transformation matrix, and the registration matrix is used for aligning the surgical tool with the preoperative medical image, so that the image of the surgical tool can be accurately fused with the CT image in a CT coordinate system based on the position information of the surgical tool in the positioning system, and the fused image is visually displayed to the clinician.
- Currently, the surgical navigation technology has been widely used in neurosurgery surgical navigation and orthopedic surgical navigation. However, these surgeries all assume that the human body is a completely stationary rigid body, and this stationary state will maintain from an early CT scanning period to the end of the entire surgical process. Although this assumption can be approximately assumed to be tenable in most neurosurgeries and orthopedic surgeries, it is difficult to be tenable in the field of pleuroperitoneal cavity interventional surgeries. For example, a pleuroperitoneal cavity tissue movement (including a tumor tissue and a normal tissue) caused by breath can generally reach a range of 2-4 cm, while the accuracy of clinical surgery is generally at a millimeter level, and such an error will result in that the currently mainstream surgical navigation systems are difficult to be safely and stably used in the clinical practice of the pleuroperitoneal cavity interventional surgeries.
- In view of the above situation, the present disclosure is proposed.
- According to one aspect of the present disclosure, a human organ movement monitoring method for monitoring human organ movement in a surgical process in real time is provided, wherein a three-dimensional medical image of a treatment site of a patient fixed with two or more movement monitoring tools is obtained by scanning prior to the surgery, the three-dimensional medical image having an associated image coordinate system, and the human organ movement monitoring method includes the following steps: obtaining first position and orientation of the movement monitoring tools in the image coordinate system identified from the preoperative three-dimensional medical image; in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, determining second position and orientation of the movement monitoring tools in a positioning coordinate system in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of positioning the position and orientation of a surgical tool; calculating an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and calculating overall errors of coordinate transformation of the movement monitoring tools from the positioning coordinate system to the image coordinate system based on the optimal coordinate transformation relation; and evaluating movement degree of a human organ at various moments relative to a preoperative scanning moment based on the real-time determined overall errors of coordinate transformation of the movement monitoring tools at the various moments.
- According to another aspect of the present disclosure, a surgical navigation system is provided, including: a positioning device, used for tracking position and orientation of a surgical tool and movement monitoring tools in a positioning coordinate system; two or more movement monitoring tools, which are fixed on a patient body, and the position and orientation of which in the positioning coordinate system are capable of being tracked by the positioning device; a three-dimensional medical image obtaining unit, used for obtaining a preoperative three-dimensional medical image of a treatment site of a patient fixed with the movement monitoring tools, the three-dimensional medical image having an associated image coordinate system; a surgical navigation workstation, used for registering and combining the preoperative three-dimensional medical image with an intraoperative surgical tool image and visually displaying the same on a connected display device to guide the surgical operation of a surgeon; wherein the surgical navigation workstation further monitors the movement state of a human organ through the following operations: identifying first position and orientation of the movement monitoring tools in the image coordinate system from the preoperative three-dimensional medical image; determining second position and orientation of the movement monitoring tools in the positioning coordinate system in the surgery in real time; calculating an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and calculating overall errors of coordinate transformation of the movement monitoring tools from the positioning coordinate system to the image coordinate system; and evaluating movement degree of the human organ at various moments relative to a preoperative scanning moment based on the real-time determined overall errors of coordinate transformation at the various moments.
- According to a further aspect of the present disclosure, a computer readable medium is provided, on which a computer program is recorded, the computer program being used in combination with a surgical navigation system and executing the following operations when being executed by a processing device: obtaining first position and orientation of movement monitoring tools in an image coordinate system identified from a preoperative three-dimensional medical image, wherein the preoperative three-dimensional medical image is obtained by scanning a treatment site of a patient fixed with two or more movement monitoring tools prior to the surgery, the three-dimensional medical image having an associated image coordinate system; in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, determining second position and orientation of the movement monitoring tools in a positioning coordinate system in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of positioning the position and orientation of a surgical tool; calculating an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and calculating overall errors of coordinate transformation of the movement monitoring tools from the positioning coordinate system to the image coordinate system based on the optimal coordinate transformation relation; and evaluating movement degree of a human organ at various moments relative to a preoperative scanning moment based on the real-time determined overall errors of coordinate transformation of the movement monitoring tools at the various moments.
- The surgical navigation system and the human organ movement monitoring method provided by the embodiments of the present disclosure can be used for solving or relieving the problem in pleuroperitoneal cavity surgical navigation that a tumor displacement is caused by a respiratory movement of the human body to reduce the surgical navigation precision, and also simplifying the surgical navigation procedures to facilitate the use and popularization of the surgical navigation in a plurality of fields.
- These and/or other aspects and advantages of the present disclosure will become clearer and more understandable from the following detailed description of the embodiments of the present disclosure in combination with the accompany drawings, wherein:
-
FIG. 1 shows a configuration block diagram of an exemplarysurgical navigation system 100 according to an embodiment of the present disclosure. -
FIG. 2 shows an exemplary structural diagram of abreath monitoring tool 120 according to an embodiment of the present disclosure. -
FIG. 3 shows an exemplary process of asurgical navigation method 200 according to an embodiment of the present disclosure. -
FIG. 4 shows a schematic diagram of a part of a time change curve of an overall error of coordinate transformation in a respiratory movement process. - In order that those skilled in the art can better understand the present disclosure, a further detailed illustration of the present disclosure will be given below in combination with the accompany drawings and specific embodiments.
- Before the embodiments of the present disclosure are described in detail below, to facilitate those skilled in the art understanding and mastering the present disclosure, main concept of the present disclosure is described below.
- A preoperative CT image is collected at a certain breath state moment or breath state stage of a patient. In a CT scanning process, position and orientation of reference marks in a positioning coordinate system are inconvenient to track in general. A CT scanning moment and an actual surgical moment may be separated for several days.
- In a subsequent surgical process, the patent is consistently in a periodic movement process of breath, theoretically only when the breath state of the patient at a certain moment is uniform with or close to that at the preoperative CT scanning moment or stage, the position and orientation of the reference marks in the positioning coordinate system at the moment correspond to the position and orientation of the reference marks in an image coordinate system identified previously from the CT image, so the registration matrix obtained at this time is correct, then the position of a surgical tool can be accurately transformed into a CT space at this moment, and the visual display of a fused three-dimensional model provided for a clinician is accurate.
- To this end, the inventor thinks that the position and orientation (orientations or azimuth angles) of monitoring tools (having reference marks thereon) in the positioning coordinate system can be monitored, a uniform coordinate transformation matrix from the positioning coordinate system to a three-dimensional image coordinate system is optimized, the sum of errors (referred to as overall errors below at sometimes) of transformation of the monitoring tools from the positioning coordinate system to the three-dimensional image coordinate system are determined, it is deemed that a moment with a smaller error indicates that the position and orientation of a human organ at this time are close to the position and orientation of the human organ at the CT scanning moment, and the information is visually or aurally provided for the clinician to assist the surgery of the clinician.
- It should be noted that, in the following description, lung and related tissue movement caused by a respiratory movement is used as examples of a human organ movement, but the present disclosure is not limited thereto and is suitable for monitoring other processes of dynamic internal movements of a human body, for example, a heartbeat movement. Embodiments described herein can be preferably used for helping to position the lung or tissues close to the lung, but the tissue can also be located at other positions, for example, the heart, digestive organs, blood vessels, the kidney, etc.
- In the following description, the position and orientation (orientation) can be expressed by 6 numerical values, namely position coordinates expressing three-dimensional positions and three angle values expressing orientation, and this is called 6D pose in some traditional technologies. However, the position and orientation can be described by fewer values as necessary.
- Herein, “registration” indicates establishing a mapping relation or a coordinate transformation relation between two coordinate systems, for example, “registration between the positioning coordinate system and the image coordinate system” indicates establishing a mapping relation between the positioning coordinate system and the image coordinate system or figuring out a coordinate transformation matrix therebetween, and the positioning coordinate system herein is a coordinate system which is referenced by a positioning device when positioning the surgical tool and movement monitoring tools.
-
FIG. 1 shows a configuration block diagram of an exemplarysurgical navigation system 100 according to an embodiment of the present disclosure. - As shown in
FIG. 1 , thesurgical navigation system 100 includes apositioning device 110, two or moremovement monitoring tools 120, a three-dimensional medicalimage obtaining unit 130 and asurgical navigation workstation 140. To conveniently describe the working process of thesurgical navigation system 100,FIG. 1 further shows acontrol console 210 and amedical tool 220, which cooperatively work with thesurgical navigation system 100. - The
positioning device 110 is configured to track position and orientation of the movement monitoring tools in the positioning coordinate system. As shown in the figure,positioning sensors 112, for example, magnetic induction positioning sensors can be placed on each movement monitoring tool. Thepositioning device 110 can track the position and orientation of the movement monitoring tools in the positioning coordinate system by means of the magnetic induction positioning sensors. - The
positioning device 110 is further configured to track the position and orientation of the surgical tool in a second positioning coordinate system. For example, the surgical tool is a puncture needle, thepositioning sensor 112 is installed on the puncture needle, and then, thepositioning device 110 can track the position and orientation of the puncture needle by means of the positioning sensor. In an example, one positioning device can track the signals of 8 positioning sensors at the same time. - The
positioning device 110 can adopt one of electromagnetic (EM) tracking technology, optical tracking technology and fiber grating tracking technology. As examples of commercially available electromagnetic positioning systems, the Aurora system of the Canada NDI company, the DriveBay of the American Asension company, the PercuNav system of the Dutch Philips company, the InstaTrak3500 plus system of the American GE company, the StealthStation AxiEM system of the American Medtronic company and the Cygnus FPS system of the American Compass company are available. - The two or more
movement monitoring tools 120 are configured to be fixed on a patient body at the same position and orientation before the surgery (specifically, before CT scanning) and in the surgery, but before the intervention of the surgical tool. A description is given below by taking it as an example that the monitored movement is a respiratory movement. For the convenience of description, themovement monitoring tools 120 may be called breath monitoring tools below. The wearing positions of the breath monitoring tools on the patient body are places greatly influenced by the breath to the best, so that the relative positions of the several breath monitoring tools may generate relative change to a certain extent with the respiratory movement. - In an example, each movement monitoring tool is provided with at least four mark points capable of being tracked by a positioning device, in which any three arbitrary mark points are non-collinear, wherein the first position and orientation of the movement monitoring tools in the image coordinate system are identified by identifying the mark points of each movement monitoring tool in the three-dimensional medical image; and the second position and orientation of the movement monitoring tools in the positioning coordinate system are determined by tracking the mark points of each movement monitoring tool in the surgery through the positioning device. Here, at least four mark points are arranged on one movement monitoring tool, and any three arbitrary mark points are non-collinear, so that a coordinate transformation matrix from the positioning coordinate system to the image coordinate system can be figured out based on the position and orientation of the at least four mark points in the image coordinate system and the first positioning coordinate system. In an example, for example, the mark points are metal balls for example, and the image parameters of the metal balls in the CT are obviously different from the image parameters of the human body material, thereby capable of being manually identified or automatically identified through image processing easily. For each such breath monitoring tool, a unique coordinate transformation matrix can be determined according to the first position and orientation of the breath monitoring tool in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and the coordinate transformation matrix can be used for guaranteeing zero transformation error. However, since a plurality of breath monitoring tools are provided, and a single coordinate transformation matrix is used for transforming each one of the plurality of breath monitoring tools from the first position and orientation in the image coordinate system to the second position and orientation in the positioning coordinate system, thus a transformation error will be generated. As mentioned above, the main concept of the present disclosure is to monitor the difference of the poses of the breath monitoring tools relative to the poses at a preoperative CT scanning moment by calculating optimal uniform transformation matrixes at various moments and corresponding overall transformation errors, so as to monitor the difference of the states of the human organ at various moments relative to the state at the preoperative CT scanning moment to achieve the purpose of monitoring the human organ movement.
- It should be noted that each breath monitoring tool is not necessarily provided with four or more mark points. On the contrary, as long as the transformation matrix and the overall transformation error between the two coordinate systems can be figured out on the basis of the positions of all mark points of all breath monitoring tools in the positioning coordinate system and the image coordinate system. For example, two breath monitoring tools can be provided, each breath monitoring tool has three mark points, for example, in this way, an optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the 6 mark points in the positioning coordinate system and the positions thereof in the image coordinate system. As another example, four breath monitoring tools can be provided, each breath monitoring tool has two mark points, in this way, the optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the eight mark points in the positioning coordinate system and the positions thereof in the image coordinate system. As a more extreme example, 5 breath monitoring tools are provided, each breath monitoring tool has one mark point, in this way, the optimal coordinate transformation matrix and the corresponding transformation error can be similarly figured out through the positions of the 5 mark points in the positioning coordinate system and the positions thereof in the image coordinate system.
-
FIG. 2 shows an exemplary structural diagram of abreath monitoring tool 120 according to an embodiment of the present disclosure. Thebreath monitoring tool 120 can be composed of a plastic framework and an object to be identified. In the plastic framework, 4 image identifiable markers (for example, metal balls, for example, being made of a lead material) are placed at positions indicated by amark number 121, the image parameters (mainly including CT values and MRI values) of these markers in a CT or MRI image are of obvious distinguishing features in comparison with the image parameters of a peripheral material, so that better development can be obtained in a CT machine, for example, the markers can be extracted using a known method. As another example of the markers, adhesive electrodes adopted in ECG production can be adopted. Identifiable markers can be automatically extracted from the image through subsequent processing of the CT image or the MRI image, and the information (including coordinate information and direction information of a tool center) of the position and orientation (for the convenience of description, can be called pose below) of the tool can be further calculated. - A magnetic induction positioning device can be placed on the facade of the exemplary
breath monitoring tool 120 inFIG. 2 , and around hole 122 at the middle and a concave structure at the left side are mainly used for facilitating the disassembly of the magnetic induction positioning device. - Three extended straight rods are mainly used for facilitating attachment on a human body surface, meanwhile,
markers 121 capable of being identified by CR or MR can be placed on the rods, and the pose information can be calculated after the markers are identified. Around hole 123 in a small cross bar at the distal tip of each straight rod can be used for recording position information on the human body surface through a marking pen, in this way, after thebreath monitoring tool 120 is removed from the surface of the patient, thebreath monitoring tool 120 can be accurately adhered on the human body surface again according to the recorded pose information. - The structure and shape of the breath monitoring tool as shown in
FIG. 2 are merely exemplary, and the configuration thereof can be changed as needed. - As an example, the utilization process of the breath monitoring tools can be divided into a CT scanning stage and an intraoperative stage. Specifically, before CT scanning is carried out, two or more breath monitoring tools are fixed at the chest position of the patient, the wearing pose information is marked on the human body surface, for example, by using the marking pen, then the patient is pushed into a CT room to carry out the CT scanning to obtain a CT scanning data set, and the position and orientation of the breath monitoring tools can be identified from such CT scanning data set. Incidentally, a three-dimensional model of a related treatment site of the patient can be reconstructed from such CT scanning data set via image processing, and such three-dimensional model can be visualized on the display device and is fused with the image of the tracked surgical tool for display to assist the surgery of the doctor. In the surgery and before the intervention of the surgical tool in vivo, the breath monitoring tools are worn according to the aforementioned pose information recorded on the body surface of the patient, the poses of the breath monitoring tools in the positioning coordinate system are monitored in real time and the coordinate transformation matrix and corresponding overall errors of all breath monitoring tools are optimized to figure out, and the human organ movement is monitored based on the change of the overall errors over time.
- The three-dimensional medical
image obtaining unit 130 can be a part of thesurgical navigation workstation 140 or a separate device. The three-dimensional medicalimage obtaining unit 130 is used for obtaining a preoperative three-dimensional medical image of a treatment site of the patient fixed with the movement monitoring tools, the three-dimensional medical image having an associated image coordinate system. For example, a three-dimensional data set obtained by preoperative CT scanning is stored in a database, the three-dimensional medicalimage obtaining unit 130 can be operatively coupled to the database, read a specific three-dimensional data set from the database and visually display the specific three-dimensional data set on the display device through image processing, such as rendering and the like. The three-dimensional medicalimage obtaining unit 130 can simply read and display a reconstructed three-dimensional model and can also read the non-reconstructed CT scanning data set and reconstruct the CT scanning data set to obtain and display the three-dimensional model. As to the method for reconstructing the three-dimensional model from the CT or MRI data set, reference can be made to related introduction of coordinate transformation and error calculation, for example, reference can be made to introductions in the third chapter and fourth chapter in a doctoral dissertation entitled “RESEARCH ON KEY TECHNOLOGY OF COMPUTER ASSISTED INTERVENTIONAL OPERATION UNDER IMAGE GUIDANCE” of Zhai Weiming, the entire contents of which are herein incorporated by reference. - The
control console 210 can be connected to thesurgical navigation workstation 140 or is a part of thesurgical navigation workstation 140. Thecontrol console 210 can receive the pose information of the breath monitoring tools and the pose information of the surgical tool from thepositioning device 110. Thecontrol console 210 can also send an instruction of operating the puncture needle according to the indication of the clinician. - The
surgical navigation workstation 140 is configured to register and combine the preoperative three-dimensional medical image with an intraoperative surgical tool image and visually display the same on the connected display device to guide the surgical operation of a surgeon. - The
surgical navigation workstation 140 can include aprocessor 141 and amemory 142, and anapplication program 1421 and anoperating system 1422 can be stored in thememory 142. Theapplication program 1421 can include a program code, and the program code, when being executed by theprocessor 141, can execute the human organ movement monitoring method that will be described in detail below. The processor here is a generalized concept and can be implemented by a single dedicated processor, a single shared processor, or a plurality of individual processors, wherein one processor among the processors can be shared, in addition, the use of the word “processor” or “controller” should not be explained as exclusively referring to hardware capable of executing software, but can implicitly include, but not limited to, digital signal processor (DSP) hardware, a read-only memory (ROM) used for storing software and a nonvolatile memory. Thesurgical navigation workstation 140 can further include an interface used for real-time interaction with adisplay 150. - The
display 150 can display the orientation, shape and/or position or the like of a medical target tissue under the control of thesurgical navigation workstation 140, and can also display the fusion of an image of the three-dimensional model of a related position of the patient and an image of the surgical tool. - The
surgical navigation workstation 140 can further include man-machine interaction interfaces, for example, being connected to such peripheral devices as a keyboard, a mouse, a touch screen system, etc. Thesurgical navigation workstation 140 can further include an interface used for interacting with the three-dimensional medicalimage obtaining unit 130 and an interface used for interacting with thecontrol console 210, etc. - For the convenience of understanding, a clinical surgical procedure will be described below briefly. In general, the clinical surgical procedure can include preoperative preparation, preoperative image scanning, preoperative surgical plan, intraoperative surgical navigation, postoperative surgical evaluation, etc. For example, the preoperative preparation includes: preparing devices in a navigation operating room, fixing the patient by a vacuum pad, and fixing the movement monitoring tools (or adhering mark points and the liked) on the body surface of the patient. For example, the preoperative image scanning includes carrying out preoperative CT scanning or MRI scanning on the patient. The preoperative surgical plan can include three-dimensional visualization of a medical image, three-dimensional reconstruction of an organ and focus, interactive surgical approach plan and surgical effect simulation and the like, wherein for example, the surgical approach plan can include selection of a needle entry point and a needle entry angle of the puncture needle; the surgical effect simulation can include thermal field simulation, damage field model reconstruction, etc. The intraoperative surgical navigation includes puncturing according to an approach in the surgical plan, etc. The postoperative surgical evaluation can include re-scanning a CT image or an MM image at specified time after the surgery, measuring the size of an actual injury area on the CT data and comparing the size with data calculated in the surgical plan to measure the accuracy and effect of the surgery.
- The human organ movement monitoring method according to the embodiment of the present disclosure can be used in an intraoperative surgical navigation process, for example.
- An exemplary surgical navigation process carried out in combination with a human body navigation system will be described below in combination with
FIG. 3 . -
FIG. 3 shows an exemplary process of asurgical navigation method 200 according to an embodiment of the present disclosure. - In a step S210, two or more movement monitoring tools are fixed on a patient body before the surgery.
- In an example, 2-4 breath monitoring tools are placed on the body surface of the patient, the wearing positions are places greatly influenced by breath to the best, so that the relative positions of the several tools will generate relative change to a certain extent with the respiratory movement, the properties of the respiratory movement can be better represented by tracking the poses of the breath monitoring tools, and thus the properties of a target treatment part (focus) which moves with the respiratory movement can be better described.
- However, the number of the breath monitoring tools is merely exemplary, and more breath monitoring tools can be selected as needed, for example, 5, 6 or more, etc.
- In a step S220, a three-dimensional medical image of the treatment site of the patient fixed with the movement monitoring tools is obtained by preoperative scanning, the three-dimensional medical image having an associated image coordinate system. For example, the preoperative scanning can be CR scanning or MRI scanning. For example, the three-dimensional medical image is obtained by the three-dimensional medical
image obtaining unit 130 as shown inFIG. 1 by reconstruction based on a CT scanning image data set. - After the three-dimensional medical image of the treatment site of the patient fixed with the movement monitoring tools is obtained by preoperative scanning, the three-dimensional medical image can be visually displayed on the display device, and the clinician can plan a surgical approach with reference to the three-dimensional medical image by means of a navigation software system, for example.
- In a step S230, for example, the
surgical navigation workstation 140 identifies first position and orientation of the movement monitoring tools in the image coordinate system from the preoperative three-dimensional medical image. - As mentioned above, mark points can be arranged on the movement monitoring tools, the mark points have image parameters which are obviously different from those of a human tissue in a CT image, so that the mark points can be manually or automatically identified from the CT image, and accordingly, the position and orientation of the movement monitoring tools in the image coordinate system can be identified.
- It should be noted that the step S220 and the step S230 are connected by a dashed line arrow, which indicates that the two steps can be separated for a relatively long time, for example, several hours or several days.
- In a step S240, in the surgery, in a state that the same movement monitoring tools are fixed on the body of the patient at the same position and orientation as in the preoperative scanning, second position and orientation of the movement monitoring tools in a positioning coordinate system are determined in real time, wherein the positioning coordinate system is a coordinate system which is referenced in a process of tracking the position and orientation of a surgical tool by the positioning device. For example, the second position and orientation of the movement monitoring tools in the positioning coordinate system are obtained by the positioning device and the positioning sensors on the movement monitoring tools as shown in
FIG. 1 and are transmitted to thesurgical navigation workstation 140. - In a step S250, the surgical navigation workstation calculates an optimal coordinate transformation relation between the positioning coordinate system and the image coordinate system in real time based on the first position and orientation of the movement monitoring tools in the image coordinate system and the second position and orientation thereof in the positioning coordinate system, and calculates overall errors of the movement monitoring tools during coordinate transformation from the positioning coordinate system to the image coordinate system based on the optimal coordinate transformation relation.
- For introduction to coordinate transformation and error calculation, reference can be made to the instruction in the Section 6.4 in the doctoral dissertation entitled “RESEARCH ON KEY TECHNOLOGY OF COMPUTER ASSISTED INTERVENTIONAL OPERATION UNDER IMAGE GUIDANCE” of Zhai Weiming, the entire contents of which are herein incorporated by reference.
- In a step S260, movement degree of a human organ at various moments relative to a preoperative scanning moment are evaluated based on the real-time determined overall errors of coordinate transformation of the movement monitoring tools at the various moments.
- As mentioned above, in theory, in various stages of the respiratory movement cycle of the lung, if the movement state at a certain moment is closest to the movement state at the CT scanning moment, the mapping of the breath monitoring tool at the moment in the image coordinate system is closest to the pose of the breath monitoring tool in the image coordinate system obtained by the preoperative CT scanning, and thus the coordinate transformation error corresponding to the coordinate transformation matrix obtained by optimization at the moment in the surgery is the smallest.
-
FIG. 4 shows a schematic diagram of a part of a time change curve of an overall error of coordinate transformation in a respiratory movement process, wherein a horizontal axis represents time and a vertical axis represents an overall error of coordinate transformation. As shown inFIG. 4 , at moments t1, t2, t3, the overall error of coordinate transformation is small, at this time, the breath state is close to the breath state at the preoperative scanning moment, the breath states at other moments deviate from the breath state at the preoperative scanning moment to different degrees, and the overall error of coordinate transformation changes accordingly. - By observing the change condition of the overall error of coordinate transformation with time in the surgical process, the difference degrees of the states of the human organ at the various moments relative to the preoperative scanning moment can be evaluated.
- A human organ moment monitoring result in the embodiment of the present disclosure can be adopted in a variety of forms.
- In an example, the real-time obtained time change curve of the overall error of coordinate transformation can be visually displayed on the display device in the form as shown in
FIG. 4 , for example. The clinician can determine the timing of, for example, puncturing the puncture needle in the patient by observing the time change curve of the overall error. - In an example, for example, the surgical navigation workstation in
FIG. 1 determines the timing for the surgical tool to intervene in a human focus based on the evaluated movement degree of the human organ at the various moments relative to the preoperative three-dimensional scanning moment, and reminds a surgeon of the timing. In an example, when the overall error of coordinate transformation at a certain moment is 50% of the average overall error of coordinate transformation of a certain time period and the change trend is reduction, the certain moment is determined as the timing of the surgical tool for intervening the human focus, and then the timing information is displayed on the display device, for example, in a short message form or is output from a loudspeaker in a voice form to remind the clinician. - In an example, since the positioning coordinate system referred in the description of the poses of the breath monitoring tools is the coordinate system which is referenced in the process of positioning the position and orientation of the surgical tool, a moment when the pose of the human organ is basically uniform with the pose at the preoperative CT scanning moment can be determined on the basis of the evaluated movement (or difference) degrees of the pose of the human organ at various moments relative to the pose at the preoperative CT scanning moment, and the real-time tracked position of the surgical tool is transformed into the image coordinate system and is fused with the three-dimensional medical image by using the calculated coordinate transformation matrix from the positioning coordinate system to the image coordinate system at the moment, and then the fused image is visually displayed on the display device. Specifically, the human organ movement monitoring method may further include: determining a moment when the overall error of coordinate transformation is smaller than a preset threshold; determining a coordinate transformation matrix from the positioning coordinate system to the image coordinate system at the moment; and determining the position of the surgical tool in the image coordinate system based on the coordinate transformation matrix, and combining the image of the surgical tool with the three-dimensional medical image at the position and displaying the combined image on a display device. Namely, at the moment when the overall error of coordinate transformation is smaller than the preset threshold, the registration from the positioning coordinate system where the surgical tool is located to the image coordinate system is carried out, the image of the surgical tool is fused with the human three-dimensional model based on the position of the surgical tool tracked by the
positioning device 110 for example, and is visually displayed on the display device to assist the surgery of the clinician. - It should be noted that the breath monitoring process can be continuously carried out in the surgery.
- In addition, a rule obtained by breath monitoring, for example, the information of a time interval (as shown in
FIG. 4 ) between a former moment having the smallest overall error and a moment having the smallest overall error can also be provided for the clinician, to enable the clinician to make psychological preparation and operation preparation before puncturing the needle at the next time. - By adopting the surgical navigation system and the human organ movement monitoring method, the movement degree (or difference degree) of the state of the human organ relative to the state at the CT scanning moment can be monitored without using a special human organ monitor, for example, for the respiratory movement, no special respiratory gate control device is needed. But, the present disclosure does not exclude use in cooperation with the special human organ monitor.
- The surgical navigation system and the human organ movement monitoring method in the embodiment of the present disclosure do not simply consider the human body as a rigid body for carrying out registration on the preoperative positioning coordinate system and the image coordinate system, but consider the internal movement of the human body for monitoring the difference degree of the state of the human organ relative to the state at the CT scanning moment, so that the doctor can be reminded of a proper timing for operation and can execute the preoperative surgical plan more accurately, and the dependence on the individual experience of the clinician is reduced.
- In addition, the embodiment of the disclosure can take the form of a computer program product. The computer program product can be accessed via a computer readable medium, for use by or in combination with a computer or any instruction execution system. The computer readable medium can include a device for storing, exchanging, transmitting or sending a program. The computer readable medium can be an electronic, magnetic, optical, electromagnetic, infrared or semiconductor system (or device or apparatus) or a transmission medium. Examples of the computer readable medium include a semiconductor or a solid-state memory, a magnetic tape, a detachable computer floppy disk, a random access memory (RAM), a read-only memory (ROM), a magnetic disk or an optical disk.
- Various embodiments of the present disclosure have been described above, and the foregoing description is exemplary, rather than exhaustive, and is not limited to the disclosed embodiments. Many modifications and variations are obvious for those of ordinary skill in the art without departing from the scope or spirit of the illustrated embodiments. The selection of terms herein is intended to explain the principles of the embodiments, practical application or improvements in the technologies on the market to the best, or enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (20)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410259145.3A CN104055520B (en) | 2014-06-11 | 2014-06-11 | Human organ motion monitoring method and operation guiding system |
CN201410259145.3 | 2014-06-11 | ||
CN201410259145 | 2014-06-11 | ||
PCT/CN2014/080237 WO2015188393A1 (en) | 2014-06-11 | 2014-06-18 | Human organ motion monitoring method, surgical navigation system, and computer-readable media |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170215969A1 true US20170215969A1 (en) | 2017-08-03 |
US10258413B2 US10258413B2 (en) | 2019-04-16 |
Family
ID=51543654
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/106,746 Expired - Fee Related US10258413B2 (en) | 2014-06-11 | 2014-06-18 | Human organ movement monitoring method, surgical navigation system and computer readable medium |
Country Status (3)
Country | Link |
---|---|
US (1) | US10258413B2 (en) |
CN (1) | CN104055520B (en) |
WO (1) | WO2015188393A1 (en) |
Cited By (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190183577A1 (en) * | 2017-12-15 | 2019-06-20 | Medtronic, Inc. | Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools |
US10482599B2 (en) | 2015-09-18 | 2019-11-19 | Auris Health, Inc. | Navigation of tubular networks |
US10492741B2 (en) | 2013-03-13 | 2019-12-03 | Auris Health, Inc. | Reducing incremental measurement sensor error |
US10524866B2 (en) | 2018-03-28 | 2020-01-07 | Auris Health, Inc. | Systems and methods for registration of location sensors |
US10531864B2 (en) | 2013-03-15 | 2020-01-14 | Auris Health, Inc. | System and methods for tracking robotically controlled medical instruments |
US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
CN111223556A (en) * | 2018-11-23 | 2020-06-02 | 西门子医疗有限公司 | Integrated medical image visualization and exploration |
CN111265299A (en) * | 2020-02-19 | 2020-06-12 | 上海理工大学 | Operation navigation method based on optical fiber shape sensing |
CN111588464A (en) * | 2019-02-20 | 2020-08-28 | 忞惪医疗机器人(苏州)有限公司 | Operation navigation method and system |
CN111770716A (en) * | 2018-02-21 | 2020-10-13 | 奥林巴斯株式会社 | Medical system and control method for medical system |
US10806535B2 (en) * | 2015-11-30 | 2020-10-20 | Auris Health, Inc. | Robot-assisted driving systems and methods |
US10827913B2 (en) | 2018-03-28 | 2020-11-10 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
CN112057165A (en) * | 2020-09-22 | 2020-12-11 | 上海联影医疗科技股份有限公司 | Path planning method, device, equipment and medium |
US10898286B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Path-based navigation of tubular networks |
US10898275B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Image-based airway analysis and mapping |
US10905499B2 (en) | 2018-05-30 | 2021-02-02 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
CN112513996A (en) * | 2018-07-09 | 2021-03-16 | 阿斯卡拉波股份有限公司 | Medical technical equipment and method |
CN112707340A (en) * | 2020-12-10 | 2021-04-27 | 安徽有光图像科技有限公司 | Equipment control signal generation method and device based on visual identification and forklift |
US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
CN113017852A (en) * | 2021-03-03 | 2021-06-25 | 上海联影医疗科技股份有限公司 | Interactive experience method, equipment and electronic device for medical imaging scanning process |
US11051681B2 (en) | 2010-06-24 | 2021-07-06 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
US11147633B2 (en) | 2019-08-30 | 2021-10-19 | Auris Health, Inc. | Instrument image reliability systems and methods |
US11160615B2 (en) | 2017-12-18 | 2021-11-02 | Auris Health, Inc. | Methods and systems for instrument tracking and navigation within luminal networks |
CN113662663A (en) * | 2021-08-20 | 2021-11-19 | 中国人民解放军陆军军医大学第二附属医院 | Coordinate system conversion method, device and system of AR holographic surgery navigation system |
CN113764076A (en) * | 2021-07-26 | 2021-12-07 | 北京天智航医疗科技股份有限公司 | Method and device for detecting mark points in medical perspective image and electronic equipment |
US11207141B2 (en) | 2019-08-30 | 2021-12-28 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
CN113855240A (en) * | 2021-09-30 | 2021-12-31 | 上海寻是科技有限公司 | Medical image registration system and method based on magnetic navigation |
CN114073581A (en) * | 2021-06-29 | 2022-02-22 | 成都科莱弗生命科技有限公司 | Bronchus electromagnetic navigation system |
US11278357B2 (en) | 2017-06-23 | 2022-03-22 | Auris Health, Inc. | Robotic systems for determining an angular degree of freedom of a medical device in luminal networks |
US11298195B2 (en) | 2019-12-31 | 2022-04-12 | Auris Health, Inc. | Anatomical feature identification and targeting |
US11298051B2 (en) * | 2019-03-20 | 2022-04-12 | Stryker European Holdings I, Llc | Technique for processing patient-specific image data for computer-assisted surgical navigation |
CN114681058A (en) * | 2022-03-02 | 2022-07-01 | 北京长木谷医疗科技有限公司 | Navigation positioning system precision verification method and device for joint replacement |
CN114813798A (en) * | 2022-05-18 | 2022-07-29 | 中国工程物理研究院化工材料研究所 | CT detection device and imaging method for representing internal structure and composition of material |
US11426095B2 (en) | 2013-03-15 | 2022-08-30 | Auris Health, Inc. | Flexible instrument localization from both remote and elongation sensors |
CN115089294A (en) * | 2022-08-24 | 2022-09-23 | 北京思创贯宇科技开发有限公司 | Interventional operation navigation method |
WO2022206434A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Interactive alignment system and method for surgical navigation, electronic device, and readable storage medium |
US11471217B2 (en) * | 2017-12-11 | 2022-10-18 | Covidien Lp | Systems, methods, and computer-readable media for improved predictive modeling and navigation |
US11488313B2 (en) | 2019-01-31 | 2022-11-01 | Siemens Healthcare Gmbh | Generating a motion-compensated image or video |
US11490782B2 (en) | 2017-03-31 | 2022-11-08 | Auris Health, Inc. | Robotic systems for navigation of luminal networks that compensate for physiological noise |
US11503986B2 (en) | 2018-05-31 | 2022-11-22 | Auris Health, Inc. | Robotic systems and methods for navigation of luminal network that detect physiological noise |
US11504187B2 (en) | 2013-03-15 | 2022-11-22 | Auris Health, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
US11510736B2 (en) | 2017-12-14 | 2022-11-29 | Auris Health, Inc. | System and method for estimating instrument location |
US11602372B2 (en) | 2019-12-31 | 2023-03-14 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
US20230096691A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods and Systems for Controlling Cooperative Surgical Instruments |
US11660147B2 (en) | 2019-12-31 | 2023-05-30 | Auris Health, Inc. | Alignment techniques for percutaneous access |
US11771309B2 (en) | 2016-12-28 | 2023-10-03 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
US11950898B2 (en) | 2020-11-06 | 2024-04-09 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
Families Citing this family (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105395252A (en) * | 2015-12-10 | 2016-03-16 | 哈尔滨工业大学 | Wearable three-dimensional image navigation device for vascular intervention operation and realizing man-machine interaction |
CN105616003B (en) * | 2015-12-24 | 2017-11-21 | 电子科技大学 | A kind of soft tissue 3D vision tracking based on radial direction spline interpolation |
CN106073898B (en) * | 2016-08-17 | 2019-06-14 | 北京柏惠维康医疗机器人科技有限公司 | Abdominal cavity interventional operation system |
CN106109016A (en) * | 2016-08-17 | 2016-11-16 | 北京柏惠维康医疗机器人科技有限公司 | Abdominal-cavity minimal-invasion surgery system and wherein play the determination method in pin moment |
CN106236258B (en) * | 2016-08-17 | 2019-03-12 | 北京柏惠维康科技有限公司 | The method and device for planning of abdominal-cavity minimal-invasion surgery puncture path |
CN106420056B (en) * | 2016-11-03 | 2023-11-03 | 中国人民解放军总医院 | Instrument, positioning and guiding device of instrument and method thereof |
CN107028659B (en) * | 2017-01-23 | 2023-11-28 | 新博医疗技术有限公司 | Surgical navigation system and navigation method under guidance of CT image |
CN106859742B (en) * | 2017-03-21 | 2023-11-10 | 北京阳光易帮医疗科技有限公司 | Puncture operation navigation positioning system and method |
CN107578443B (en) * | 2017-07-26 | 2020-11-06 | 北京理工大学 | Needle knife display method and device |
CN108175502B (en) * | 2017-11-29 | 2021-08-17 | 苏州朗开医疗技术有限公司 | Bronchoscope electromagnetic navigation system |
CN108682048B (en) * | 2018-05-15 | 2022-05-17 | 杭州三坛医疗科技有限公司 | Method, device and system for displaying posture of guide channel and readable storage medium |
CN108937987B (en) * | 2018-05-22 | 2021-07-02 | 上海联影医疗科技股份有限公司 | Method and system for determining position of marker in motif |
CN109247915B (en) * | 2018-08-30 | 2022-02-18 | 北京连心医疗科技有限公司 | Detection label for skin surface deformation and real-time detection method |
US11944388B2 (en) | 2018-09-28 | 2024-04-02 | Covidien Lp | Systems and methods for magnetic interference correction |
CN109192034A (en) * | 2018-10-26 | 2019-01-11 | 刘林 | A kind of controllable spinal cord injury model device and making and use method |
CN109620201B (en) * | 2018-12-07 | 2023-04-14 | 南京国科精准医学科技有限公司 | Flexible multi-lead cap type magnetoencephalography instrument and high-precision imaging method thereof |
CN109767469B (en) * | 2018-12-29 | 2021-01-29 | 北京诺亦腾科技有限公司 | Method and system for calibrating installation relationship and storage medium |
CN112996435A (en) * | 2018-12-29 | 2021-06-18 | 深圳迈瑞生物医疗电子股份有限公司 | Monitoring method based on physiological parameters, monitoring equipment and computer storage medium |
CN109464196B (en) * | 2019-01-07 | 2021-04-20 | 北京和华瑞博医疗科技有限公司 | Surgical navigation system adopting structured light image registration and registration signal acquisition method |
WO2021007803A1 (en) | 2019-07-17 | 2021-01-21 | 杭州三坛医疗科技有限公司 | Positioning and navigation method for fracture reduction and closure surgery, and positioning device for use in method |
CN110613519B (en) * | 2019-09-20 | 2020-09-15 | 真健康(北京)医疗科技有限公司 | Dynamic registration positioning device and method |
CN110731821B (en) * | 2019-09-30 | 2021-06-01 | 艾瑞迈迪医疗科技(北京)有限公司 | Method and guide bracket for minimally invasive tumor ablation based on CT/MRI |
CN112489745A (en) * | 2019-11-27 | 2021-03-12 | 上海联影智能医疗科技有限公司 | Sensing device for medical facility and implementation method |
CN111513850B (en) * | 2020-04-30 | 2022-05-06 | 京东方科技集团股份有限公司 | Guide device, puncture needle adjustment method, storage medium, and electronic apparatus |
CN112348878B (en) * | 2020-10-23 | 2023-03-21 | 歌尔科技有限公司 | Positioning test method and device and electronic equipment |
CN112515767B (en) * | 2020-11-13 | 2021-11-16 | 中国科学院深圳先进技术研究院 | Surgical navigation device, surgical navigation apparatus, and computer-readable storage medium |
CN113499138B (en) * | 2021-07-07 | 2022-08-09 | 南开大学 | Active navigation system for surgical operation and control method thereof |
CN113425411B (en) * | 2021-08-04 | 2022-05-10 | 成都科莱弗生命科技有限公司 | Device of pathological change location navigation |
CN113729945B (en) * | 2021-08-24 | 2022-04-15 | 真健康(北京)医疗科技有限公司 | Registration method of body surface positioning device, puncture guiding method and equipment |
CN114451992B (en) * | 2021-10-11 | 2023-08-15 | 佗道医疗科技有限公司 | Post-operation nail placement precision evaluation method |
CN113940756B (en) * | 2021-11-09 | 2022-06-07 | 广州柏视医疗科技有限公司 | Operation navigation system based on mobile DR image |
CN114271909A (en) * | 2021-12-13 | 2022-04-05 | 杭州堃博生物科技有限公司 | Information processing method, device, system, equipment and medium for chest puncture |
CN114176777B (en) * | 2021-12-20 | 2022-07-01 | 北京诺亦腾科技有限公司 | Precision detection method, device, equipment and medium of operation-assisted navigation system |
CN114343845B (en) * | 2022-01-11 | 2023-12-12 | 上海睿触科技有限公司 | Focus position dynamic tracking method for auxiliary puncture system |
CN114587533B (en) * | 2022-03-25 | 2023-04-28 | 深圳市华屹医疗科技有限公司 | Puncture location guidance method, puncture location guidance device, puncture location guidance apparatus, puncture location guidance program, and puncture location guidance program |
CN115414121B (en) * | 2022-11-07 | 2023-03-24 | 中南大学 | Surgical operation navigation system based on radio frequency positioning chip |
CN116942317B (en) * | 2023-09-21 | 2023-12-26 | 中南大学 | Surgical navigation positioning system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130223702A1 (en) * | 2012-02-22 | 2013-08-29 | Veran Medical Technologies, Inc. | Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040034297A1 (en) * | 2002-08-13 | 2004-02-19 | General Electric Company | Medical device positioning system and method |
CN2748042Y (en) | 2004-10-22 | 2005-12-28 | 上海导向医疗系统有限公司 | Computer aided navigational system for viscera refrigeration intervention surgery |
EP1795142B1 (en) | 2005-11-24 | 2008-06-11 | BrainLAB AG | Medical tracking system using a gamma camera |
CN100493471C (en) | 2006-01-26 | 2009-06-03 | 清华大学深圳研究生院 | Puncture guiding system of computer aided PCNL |
CN101327148A (en) | 2008-07-25 | 2008-12-24 | 清华大学 | Instrument recognizing method for passive optical operation navigation |
CN201353203Y (en) * | 2009-02-09 | 2009-12-02 | 李晴航 | Computer aided surgery intraoperative positioning system |
CN102266250B (en) * | 2011-07-19 | 2013-11-13 | 中国科学院深圳先进技术研究院 | Ultrasonic operation navigation system and ultrasonic operation navigation method |
CN102949240B (en) * | 2011-08-26 | 2014-11-26 | 高欣 | Image-guided lung interventional operation system |
CN202437059U (en) | 2011-11-29 | 2012-09-19 | 北京集翔多维信息技术有限公司 | Bronchoscopy electromagnetic navigation system |
CN102525662B (en) * | 2012-02-28 | 2013-09-04 | 中国科学院深圳先进技术研究院 | Three-dimensional visual tissue organ operation navigation system |
CN103356284B (en) * | 2012-04-01 | 2015-09-30 | 中国科学院深圳先进技术研究院 | Operation piloting method and system |
-
2014
- 2014-06-11 CN CN201410259145.3A patent/CN104055520B/en active Active
- 2014-06-18 US US15/106,746 patent/US10258413B2/en not_active Expired - Fee Related
- 2014-06-18 WO PCT/CN2014/080237 patent/WO2015188393A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130223702A1 (en) * | 2012-02-22 | 2013-08-29 | Veran Medical Technologies, Inc. | Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation |
Cited By (71)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11857156B2 (en) | 2010-06-24 | 2024-01-02 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
US11051681B2 (en) | 2010-06-24 | 2021-07-06 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
US11241203B2 (en) | 2013-03-13 | 2022-02-08 | Auris Health, Inc. | Reducing measurement sensor error |
US10492741B2 (en) | 2013-03-13 | 2019-12-03 | Auris Health, Inc. | Reducing incremental measurement sensor error |
US10531864B2 (en) | 2013-03-15 | 2020-01-14 | Auris Health, Inc. | System and methods for tracking robotically controlled medical instruments |
US11426095B2 (en) | 2013-03-15 | 2022-08-30 | Auris Health, Inc. | Flexible instrument localization from both remote and elongation sensors |
US11129602B2 (en) | 2013-03-15 | 2021-09-28 | Auris Health, Inc. | Systems and methods for tracking robotically controlled medical instruments |
US11504187B2 (en) | 2013-03-15 | 2022-11-22 | Auris Health, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
US10796432B2 (en) | 2015-09-18 | 2020-10-06 | Auris Health, Inc. | Navigation of tubular networks |
US10482599B2 (en) | 2015-09-18 | 2019-11-19 | Auris Health, Inc. | Navigation of tubular networks |
US11403759B2 (en) | 2015-09-18 | 2022-08-02 | Auris Health, Inc. | Navigation of tubular networks |
US10806535B2 (en) * | 2015-11-30 | 2020-10-20 | Auris Health, Inc. | Robot-assisted driving systems and methods |
US10813711B2 (en) | 2015-11-30 | 2020-10-27 | Auris Health, Inc. | Robot-assisted driving systems and methods |
US11464591B2 (en) | 2015-11-30 | 2022-10-11 | Auris Health, Inc. | Robot-assisted driving systems and methods |
US11771309B2 (en) | 2016-12-28 | 2023-10-03 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
US11490782B2 (en) | 2017-03-31 | 2022-11-08 | Auris Health, Inc. | Robotic systems for navigation of luminal networks that compensate for physiological noise |
US11278357B2 (en) | 2017-06-23 | 2022-03-22 | Auris Health, Inc. | Robotic systems for determining an angular degree of freedom of a medical device in luminal networks |
US11759266B2 (en) | 2017-06-23 | 2023-09-19 | Auris Health, Inc. | Robotic systems for determining a roll of a medical device in luminal networks |
US11850008B2 (en) | 2017-10-13 | 2023-12-26 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
US11471217B2 (en) * | 2017-12-11 | 2022-10-18 | Covidien Lp | Systems, methods, and computer-readable media for improved predictive modeling and navigation |
US11510736B2 (en) | 2017-12-14 | 2022-11-29 | Auris Health, Inc. | System and method for estimating instrument location |
US11135017B2 (en) * | 2017-12-15 | 2021-10-05 | Medtronic, Inc. | Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools |
US20190183577A1 (en) * | 2017-12-15 | 2019-06-20 | Medtronic, Inc. | Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools |
US10413363B2 (en) * | 2017-12-15 | 2019-09-17 | Medtronic, Inc. | Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools |
US20200060765A1 (en) * | 2017-12-15 | 2020-02-27 | Medtronic, Inc. | Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools |
US11160615B2 (en) | 2017-12-18 | 2021-11-02 | Auris Health, Inc. | Methods and systems for instrument tracking and navigation within luminal networks |
CN111770716A (en) * | 2018-02-21 | 2020-10-13 | 奥林巴斯株式会社 | Medical system and control method for medical system |
US11576730B2 (en) | 2018-03-28 | 2023-02-14 | Auris Health, Inc. | Systems and methods for registration of location sensors |
US10827913B2 (en) | 2018-03-28 | 2020-11-10 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
US11712173B2 (en) | 2018-03-28 | 2023-08-01 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
US10524866B2 (en) | 2018-03-28 | 2020-01-07 | Auris Health, Inc. | Systems and methods for registration of location sensors |
US10898277B2 (en) | 2018-03-28 | 2021-01-26 | Auris Health, Inc. | Systems and methods for registration of location sensors |
US10905499B2 (en) | 2018-05-30 | 2021-02-02 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
US11793580B2 (en) | 2018-05-30 | 2023-10-24 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
US10898275B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Image-based airway analysis and mapping |
US11759090B2 (en) | 2018-05-31 | 2023-09-19 | Auris Health, Inc. | Image-based airway analysis and mapping |
US11503986B2 (en) | 2018-05-31 | 2022-11-22 | Auris Health, Inc. | Robotic systems and methods for navigation of luminal network that detect physiological noise |
US11864850B2 (en) | 2018-05-31 | 2024-01-09 | Auris Health, Inc. | Path-based navigation of tubular networks |
US10898286B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Path-based navigation of tubular networks |
CN112513996A (en) * | 2018-07-09 | 2021-03-16 | 阿斯卡拉波股份有限公司 | Medical technical equipment and method |
CN111223556A (en) * | 2018-11-23 | 2020-06-02 | 西门子医疗有限公司 | Integrated medical image visualization and exploration |
US11488313B2 (en) | 2019-01-31 | 2022-11-01 | Siemens Healthcare Gmbh | Generating a motion-compensated image or video |
CN111588464A (en) * | 2019-02-20 | 2020-08-28 | 忞惪医疗机器人(苏州)有限公司 | Operation navigation method and system |
US11911149B2 (en) | 2019-03-20 | 2024-02-27 | Stryker European Operations Holdings Llc | Technique for processing patient-specific image data for computer-assisted surgical navigation |
US11298051B2 (en) * | 2019-03-20 | 2022-04-12 | Stryker European Holdings I, Llc | Technique for processing patient-specific image data for computer-assisted surgical navigation |
US11207141B2 (en) | 2019-08-30 | 2021-12-28 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
US11147633B2 (en) | 2019-08-30 | 2021-10-19 | Auris Health, Inc. | Instrument image reliability systems and methods |
US11944422B2 (en) | 2019-08-30 | 2024-04-02 | Auris Health, Inc. | Image reliability determination for instrument localization |
US11602372B2 (en) | 2019-12-31 | 2023-03-14 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
US11298195B2 (en) | 2019-12-31 | 2022-04-12 | Auris Health, Inc. | Anatomical feature identification and targeting |
US11660147B2 (en) | 2019-12-31 | 2023-05-30 | Auris Health, Inc. | Alignment techniques for percutaneous access |
CN111265299A (en) * | 2020-02-19 | 2020-06-12 | 上海理工大学 | Operation navigation method based on optical fiber shape sensing |
CN112057165A (en) * | 2020-09-22 | 2020-12-11 | 上海联影医疗科技股份有限公司 | Path planning method, device, equipment and medium |
US11950898B2 (en) | 2020-11-06 | 2024-04-09 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
CN112707340A (en) * | 2020-12-10 | 2021-04-27 | 安徽有光图像科技有限公司 | Equipment control signal generation method and device based on visual identification and forklift |
CN113017852A (en) * | 2021-03-03 | 2021-06-25 | 上海联影医疗科技股份有限公司 | Interactive experience method, equipment and electronic device for medical imaging scanning process |
WO2022206434A1 (en) * | 2021-04-01 | 2022-10-06 | 上海复拓知达医疗科技有限公司 | Interactive alignment system and method for surgical navigation, electronic device, and readable storage medium |
CN114073581A (en) * | 2021-06-29 | 2022-02-22 | 成都科莱弗生命科技有限公司 | Bronchus electromagnetic navigation system |
CN113764076A (en) * | 2021-07-26 | 2021-12-07 | 北京天智航医疗科技股份有限公司 | Method and device for detecting mark points in medical perspective image and electronic equipment |
CN113662663A (en) * | 2021-08-20 | 2021-11-19 | 中国人民解放军陆军军医大学第二附属医院 | Coordinate system conversion method, device and system of AR holographic surgery navigation system |
US20230101714A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods and Systems for Controlling Cooperative Surgical Instruments |
US20230093972A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods and Systems for Controlling Cooperative Surgical Instruments |
US20230096691A1 (en) * | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Methods and Systems for Controlling Cooperative Surgical Instruments |
CN113855240A (en) * | 2021-09-30 | 2021-12-31 | 上海寻是科技有限公司 | Medical image registration system and method based on magnetic navigation |
US11957421B2 (en) * | 2021-10-22 | 2024-04-16 | Cilag Gmbh International | Methods and systems for controlling cooperative surgical instruments |
CN114681058A (en) * | 2022-03-02 | 2022-07-01 | 北京长木谷医疗科技有限公司 | Navigation positioning system precision verification method and device for joint replacement |
CN114813798A (en) * | 2022-05-18 | 2022-07-29 | 中国工程物理研究院化工材料研究所 | CT detection device and imaging method for representing internal structure and composition of material |
CN115089294A (en) * | 2022-08-24 | 2022-09-23 | 北京思创贯宇科技开发有限公司 | Interventional operation navigation method |
Also Published As
Publication number | Publication date |
---|---|
WO2015188393A1 (en) | 2015-12-17 |
US10258413B2 (en) | 2019-04-16 |
CN104055520A (en) | 2014-09-24 |
CN104055520B (en) | 2016-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10258413B2 (en) | Human organ movement monitoring method, surgical navigation system and computer readable medium | |
US11529070B2 (en) | System and methods for guiding a medical instrument | |
TWI615126B (en) | An image guided augmented reality method and a surgical navigation of wearable glasses using the same | |
EP2654559B1 (en) | System to guide a rigid instrument | |
CN112220557B (en) | Operation navigation and robot arm device for craniocerebral puncture and positioning method | |
EP2584990B1 (en) | Focused prostate cancer treatment system | |
CN103479430A (en) | Image guiding intervention operation navigation system | |
JP2016517288A (en) | Planning, Guidance and Simulation System and Method for Minimally Invasive Treatment (Cross-reference of Related Applications) This application is a “PLANNING,” filed on March 15, 2013, which is incorporated herein by reference in its entirety. Priority of US Provisional Patent Application No. 61 / 800,155 entitled “NAVIGATION SIMULATION SYSTEMS ANDMETHODS FORMINIMALY INVASI VETHERAPY”. This application also has priority to US Provisional Patent Application No. 61 / 924,993, filed January 8, 2014, entitled “PLANNING, NAVIGATION SIMULATION SYSTEMS AND METHODS FORMINIMALY INVASI VETHERAPY”, which is incorporated herein by reference in its entirety. Insist. This application also claims the priority of US Provisional Patent Application No. 61 / 845,256, entitled “SURGICALTRAININGANDIMAGEMAGRAINPHANTOM”, filed July 11, 2013, which is incorporated herein by reference in its entirety. . This application also claims the priority of US Provisional Patent Application No. 61 / 900,122, entitled “SURGICALTRAININGANDIMAGINGBRAINPHANTOM” filed on November 5, 2013, which is incorporated herein by reference in its entirety. . | |
JP2007531553A (en) | Intraoperative targeting system and method | |
JP2002526188A (en) | System and method for determining the position of a catheter during a medical procedure inside the body | |
US20200214772A1 (en) | Method for recovering patient registration | |
CN109758233A (en) | A kind of diagnosis and treatment integrated operation robot system and its navigation locating method | |
EP3184036A1 (en) | Registration between coordinate systems for visualizing a tool | |
CN110537980A (en) | puncture surgery navigation method based on motion capture and mixed reality technology | |
CN109864820A (en) | One kind mapping mixed positioning navigation system based on CT threedimensional model | |
US20050228251A1 (en) | System and method for displaying a three-dimensional image of an organ or structure inside the body | |
EP3184035A1 (en) | Ascertaining a position and orientation for visualizing a tool | |
Zheng et al. | Development status and application of neuronavigation system | |
WO2022240790A1 (en) | Medical instrument guidance systems and associated methods | |
CN113907883A (en) | 3D visualization operation navigation system and method for ear-side skull-base surgery | |
Wu et al. | Process analysis and application summary of surgical navigation system | |
CN113940756B (en) | Operation navigation system based on mobile DR image | |
CN113598909B (en) | Method for measuring distance from focus point to percutaneous point in body | |
US20230240750A1 (en) | Systems for evaluating registerability of anatomic models and associated methods | |
US20240090866A1 (en) | System and method for displaying ablation zone progression |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TSINGHUA UNIVERSITY, CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZHAI, WEIMING;SONG, YIXU;REEL/FRAME:038976/0316 Effective date: 20160517 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20230416 |