WO2015188393A1 - 人体器官运动监测方法、手术导航系统和计算机可读介质 - Google Patents

人体器官运动监测方法、手术导航系统和计算机可读介质 Download PDF

Info

Publication number
WO2015188393A1
WO2015188393A1 PCT/CN2014/080237 CN2014080237W WO2015188393A1 WO 2015188393 A1 WO2015188393 A1 WO 2015188393A1 CN 2014080237 W CN2014080237 W CN 2014080237W WO 2015188393 A1 WO2015188393 A1 WO 2015188393A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
image
motion monitoring
tool
positioning
Prior art date
Application number
PCT/CN2014/080237
Other languages
English (en)
French (fr)
Inventor
翟伟明
宋亦旭
Original Assignee
清华大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 清华大学 filed Critical 清华大学
Priority to US15/106,746 priority Critical patent/US10258413B2/en
Publication of WO2015188393A1 publication Critical patent/WO2015188393A1/zh

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0004Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by the type of physiological signal transmitted
    • A61B5/0013Medical image data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • A61B5/0015Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network characterised by features of the telemetry system
    • A61B5/002Monitoring the patient using a local or closed circuit, e.g. in a room or building
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1107Measuring contraction of parts of the body, e.g. organ, muscle
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00694Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
    • A61B2017/00699Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body correcting for movement caused by respiration, e.g. by triggering
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/102Modelling of surgical devices, implants or prosthesis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2074Interface software
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3954Markers, e.g. radio-opaque or breast lesions markers magnetic, e.g. NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3966Radiopaque markers visible in an X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Definitions

  • the present invention generally relates to surgical navigation systems and methods, and more particularly to a human organ motion monitoring method, a human body navigation system, and a computer readable medium for real-time monitoring of movement of a human organ in a medical procedure.
  • Interventional surgery is the development of modern surgery. The difference between interventional surgery and traditional surgery is that there is no need to open the knife. Only a small wound can be used to make special surgical instruments such as catheter, cryo-needle, radiofrequency ablation, guide wire, etc. It penetrates into the lesions of the human body or the location of the surgical target, and then achieves the purpose of treatment through various physical/chemical functions, thereby solving the problems of tumor resection, tissue biopsy, and placement of artificial equipment that can be solved by open surgery in the past.
  • special surgical instruments such as catheter, cryo-needle, radiofrequency ablation, guide wire, etc. It penetrates into the lesions of the human body or the location of the surgical target, and then achieves the purpose of treatment through various physical/chemical functions, thereby solving the problems of tumor resection, tissue biopsy, and placement of artificial equipment that can be solved by open surgery in the past.
  • Computer-assisted surgical navigation technology is a cross-disciplinary research project integrating computer science, artificial intelligence, automatic control, image processing, 3D graphics, virtual reality and clinical treatment.
  • Surgical navigation technology uses a variety of modal medical images to assist doctors in puncturing surgical instruments directly into the lesion for local treatment, thereby improving the quality of surgery, reducing surgical trauma, and reducing patient suffering.
  • Surgical navigation uses the patient's medical image and the three-dimensional model generated by its reconstruction to guide the implementation of the clinical procedure in real time.
  • the patient's medical image data set is obtained, for example, by CT (computerized tomography) or MRI (Magnetic Resonance Imaging) scanning.
  • the surgical navigation system connects the patient's preoperative medical image data with the intraoperative surgical site through the positioning device, and can accurately display the patient's anatomy and details of the three-dimensional spatial position near the lesion in the software interface.
  • the surgical instrument is pointed at any part of the patient's body, its coordinate information is acquired by the navigation system in real time and displayed on the patient's 3D model. This allows the doctor to know the relative positional relationship between the surgical instrument and the tumor lesion in real time, even without having to open the patient.
  • surgical navigation has the following advantages:
  • the most appropriate surgical path can be selected through preoperative surgical design
  • the surgical approach can be adjusted in real time during surgery to achieve more accurate lesions.
  • CT images are exemplified below as medical images, but it is obvious that medical images may also be other images such as MRI images.
  • Real-time surgical navigation involves the use of a positioning device or tracking system, for example, an electromagnetic (EM, electromagnetic) tracker to track the end of a surgical tool, such as a puncture needle, to correlate the position of the surgical tool with a pre-operative medical image such as a CT image. , and display the fused image to the clinician.
  • EM electromagnetic
  • a reference mark-based registration process is typically performed prior to navigation. These reference marks (external reference marks or internal anatomical reference marks) are identified from the CT image and touched by the calibrated tracking probes to obtain these reference marks in the tracker space, hereinafter referred to as the positioning coordinate system coordinate of. After that, point-based registration is performed to find the coordinate transformation matrix between the CT space and the tracker space.
  • a registration matrix is obtained from the coordinate transformation matrix, which aligns the surgical tool with the pre-operative CT image, so that the image of the surgical tool and the CT image can be realized in the CT coordinate system based on the position information of the surgical tool in the positioning system.
  • the present invention has been made in view of the above circumstances.
  • a method for monitoring a movement of a human organ in real time for monitoring the movement of a human organ during a surgical procedure is provided, and a patient fixed with two or more motion monitoring tools has been obtained by scanning before surgery.
  • the human body motion monitoring method comprising the steps of: obtaining each motion monitoring tool identified from the pre-operative three-dimensional medical image in the image coordinate system The first position and posture in the same state; in the same position and posture of the same motion monitoring tool fixed on the patient's body in the same position and posture as the pre-operative scan: real-time determination of the second position of each motion monitoring tool in the positioning coordinate system and Attitude, the positioning coordinate system is a coordinate system referenced during positioning of the position and posture of the surgical tool; based on the first position and posture of each motion monitoring tool in the image coordinate system and the second position and posture in the positioning coordinate system , real-time calculation of the positioning coordinate system to the image coordinate system
  • a surgical navigation system comprising: a positioning device for tracking a position and a posture of a surgical tool and a motion monitoring tool in a positioning coordinate system; two or more motion monitoring tools , is fixed on the patient's body, and its position and posture in the positioning coordinate system can be tracked by the positioning device; a three-dimensional medical image obtaining unit for obtaining the three-dimensional medical part of the patient with the motion monitoring tool before surgery a stereoscopic medical image having an associated image coordinate system; a surgical navigation workstation for registering and combining the preoperative three-dimensional medical image with the surgical tool image during surgery, and visualizing on the connected display device Displaying to guide the surgical operation of the surgeon; wherein the surgical navigation workstation also monitors the motion state of the human organ by: identifying the motion monitoring tool in the image coordinate system from the preoperative three-dimensional medical image a position and posture; determine each in real time during surgery The second position and attitude of the motion monitoring tool in the positioning coordinate system; calculating the positioning coordinate system in real time
  • a computer readable medium having recorded thereon a computer program for use in conjunction with a surgical navigation system, and when executed by the processing device, performing the following operations: obtaining The first position and posture of each motion monitoring tool identified in the pre-three-dimensional medical image in the image coordinate system, the pre-operative three-dimensional medical image is based on a patient who has two or more motion monitoring tools fixed by scanning before surgery Obtained from the medical site, the three-dimensional medical image has an associated image coordinate system; in the same state that the same motion monitoring tool is fixed to the patient's body in the same position and posture as the pre-operative scan: real-time determination of each motion monitoring a second position and attitude of the tool in the positioning coordinate system, the positioning coordinate system is a coordinate system referenced during positioning of the position and posture of the surgical tool; based on the first position and posture of each motion monitoring tool in the image coordinate system and Second position and attitude in the positioning coordinate system, real-time calculation The optimal coordinate transformation relationship between the bit coordinate system and the image coordinate system, and
  • the surgical navigation system and the human body motion monitoring method according to the embodiments of the present invention can solve or alleviate the problem that the human body may cause tumor displacement due to respiratory motion in the chest and thoracic abdominal operation navigation, thereby reducing the accuracy of the surgical navigation; It facilitates the use and promotion of surgical navigation in multiple fields.
  • FIG. 1 shows a block diagram of the configuration of an exemplary surgical navigation system 100 in accordance with an embodiment of the present invention.
  • FIG. 2 shows an exemplary structural diagram of a respiratory monitoring tool 120 in accordance with an embodiment of the present invention.
  • FIG. 3 illustrates an exemplary process of a surgical navigation method 200 in accordance with an embodiment of the present invention.
  • Figure 4 shows a schematic diagram of a portion of the overall error versus time curve of coordinate transformation during respiratory motion. detailed description
  • the CT image before surgery is collected at a certain respiratory state or respiratory state of the patient. It is often inconvenient to track the position and attitude of the reference mark in the positioning coordinate system during the CT scan. The time at which the CT is scanned may be several days away from the actual surgery.
  • the patient continues to be in the periodic motion of breathing.
  • the reference mark at this time is in the positioning coordinate system.
  • the position and posture in the middle correspond to the position and posture of the reference mark previously recognized from the CT image in the image coordinate system, so that the registration matrix obtained at this time is correct, so that the surgery can be accurately performed at this moment.
  • the position of the tool is transformed into the CT space, and the visual display of the fused 3D volume model provided to the clinician is accurate.
  • the inventor thought that it is possible to monitor the position and attitude (orientation or azimuth) of each monitoring tool (with reference marks) in the positioning coordinate system, and optimize a uniform coordinate transformation from the positioning coordinate system to the three-dimensional image coordinate system.
  • the matrix determines the total of the errors of the conversion of each monitoring tool from the positioning coordinate system to the three-dimensional image coordinate system (hereinafter sometimes referred to as the overall error), and the moment when the error is small indicates that the position and posture of the human organ at this time are close to
  • the position and posture of the human organ are scanned at the time of CT, and the information is provided to the clinician in a visual or audible form to assist the doctor in performing the operation.
  • the movement of the lungs and related tissues caused by the respiratory motion is taken as an example of the movement of the human organs, but the present invention is not limited thereto, but is applicable to other processes for monitoring the dynamic internal motion of the human body, for example.
  • Heartbeat movement The embodiments described herein may preferably be used to assist in positioning the lung or tissue adjacent to the lung, although the tissue may also be located elsewhere, such as the heart, digestive organs, blood vessels, kidneys, and the like.
  • the position and posture can be expressed by six numerical values, that is, positional coordinates representing three-dimensional positions and three angular values representing orientations, which are also referred to in some conventional techniques.
  • registration means establishing a mapping relationship between two coordinate systems or establishing a coordinate transformation relationship, such as “registration between a positioning coordinate system and an image coordinate system", which means that in the positioning coordinate system and A mapping relationship is established between the image coordinate systems, or a coordinate transformation matrix between the two is obtained.
  • the positioning coordinate system here is a coordinate system referenced by the positioning device positioning surgical tool and the motion monitoring tool.
  • FIG. 1 shows a block diagram of the configuration of an exemplary surgical navigation system 100 in accordance with an embodiment of the present invention.
  • surgical navigation system 100 includes a positioning device 110, two or more motion monitoring tools 120, a three-dimensional medical image acquisition unit 130, and a surgical navigation workstation 140.
  • FIG. 1 also shows a console 210 and a medical tool 220 that cooperate with the surgical navigation system 100.
  • Positioning device 110 is configured to track the position and attitude of the motion monitoring tool in the positioning coordinate system.
  • a motion sensor can be placed on the motion monitoring tool 112, such as a magnetic proximity sensor.
  • the positioning device 110 is able to track the position and attitude of the individual motion monitoring tools in the positioning coordinate system.
  • the positioning device 110 is also configured to track the position and posture of the surgical tool in the second positioning coordinate system.
  • the surgical tool is, for example, a puncture needle on which a positioning sensor 112 is mounted, whereby the positioning device 110 can track the position and posture of the puncture needle by means of the positioning sensor.
  • a positioning device can track the signals of eight positioning sensors simultaneously.
  • the positioning device 110 can employ one of electromagnetic (EM) tracking technology, optical tracking technology, and fiber grating tracking technology.
  • EM electromagnetic
  • optical tracking technology optical tracking technology
  • fiber grating tracking technology As an example of a commercially available electromagnetic positioning system, there are Aurora system of NDI of Canada, DriveBay of Asension of the United States, PercuNav system of Philips of the Netherlands, InstaTrak 3500 Plus system of GE of the United States, StealthStation AxiEM system of Medtronic of the United States and Compass of the United States. Cygnus FPS system.
  • Two or more motion monitoring tools 120 are configured to be secured to the patient's body in the same position and posture, respectively, prior to surgery (specifically, prior to CT scanning) and during surgery but prior to intervention of the surgical tool.
  • the motion monitoring tool 120 may hereinafter be referred to as a respiratory monitoring tool.
  • the position of the respiratory monitoring tool on the patient's body surface is best selected to be affected by the breathing, so that the relative position between several respiratory monitoring tools may change to a certain extent with the breathing movement.
  • each motion monitoring tool has at least four landmarks that can be tracked by the positioning device, wherein any three marker points are not collinear, wherein each of the motion monitoring tools is identified by identifying in the three-dimensional medical image Marking points to identify the first position and posture of each motion monitoring tool in the image coordinate system; and in the operation, the positioning device determines the position of each motion monitoring tool in the positioning coordinate system by tracking each marker point of each motion monitoring tool Two positions and postures.
  • Setting at least four marker points on a motion monitoring tool and any three marker points are not collinear in order to solve the slave position based on the position and posture of the at least four marker points in the image coordinate system and the first positioning coordinate system Coordinate transformation matrix from coordinate system to image coordinate system.
  • the marking point is, for example, a metal ball
  • the image parameter of the metal ball in the CT is significantly different from the image parameter of the human body material, and is easily recognized manually or by image processing.
  • a unique coordinate transformation matrix can be determined based on its first position and posture in the image coordinate system and the second position and posture in the positioning coordinate system, and the coordinate transformation matrix can make The conversion error is zero.
  • a single coordinate transformation matrix is used to perform a first position and attitude of each of the plurality of respiratory monitoring tools in the image coordinate system and a second position and attitude in the positioning coordinate system. The conversion, therefore, will produce conversion errors.
  • the main idea of the present invention is to monitor the difference in the pose of the respiratory monitoring tool at each moment relative to the pre-operative CT scan time by calculating the optimal unified transformation matrix at each moment and the corresponding overall conversion error. Thereby realizing the state of monitoring human organs at various times relative to preoperative
  • the difference in the state of the CT scan time achieves the purpose of monitoring the movement of human organs.
  • each respiratory monitoring tool does not necessarily have four or more landmarks. Rather, it is possible to optimize the transformation matrix and the overall conversion error between the two coordinate systems based on the position of all the marker points from all the respiratory monitoring tools in the positioning coordinate system and the image coordinate system.
  • there may be two breathing monitoring tools each of which has, for example, three marker points, so that the optimal coordinates can be solved by a total of six marker points in the positioning coordinate system and the position in the image coordinate system.
  • Transform the matrix and the corresponding conversion error there may be four respiratory monitoring tools, each of which has 2 marker points, so that the optimal position of the eight landmarks in the positioning coordinate system and the position in the image coordinate system can be solved optimally.
  • Coordinate transformation matrix and corresponding conversion error is that there are five respiratory monitoring tools, each of which has a marker point, which can also be solved by a total of five marker points in the positioning coordinate system and the position in the image coordinate system.
  • the optimal coordinate transformation matrix and the corresponding conversion error is that there are five respiratory monitoring tools, each of which has
  • FIG. 2 shows an exemplary structural schematic of a respiratory monitoring tool 120 in accordance with an embodiment of the present invention.
  • the respiratory monitoring tool 120 can be comprised of a plastic frame and an identified object.
  • four image-recognizable markers such as metal pellets, such as lead
  • the image parameters of these markers in CT or MRI images and surrounding materials mainly The CT value and the MRI value have obvious distinguishing characteristics, so that, for example, a better development can be obtained in a CT machine. It can be easily extracted by known methods.
  • an electrode sticker for use in electrocardiography can be used. Subsequent processing of CT images or MRI images can automatically extract identifiable markers from the image, and further calculate the position and posture of the tool (hereinafter, for convenience of description, it can be simply called pose) information (including tool center) Coordinate information and direction information).
  • the front surface of the respiratory monitoring tool 120 illustrated in Fig. 2 can be placed with a magnetic induction positioning device, and the circular hole 122 in the middle and the concave structure on the left side are mainly for facilitating the disassembly of the magnetic induction positioning device.
  • the three extended straight rods are mainly for easy attachment on the surface of the human body.
  • markers that can be recognized by CT or MR can be placed on the rod, and the position information can be calculated after the recognition.
  • the circular hole 123 on the small crossbar at the end of the straight rod can record the position information on the surface of the human body through the marker pen, so that after the respiratory monitoring tool 120 is detached from the patient's surface, the breathing can be accurately performed according to the recorded posture information.
  • the monitoring tool 120 is again pasted to the surface of the human body.
  • the structure and shape of the respiratory monitoring tool shown in Fig. 2 are merely examples, and the configuration can be changed as needed.
  • the utilization of the respiratory monitoring tool can be divided into a CT scan phase and an intraoperative phase.
  • two or more respiratory monitoring tools are fixed to the chest position of the patient, and the posture information worn by the marker on the human body surface is recorded, for example, using a marker, and then the patient is pushed into the CT room.
  • CT scan a CT scan data set is obtained, from which the position and posture of the respiratory monitoring tool can be identified.
  • a three-dimensional model of a patient-related medical site can be reconstructed by image processing from such a CT scan data set, and such a three-dimensional stereo model can be visualized on the display device and imaged with the tracked medical tool. Fusion display to assist the doctor in the operation.
  • the respiratory monitoring tool Before the operation and before the surgical tool is inserted into the human body, the respiratory monitoring tool is worn according to the posture information recorded on the surface of the patient's body as described above, and the posture of each respiratory monitoring tool in the positioning coordinate system is monitored in real time, and optimized for all breathing.
  • the coordinate transformation matrix of the monitoring tool and the corresponding total error are monitored for movement of the human organ based on changes in the total error over time.
  • the three-dimensional medical image acquisition unit 130 may be part of the surgical navigation workstation 140 or a separate device.
  • the three-dimensional medical image obtaining unit 130 obtains a three-dimensional medical image of a medical site of a patient to which the motion monitoring tool is fixed before surgery, the stereoscopic medical image having an associated image coordinate system.
  • a three-dimensional data set obtained by a pre-operative CT scan is stored in a database, and a three-dimensional medical image obtaining unit 130 is operatively coupled to the database, and from the number
  • a specific three-dimensional data set is read from the library and visually displayed on the display device by image processing such as rendering.
  • the three-dimensional medical image obtaining unit 130 may simply read and display the reconstructed three-dimensional stereo model, or may read and reconstruct the CT image data set before reconstruction to obtain a three-dimensional stereo model and display it.
  • the method of reconstructing a three-dimensional model from a CT or MRI scan data set reference may be made to the introduction of coordinate transformation and error calculation.
  • the author may refer to the author's titled "Image-guided research on key techniques of computer-assisted interventional navigation".
  • the introduction of the third and fourth chapters of the doctoral thesis is hereby incorporated by reference.
  • Console 210 can be coupled to surgical navigation workstation 140 or to a portion of surgical navigation workstation 140.
  • the console 210 can receive pose information from the respiratory monitoring tool of the positioning device 110 and pose information of the surgical tool.
  • the console 210 can also issue an instruction to operate the puncture needle in accordance with the clinician's instructions.
  • the surgical navigation workstation 140 is configured to register and combine pre-operative three-dimensional medical images with surgical tool images during surgery and visually display them on the connected display device to guide the surgical procedure of the surgeon.
  • the surgical navigation workstation 140 can include a processor 141 and a memory 142 in which an application 1421 and an operating system 1422 can be stored.
  • the application 1421 may include a program code that, when executed by the processor 141, may perform a human body motion monitoring method as will be described in detail below.
  • the processor herein is a broad concept and can be implemented by a single dedicated processor, a single shared processor, or multiple individual processors, where one of the multiple processors can be shared, and the word "processor" should not be used.
  • the use of "or” controller is to be interpreted as exclusively referring to hardware capable of executing software, but is capable of implicitly including but not limited to digital signal processor ("DSP”) hardware, read-only memory for storing software. (“RAM”) and non-volatile memory.
  • DSP digital signal processor
  • RAM read-only memory
  • non-volatile memory The surgical navigation workstation 140 can also include an interface for interacting with the display 150 in real time.
  • the display 150 can display the orientation, shape and/or position of the medical target tissue under the control of the surgical navigation workstation 140, and can also display the fusion of the three-dimensional model of the patient's relevant site and the image of the surgical tool. Peripherals such as mice and touch screen systems.
  • the surgical navigation workstation 140 may also include an interface for interacting with the three-dimensional medical image acquisition unit 130, an interface for interacting with the console 210, and the like.
  • clinical procedures may include preoperative preparation, preoperative imaging, preoperative surgical planning, intraoperative surgical navigation, and postoperative hands. Treatment and other treatments.
  • Preoperative preparation includes, for example, navigational operating room equipment preparation, use of a vacuum pad to secure the patient, and fixation of the motion monitoring tool (or labeling point, etc.) to the patient's body surface.
  • Preoperative image scans include, for example, preoperative CT scans or MRI scans of the patient.
  • the preoperative surgical planning may include three-dimensional visualization of medical images, three-dimensional reconstruction of organs and lesions, interactive surgical path planning, and simulation of surgical effects, etc., wherein the surgical path plan includes, for example, a needle insertion point and a needle insertion angle selection; Surgical effect simulations may include thermal field simulation, damage field model reconstruction, and the like.
  • Intraoperative surgical navigation includes completing the puncture according to the path in the surgical plan.
  • Postoperative surgical evaluation may include re-scanning CT images or MRI images after scheduled events, and the actual lesion area is measured on CT data and compared with the data calculated by the surgical plan to measure the accuracy and effect of the surgery.
  • the human organ motion monitoring method according to an embodiment of the present invention can be used, for example, in an intraoperative surgical navigation process.
  • FIG. 3 illustrates an exemplary process of a surgical navigation method 200 in accordance with an embodiment of the present invention.
  • step S210 two or more motion monitoring tools are attached to the patient's body prior to surgery.
  • 2-4 breath monitoring tools are placed on the surface of the patient's body, and the position to be worn is preferably selected to be affected by the breathing, so that the relative position between several tools will occur to some extent with the breathing movement.
  • the relative change of the breathing monitoring tool poses more representative of the characteristics of the breathing movement, and is more able to describe the characteristics of the target medical part (lesion) that moves with the breathing movement.
  • the number of the above-mentioned respiratory monitoring tools is only an example, and a larger number of respiratory monitoring tools can be selected as needed, for example, five, six or even more.
  • step S220 a three-dimensional medical image of a medical site of a patient to which the motion monitoring tool is attached is scanned before surgery, the three-dimensional medical image having an associated image coordinate system.
  • the pre-operative scan is, for example, a CT scan or an MRI scan.
  • the three-dimensional medical image is reconstructed based on the CT scan image data set, for example, by the three-dimensional medical image obtaining unit 130 shown in Fig. 1.
  • a stereoscopic medical image After performing a pre-operative scan and obtaining a three-dimensional medical image of the medical site of the patient to which the motion monitoring tool is attached, such a stereoscopic medical image can be visually displayed on the display device, and the clinician can refer to the stereoscopic medical image with the help of, for example,
  • the navigation software system performs surgical path planning.
  • each motion monitoring tool may be provided with a marking point, which has image parameters significantly different from human tissue in the CT image, thereby being able to manually or automatically identify each marking point from the CT image, thereby The position and attitude of each motion monitoring tool in the image coordinate system can be identified.
  • step S220 and step S230 are connected by a dotted arrow, indicating that the two can be separated by a long time, for example, several hours or days.
  • step S240 in the state in which the same motion monitoring tool is fixed to the patient's body in the same position and posture as the pre-operative scan, the second position of each motion monitoring tool in the positioning coordinate system is determined in real time.
  • the positioning coordinate system is a coordinate system associated with the positioning device tracking the position and posture of the surgical tool.
  • the positioning device on Figure 1 and the positioning sensors on the various motion monitoring tools are utilized to derive a second position and attitude of each of the motion monitoring tools in the positioning coordinate system and communicated to the surgical navigation workstation 140.
  • step S250 the surgical navigation workstation calculates the most between the positioning coordinate system and the image coordinate system based on the first position and posture of each motion monitoring tool in the image coordinate system and the second position and posture in the positioning coordinate system.
  • the optimal coordinate transformation relationship, and the overall error of the coordinate transformation of each motion monitoring tool from the positioning coordinate system to the image coordinate system is calculated based on the optimal coordinate transformation relationship.
  • step S260 the degree of motion of the human organs at the respective moments with respect to the pre-operative scanning time is evaluated based on the overall error of the coordinate transformation of the respective motion monitoring tools at respective times determined in real time.
  • the mapping of the respiratory monitoring tool at that moment in the image coordinate system It should be the closest to the pose of the respiratory monitoring tool obtained in the pre-operative CT scan in the image coordinate system, so the coordinate conversion error corresponding to the coordinate transformation matrix optimized at that time in the surgery should be minimal.
  • Figure 4 shows a schematic diagram of a portion of the overall error versus time curve of coordinate transformation during respiratory motion.
  • the horizontal axis represents time and the vertical axis represents total coordinate conversion error.
  • the overall error of coordinate transformation is small, and the breathing state and pre-operative scan at this time The breathing state at the moment of the drawing is relatively close, while the breathing state at other times deviates from the breathing state at the scanning time before the surgery, and the overall error of the coordinate transformation also changes.
  • the human organ motion monitoring results of the embodiments of the present invention can be utilized in various forms.
  • a time-varying curve of the coordinate-converted overall error obtained in real time can be visually displayed on the display device in the form of, for example, Figure 4.
  • the clinician can determine, for example, the timing of the puncture for the patient to enter the needle.
  • the surgical navigation workstation of FIG. 1 determines the timing of the intervention of the surgical tool with respect to the human lesion based on the degree of motion of the human organ at each of the times relative to the pre-operative three-dimensional scanning time, and prompts the operator performing the surgery.
  • the timing In one example, for example, when the coordinate conversion total error at a certain time is 50% of the average coordinate conversion total error of a certain period of time and the change trend is decreased, it is determined to be suitable for the timing of the intervention of the surgical tool into the human lesion, and then, for example, The short message form is displayed on the display device or outputted from the speaker in speech to alert the clinician.
  • the positioning coordinate system that is described with reference to the pose of the respiratory monitoring tool is the coordinate system referenced in the process of locating the position and posture of the surgical tool during surgery, it can be based on the evaluated posture of the human organ.
  • the time (or difference) of the pose of each time relative to the pre-operative CT scan time, and the time at which the pose of the human organ is substantially coincident with the pose of the CT scan time, and the calculated slave position coordinates of the time is used.
  • the coordinate conversion matrix of the image coordinate system is converted to the position of the surgical tool tracked in real time into the image coordinate system and merged with the stereo medical image, thereby being visually displayed on the display device.
  • the human body motion monitoring method may further include: determining a time when the total error of the coordinate transformation is less than a predetermined threshold; determining a coordinate transformation matrix of the positioning coordinate system at the time to the image coordinate system; determining the operation based on the coordinate transformation matrix
  • the position of the tool in the image coordinate system and the image of the surgical tool is displayed on the display device in combination with the stereoscopic medical image at that location. That is, when the total error of the coordinate transformation is less than the predetermined threshold, the registration of the positioning coordinate system where the surgical tool is located to the image coordinate system is performed, and the image of the surgical tool is based on, for example, the position of the surgical tool tracked by the positioning device 110. It is integrated with the three-dimensional model of the human body and visually displayed on the display device to assist the clinician in surgery.
  • the respiratory monitoring process can be continuously performed during surgery.
  • the rules obtained by respiratory monitoring such as the minimum moment of the previous total error and the overall error
  • the time interval between the smallest moments can also be provided to the clinician, allowing the clinician to be mentally prepared and operationally prepared before the next needle.
  • the present invention does not exclude the use of specialized human organ monitoring instruments.
  • the surgical navigation system and the human body motion monitoring method according to the embodiment of the present invention do not simply treat the human body as a rigid body for registration of the preoperative positioning coordinate system and the image coordinate system, but consider the internal motion of the human body to monitor the human body.
  • the degree of difference in the state of the organ relative to the state at the time of the CT scan can prompt the doctor to operate at an appropriate timing, so that the preoperative surgical planning can be performed more accurately, and the dependence on the clinician's personal experience can be reduced.
  • embodiments of the invention may take the form of a computer program product.
  • the computer program product can be accessed via a computer readable medium for use by or in connection with a computer or any instruction execution system.
  • a computer readable medium can comprise, store, exchange, propagate or transport a device.
  • the computer readable medium can be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system (or device or device) or a propagation medium. Examples of a computer readable medium include a semiconductor or solid state memory, magnetic tape, a detachable computer floppy disk, a random access memory (RAM), a read only memory (ROM), a magnetic disk or an optical disk.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physiology (AREA)
  • Robotics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Urology & Nephrology (AREA)
  • Pulmonology (AREA)
  • Multimedia (AREA)
  • Dentistry (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

一种手术过程中实时监测人体器官的运动的人体器官运动监测方法和手术导航系统。人体器官运动监测方法包括:获得从术前三维立体医学图像中识别的各个运动监测工具(120)在影像坐标系中的第一位置和姿态;在手术中实时确定各个运动监测工具(120)在定位坐标系中的第二位置和姿态;基于各个运动监测工具(120)在影像坐标系中的第一位置和姿态以及在定位坐标系中的第二位置和姿态,实时计算定位坐标系到影像坐标系之间的最优坐标转换关系,以及基于该最优坐标转换关系计算各个运动监测工具(120)在从定位坐标系到影像坐标系的坐标转换的总体误差;基于实时确定的各个时刻的各个运动监测工具(120)的坐标转换总体误差,评估人体器官在该各个时刻相对于手术前扫描时刻的运动程度。

Description

人体器官运动监测方法、 手术导航系统和计算机可读介质
技术领域
本发明总体地涉及手术导航系统和方法, 更具体地涉及医学过程中实时 监测人体器官的运动的人体器官运动监测方法、 人体导航系统、 计算机可读 介质。 背景技术
介入手术是现代外科手术的发展方法, 介入手术与传统外科手术的区别 是不需要开刀, 只需要一个很小的创口即可将特制的手术器械例如导管、 冷 冻针、 射频消融振、 导丝等刺入到人体内的病灶部位或者手术靶点位置上, 然后通过各种物理 /化学作用达到治疗的目的, 从而解决过去需要开放式手术 才能解决的肿瘤切除、 组织活检、 人工器材放置等问题。
在介入手术过程中, 医生无法直接观察到病人体内的治疗部位或者病灶, 需要借助于医学影像引导下的计算机辅助手术导航技术。 计算机辅助手术导 航技术是集计算机科学、 人工智能、 自动控制、 影像处理、 三维图形学、 虚 拟现实和临床治疗等多方面技术为一体的交叉研究课题。 手术导航技术使用 多种模态的医学影像协助医生将手术器械直接穿刺到病灶进行局部治疗, 进 而提高手术质量、 减少手术创伤、 降低患者痛苦。
手术导航应用患者的医学影像及由其重构生成的三维模型来实时引导临 床手术的实施。 患者的医学影像数据集例如通过 CT ( computerized tomography, 计算机断层摄影)或 MRI ( Magnetic Resonance Imaging, 核磁 共振成像)扫描手段来获得。 手术导航系统把患者术前医学影像数据和术中 手术部位通过定位设备联系起来, 并能够在软件界面里准确地显示患者的解 剖结构及病灶附近三维空间位置的细节。 当手术器械指向患者身体内部的任 意部位时, 它的坐标信息都会被导航系统实时获取, 并显示在患者的三维模 型上。 这样即使不用给患者开刀, 医生也能够实时了解手术器械和肿瘤病灶 之间的相对位置关系。 相对于传统手术手段而言, 手术导航具有下述优点:
1. 可通过三维可视化技术实时重构病灶, 并显示手术视野周围的结构特 征;
2. 可通过术前手术方案设计选择最合适的手术路径;
3. 可显示手术路径上可能遇到的组织结构;
4. 可显示重要的应回避的组织结构, 比如血管、 神经和骨骼等;
5. 可显示病灶的需要治疗的范围;
6. 可实时精确计算出手术器械的位置姿态并加以显示;
7. 可显示手术器械与病灶的空间位置关系, 并指示手术器械前进的方 向;
8. 可术中实时调整手术入路, 从而更加精确地达到病灶部位。
为了描述方便, 下文以 CT 图像作为医学影像的例子, 不过显然医学影 像也可以例如 MRI影像的其它影像。
实时手术导航涉及釆用定位设备或者跟踪系统, 例如, 釆用电磁( EM, electromagnetic )跟踪器, 来跟踪手术工具例如穿刺针的末端, 从而将手 术工具的位置与手术前医学影像例如 CT 图像相关, 并将融合图像显示给临 床医生。 为了实现 CT 空间和跟踪器空间之间的数据整合, 通常在导航之前 执行基于参考标记的配准过程。 从 CT 图像中识别这些参考标记(外部参考 标记或者内部解剖参考标记), 并由经校准的跟踪探头触碰这些参考标记, 从 而得到这些参考标记在跟踪器空间, 下文称之为定位坐标系中的坐标。 之后, 执行基于点的配准, 以找到 CT 空间和跟踪器空间之间的坐标变换矩阵。 由 坐标变换矩阵得到配准矩阵, 该配准矩阵使得手术工具与手术前 CT 图像对 准, 从而能够基于手术工具在定位系统中的位置信息, 实现手术工具的图像 与 CT图像在 CT坐标系中的准确融合, 并可视地显示给临床医生。 发明内容
目前手术导航技术已经广泛地应用于神经外科手术导航和骨科手术导航 中。 然而这些手术都假设人体是一个完全静止的刚体, 并且这种静止状态会 从 CT扫描初期保持到整个手术过程的结束。 虽然这种假设在大多数神经外 科手术和骨科手术里可以近似假定成立, 但是它在胸腹腔介入手术领域却难 以获得成立。 由例如呼吸而导致的胸腹腔组织运动 (包括肿瘤组织和正常组 织)往往可以达到 2-4厘米的范围, 而临床手术的精度要求往往是在毫米级 别的, 这样的误差会导致目前主流的手术导航系统难以安全稳定地应用于胸 腹腔介入手术的临床实践中。
鉴于上述情况, 做出了本发明。
根据本发明的一个方面, 提供了一种手术过程中实时监测人体器官的运 动的人体器官运动监测方法, 在手术前通过扫描已经获得了固定有两个或两 个以上的运动监测工具的患者的医疗部位的三维立体医学图像, 该三维立体 医学图像具有相关联的影像坐标系,该人体器官运动监测方法包括如下步骤: 获得从术前三维立体医学图像中识别的各个运动监测工具在影像坐标系中的 第一位置和姿态; 在同样的运动监测工具以与手术前扫描时同样的位置和姿 态固定于患者身体上的状态下: 实时确定各个运动监测工具在定位坐标系中 的第二位置和姿态, 该定位坐标系是定位手术工具的位置和姿态过程中所参 考的坐标系; 基于各个运动监测工具在影像坐标系中的第一位置和姿态以及 在定位坐标系中的第二位置和姿态, 实时计算定位坐标系到影像坐标系之间 的最优坐标转换关系, 以及基于该最优坐标转换关系计算各个运动监测工具 在从定位坐标系到影像坐标系的坐标转换的总体误差; 以及基于实时确定的 各个时刻的各个运动监测工具的坐标转换的总体误差, 来评估人体器官在该 各个时刻相对于手术前扫描时刻的运动程度。
根据本发明的另一方面, 提供了一种手术导航系统, 包括: 定位设备, 用于跟踪手术工具和运动监测工具在定位坐标系中的位置和姿态; 两个或两 个以上的运动监测工具, 被固定于患者身体上, 其在定位坐标系中的位置和 姿态能够被定位设备跟踪到; 三维立体医学图像获得单元, 用于获得手术前 的固定有运动监测工具的患者的医疗部位的三维立体医学图像, 该立体医学 图像具有相关联的影像坐标系; 手术导航工作站, 进行术前三维立体医学图 像与手术中的手术工具图像的配准和组合, 并在所连接的显示设备上进行可 视化显示, 以指导手术执行者的手术操作; 其中, 所述手术导航工作站还通 过下述操作监视人体器官的运动状态: 从术前三维立体医学图像中识别各个 运动监测工具在影像坐标系中的第一位置和姿态; 在手术中实时确定各个运 动监测工具在定位坐标系中的第二位置和姿态; 基于各个运动监测工具在影 像坐标系中的第一位置和姿态以及在定位坐标系中的第二位置和姿态, 实时 计算定位坐标系到影像坐标系之间的最优坐标转换关系, 以及计算各个运动 监测工具在从定位坐标系到影像坐标系的坐标转换的总体误差; 以及基于实 时确定的各个时刻的坐标转换的总体误差, 来评估人体器官在该各个时刻相 对于手术前扫描时刻的运动程度。
根据本发明的再一方面, 提供了一种计算机可读介质, 其上记载有计算 机程序, 该计算机程序与手术导航系统结合使用, 且当被处理装置执行时, 执行下述操作: 获得从术前三维立体医学图像中识别的各个运动监测工具在 影像坐标系中的第一位置和姿态, 该术前三维立体医学图像基于手术前通过 扫描固定有两个或两个以上的运动监测工具的患者的医疗部位得到的, 该三 维立体医学图像具有相关联的影像坐标系; 在同样的运动监测工具以与手术 前扫描时同样的位置和姿态固定于患者身体上的状态下: 实时确定各个运动 监测工具在定位坐标系中的第二位置和姿态, 该定位坐标系是定位手术工具 的位置和姿态过程中所参考的坐标系; 基于各个运动监测工具在影像坐标系 中的第一位置和姿态以及在定位坐标系中的第二位置和姿态, 实时计算定位 坐标系到影像坐标系之间的最优坐标转换关系, 以及基于该最优坐标转换关 系计算各个运动监测工具在从定位坐标系到影像坐标系的坐标转换的总体误 差; 以及基于实时确定的各个时刻的各个运动监测工具的坐标转换的总体误 差, 来评估人体器官在该各个时刻相对于手术前扫描时刻的运动程度。
利用本发明实施例的手术导航系统和人体器官运动监测方法, 能够解决 或緩解胸胸腹腔手术导航中人体由于呼吸运动而导致肿瘤位移、 进而降低手 术导航精度的问题; 也能够简化手术导航的流程, 方便手术导航在多个领域 的使用和推广。 附图说明
从下面结合附图对本发明实施例的详细描述中, 本发明的这些和 /或其它 方面和优点将变得更加清楚并更容易理解, 其中:
图 1示出了根据本发明实施例的示例性手术导航系统 100的配置框图。 图 2示出了根据本发明实施例的呼吸监测工具 120的示例性结构示意图。 图 3示出了根据本发明实施例的手术导航方法 200的示例性过程。
图 4示出了呼吸运动过程中的坐标转换的总体误差随时间变化曲线的一 部分的示意图。 具体实施方式
为了使本领域技术人员更好地理解本发明, 下面结合附图和具体实施方 式对本发明作进一步详细说明。
在详细描述本发明实施例之前, 为便于本领域技术人员理解和把握本发 明, 下面描述以下本发明的主要思想。
手术前的 CT图像是在患者的某个呼吸状态时刻或者呼吸状态阶段釆集 到的。 在进行 CT扫描的过程中通常不便于跟踪参考标记在定位坐标系中的 位置和姿态。 扫描 CT的时刻与实际手术的时刻可能间隔数天。
在后续的手术过程中患者持续处于呼吸的周期运动过程中, 理论上只有 当患者的某时刻的呼吸状态与手术前 CT扫描时刻或阶段相一致或相近时, 此时的参考标记在定位坐标系中的位置和姿态才对应于先前从 CT图像中识 别出来的参考标记在图像坐标系中的位置和姿态, 因而此时得到的配准矩阵 才是正确的, 从而在此时刻才能精确地将手术工具的位置变换到 CT空间中, 提供给临床医生的融合后的三维体模型的可视显示才是精确的。
为此, 发明人想到, 可以监测各个监测工具(上有参考标记)在定位坐 标系中的位置和姿态(取向或方位角), 优化一个统一的从定位坐标系到三维 影像坐标系的坐标转换矩阵, 确定各个监测工具从定位坐标系到三维影像坐 标系的转换的误差的总计(后文中有时简称之为总体误差), 认为误差较小的 时刻表明此时的人体器官的位置和姿态接近于在扫描 CT时刻人体器官的位 置和姿态, 并以可视或者可听等形式将该信息提供给临床医生, 辅助医生进 行手术。
需要说明的是, 下文描述中, 以呼吸运动引起肺部及相关组织运动作为 人体器官运动的示例, 不过本发明并不局限于此, 而是适用于监测人体存在 动态内部运动的其他过程, 例如心跳运动。 本文中描述的实施例可以优选用 于协助对肺或者接近肺的组织进行定位,不过所述组织也可以位于其他位置, 例如心脏、 消化器官、 血管、 肾脏等。
在下文描述中, 位置和姿态 (取向)可以用 6个数值来表示, 即表示三 维位置的位置坐标和表示取向的三个角度值, 这在某些传统技术中也被称为
6D位姿。 不过根据需要, 位置和姿态可以用更少的数值来描述。
在本文中, "配准"表示在两个坐标系之间建立映射关系或者建立坐标转 换关系, 例如 "定位坐标系和影像坐标系之间的配准" 表示在定位坐标系和 影像坐标系之间建立映射关系, 或者求得两者之间的坐标转换矩阵, 这里的 定位坐标系是定位设备定位手术工具和运动监测工具中所参考的坐标系。
图 1示出了根据本发明实施例的示例性手术导航系统 100的配置框图。 如图 1所示, 手术导航系统 100包括定位设备 110、 两个或两个以上的 运动监测工具 120、 三维立体医学图像获得单元 130、 手术导航工作站 140。 为了便于描述手术导航系统 100工作过程,图 1还示出了与手术导航系统 100 协同工作的控制台 210和医疗工具 220。
定位设备 110配置为跟踪运动监测工具在定位坐标系中的位置和姿态。 如图所示, 运动监测工具上能够放置定位传感器 112, 例如磁感应定位传感 器。 定位设备 110借助于该磁感应定位装置, 能够跟踪各个运动监测工具在 定位坐标系中的位置和姿态。
定位设备 110还配置为跟踪手术工具在第二定位坐标系中的位置和姿 态。 手术工具例如为穿刺针, 穿刺针上安装有定位传感器 112 , 由此定位设 备 110能够借助于该定位传感器跟踪穿刺针的位置和姿态。 在一个示例中, 一个定位设备能够同时跟踪 8个定位传感器的信号。
定位设备 110可以釆用电磁(EM )跟踪技术、 光学跟踪技术、 光纤光栅 跟踪技术之一。 作为商业可用电磁定位系统的例子, 具有加拿大 NDI公司的 Aurora系统,美国 Asension公司的 DriveBay, 荷兰 Philips公司的 PercuNav 系统、 美国 GE公司的 InstaTrak3500 Plus系统、 美国 Medtronic公司的 StealthStation AxiEM系统和美国 Compass公司的 Cygnus FPS系统。
两个或两个以上的运动监测工具 120被配置为分别在手术前(具体地, CT扫描前)和手术中但在手术工具介入之前以相同的位置和姿态被固定于患 者身体上。 下文以监测的运动为呼吸运动为例进行描述。 为描述便利, 下文 可能将运动监测工具 120称为呼吸监测工具。 呼吸监测工具在患者体表佩戴 的位置最好选择受呼吸影响较大的地方, 从而随着呼吸运动几个呼吸监测工 具之间的相对位置可能会发生一定程度的相对变化。
在一个示例中, 每个运动监测工具具有至少四个能够被定位设备跟踪的 标志点, 其中任意三个标志点不共线, 其中通过在该三维立体医学图像中识 别每个运动监测工具的各个标志点来识别各个运动监测工具在影像坐标系中 的第一位置和姿态; 以及在手术中定位设备通过跟踪每个运动监测工具的各 个标志点来确定各个运动监测工具在定位坐标系中的第二位置和姿态。这里, 在一个运动监测工具上设置至少四个标志点并且任意三个标志点不共线是为 了基于该至少四个标志点在影像坐标系和第一定位坐标系的位置和姿态即能 够求解出从定位坐标系到影像坐标系的坐标转换矩阵。 在一个示例中, 标志 点例如为金属小球, 该金属小球在 CT中的影像参数明显不同于人体材质的 影像参数, 很容易人工或者通过图像处理将其自动识别出来。 对于这样的每 个呼吸监测工具, 可以根据其在影像坐标系的第一位置和姿态以及在定位坐 标系中的第二位置和姿态, 来确定一个唯一的坐标转换矩阵, 该坐标转换矩 阵可以使得转换误差为零。 但是因为有多个呼吸监测工具, 要利用单个坐标 转换矩阵来进行该多个呼吸监测工具的每个在影像坐标系的第一位置和姿态 和在定位坐标系中的第二位置和姿态之间的转换, 因此会产生转换误差。 如 前所述, 本发明的主要思想是通过计算各个时刻的最优统一转换矩阵和对应 的总体转换误差来监测各个时刻呼吸监测工具的位姿相对于手术前 CT扫描 时刻的位姿的差异, 从而实现了监测各个时刻人体器官的状态相对于手术前
CT扫描时刻的状态的差异, 达到了监测人体器官运动的目的。
需要说明的是,每个呼吸监测工具并不必需具有四个或更多个的标志点。 而是只要基于来自所有呼吸监测工具的全部标志点在定位坐标系和影像坐标 系中的位置能够优化求解两个坐标系之间的转换矩阵和总体转换误差即可。 例如可以具有两个呼吸监测工具, 每个呼吸监测工具具有例如三个标志点, 这样同样可以由总计六个标志点在定位坐标系中的位置和在影像坐标系中的 位置而求解最优坐标转换矩阵和对应的转换误差。 再例如, 可以具有四个呼 吸监测工具, 每个呼吸监测工具具有 2个标志点, 这样同样可以由总计八个 标志点在定位坐标系中的位置和在影像坐标系中的位置而求解最优坐标转换 矩阵和对应的转换误差。 一个更极端的例子是, 具有 5个呼吸监测工具, 每 个呼吸监测工具具有一个标志点, 这样同样可以由总计五个标志点在定位坐 标系中的位置和在影像坐标系中的位置而求解最优坐标转换矩阵和对应的转 换误差。
图 2示出了根据本发明实施例的呼吸监测工具 120的示例性结构示意图。 该呼吸监测工具 120可以由塑料框架和被识别物组成。 在塑料框架中, 在标 号 121所指示的位置处放置 4个影像可识别标记物 (比如金属小球, 例如铅 材质 ),这些标记物在 CT或者 MRI影像里和周边材质的影像参数 (主要是 CT 值和 MRI值)有明显的区别特征,从而例如可以在 CT机内得到比较好的显影, 可以通过已知方法方便的提取出来。 作为标记物的另一个示例, 可以使用心 电图制作中釆用的电极贴。 后续可以通过对 CT影像或者 MRI影像的后处理 自动从影像里提取可识别标记物, 并进一步计算工具的位置和姿态(下文中, 为描述方便, 可以简称之为位姿)信息 (包括工具中心的坐标信息和方向信 息)。
图 2中例示的呼吸监测工具 120的正面可以放置磁感应定位装置, 中间 的圓孔 122以及左边的凹结构主要是方便磁感应定位装置的拆卸。
三个延长的直杆主要是方便在人体表面附着, 同时在杆上可以放置能够 被 CT或者 MR识别的标记物, 识别后方能够计算位姿信息
直杆末端的小横杆上的圓孔 123可以通过记号笔在人体表面记录位置信 息, 这样将呼吸监测工具 120从患者表面拆卸下来以后, 能够依据所记录的 位姿信息来准确地将该呼吸监测工具 120再次粘贴到人体表面。
图 2所示的呼吸监测工具的结构和形状仅为示例, 可以根据需要改变其 配置。
作为示例, 呼吸监测工具的利用过程可以分为 CT扫描阶段和手术中阶 段。 具体地, 在进行 CT扫描前, 将两个或两个以上的呼吸监测工具固定于 患者胸部位置, 并在人体表面例如利用记号笔记录下佩戴的位姿信息, 然后 将患者推入 CT室进行 CT扫描, 得到 CT扫描数据集, 从这样的 CT扫描数 据集中可以识别出呼吸监测工具的位置和姿态。 顺便说一下, 由这样的 CT 扫描数据集经过图像处理可以重构得到患者相关医疗部位的三维立体模型, 并且可以将这样的三维立体模型可视化于显示设备上, 并与被跟踪的医疗工 具的图像融合显示, 以辅助医生进行手术。 在手术中并且在手术工具介入人 体之前, 按照前述记录在患者身体表面的位姿信息来佩戴呼吸监测工具, 实 时监测各个呼吸监测工具在定位坐标系中的位姿, 并且优化求得针对所有呼 吸监测工具的坐标转换矩阵和对应的总计误差, 基于总计误差的随时间变化 来监测人体器官的运动。
三维立体医学图像获得单元 130可以是手术导航工作站 140的一部分, 或者是单独的设备。 三维立体医学图像获得单元 130获得手术前的固定有运 动监测工具的患者的医疗部位的三维立体医学图像, 该立体医学图像具有相 关联的影像坐标系。 例如手术前 CT扫描得到的三维数据集被存储于数据库 中, 三维立体医学图像获得单元 130能够操作地耦合到该数据库, 并从该数 据库中读取特定的三维数据集, 并通过图像处理诸如渲染等可视地显示到显 示设备上。 三维立体医学图像获得单元 130可以是简单地读取已经重构好的 三维立体模型并显示, 也可以是读取重构前的 CT图像数据集并进行重构处 理来得到三维立体模型并显示。 有关从 CT或 MRI扫描数据集重构三维立体 模型的方法可以参考有关坐标转换和误差计算的介绍, 例如可以参考作者为 翟伟明的题为 "影像引导下计算机辅助介入手术导航关键技术的研究" 的博 士论文中第三章和第四章的介绍, 这里将该部分内容通过引用合并于此。
控制台 210可以连接至手术导航工作站 140或者是手术导航工作站 140 的一部分。 控制台 210可以接收来自定位设备 110的呼吸监测工具的位姿信 息和手术工具的位姿信息。 控制台 210还可以依据临床医生的指示来发出操 作穿刺针的指令。
手术导航工作站 140配置为进行术前三维立体医学图像与手术中的手术 工具图像的配准和组合, 并在所连接的显示设备上进行可视化显示, 以指导 手术执行者的手术操作。
手术导航工作站 140可以包括处理器 141和存储器 142, 存储器 142中 可以存储有应用程序 1421和操作系统 1422。 应用程序 1421可以包括程序代 码, 该程序代码当被处理器 141执行时可以执行下面将详细描述的人体器官 运动监测方法。 这里的处理器是广义的概念, 可以通过单个专用处理器、 单 个共享处理器或者多个个体处理器来实施, 其中多个处理器中的一个处理器 可以共享, 此外不应将词语 "处理器" 或 "控制器" 的使用解释为排他性地 指代能够执行软件的硬件, 而是其能够隐含地包括但不限于数字信号处理器 ( "DSP" )硬件、 用于存储软件的只读存储器 ( "RAM" )和非易失性存储器。 手术导航工作站 140还可以包括用于与显示器 150实时交互的接口。
显示器 150可以在手术导航工作站 140的控制下显示医疗目标组织的取 向、 形状和 /或位置等, 也可以显示患者有关部位的三维立体模型和手术工具 的图像的融合。 鼠标、 触摸屏系统等外围设备。 手术导航工作站 140还可以包括用于与三维 立体医学图像获得单元 130交互的接口、 用于与控制台 210交互的接口等。
为便于理解, 下面概要描述一下临床手术流程。 一般地, 临床手术流程 可以包括术前准备、 术前影像扫描、 术前手术规划、 术中手术导航、 术后手 术评估等处理。 术前准备例如包括导航手术室设备准备、 使用真空垫固定患 者、 往患者体表固定运动监测工具(或者贴标记点等)。 术前影像扫描包括例 如对患者进行术前 CT扫描或者 MRI扫描。 术前手术规划可以包括医学影像 的三维可视化、 器官及病灶的三维重构、 交互式手术路径计划以及手术效果 模拟等, 其中手术路径计划例如包括穿刺针的入针点、 入针角度的选择; 手 术效果模拟可以包括热场模拟、 损伤场模型重构等。 术中手术导航包括按照 手术计划中的路径完成穿刺等。 术后手术评估可以包括术后规定事件重新扫 描 CT影像或者 MRI影像, 并在 CT数据上测量实际损伤区域的尺寸并和手 术规划计算的数据进行比较, 以衡量手术的精度和效果。
根据本发明实施例的人体器官运动监测方法例如可以用在术中手术导航 过程中。
下面结合图 3描述结合人体导航系统进行的示例性手术导航过程。
图 3示出了根据本发明实施例的手术导航方法 200的示例性过程。
在步骤 S210,在手术前将两个或两个以上的运动监测工具固定于患者身 体上。
在一个示例中, 在患者身体表面放置 2-4个呼吸监测工具, 佩戴的位置 最好选择受呼吸影响较大的地方, 使得随着呼吸运动, 几个工具之间的相对 位置会发生一定程度的相对变化, 由此对呼吸监测工具的位姿的跟踪更能代 表呼吸运动的特性, 也就更能描述随着呼吸运动而运动的目标医疗部分(病 灶) 的特性。
不过上述呼吸监测工具的个数仅为示例, 可以根据需要选择更多数目个 呼吸监测工具, 例如 5个、 6个乃至更多个等等。
在步骤 S220中,在手术前扫描以获得固定有运动监测工具的患者的医疗 部位的三维立体医学图像, 该三维立体医学图像具有相关联的影像坐标系。 手术前扫描例如为 CT扫描或者 MRI扫描。 三维立体医学图像例如由图 1所 示的三维立体医学图像获得单元 130来基于 CT扫描图像数据集而重构得到。
在进行术前扫描并获得固定有运动监测工具的患者的医疗部位的三维立 体医学图像后, 这样的立体医学图像可以可视地显示在显示设备上, 临床医 生可以参考该立体医学图像例如借助于导航软件系统进行手术路径规划。
在步骤 S230中,例如由手术导航工作站 140从术前三维立体医学图像中 识别各个运动监测工具在影像坐标系中的第一位置和姿态。 如前所述, 各个运动监测工具上可以布置有标记点, 该标记点在 CT影 像中具有明显区别于人体组织的影像参数, 从而能够人工或者自动地从 CT 影像中识别出各个标记点, 从而能够识别出各个运动监测工具在影像坐标系 中的位置和姿态。
需要说明的是, 步骤 S220与步骤 S230之间以虚线箭头连接, 表示两者 之间可以间隔较长时间, 例如几个小时或数天。
在步骤 S240中,在手术中,在同样的运动监测工具以与手术前扫描时同 样的位置和姿态固定于患者身体上的状态下, 实时确定各个运动监测工具在 定位坐标系中的第二位置和姿态, 该定位坐标系是与定位设备跟踪手术工具 的位置和姿态相关联的坐标系。 例如, 利用图 1中所示定位设备和各个运动 监测工具上的定位传感器来得到各个运动监测工具在定位坐标系中的第二位 置和姿态, 并传递给手术导航工作站 140。
在步骤 S250中 ,手术导航工作站基于各个运动监测工具在影像坐标系中 的第一位置和姿态以及在定位坐标系中的第二位置和姿态, 实时计算定位坐 标系到影像坐标系之间的最优坐标转换关系, 以及基于该最优坐标转换关系 计算各个运动监测工具在从定位坐标系到影像坐标系的坐标转换的总体误 差。
有关坐标转换和误差计算的介绍,例如可以参考作者为翟伟明的题为 "影 像引导下计算机辅助介入手术导航关键技术的研究" 的博士论文中第 6.4节 的介绍, 这里将该部分内容通过引用合并于此。
在步骤 S260中,基于实时确定的各个时刻的各个运动监测工具的坐标转 换的总体误差, 来评估人体器官在该各个时刻相对于手术前扫描时刻的运动 程度。
如前所述, 理论上, 在肺部的呼吸运动周期的各个阶段中, 如果某时刻 的运动状态与 CT扫描时刻的运动状态最接近, 则该时刻的呼吸监测工具在 影像坐标系中的映射应该是最接近于手术前 CT扫描获得的呼吸监测工具在 影像坐标系中的位姿的, 因此与手术中该时刻优化得到的坐标转换矩阵相对 应的坐标转换误差应该是最小的。
图 4示出了呼吸运动过程中的坐标转换的总体误差随时间变化曲线的一 部分的示意图。 其中横轴表示时间, 纵轴表示坐标转换总体误差。 如图 4所 示, 在时刻 tl、 t2、 t3 , 坐标转换总体误差较小, 此时的呼吸状态与手术前扫 描时刻的呼吸状态较接近, 而其它时刻的呼吸状态不同程度地背离手术前扫 描时刻的呼吸状态, 坐标转换总体误差也随之变化。
通过考察手术过程中坐标转化总体误差随时间的变化情况, 能够评估人 体器官的状态在该各个时刻相对于手术前扫描时刻的状态的差异程度。
可以以多种形式利用本发明实施例的人体器官运动监测结果。
在一个示例中, 可以将实时获得的坐标转换总体误差的随时间变化曲线 以例如图 4的形式可视地显示在显示设备上。 临床医生通过观察该总体误差 随时间变化曲线, 可以确定例如穿刺针对于患者入针的时机。
在一个示例中, 例如图 1的手术导航工作站基于所述评估人体器官在该 各个时刻相对于手术前三维扫描时刻的运动程度, 确定手术工具介入人体病 灶的时机, 并向执行手术的操作者提示该时机。 在一个示例中, 例如当某时 刻的坐标转换总体误差为某一时段的平均坐标转换总体误差的 50%时并且变 化趋势为降低时, 确定为适于手术工具介入人体病灶的时机, 接着例如以短 消息形式显示在显示设备上, 或者以语音形式从扬声器输出, 来提醒临床医 生。
在一个示例中, 因为描述呼吸监测工具的位姿所参考的定位坐标系即是 手术中定位手术工具的位置和姿态的过程中参考的坐标系, 所以可以基于所 评估的人体器官的位姿在各个时刻相对于术前 CT扫描时刻的位姿的运动(或 差异)程度, 而确定人体器官的位姿与 CT扫描时刻的位姿基本一致的时刻, 并且利用所计算的该时刻的从定位坐标系到影像坐标系的坐标转换矩阵, 来 将实时跟踪到的手术工具的位置转换到影像坐标系中并与立体医学图像融 合, 进而可视地显示在显示设备上。 具体地, 该人体器官运动监测方法还可 以包括: 确定坐标转换的总体误差小于预定阔值的时刻; 确定该时刻的定位 坐标系到影像坐标系的坐标转换矩阵; 基于该坐标转换矩阵, 确定手术工具 在影像坐标系中的位置, 并将手术工具的图像在该位置与立体医学图像组合 地显示在显示装置上。 即, 在坐标转换的总体误差小于预定阔值的时刻, 进 行手术工具所在的定位坐标系到影像坐标系的配准, 并且基于例如定位设备 110跟踪到的手术工具的位置, 将手术工具的图像与人体三维立体模型融合, 并可视地显示在显示设备上, 以辅助临床医生手术。
需要说明的是, 该呼吸监测过程可以在手术中持续地进行。
另外, 由呼吸监测获得的规律, 例如前一总体误差最小时刻和总体误差 最小的时刻之间的时间间隔 (如图 4所示)信息也可以提供给临床医生, 让 临床医生在下次入针之前做好心理准备和操作准备。
利用上述的手术导航系统和人体器官运动监测方法, 能够不借助于专门 的人体器官监测仪器来监测人体器官的状态相对于 CT扫描时刻的状态的运 动程度(或者说差异程度), 例如对于呼吸运动而言, 不需要专门的呼吸门控 装置。 不过, 本发明并不排斥与专门的人体器官监测仪器的协同使用。
利用本发明实施例的手术导航系统和人体器官运动监测方法, 不是简单 地将人体视为刚体来进行术前定位坐标系和影像坐标系的配准, 而是考虑了 人体的内部运动来监测人体器官的状态相对于 CT扫描时刻的状态的差异程 度, 从而能够提示医生恰当的操作时机, 使得能够更准确地执行术前的手术 规划, 减少对临床医生个人经验的依赖。
此外, 本发明的实施例可以釆取计算机程序产品的形式。 可经由计算机 可读介质访问所述计算机程序产品, 以供计算机或任何指令执行系统使用或 者与之结合使用。 计算机可读介质可以包括、 存储、 交换、 传播或发送程序 的设备。 计算机可读介质可以是电子、磁、 光、 电磁、 红外或半导体系统(或 器件或装置)或传播介质。 计算机可读介质的例子包括半导体或者固态存储 器、磁带、可拆卸计算机软盘、 随机存取存储器 (RAM)、只读存储器(ROM )、 磁盘或光盘。
以上已经描述了本发明的各实施例, 上述说明是示例性的, 并非穷尽性 的, 并且也不限于所披露的各实施例。 在不偏离所说明的各实施例的范围和 精神的情况下, 对于本技术领域的普通技术人员来说许多修改和变更都是显 而易见的。 本文中所用术语的选择, 旨在最好地解释各实施例的原理、 实际 应用或对市场中的技术的改进, 或者使本技术领域的其它普通技术人员能理 解本文披露的各实施例。

Claims

权 利 要 求 书
1、 一种手术过程中实时监测人体器官的运动的人体器官运动监测方法, 在手术前通过扫描已经获得了固定有两个或两个以上的运动监测工具的患者 的医疗部位的三维立体医学图像, 该三维立体医学图像具有相关联的影像坐 标系, 该人体器官运动监测方法包括如下步骤:
获得从术前三维立体医学图像中识别的各个运动监测工具在影像坐标系 中的第一位置和姿态;
在同样的运动监测工具以与手术前扫描时同样的位置和姿态固定于患者 身体上的状态下:
实时确定各个运动监测工具在定位坐标系中的第二位置和姿态, 该定 位坐标系是定位手术工具的位置和姿态过程中所参考的坐标系;
基于各个运动监测工具在影像坐标系中的第一位置和姿态以及在定位 坐标系中的第二位置和姿态, 实时计算定位坐标系到影像坐标系之间的最优 坐标转换关系, 以及基于该最优坐标转换关系计算各个运动监测工具在从定 位坐标系到影像坐标系的坐标转换的总体误差; 以及
基于实时确定的各个时刻的各个运动监测工具的坐标转换的总体误 差, 来评估人体器官在该各个时刻相对于手术前扫描时刻的运动程度。
2、根据权利要求 1的人体器官运动监测方法,其中所述人体器官的运动 包括肺部的呼吸周期的各个阶段。
3、根据权利要求 1的人体器官运动监测方法,还包括基于所述评估人体 器官在该各个时刻相对于手术前三维扫描时刻的运动程度, 确定手术工具介 入人体病灶的时机, 并向执行手术的操作者提示该时机。
4、 根据权利要求 1的人体器官运动监测方法, 还包括:
确定坐标转换的总体误差小于预定阔值的时刻;
确定该时刻的定位坐标系到影像坐标系的坐标转换矩阵;
基于该坐标转换矩阵, 确定手术工具在影像坐标系中的位置, 并将手术 工具的图像在该位置与立体医学图像组合地显示在显示装置上。
5、 根据权利要求 1的人体器官运动监测方法, 所述手术工具是穿刺针。
6、根据权利要求 1的人体器官运动监测方法,所述每个运动监测工具具 有至少四个能够被定位设备跟踪的标志点, 其中任意三个标志点不共线, 其 中通过在该三维立体医学图像中识别每个运动监测工具的各个标志点来识别 各个运动监测工具在影像坐标系中的第一位置和姿态; 以及在手术中定位设 备通过跟踪每个运动监测工具的各个标志点来确定各个运动监测工具在定位 坐标系中的第二位置和姿态。
7、根据权利要求 1的人体器官运动监测方法,所述运动监测工具被固定 于受人体器官运动影响大的患者身体部位上。
8、根据权利要求 1的人体器官运动监测方法,所述定位设备为电磁跟踪 哭口
9、 一种手术导航系统, 包括:
定位设备, 用于跟踪手术工具和运动监测工具在定位坐标系中的位置和 姿态;
两个或两个以上的运动监测工具, 被固定于患者身体上, 其在定位坐标 系中的位置和姿态能够被定位设备跟踪到;
三维立体医学图像获得单元, 用于获得手术前的固定有运动监测工具的 患者的医疗部位的三维立体医学图像, 该立体医学图像具有相关联的影像坐 标系;
手术导航工作站, 进行术前三维立体医学图像与手术中的手术工具图像 的配准和组合, 并在所连接的显示设备上进行可视化显示, 以指导手术执行 者的手术操作;
其中, 所述手术导航工作站还通过下述操作监视人体器官的运动 状态:
从术前三维立体医学图像中识别各个运动监测工具在影像坐标系 中的第一位置和姿态;
在手术中实时确定各个运动监测工具在定位坐标系中的第二位置 和姿态;
基于各个运动监测工具在影像坐标系中的第一位置和姿态以及在 定位坐标系中的第二位置和姿态, 实时计算定位坐标系到影像坐标系之间的 最优坐标转换关系, 以及计算各个运动监测工具在从定位坐标系到影像坐标 系的坐标转换的总体误差; 以及
基于实时确定的各个时刻的坐标转换的总体误差, 来评估人体器 官在该各个时刻相对于手术前扫描时刻的运动程度。
10、 根据权利要求 9的手术导航系统, 其中所述人体器官的运动包括肺 部的呼吸周期的各个阶段。
11、 根据权利要求 9的手术导航系统, 所述手术导航工作站基于所述评 估的人体器官在该各个时刻相对于手术前扫描时刻的运动程度, 确定手术工 具介入人体病灶的时机, 并向执行手术的操作者提示该时机。
12、 根据权利要求 9的手术导航系统, 所述手术导航工作站还配置为: 确定坐标转换的总体误差小于预定阔值的时刻;
确定该时刻的定位坐标系到影像坐标系的坐标转换矩阵; 以及
基于该坐标转换矩阵, 实时确定手术工具在影像坐标系中的位置, 并将 手术工具的图像在该位置与三维立体医学图像组合地显示在显示设备上。
13、 根据权利要求 9的手术导航系统, 所述手术工具是穿刺针。
14、 根据权利要求 9的手术导航系统, 所述每个运动监测工具具有至少 四个能够被定位设备跟踪的标志点, 其中任意三个标志点不共线, 其中通过 在该三维立体医学图像中识别每个运动监测工具的各个标志点来识别各个运 动监测工具在影像坐标系中的第一位置和姿态; 以及在手术中定位设备通过 跟踪每个运动监测工具的各个标志点来确定各个运动监测工具在定位坐标系 中的第二位置和姿态。
15、 根据权利要求 9的手术导航系统, 所述运动监测工具被固定于受人 体器官运动影响大的患者身体部位上。
16、 一种计算机可读介质, 其上记载有计算机程序, 该计算机程序与手 术导航系统结合使用, 且当被处理装置执行时, 执行下述操作:
获得从术前三维立体医学图像中识别的各个运动监测工具在影像坐标系 中的第一位置和姿态, 该术前三维立体医学图像基于手术前扫描固定有两个 或两个以上的运动监测工具的患者的医疗部位而得到的, 该三维立体医学图 像具有相关联的影像坐标系;
在同样的运动监测工具以与手术前扫描时同样的位置和姿态固定于患者 身体上的状态下:
实时确定各个运动监测工具在定位坐标系中的第二位置和姿态, 该定 位坐标系是定位手术工具的位置和姿态过程中所参考的坐标系;
基于各个运动监测工具在影像坐标系中的第一位置和姿态以及在定位 坐标系中的第二位置和姿态, 实时计算定位坐标系到影像坐标系之间的最优 坐标转换关系, 以及基于该最优坐标转换关系计算各个运动监测工具在从定 位坐标系到影像坐标系的坐标转换的总体误差; 以及
基于实时确定的各个时刻的各个运动监测工具的坐标转换的总体误 差, 来评估人体器官在该各个时刻相对于手术前扫描时刻的运动程度。
17、根据权利要求 16的计算机可读介质,该计算机程序当被计算装置执 行时还执行下述操作:
确定坐标转换的总体误差小于预定阔值的时刻;
确定该时刻的定位坐标系到影像坐标系的坐标转换矩阵;
基于该坐标转换矩阵, 确定手术工具在影像坐标系中的位置和姿态, 并 将手术工具的图像基于该位置和姿态与立体医学图像组合地显示在显示装置
PCT/CN2014/080237 2014-06-11 2014-06-18 人体器官运动监测方法、手术导航系统和计算机可读介质 WO2015188393A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/106,746 US10258413B2 (en) 2014-06-11 2014-06-18 Human organ movement monitoring method, surgical navigation system and computer readable medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410259145.3 2014-06-11
CN201410259145.3A CN104055520B (zh) 2014-06-11 2014-06-11 人体器官运动监测方法和手术导航系统

Publications (1)

Publication Number Publication Date
WO2015188393A1 true WO2015188393A1 (zh) 2015-12-17

Family

ID=51543654

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/080237 WO2015188393A1 (zh) 2014-06-11 2014-06-18 人体器官运动监测方法、手术导航系统和计算机可读介质

Country Status (3)

Country Link
US (1) US10258413B2 (zh)
CN (1) CN104055520B (zh)
WO (1) WO2015188393A1 (zh)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113499138A (zh) * 2021-07-07 2021-10-15 南开大学 一种外科手术的主动导航系统及其控制方法
CN114159161A (zh) * 2020-09-10 2022-03-11 杭州三坛医疗科技有限公司 手术导航系统
CN114176777A (zh) * 2021-12-20 2022-03-15 北京诺亦腾科技有限公司 手术辅助导航系统的精度检测方法、装置、设备及介质
CN114343845A (zh) * 2022-01-11 2022-04-15 上海睿触科技有限公司 一种用于辅助穿刺系统的病灶位置动态跟踪方法
CN116942317A (zh) * 2023-09-21 2023-10-27 中南大学 一种手术导航定位系统

Families Citing this family (85)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8672837B2 (en) 2010-06-24 2014-03-18 Hansen Medical, Inc. Methods and devices for controlling a shapeable medical device
US9057600B2 (en) 2013-03-13 2015-06-16 Hansen Medical, Inc. Reducing incremental measurement sensor error
US9014851B2 (en) 2013-03-15 2015-04-21 Hansen Medical, Inc. Systems and methods for tracking robotically controlled medical instruments
US9629595B2 (en) 2013-03-15 2017-04-25 Hansen Medical, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US9271663B2 (en) 2013-03-15 2016-03-01 Hansen Medical, Inc. Flexible instrument localization from both remote and elongation sensors
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
CN108778113B (zh) 2015-09-18 2022-04-15 奥瑞斯健康公司 管状网络的导航
US10143526B2 (en) 2015-11-30 2018-12-04 Auris Health, Inc. Robot-assisted driving systems and methods
CN105395252A (zh) * 2015-12-10 2016-03-16 哈尔滨工业大学 具有人机交互的可穿戴式血管介入手术三维立体图像导航装置
CN105616003B (zh) * 2015-12-24 2017-11-21 电子科技大学 一种基于径向样条插值的软组织三维视觉跟踪方法
CN106109016A (zh) * 2016-08-17 2016-11-16 北京柏惠维康医疗机器人科技有限公司 腹腔微创手术系统及其中下针时刻的确定方法
CN106073898B (zh) * 2016-08-17 2019-06-14 北京柏惠维康医疗机器人科技有限公司 腹腔介入手术系统
CN106236258B (zh) * 2016-08-17 2019-03-12 北京柏惠维康科技有限公司 腹腔微创手术穿刺路径的规划方法及装置
CN106420056B (zh) * 2016-11-03 2023-11-03 中国人民解放军总医院 器械以及器械的定位与引导装置及其方法
US10244926B2 (en) 2016-12-28 2019-04-02 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
CN107028659B (zh) * 2017-01-23 2023-11-28 新博医疗技术有限公司 一种ct图像引导下的手术导航系统及导航方法
CN106859742B (zh) * 2017-03-21 2023-11-10 北京阳光易帮医疗科技有限公司 一种穿刺手术导航定位系统和方法
CN108990412B (zh) 2017-03-31 2022-03-22 奥瑞斯健康公司 补偿生理噪声的用于腔网络导航的机器人系统
US10022192B1 (en) 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
CN107578443B (zh) * 2017-07-26 2020-11-06 北京理工大学 针刀显示方法及装置
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
CN108175502B (zh) * 2017-11-29 2021-08-17 苏州朗开医疗技术有限公司 一种支气管镜电磁导航系统
US11471217B2 (en) * 2017-12-11 2022-10-18 Covidien Lp Systems, methods, and computer-readable media for improved predictive modeling and navigation
US11510736B2 (en) 2017-12-14 2022-11-29 Auris Health, Inc. System and method for estimating instrument location
US10413363B2 (en) * 2017-12-15 2019-09-17 Medtronic, Inc. Augmented reality solution to optimize the directional approach and therapy delivery of interventional cardiology tools
EP3684283A4 (en) 2017-12-18 2021-07-14 Auris Health, Inc. METHODS AND SYSTEMS FOR MONITORING AND NAVIGATION OF INSTRUMENTS IN LUMINAL NETWORKS
CN111770716B (zh) * 2018-02-21 2023-12-01 奥林巴斯株式会社 医疗系统和医疗系统的控制方法
JP7225259B2 (ja) 2018-03-28 2023-02-20 オーリス ヘルス インコーポレイテッド 器具の推定位置を示すためのシステム及び方法
JP7214747B2 (ja) 2018-03-28 2023-01-30 オーリス ヘルス インコーポレイテッド 位置センサの位置合わせのためのシステム及び方法
CN108682048B (zh) * 2018-05-15 2022-05-17 杭州三坛医疗科技有限公司 导向通道的姿态显示方法、装置和系统、可读存储介质
CN108937987B (zh) * 2018-05-22 2021-07-02 上海联影医疗科技股份有限公司 一种确定模体中标记物位置的方法和系统
CN110547874B (zh) * 2018-05-30 2022-09-23 上海舍成医疗器械有限公司 制定移动路径的方法及其组件和在自动化设备中的应用
CN114601559B (zh) 2018-05-30 2024-05-14 奥瑞斯健康公司 用于基于定位传感器的分支预测的系统和介质
KR102455671B1 (ko) 2018-05-31 2022-10-20 아우리스 헬스, 인코포레이티드 이미지-기반 기도 분석 및 매핑
EP3801189B1 (en) 2018-05-31 2024-09-11 Auris Health, Inc. Path-based navigation of tubular networks
CN112236083B (zh) 2018-05-31 2024-08-13 奥瑞斯健康公司 用于导航检测生理噪声的管腔网络的机器人系统和方法
DE102018116558A1 (de) * 2018-07-09 2020-01-09 Aesculap Ag Medizintechnisches Instrumentarium und Verfahren
CN109247915B (zh) * 2018-08-30 2022-02-18 北京连心医疗科技有限公司 一种皮肤表面形变的检测标贴及实时检测方法
US11944388B2 (en) 2018-09-28 2024-04-02 Covidien Lp Systems and methods for magnetic interference correction
JP7536752B2 (ja) 2018-09-28 2024-08-20 オーリス ヘルス インコーポレイテッド 内視鏡支援経皮的医療処置のためのシステム及び方法
CN109192034A (zh) * 2018-10-26 2019-01-11 刘林 一种可控的脊髓损伤模型装置及制作和使用方法
EP3657512B1 (en) * 2018-11-23 2022-02-16 Siemens Healthcare GmbH Integrated medical image visualization and exploration
CN109620201B (zh) * 2018-12-07 2023-04-14 南京国科精准医学科技有限公司 柔性多导联帽式脑磁仪及其高精度成像方法
WO2020133430A1 (zh) * 2018-12-29 2020-07-02 深圳迈瑞生物医疗电子股份有限公司 基于生理参数的监护方法、监护设备及计算机存储介质
CN109767469B (zh) * 2018-12-29 2021-01-29 北京诺亦腾科技有限公司 一种安装关系的标定方法、系统及存储介质
CN109464196B (zh) * 2019-01-07 2021-04-20 北京和华瑞博医疗科技有限公司 采用结构光影像配准的手术导航系统及配准信号采集方法
DE102019201227A1 (de) 2019-01-31 2020-08-06 Siemens Healthcare Gmbh Bildgebendes Gerät und Verfahren zum Erzeugen eines bewegungskompensierten Bildes oder Videos, Computerprogrammprodukt und computerlesbares Speichermedium
CN111588464B (zh) * 2019-02-20 2022-03-04 忞惪医疗机器人(苏州)有限公司 一种手术导航方法及系统
EP3712900A1 (en) 2019-03-20 2020-09-23 Stryker European Holdings I, LLC Technique for processing patient-specific image data for computer-assisted surgical navigation
WO2021007803A1 (zh) 2019-07-17 2021-01-21 杭州三坛医疗科技有限公司 骨折复位闭合手术定位导航方法和用于该方法的定位装置
WO2021038495A1 (en) 2019-08-30 2021-03-04 Auris Health, Inc. Instrument image reliability systems and methods
JP2022546421A (ja) 2019-08-30 2022-11-04 オーリス ヘルス インコーポレイテッド 位置センサの重みベースの位置合わせのためのシステム及び方法
CN110613519B (zh) * 2019-09-20 2020-09-15 真健康(北京)医疗科技有限公司 动态配准定位装置及方法
CN110731821B (zh) * 2019-09-30 2021-06-01 艾瑞迈迪医疗科技(北京)有限公司 基于ct/mri用于肿瘤微创消融术的方法及引导支架
CN112489745A (zh) * 2019-11-27 2021-03-12 上海联影智能医疗科技有限公司 用于医疗设施的感测装置及实施方法
WO2021137108A1 (en) 2019-12-31 2021-07-08 Auris Health, Inc. Alignment interfaces for percutaneous access
EP4084720A4 (en) 2019-12-31 2024-01-17 Auris Health, Inc. ALIGNMENT TECHNIQUES FOR PERCUTANE ACCESS
EP4084721A4 (en) 2019-12-31 2024-01-03 Auris Health, Inc. IDENTIFICATION OF AN ANATOMIC FEATURE AND AIMING
CN111265299B (zh) * 2020-02-19 2023-08-18 上海理工大学 基于光纤形状传感的手术导航系统
CN111513850B (zh) * 2020-04-30 2022-05-06 京东方科技集团股份有限公司 引导装置、穿刺针调整方法、存储介质及电子设备
CN111881887A (zh) * 2020-08-21 2020-11-03 董秀园 基于多摄像头的运动姿态监测和指导方法及装置
CN112057165B (zh) * 2020-09-22 2023-12-22 上海联影医疗科技股份有限公司 一种路径规划方法、装置、设备和介质
CN112348878B (zh) * 2020-10-23 2023-03-21 歌尔科技有限公司 定位测试方法、装置及电子设备
CN112515767B (zh) * 2020-11-13 2021-11-16 中国科学院深圳先进技术研究院 手术导航装置、设备及计算机可读存储介质
CN112707340B (zh) * 2020-12-10 2022-07-19 安徽有光图像科技有限公司 基于视觉识别的设备控制信号生成方法、装置及叉车
CN113017852B (zh) * 2021-03-03 2022-11-15 上海联影医疗科技股份有限公司 医学成像扫描过程的交互体验方法、设备和电子装置
CN113133828B (zh) * 2021-04-01 2023-12-01 上海复拓知达医疗科技有限公司 一种用于手术导航的交互配准系统、方法、电子设备和可读存储介质
CN114073581B (zh) * 2021-06-29 2022-07-12 成都科莱弗生命科技有限公司 一种支气管电磁导航系统
CN113764076B (zh) * 2021-07-26 2024-02-20 北京天智航医疗科技股份有限公司 检测医疗透视图像中标记点的方法、装置及电子设备
CN113425411B (zh) * 2021-08-04 2022-05-10 成都科莱弗生命科技有限公司 一种病变定位导航的装置
CN113662663B (zh) * 2021-08-20 2023-04-28 中国人民解放军陆军军医大学第二附属医院 一种ar全息手术导航系统坐标系转换方法、装置及系统
CN113729945B (zh) * 2021-08-24 2022-04-15 真健康(北京)医疗科技有限公司 体表定位装置的配准方法、穿刺引导方法及设备
US20230100698A1 (en) * 2021-09-29 2023-03-30 Cilag Gmbh International Methods for Controlling Cooperative Surgical Instruments
CN113855240B (zh) * 2021-09-30 2023-05-19 上海寻是科技有限公司 一种基于磁导航的医学影像配准系统和方法
CN114451992B (zh) * 2021-10-11 2023-08-15 佗道医疗科技有限公司 一种术后置钉精度评价方法
CN113940756B (zh) * 2021-11-09 2022-06-07 广州柏视医疗科技有限公司 一种基于移动dr影像的手术导航系统
CN114271909B (zh) * 2021-12-13 2024-08-09 杭州堃博生物科技有限公司 胸部穿刺的信息处理方法、装置、系统、设备与介质
CN114681058B (zh) * 2022-03-02 2023-02-28 北京长木谷医疗科技有限公司 用于关节置换术的导航定位系统精度验证方法及装置
CN114587533B (zh) * 2022-03-25 2023-04-28 深圳市华屹医疗科技有限公司 穿刺位置引导方法、装置、设备、存储介质和程序产品
CN114813798B (zh) * 2022-05-18 2023-07-07 中国工程物理研究院化工材料研究所 用于表征材料内部结构及成分的ct检测装置和成像方法
CN115089294B (zh) * 2022-08-24 2023-03-21 北京思创贯宇科技开发有限公司 介入手术导航的方法
CN115414121B (zh) * 2022-11-07 2023-03-24 中南大学 一种基于射频定位芯片的外科手术导航系统
CN117618104B (zh) * 2024-01-25 2024-04-26 广州信筑医疗技术有限公司 一种具有术中监测功能的激光手术系统
CN117838311B (zh) * 2024-03-07 2024-05-31 杭州海沛仪器有限公司 基于光学定位的靶点消融呼吸门控系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2748042Y (zh) * 2004-10-22 2005-12-28 上海导向医疗系统有限公司 脏器冷冻介入手术计算机辅助导航系统
CN1806771A (zh) * 2006-01-26 2006-07-26 清华大学深圳研究生院 计算机辅助经皮肾穿刺取石术中的穿刺导航系统及方法
US20070167712A1 (en) * 2005-11-24 2007-07-19 Brainlab Ag Medical tracking system using a gamma camera
CN101327148A (zh) * 2008-07-25 2008-12-24 清华大学 用于被动式光学手术导航的器械识别方法
CN202437059U (zh) * 2011-11-29 2012-09-19 北京集翔多维信息技术有限公司 支气管镜电磁导航系统
CN102949240A (zh) * 2011-08-26 2013-03-06 高欣 一种影像导航肺部介入手术系统
CN103356284A (zh) * 2012-04-01 2013-10-23 中国科学院深圳先进技术研究院 手术导航方法和系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040034297A1 (en) * 2002-08-13 2004-02-19 General Electric Company Medical device positioning system and method
CN201353203Y (zh) * 2009-02-09 2009-12-02 李晴航 计算机辅助手术术中定位系统
CN102266250B (zh) * 2011-07-19 2013-11-13 中国科学院深圳先进技术研究院 超声手术导航系统
US9138165B2 (en) * 2012-02-22 2015-09-22 Veran Medical Technologies, Inc. Systems, methods and devices for forming respiratory-gated point cloud for four dimensional soft tissue navigation
CN102525662B (zh) * 2012-02-28 2013-09-04 中国科学院深圳先进技术研究院 组织器官三维可视化手术导航系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2748042Y (zh) * 2004-10-22 2005-12-28 上海导向医疗系统有限公司 脏器冷冻介入手术计算机辅助导航系统
US20070167712A1 (en) * 2005-11-24 2007-07-19 Brainlab Ag Medical tracking system using a gamma camera
CN1806771A (zh) * 2006-01-26 2006-07-26 清华大学深圳研究生院 计算机辅助经皮肾穿刺取石术中的穿刺导航系统及方法
CN101327148A (zh) * 2008-07-25 2008-12-24 清华大学 用于被动式光学手术导航的器械识别方法
CN102949240A (zh) * 2011-08-26 2013-03-06 高欣 一种影像导航肺部介入手术系统
CN202437059U (zh) * 2011-11-29 2012-09-19 北京集翔多维信息技术有限公司 支气管镜电磁导航系统
CN103356284A (zh) * 2012-04-01 2013-10-23 中国科学院深圳先进技术研究院 手术导航方法和系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ZHAI, WEIMING: "Study on Computer-assisted Intervention Surgery Navigation Key Technique under the Guidance of Video", MEDICINE & PUBLIC HEALTH, CHINA DOCTORAL DISSERTATIONS FULL-TEXT DATABASE, vol. E076-1, no. 05, 2012, pages 84 - 85, ISSN: 1674-022X *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114159161A (zh) * 2020-09-10 2022-03-11 杭州三坛医疗科技有限公司 手术导航系统
CN113499138A (zh) * 2021-07-07 2021-10-15 南开大学 一种外科手术的主动导航系统及其控制方法
WO2023280326A1 (zh) * 2021-07-07 2023-01-12 南开大学深圳研究院 一种外科手术的主动导航系统及其控制方法
CN114176777A (zh) * 2021-12-20 2022-03-15 北京诺亦腾科技有限公司 手术辅助导航系统的精度检测方法、装置、设备及介质
CN114176777B (zh) * 2021-12-20 2022-07-01 北京诺亦腾科技有限公司 手术辅助导航系统的精度检测方法、装置、设备及介质
CN114343845A (zh) * 2022-01-11 2022-04-15 上海睿触科技有限公司 一种用于辅助穿刺系统的病灶位置动态跟踪方法
CN114343845B (zh) * 2022-01-11 2023-12-12 上海睿触科技有限公司 一种用于辅助穿刺系统的病灶位置动态跟踪方法
CN116942317A (zh) * 2023-09-21 2023-10-27 中南大学 一种手术导航定位系统
CN116942317B (zh) * 2023-09-21 2023-12-26 中南大学 一种手术导航定位系统

Also Published As

Publication number Publication date
US10258413B2 (en) 2019-04-16
US20170215969A1 (en) 2017-08-03
CN104055520A (zh) 2014-09-24
CN104055520B (zh) 2016-02-24

Similar Documents

Publication Publication Date Title
WO2015188393A1 (zh) 人体器官运动监测方法、手术导航系统和计算机可读介质
US11553968B2 (en) Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11547377B2 (en) System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US20210137351A1 (en) Apparatus and Method for Airway Registration and Navigation
US20200146588A1 (en) Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
EP3289964B1 (en) Systems for providing proximity awareness to pleural boundaries, vascular structures, and other critical intra-thoracic structures during electromagnetic navigation bronchoscopy
US20070244369A1 (en) Medical Imaging System for Mapping a Structure in a Patient's Body
US11471217B2 (en) Systems, methods, and computer-readable media for improved predictive modeling and navigation
JP2008126075A (ja) Ctレジストレーション及びフィードバックの視覚的検証のためのシステム及び方法
US10524695B2 (en) Registration between coordinate systems for visualizing a tool
EP3184035B1 (en) Ascertaining a position and orientation for visualizing a tool
CN113907883A (zh) 一种耳侧颅底外科3d可视化手术导航系统及方法
EP4091570B1 (en) Probe for improving registration accuracy between a tomographic image and a tracking system
WO2020106664A1 (en) System and method for volumetric display of anatomy with periodic motion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14894441

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15106746

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14894441

Country of ref document: EP

Kind code of ref document: A1