CN114938994A - Lung cancer accurate puncture navigation system and method based on respiratory motion compensation - Google Patents

Lung cancer accurate puncture navigation system and method based on respiratory motion compensation Download PDF

Info

Publication number
CN114938994A
CN114938994A CN202210581395.3A CN202210581395A CN114938994A CN 114938994 A CN114938994 A CN 114938994A CN 202210581395 A CN202210581395 A CN 202210581395A CN 114938994 A CN114938994 A CN 114938994A
Authority
CN
China
Prior art keywords
image
patient
lung
module
motion compensation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210581395.3A
Other languages
Chinese (zh)
Inventor
李海
王腾飞
王宏志
江海河
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei Institutes of Physical Science of CAS
Original Assignee
Hefei Institutes of Physical Science of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei Institutes of Physical Science of CAS filed Critical Hefei Institutes of Physical Science of CAS
Priority to CN202210581395.3A priority Critical patent/CN114938994A/en
Publication of CN114938994A publication Critical patent/CN114938994A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention relates to a lung cancer accurate puncture navigation system based on respiratory motion compensation, which comprises: the magnetic field generator is used for acquiring three-dimensional space coordinate data of the electromagnetic puncture needle and the electromagnetic sensor on the surface of the skin of the patient in real time and sending the data to the space registration module and the respiratory motion compensation module; the image segmentation module is used for finely segmenting the lung structure and the blood vessel in the initial image; a spatial registration module; the image registration module acquires three-dimensional structural information of an operation part of a patient in an operation by using preoperative CT and registers the three-dimensional structural information with a CT image acquired in the operation; a respiratory motion compensation module; and the visual interface module displays the position of the surgical instrument in the patient image and the motion track of the surgical instrument in real time. The invention can track the motion of lung organs, nodules and tumors in real time, simultaneously accurately position the real-time position of the surgical instrument in the body of a subject during operation, provide accurate and reliable navigation for the lung puncture operation, and has simple and convenient system operation and high applicability.

Description

Lung cancer accurate puncture navigation system and method based on respiratory motion compensation
Technical Field
The invention relates to the technical field of medical image processing and computer vision, in particular to a lung cancer accurate puncture navigation system and method based on respiratory motion compensation.
Background
With the development of computer, medical imaging, high-precision measurement and other technologies, an intelligent operation navigation technology based on image guidance is gradually developed. The intelligent operation navigation technology is a technology which utilizes related technologies such as medical image analysis, three-dimensional positioning, computer visualization and the like to track and display the three-dimensional spatial position and motion information of the surgical instrument and the human body part in real time, thereby achieving the purpose of monitoring the surgical instrument entering the human body in real time and helping a surgeon to accurately perform an operation. The surgical navigation technology extends the visual field of a surgeon, promotes the initiative and flexibility of the surgeon in the operation, and has very important significance for optimizing the surgical path, improving the surgical precision and success rate and reducing surgical injuries.
Intelligent surgical navigation techniques have found major applications in spinal and neurosurgical procedures in the first place. This is because the spine is a rigid organ, and the brain is supported by the rigid skull, so that it will not deform too much during the operation, and it is easier to achieve the precise positioning of the organ and the instrument. However, when performing a surgical operation on an organ (such as a lung, a heart, etc.) that is greatly affected by breathing or heartbeat (such as a percutaneous lung biopsy), the operation is difficult because the operation object can generate continuous deformation or movement.
Therefore, developing an intelligent operation navigation system for moving organs, such as a real-time lung movement navigation system, has important clinical value and social significance, and is a hot spot and development trend in the current high-end medical instrument research and development field.
Disclosure of Invention
The invention aims to provide a lung cancer accurate puncture navigation system based on respiratory motion compensation, which can track the motion of lung organs, nodules and tumors in a complex surgical scene, guide surgical instruments to quickly and accurately reach specific positions and assist in successfully completing a puncture operation.
In order to achieve the purpose, the invention adopts the following technical scheme: a lung cancer accurate puncture navigation system based on respiratory motion compensation comprises:
the magnetic field generator is used for acquiring three-dimensional space coordinate data of the electromagnetic puncture needle and the electromagnetic sensor on the surface of the skin of the patient in real time and sending the data to the space registration module and the respiratory motion compensation module;
the image segmentation module is used for finely segmenting the lung structure and the blood vessels in the initial image and accurately segmenting and labeling the three-dimensionally reconstructed lung and blood vessels; the initial image is a CT image before the operation of the patient;
the spatial registration module is used for acquiring image coordinates of a plurality of electromagnetic sensors adhered to the skin surface of a patient in an initial image, acquiring spatial coordinates of the electromagnetic sensors through the magnetic field generator, and establishing a relation between the image coordinates and the spatial coordinates;
the image registration module is used for acquiring three-dimensional structural information of an operation part of a patient in an operation by using CT before the operation, registering the three-dimensional structural information with a CT image acquired in the operation and fusing images of the patient before the operation and the patient in the operation;
the respiratory motion compensation module is used for acquiring CT images of expiration and inspiration phases of a patient before operation, acquiring a deformation field between the expiration and inspiration CT images through image registration, and interpolating the deformation field so as to reconstruct a 4D-CT image of the lung, and predicting and tracking the motion of organs, nodules and tumors of the lung by combining three-dimensional space coordinate data of an electromagnetic sensor on the surface of the skin of the patient;
and the visual interface module is used for displaying the position of the surgical instrument in the patient image and the motion track of the surgical instrument in real time to assist a doctor in performing an operation.
The magnetic field generator acquires three-dimensional space coordinates of the electromagnetic sensor on the surface of the skin of the patient in real time and sends the three-dimensional space coordinates to the space registration module and the respiratory motion compensation module; the image segmentation module segments the lung region and the blood vessel in the initial image and sends the segmented lung region and blood vessel to the image registration module; the spatial registration module, the image registration module and the respiratory motion compensation module correlate the spatial coordinates with the image coordinates, align the spatial coordinates of intraoperative and preoperative CT images, predict real-time lung motion images of the patient at the same time, and send the results to the visualization interface module.
Another objective of the present invention is to provide a respiratory motion compensation method for a precise lung cancer puncture navigation system based on respiratory motion compensation, the method comprising the following steps in sequence:
(1) acquiring preoperative exhalation image P of patient ex (x) And an inspiratory image P in (x) Two phase 3D-CT images and three-dimensional space coordinates corresponding to the electromagnetic sensor;
(2) carrying out image registration on the expiration image and the inspiration image to obtain a deformation field mu (x) from the expiration image to the inspiration image;
(3) carrying out linear interpolation on the deformation field mu (x), acquiring a middle deformation field sequence of the whole breathing cycle of the patient, and reconstructing a 4D-CT image of the lung and the position of the electromagnetic sensor corresponding to the lung;
(4) acquiring real-time space coordinates v of electromagnetic sensor on skin surface of patient during operation c Calculating the interpolation image which is most matched with the current phase as the predicted real-time lung image P est And displaying in real time in the visual interface module.
The step (3) specifically comprises the following steps:
(3a) n equal parts of the deformation field mu (x) are obtained to obtain n middle deformation fields mu i (x) N is any one of 8 to 12;
(3b) using the expiratory image P ex (x) For reference images, passing through the n intermediate deformation fields mu i (x) Calculating to obtain 3D-CT images of n respiratory phases, and forming 4D-CT image P of whole respiratory cycle of the subject i (x),i=1...n;
(3c) Extracting space coordinate position v of corresponding electromagnetic sensor from 4D-CT image i ,i=1...n。
The step (4) specifically comprises the following steps:
(4a) obtaining real-time spatial coordinates v of an electromagnetic sensor during a patient's surgery c
(4b) By calculating the real-time spatial coordinates v of the electromagnetic sensor c And the space coordinate position v of electromagnetic sensing in the 4D-CT image i Selecting the electromagnetic sensor position with the minimum distance value, and extracting a corresponding phase i;
(4c) 4D-CT image P corresponding to current phase i i (x) Is predictedReal-time lung image P est And displaying in real time in the visual interface module.
According to the technical scheme, the beneficial effects of the invention are as follows: firstly, the invention can track the motion of lung organs, nodules and tumors in real time, simultaneously accurately position the real-time position of surgical instruments in a subject body in an operation, provide accurate and reliable navigation for a lung puncture operation, and has simple and convenient system operation and high applicability; secondly, the respiratory motion compensation method can realize the respiratory motion compensation of non-rigid organs (such as lungs) and has higher precision, so that the motion of the organs, nodules and tumors of the lungs can be tracked in real time, and the application of the respiratory motion compensation method in the accurate puncture surgery navigation of the lungs is met.
Drawings
FIG. 1 is a block diagram of the circuit configuration of the present invention;
FIG. 2 is a flow chart of a method of the present invention;
FIG. 3 is a flow chart of the operation of the respiratory motion compensation module of the present invention;
fig. 4 is a display diagram of a visualization interface module according to the present invention.
Detailed Description
As shown in fig. 1, an accurate lung cancer puncture navigation system based on respiratory motion compensation includes:
the magnetic field generator is used for acquiring three-dimensional space coordinate data of the electromagnetic puncture needle and the electromagnetic sensor on the surface of the skin of the patient in real time and sending the data to the space registration module and the respiratory motion compensation module;
the image segmentation module is used for finely segmenting the lung structure and the blood vessels in the initial image and accurately segmenting and labeling the three-dimensionally reconstructed lung and blood vessels, so that the three-dimensional structural information of the lung of the patient is quickly acquired, and the damage to the main blood vessels is avoided; the initial image is a CT image before the operation of the patient;
the space registration module is used for acquiring image coordinates of a plurality of electromagnetic sensors adhered to the skin surface of a patient in an initial image, obtaining space coordinates of the electromagnetic sensors through the magnetic field generator and establishing a relation between the image coordinates and the space coordinates;
the image registration module is used for acquiring three-dimensional structural information of an operation part of a patient in an operation by using CT before the operation, registering the three-dimensional structural information with a CT image acquired in the operation and fusing images of the patient before the operation and the patient in the operation;
the respiratory motion compensation module is used for acquiring CT images of the preoperative exhalation phase and the inhalation phase of a patient, acquiring a deformation field between the exhalation CT images and the inhalation CT images through image registration, interpolating the deformation field so as to reconstruct a 4D-CT image of the lung, and predicting and tracking the motion of lung organs, nodules and tumors by combining three-dimensional space coordinate data of an electromagnetic sensor on the surface of the skin of the patient;
and the visual interface module is used for displaying the position of the surgical instrument in the patient image and the motion track of the surgical instrument in real time to assist a doctor in performing an operation.
The magnetic field generator acquires three-dimensional space coordinates of the electromagnetic sensor on the surface of the skin of the patient in real time and sends the three-dimensional space coordinates to the space registration module and the respiratory motion compensation module; the image segmentation module segments the lung region and the blood vessel in the initial image and sends the segmented lung region and blood vessel to the image registration module; the spatial registration module, the image registration module and the respiratory motion compensation module correlate the spatial coordinates with the image coordinates, align the spatial coordinates of intraoperative and preoperative CT images, predict real-time lung motion images of the patient at the same time, and send the results to the visualization interface module.
In practical application, the system is built in a prototype of a large-aperture CT chamber. The magnetic field generator is fixed above the chest of a patient, the range of the magnetic field generator covers the electromagnetic puncture needle and the skin surface electromagnetic sensor, the spatial positions of the electromagnetic puncture needle and the skin surface electromagnetic sensor are collected and positioned in real time and are transmitted to the sensor interface unit, and the image segmentation module, the spatial registration module, the image registration module, the respiratory motion compensation module and the visual interface module are integrated in the workstation.
As shown in fig. 3, the method comprises the following sequence of steps:
(1) acquiring preoperative exhalation image P of patient ex (x) And an inspiratory image P in (x) Two phase 3D-CT images and three-dimensional space coordinates corresponding to the electromagnetic sensor;
(2) carrying out image registration on the expiration image and the inspiration image to obtain a deformation field mu (x) from the expiration image to the inspiration image;
(3) carrying out linear interpolation on the deformation field mu (x), acquiring a middle deformation field sequence of the whole breathing cycle of the patient, and reconstructing a 4D-CT image of the lung and the position of the electromagnetic sensor corresponding to the lung;
(4) acquiring real-time space coordinates v of electromagnetic sensor on skin surface of patient during operation c Calculating the interpolation image which is most matched with the current phase as the predicted real-time lung image P est And displaying in real time in the visual interface module.
The step (3) specifically comprises the following steps:
(3a) n equal parts of the deformation field mu (x) are obtained to obtain n middle deformation fields mu i (x) N is any one of 8 to 12;
(3b) using the expiratory image P ex (x) For reference images, passing through the n intermediate deformation fields mu i (x) Calculating to obtain 3D-CT images of n respiratory phases, and forming 4D-CT image P of whole respiratory cycle of the subject i (x),i=1...n;
(3c) Extracting space coordinate position v of corresponding electromagnetic sensor from 4D-CT image i ,i=1...n。
The step (4) specifically comprises the following steps:
(4a) obtaining real-time spatial coordinates v of an electromagnetic sensor during a patient's surgery c
(4b) By calculating the real-time spatial coordinates v of the electromagnetic sensor c And the space coordinate position v of the electromagnetic sensor in the 4D-CT image i Selecting the electromagnetic sensor position with the minimum distance value, and extracting a corresponding phase i;
(4c) 4D-CT image P corresponding to current phase i i (x) I.e. the predicted real-time lung image P est And displaying in real time in the visual interface module.
The invention is further described below with reference to fig. 1 to 4.
As shown in fig. 2, the main workflow of the whole accurate lung cancer puncture navigation system is composed of two parts: preoperative planning and intraoperative positioning. The preoperative planning is carried out on the basis of image preprocessing, lung region segmentation and operation path planning according to CT images shot preoperatively and detailed anatomical structure information of an operation part obtained by three-dimensional reconstruction; meanwhile, the correlation between the spatial coordinates of the electromagnetic sensor on the surface of the skin and the lung moving images is established by collecting two phase images of inspiration and expiration of a patient before an operation, so that the lung images are predicted in real time; the intra-operative positioning determines the spatial position of the patient entity by means of intra-operative images, and the patient entity is fused in the pre-operative three-dimensional reconstructed image space by means of image registration. And acquiring the position and the running track of the surgical instrument in the space by using the magnetic field generator. And establishing a conversion relation between the electromagnetic space coordinate and the CT image coordinate, and overlapping the space position of the surgical instrument to the image space. And finally, loading the prediction relation established before the operation to a respiratory motion compensation module, predicting and displaying the lung image in real time, and evaluating the precision of the system.
The spatial registration module is used for registering the three-dimensional space coordinates of the plurality of electromagnetic sensors on the surface of the skin of the patient with the corresponding CT image coordinates; obtaining spatial coordinates of the electromagnetic sensors based on the magnetic field generator by fixing the plurality of electromagnetic sensors on the skin surface of the patient; meanwhile, three-dimensional image coordinates of the electromagnetic sensor are obtained through the acquired CT images of the patient; establishing a space-to-image conversion matrix according to the relation between the space coordinate system and the CT image coordinate system of the subject, and sending the space-to-image conversion matrix to a visual interface module;
the image registration module is used for acquiring three-dimensional structure information of the surgical part of the patient based on the lung segmentation images of the intraoperative CT and the preoperative CT obtained by the image segmentation module; registering three-dimensional structural information of an operation part of a patient in an operation with a preoperative CT segmentation image containing detailed anatomical structural information of the patient so as to fuse preoperative and intraoperative patient images and send the images to a visual interface module;
the visual interface module is used for obtaining real-time three-dimensional space coordinates of the electromagnetic puncture needle based on the magnetic field generator; based on a space registration module, converting the real-time three-dimensional space coordinate of the electromagnetic puncture needle into a three-dimensional image coordinate and fusing the three-dimensional image coordinate into a registration image, displaying the position and the motion track of the electromagnetic puncture needle in the patient image in real time and outputting from multiple angles;
the respiratory motion compensation module, visualizing the patient's anatomy in real-time and tracking the position of the surgical instrument, is a key core of the surgical navigation system. For non-rigid organs (such as the lungs), respiratory motion can cause geometric distortions that make precise positioning of surgical instruments difficult. Therefore, the research on the influence of the respiratory motion on the pulmonary nodules has important clinical value and significance for improving the operation precision and ensuring the operation safety.
The accurate lung cancer puncture navigation system carries out space mapping on preoperative three-dimensional images and an electromagnetic puncture needle through a skin surface electromagnetic sensor, carries out virtual-real fusion on virtual images and the position of the electromagnetic puncture needle, simultaneously tracks the motion of lung organs, nodules and tumors in real time through a respiratory motion compensation module, and displays the fused images in a visual interface module in real time.
The current common clinical methods for solving the respiratory motion problem include: breath hold/breathing gating and motion tracking. The breath holding/breathing gating method is the simplest method, but the method has the disadvantages that the control on the breathing phase is rough, the precision is difficult to guarantee, the operation time is increased, and the requirements of breath holding or breathing gating are difficult to be completed by part of patients. The motion tracking method needs to implant a marker into the lung region and then use an imaging device (such as X-ray) to monitor the respiratory motion by tracking the marker, but this method is invasive, and can acquire more accurate lung motion information only in the vicinity of the marker, and the monitoring range is small.
The method of the invention is used for testing the accuracy of the predicted lung image, and the error result is shown in the table 1:
TABLE 1
Phase 1 Phase 2 Phase 3 Phase 4 Phase 5 Phase 6 Phase 7 Phase 8
Subject 1 1.56 1.75 1.59 1.02 1.06 1.41 1.52 2.25
Subject 2 1.13 1.09 1.02 0.98 1.07 1.05 1.12 3.1
Subject 3 2.1 1.9 1.12 1.06 0.87 1.17 2.31 2.8
Subject 4 1.2 1.41 1.2 1.5 1.21 1 1.56 3.09
Subject 5 1.56 1.52 1 1.2 1.2 1.55 2.2 3.3
Subject 6 1.52 1.34 1.41 1.48 1.18 1.8 2.3 2.38
Subject 7 2.1 1.5 0.98 1 1.49 1.1 1.67 1.41
Subject 8 1.7 1.62 0.9 1.3 1.4 1.78 2.6 2.15
The visual interface module is used for obtaining real-time three-dimensional space coordinates of the electromagnetic puncture needle based on the magnetic field generator; and based on a space registration module, converting the real-time three-dimensional space coordinate of the electromagnetic puncture needle into a three-dimensional image coordinate and fusing the three-dimensional image coordinate into a registration image, displaying the position and the motion track of the puncture needle in the image of the subject in real time, and outputting from multiple angles. Fig. 4 is an example of a visualization interface module, which can display a three-dimensional image of the lung of the subject and a two-dimensional image of each direction in real time, and simultaneously display the position of the surgical instrument in the image of the subject and the motion track thereof, and feed back the position and the motion track to the surgeon from multiple angles to assist the operation.
The feasibility and accuracy of the navigation system evaluated using the human body model. The preoperative surgical plan of the navigation system may map a path from a skin surface entry point to a target tumor point and display the real-time position of the electromagnetic puncture needle within the subject's body, guiding the surgical instrument to the target nodule. The actual position of the surgical instrument is determined by post-operative scanning CT, which is registered to the pre-operative CT space. The distance between the predicted image and the actual image surgical tip is defined as the target registration error, and the results are shown in table 2:
TABLE 2
Figure BDA0003663949560000081
In conclusion, the invention can track the motion of lung organs, nodules and tumors in real time, simultaneously accurately position the real-time position of the surgical instrument in the body of a subject in the operation, provide accurate and reliable navigation for the lung puncture operation, and has simple and convenient system operation and high applicability; the respiratory motion compensation method can realize the respiratory motion compensation of non-rigid organs (such as lungs) and has higher precision, so that the motion of the organs, nodules and tumors of the lungs can be tracked in real time, and the application of the respiratory motion compensation method in the accurate puncture surgery navigation of the lungs is met.

Claims (5)

1. The utility model provides an accurate puncture navigation of lung cancer based on respiratory motion compensation which characterized in that: the method comprises the following steps:
the magnetic field generator is used for acquiring three-dimensional space coordinate data of the electromagnetic puncture needle and the electromagnetic sensor on the surface of the skin of the patient in real time and sending the data to the space registration module and the respiratory motion compensation module;
the image segmentation module is used for finely segmenting the lung structure and the blood vessels in the initial image and accurately segmenting and labeling the three-dimensionally reconstructed lung and blood vessels; the initial image is a CT image before the operation of the patient;
the spatial registration module is used for acquiring image coordinates of a plurality of electromagnetic sensors adhered to the skin surface of a patient in an initial image, acquiring spatial coordinates of the electromagnetic sensors through the magnetic field generator, and establishing a relation between the image coordinates and the spatial coordinates;
the image registration module is used for acquiring three-dimensional structural information of an operation part of a patient in an operation by using CT before the operation, registering the three-dimensional structural information with a CT image acquired in the operation and fusing images of the patient before the operation and the patient in the operation;
the respiratory motion compensation module is used for acquiring CT images of expiration and inspiration phases of a patient before operation, acquiring a deformation field between the expiration and inspiration CT images through image registration, and interpolating the deformation field so as to reconstruct a 4D-CT image of the lung, and predicting and tracking the motion of organs, nodules and tumors of the lung by combining three-dimensional space coordinate data of an electromagnetic sensor on the surface of the skin of the patient;
and the visual interface module is used for displaying the position of the surgical instrument in the patient image and the motion track of the surgical instrument in real time to assist a doctor in performing an operation.
2. The respiratory motion compensation based lung cancer precision puncture navigation system according to claim 1, characterized in that: the magnetic field generator acquires three-dimensional space coordinates of the electromagnetic sensor on the surface of the skin of the patient in real time and sends the three-dimensional space coordinates to the space registration module and the respiratory motion compensation module; the image segmentation module segments the lung region and the blood vessel in the initial image and sends the segmented lung region and blood vessel to the image registration module; the spatial registration module, the image registration module and the respiratory motion compensation module correlate the spatial coordinates with the image coordinates, align the spatial coordinates of intraoperative and preoperative CT images, predict real-time lung motion images of the patient at the same time, and send the results to the visualization interface module.
3. The respiratory motion compensation method of the lung cancer precise puncture navigation system based on respiratory motion compensation according to any one of claims 1 to 2, wherein: the method comprises the following steps in sequence:
(1) acquiring preoperative exhalation image P of patient ex (x) And an inspiratory image P in (x) Two phase 3D-CT images and three-dimensional space coordinates corresponding to the electromagnetic sensor;
(2) carrying out image registration on the expiration image and the inspiration image to obtain a deformation field mu (x) from the expiration image to the inspiration image;
(3) performing linear interpolation on the deformation field mu (x), acquiring a middle deformation field sequence of the whole breathing cycle of the patient, and reconstructing a 4D-CT image of the lung and the position of the electromagnetic sensor corresponding to the image;
(4) acquiring real-time space coordinates v of electromagnetic sensor on skin surface of patient during operation c Calculating the interpolation image which is most matched with the current phase as the predicted real-time lung image P est And displaying in real time in the visual interface module.
4. The respiratory motion compensation method of claim 3, wherein: the step (3) specifically comprises the following steps:
(3a) n equal parts of the deformation field mu (x) are obtained to obtain n middle deformation fields mu i (x) N is any one of 8 to 12;
(3b) using the expiratory image P ex (x) For reference images, passing through the n intermediate deformation fields mu i (x) Calculating to obtain 3D-CT images of n respiratory phases, and forming 4D-CT image P of whole respiratory cycle of the subject i (x),i=1...n;
(3c) Extracting space coordinate position v of corresponding electromagnetic sensor from 4D-CT image i ,i=1...n。
5. The respiratory motion compensation method of claim 3, wherein: the step (4) specifically comprises the following steps:
(4a) obtaining real-time spatial coordinates v of an electromagnetic sensor during a patient's surgery c
(4b) By calculating the real-time spatial coordinates v of the electromagnetic sensor c And the space coordinate position v of electromagnetic sensing in the 4D-CT image i Selecting the electromagnetic sensor position with the minimum distance value, and extracting a corresponding phase i;
(4c) 4D-CT image P corresponding to current phase i i (x) I.e. the predicted real-time lung image P est And displaying in real time in the visual interface module.
CN202210581395.3A 2022-05-26 2022-05-26 Lung cancer accurate puncture navigation system and method based on respiratory motion compensation Pending CN114938994A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210581395.3A CN114938994A (en) 2022-05-26 2022-05-26 Lung cancer accurate puncture navigation system and method based on respiratory motion compensation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210581395.3A CN114938994A (en) 2022-05-26 2022-05-26 Lung cancer accurate puncture navigation system and method based on respiratory motion compensation

Publications (1)

Publication Number Publication Date
CN114938994A true CN114938994A (en) 2022-08-26

Family

ID=82909263

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210581395.3A Pending CN114938994A (en) 2022-05-26 2022-05-26 Lung cancer accurate puncture navigation system and method based on respiratory motion compensation

Country Status (1)

Country Link
CN (1) CN114938994A (en)

Similar Documents

Publication Publication Date Title
US20210137351A1 (en) Apparatus and Method for Airway Registration and Navigation
US11553968B2 (en) Apparatuses and methods for registering a real-time image feed from an imaging device to a steerable catheter
US11903659B2 (en) Robotic device for a minimally invasive medical intervention on soft tissues
US20200146588A1 (en) Apparatuses and methods for endobronchial navigation to and confirmation of the location of a target tissue and percutaneous interception of the target tissue
CN100506153C (en) Method for aligning and overlapping image data of medical imaging in serial shooting
EP1719078B1 (en) Device and process for multimodal registration of images
CN107106241B (en) System for navigating to surgical instruments
JP5667988B2 (en) System and method for dynamic metal strain compensation of electromagnetic tracking system
RU2594811C2 (en) Visualisation for navigation instruction
CN106725852A (en) The operation guiding system of lung puncture
US20070244369A1 (en) Medical Imaging System for Mapping a Structure in a Patient's Body
US10111717B2 (en) System and methods for improving patent registration
WO2008035271A2 (en) Device for registering a 3d model
CN114746901A (en) Registration of images with tracking systems
JP2023149127A (en) Image processing device, method, and program
Wein et al. Ultrasound based respiratory motion compensation in the abdomen
CN113855235B (en) Magnetic resonance navigation method and device used in microwave thermal ablation operation of liver part
CN114938994A (en) Lung cancer accurate puncture navigation system and method based on respiratory motion compensation
WO2022165112A1 (en) Systems and methods for c-arm fluoroscope camera pose refinement with secondary movement compensation
Bucki et al. Real-time SPECT and 2D ultrasound image registration
EP4091570B1 (en) Probe for improving registration accuracy between a tomographic image and a tracking system
CN113940756B (en) Operation navigation system based on mobile DR image
US20240216010A1 (en) Method and device for registration and tracking during a percutaneous procedure
WO2024033861A1 (en) Surgical navigation system, surgical navigation method, calibration method of surgical navigation system
Hawkes et al. Computational models in image guided interventions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination