CN114176726B - Puncture method based on phase registration - Google Patents
Puncture method based on phase registration Download PDFInfo
- Publication number
- CN114176726B CN114176726B CN202111507399.9A CN202111507399A CN114176726B CN 114176726 B CN114176726 B CN 114176726B CN 202111507399 A CN202111507399 A CN 202111507399A CN 114176726 B CN114176726 B CN 114176726B
- Authority
- CN
- China
- Prior art keywords
- patient
- curve
- image
- focus
- focus characteristic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods, e.g. tourniquets
- A61B17/34—Trocars; Puncturing needles
- A61B17/3403—Needle locating or guiding means
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Robotics (AREA)
- Pathology (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
Abstract
The invention discloses a puncturing method based on phase registration, which comprises the following steps: selecting any 3D image in the 4D images as a planning image for planning; acquiring a breathing curve of a patient; acquiring the position of focus features in the 4D image to obtain a focus feature motion curve; carrying out phase matching on the breathing curve and the focus characteristic motion curve to obtain a mapping relation between the breathing curve and the focus characteristic motion curve; obtaining the corresponding amplitude of the planning image in the respiratory curve according to the phase point of the planning image on the focus characteristic motion curve and the mapping relation, and marking the amplitude as a target amplitude; and monitoring the respiration of the patient in real time, and executing a planning result when the respiration amplitude of the patient reaches the target amplitude. According to the invention, the phase point of the planning image and the respiratory amplitude of the respiratory curve are in a corresponding relation, and the puncture phase can be prompted to be reached through the amplitude change of the external respiratory curve, so that accurate puncture is realized.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to a puncture method based on phase registration.
Background
In clinic, especially in the clinic of tumor treatment, medical images are increasingly used for guiding, and then mechanical arms are used for assisting in positioning operation tools, so that minimally invasive operation is performed, the wound surface of a patient is reduced, and the recovery speed of the patient is accelerated.
In the operation of biopsy, ablation and the like of lung puncture, as the patient is in a state of continuous respiration, and the main puncture and ablation parts are the parts of the lung, the liver and the like which are most affected by respiration, the displacement of focus points in corresponding organs can reach 5cm at maximum because of the mediastinal movement caused by the respiration of the patient, so that even doctors with abundant experience can not guarantee one hundred percent success rate. The operation navigation auxiliary robot for lung puncture and ablation becomes a future trend, and the motion rule of the lung is difficult to grasp, but the current operation navigation auxiliary robot cannot accurately grasp the motion rule of the lung, so that accurate operation of focus points is difficult to realize.
Disclosure of Invention
The invention aims to: in order to solve the problems, the invention provides a puncturing method based on phase registration, which can ensure puncturing precision.
The technical scheme is as follows: a puncturing method based on phase registration, comprising the steps of:
selecting any 3D image in the 4D images as a planning image for planning;
acquiring a breathing curve of a patient;
acquiring the position of focus features in the 4D image to obtain a focus feature motion curve;
carrying out phase matching on the breathing curve and the focus characteristic motion curve to obtain a mapping relation between the breathing curve and the focus characteristic motion curve;
obtaining the corresponding amplitude of the planning image in the respiratory curve according to the phase point of the planning image on the focus characteristic motion curve and the mapping relation, and marking the amplitude as a target amplitude;
and monitoring the respiration of the patient in real time, and executing a planning result when the respiration amplitude of the patient reaches the target amplitude.
And selecting a 3D image corresponding to the end expiration or the end inspiration from the planning image.
The scanning time resolution of the 4D image is set to 0.25s.
The step of obtaining the breathing curve of the patient comprises the following steps: the method comprises the steps of obtaining patient breathing data of at least one breathing cycle through a breathing collector, taking the tail of inspiration of a patient as a reference pose, namely a datum point, defining the axis with the largest variation of the breathing amplitude of the patient in one breathing cycle of the patient as a datum axis, taking the amplitude corresponding to the datum axis in the subsequent breathing of the patient as the ordinate value of the breathing curve of the patient, and further obtaining the breathing curve of the patient.
The respiratory collector is a tracer, a respiratory bellyband or a respiratory flow valve which are obtained by adopting binocular camera identification.
When the variation of the datum point of the patient in any one of the other two directions orthogonal to the datum axis exceeds a set threshold value in the subsequent respiration, the respiration amplitude corresponding to the same phase in the history data of unchanged datum point is differed from the latest acquired respiration amplitude, and the obtained difference is interpolated to the phase corresponding to the respiration curve of the patient in the period, so that the adjusted respiration curve of the patient is obtained.
The steps of obtaining focus characteristic motion curve are as follows: and (3) extracting the position of the focus characteristic of the patient in the image coordinate system and the corresponding scanning sampling time by an image processing algorithm on the 4D image, and obtaining a focus characteristic motion curve of the patient by performing dimension reduction processing on the data.
The obtaining of the focus characteristic motion curve is further as follows: and taking bifurcation points of a patient focus side trachea or blood vessels as patient focus characteristic points, extracting patient focus characteristic points of a first 3D image, and carrying out tracking processing or image matching algorithm on the patient focus characteristic points in a subsequent 3D image to obtain a curve of movement of patient focus characteristics along with time.
The patient focus characteristic motion curve is further obtained as follows: and selecting the slice with the focus characteristic of the patient as a reference picture, and respectively performing difference operation on the slices at the corresponding slice positions obtained by scanning at other times and the reference picture to obtain the thickness characteristic of the focus characteristic of the patient in the movement direction, thereby obtaining the focus characteristic movement curve of the patient.
The mapping relation between the breathing curve and the focus characteristic motion curve is specifically: and moving the focus characteristic motion curve of the patient on a time axis of the breathing curve in a set step length, traversing the whole range of the breathing curve, solving the mean square error of the amplitudes of the patient breathing curve and the focus characteristic motion curve corresponding to each time component after each movement, and taking the patient breathing curve and the focus characteristic motion curve corresponding to the time component corresponding to the minimum mean square error as the optimal matching, thereby obtaining the mapping relation between the two.
Before the mean square error is calculated on the amplitudes of the patient respiratory curve and the focus characteristic motion curve, noise reduction and normalization processing are carried out on the patient focus characteristic motion curve and the patient respiratory curve data, and corresponding weights are set for the amplitudes of different phases according to the characteristics of the focus characteristic motion curve.
The assigned weights gradually decrease from the end-expiration phase to the end-inspiration phase of the focus characteristic motion curve to the middle phase of the focus characteristic motion curve.
The beneficial effects are that: according to the invention, the mapping relation between the external breathing curve and the focus characteristic motion curve in the patient is established, and the phase of the planning image on the focus characteristic motion curve is combined to obtain the corresponding breathing amplitude of the planning image on the breathing curve, so that the puncture phase can be prompted to be reached through the amplitude change of the external breathing curve, and the accurate puncture is realized.
Drawings
FIG. 1 is a flow chart of a lancing method according to the present invention;
FIG. 2 is an external respiration graph;
FIG. 3 is a graph of internal lesion characterization motion;
FIG. 4 is a schematic diagram of planning a planning channel in a planning image;
fig. 5 is a schematic diagram of the phase registration results of the breathing curve and the lesion characteristic motion curve.
Detailed Description
The invention is further elucidated below in connection with the drawings and the specific embodiments.
As shown in fig. 1, the puncturing method based on phase registration of the invention comprises the following steps:
(1) Fixing a data acquisition tool;
confirming focus position of a patient by means of preoperative enhancement CT and perspective image positioning, placing a register at the focus of the patient, and placing a respiratory sampler at the abdomen of the patient; the register is used for registering and registering the image coordinate system and the robot coordinate system so as to map the pose of the planning channel under the image coordinate system into the robot execution space under the robot coordinate system; the breath sampler comprises, but is not limited to, an optical tracking tracer, an electromagnetic navigation sensor or a radar reflector, and can be used for acquiring breath fluctuation changes and processing the acquired breath fluctuation data to obtain a patient breathing curve, as shown in fig. 2;
specifically, in the embodiment disclosed by the invention, the register comprises a plurality of steel balls made of X-ray impermeable materials, the steel balls are asymmetrically distributed by adopting a preset topological rule, the sizes of the steel balls can be completely consistent or different, and the diameter of the steel balls is preferably 2-4mm;
in the embodiment disclosed by the invention, the respiration sampler adopts a tracer identified by an infrared optical binocular camera, wherein the tracer is a sphere capable of emitting infrared light, and the infrared light can be captured by the infrared binocular camera so as to determine the position of the tracer;
(2) CT scanning is carried out to obtain a 4D image;
performing 4D scanning on a patient focus to obtain a 4D image containing the focus characteristics of the patient, wherein the resolution of scanning sampling time is as small as possible, and the scanning time resolution is set to be 0.25s;
the time requirement of the scanned 4D image comprises at least one respiratory cycle, and the focus characterization quantity is as clear as possible and has no interference;
(3) Confirming a planning image;
a doctor selects any 3D image from the 4D images as a planning image, the planning image is taken as a puncture image, a planning channel for puncturing is planned based on the puncture image, and a scanning sampling time point of the planning image is obtained according to data extraction, as shown in fig. 4;
the identification features are obvious at the end of expiration and the end of inspiration, and the reliability of the extraction result is high, so that 3D images corresponding to the end of expiration and the end of inspiration are selected as planning images;
(4) Acquiring a breathing curve of a patient;
tracking the position change of a respiratory sampler of the human abdomen in real time through an infrared binocular camera, and fitting to obtain a respiratory curve of a patient through data noise reduction, data reduction and a dynamic reference adjustment algorithm, wherein the respiratory curve is shown in figure 2;
specifically, the infrared binocular camera pre-samples the position information of the tracer in one respiratory cycle of the respiratory sampler, and takes the inhalation end of a patient as a reference pose, namely a datum point; defining an axis with the largest position change of the breath sampler in one breath cycle of a patient as a reference axis, taking a numerical value corresponding to the position of the breath sampler in the subsequent breath of the patient on the reference axis as the ordinate of the breath curve of the patient, and further obtaining the breath curve of the patient;
further, when the variation of the datum point of the patient in any one of the other two directions orthogonal to the datum axis exceeds a set threshold value in the subsequent breath, warning that the position of the tracer is changed, namely the datum point is adjusted, at the moment, the position of the breath sampler corresponding to the same phase in the history data of which the position of the tracer is not changed is differed from the position of the breath sampler which is acquired last, and the obtained difference is interpolated to the phase corresponding to the patient breathing curve in the period, so that the adjusted patient breathing curve is obtained; acquiring a stable patient breathing curve in real time based on the dynamic reference adjustment algorithm;
(5) Acquiring a focus motion curve of a patient;
performing image processing algorithm extraction on the 4D image obtained in the step (2) to obtain the position of the focus feature of the patient in the image coordinate system and the corresponding scanning sampling time, performing dimension reduction processing on the data to obtain a focus feature motion curve of the patient, and combining the scanning sampling time points of the planning image to obtain the corresponding phase points of the planning image on the focus feature motion curve as shown in fig. 3;
specifically, one of the following two methods may be selected to obtain a patient focus motion profile according to the actual situation:
the method comprises the following steps: because the characteristics of the focus of the patient are various and complex and the direct extraction is difficult, the invention takes the obvious bifurcation points of the trachea or the blood vessels around the focus of the patient as the characteristics of the focus of the patient, a doctor picks up the obvious bifurcation points in a certain 3D image and then carries out the operations of tracking, 8-word bifurcation identification, segmentation extraction, morphological analysis, intersection extraction and the like to extract the characteristics of the focus of the patient along with the time;
the second method is as follows: for the 3D image with less 2D or 3D slices scanned along with time and no obvious patient focus feature on the 3D image, the invention selects the slice with obvious patient focus feature as a reference picture, and the slices at the corresponding slice positions obtained by other time scanning are respectively subjected to difference operation with the reference picture, and sequentially undergo a series of operations such as anisotropic filtering, threshold segmentation, morphological processing, blob feature calculation and the like to obtain the thickness feature of the patient focus feature in the motion direction, so as to obtain a patient focus feature motion curve;
(6) Phase matching;
based on the respiratory motion periodicity of the patient, carrying out phase matching on the respiratory curve of the patient obtained in the step (4) and the focus characteristic motion curve of the patient obtained in the step (5) to obtain a mapping relation between the respiratory curve and the focus characteristic motion curve, and obtaining a respiratory amplitude corresponding to the planning image on an external respiratory curve based on the mapping relation and a phase point corresponding to the planning image on the focus characteristic motion curve as shown in fig. 5, and marking the respiratory amplitude as a target amplitude;
the method comprises the following specific steps:
(61) Noise reduction and normalization processing are carried out on a focus characteristic motion curve of a patient and a breathing curve of the patient, corresponding weights are set for amplitudes of different phases according to characteristics of the focus characteristic motion curve, specifically, identification characteristics are obvious at the end of expiration and the end of inspiration, a larger weight value is given when the reliability of an extraction result is high, and the weight value is sequentially reduced from the end of inspiration and the end of expiration to a part close to the middle;
(62) Because the sampling rate of the focus characteristic motion curve of the patient is low, the time range is short and the sampling points are few, the focus characteristic motion curve of the patient moves on the time axis of the breathing curve of the patient in a set step length, the whole range of the breathing curve is traversed, after each movement, the mean square error of the amplitude values of the breathing curve of the patient corresponding to each time component and the focus characteristic motion curve is calculated, and the breathing curve of the patient corresponding to the time component corresponding to the minimum mean square error and the focus characteristic motion curve are used as the optimal matching, so that the mapping relation between the focus characteristic motion curve of the patient and the breathing curve of the patient can be obtained; in other embodiments, step (61) may be omitted, and the resulting mapping may have some error, but is within acceptable limits for the final real-time effect image;
(7) Performing a puncture;
and monitoring the respiration of the patient in real time, and executing a planning result when the respiration amplitude of the patient reaches the target amplitude.
In the invention, the step of acquiring the breathing curve of the patient and the step of acquiring the focus motion curve of the patient are relatively independent, so that the positions of the patient and the focus motion curve can be adjusted, and the step is only completed before the two steps are subjected to phase matching.
According to the invention, the external and visible respiration curve which can not cause radiation injury to the patient during monitoring is mapped with the invisible focus characteristic motion curve which can cause radiation injury to the patient during the extraction process, and the phase of the planning image on the focus characteristic motion curve is combined to obtain the corresponding respiration amplitude of the planning image on the respiration curve, so that the puncture phase can be prompted to be reached through the amplitude change of the external respiration curve, and accurate puncture is realized.
The preferred embodiments of the present invention have been described in detail above, but the present invention is not limited to the specific details of the above embodiments, and various equivalent changes (such as number, shape, position, etc.) may be made to the technical solution of the present invention within the scope of the technical concept of the present invention, and these equivalent changes all fall within the scope of the present invention.
Claims (11)
1. A puncture guiding method based on phase registration is characterized in that: the method comprises the following steps:
selecting any 3D image in the 4D images as a planning image for planning;
acquiring a breathing curve of a patient;
acquiring the position of focus features in the 4D image to obtain a focus feature motion curve;
the respiratory curve and the focus characteristic motion curve are subjected to phase matching to obtain a mapping relation between the respiratory curve and the focus characteristic motion curve, specifically:
moving the focus characteristic motion curve of the patient on a time axis of the breathing curve in a set step length, traversing the whole range of the breathing curve, solving the mean square error of the amplitudes of the patient breathing curve and the focus characteristic motion curve corresponding to each time component after each movement, and taking the patient breathing curve and the focus characteristic motion curve corresponding to the time component corresponding to the minimum mean square error as the optimal matching, thereby obtaining the mapping relation between the two;
obtaining the corresponding amplitude of the planning image in the respiratory curve according to the phase point of the planning image on the focus characteristic motion curve and the mapping relation, and marking the amplitude as a target amplitude;
and monitoring the respiration of the patient in real time, and executing a planning result when the respiration amplitude of the patient reaches the target amplitude.
2. The phase registration-based puncture guiding method according to claim 1, characterized in that: and selecting a 3D image corresponding to the end expiration or the end inspiration from the planning image.
3. The phase registration-based puncture guiding method according to claim 1, characterized in that: the scanning time resolution of the 4D image is set to 0.25s.
4. The phase registration-based puncture guiding method according to claim 1, characterized in that: the step of obtaining the breathing curve of the patient comprises the following steps: the method comprises the steps of obtaining patient breathing data of at least one breathing cycle through a breathing collector, taking the tail of inspiration of a patient as a reference pose, namely a datum point, defining the axis with the largest variation of the breathing amplitude of the patient in one breathing cycle of the patient as a datum axis, taking the amplitude corresponding to the datum axis in the subsequent breathing of the patient as the ordinate value of the breathing curve of the patient, and further obtaining the breathing curve of the patient.
5. The phase registration-based puncture guiding method according to claim 4, characterized in that: the respiratory collector is a tracer, a respiratory bellyband or a respiratory flow valve which are obtained by adopting binocular camera identification.
6. The phase registration-based puncture guiding method according to claim 4, characterized in that: when the variation of the datum point of the patient in any one of the other two directions orthogonal to the datum axis exceeds a set threshold value in the subsequent respiration, the respiration amplitude corresponding to the same phase in the history data of unchanged datum point is differed from the latest acquired respiration amplitude, and the obtained difference is interpolated to the phase corresponding to the respiration curve of the patient in the period, so that the adjusted respiration curve of the patient is obtained.
7. The phase registration-based puncture guiding method according to claim 1, characterized in that: the steps of obtaining focus characteristic motion curve are as follows: and (3) extracting the position of the focus characteristic of the patient in the image coordinate system and the corresponding scanning sampling time by an image processing algorithm on the 4D image, and obtaining a focus characteristic motion curve of the patient by performing dimension reduction processing on the data.
8. The phase registration-based puncture guiding method according to claim 7, characterized in that: the obtaining of the focus characteristic motion curve is further as follows: and taking bifurcation points of a patient focus side trachea or blood vessels as patient focus characteristic points, extracting patient focus characteristic points of a first 3D image, and carrying out tracking processing or image matching algorithm on the patient focus characteristic points in a subsequent 3D image to obtain a curve of movement of patient focus characteristics along with time.
9. The phase registration-based puncture guiding method according to claim 7, characterized in that: the patient focus characteristic motion curve is further obtained as follows: and selecting the slice with the focus characteristic of the patient as a reference picture, and respectively performing difference operation on the slices at the corresponding slice positions obtained by scanning at other times and the reference picture to obtain the thickness characteristic of the focus characteristic of the patient in the movement direction, thereby obtaining the focus characteristic movement curve of the patient.
10. The phase registration-based puncture guiding method according to claim 1, characterized in that: before the mean square error is calculated on the amplitudes of the patient respiratory curve and the focus characteristic motion curve, noise reduction and normalization processing are carried out on the patient focus characteristic motion curve and the patient respiratory curve data, and corresponding weights are set for the amplitudes of different phases according to the characteristics of the focus characteristic motion curve.
11. The phase registration-based puncture guiding method according to claim 10, characterized in that: the weight of the amplitude gradually decreases from the phase of the focus characteristic motion curve at the end of expiration and the end of inspiration to the middle of the focus characteristic motion curve.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111507399.9A CN114176726B (en) | 2021-12-10 | 2021-12-10 | Puncture method based on phase registration |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111507399.9A CN114176726B (en) | 2021-12-10 | 2021-12-10 | Puncture method based on phase registration |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114176726A CN114176726A (en) | 2022-03-15 |
CN114176726B true CN114176726B (en) | 2023-08-04 |
Family
ID=80604385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111507399.9A Active CN114176726B (en) | 2021-12-10 | 2021-12-10 | Puncture method based on phase registration |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114176726B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114521939B (en) * | 2022-04-24 | 2022-09-06 | 北京智愈医疗科技有限公司 | Automatic water jet cutting implementation method and system |
CN114601539B (en) * | 2022-05-16 | 2022-09-06 | 南京佗道医疗科技有限公司 | Puncture guiding method based on 3D image |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106952285A (en) * | 2017-02-15 | 2017-07-14 | 上海交通大学 | The pulmonary movements method of estimation of motion model and auto-registration is counted based on priori |
CN107374678A (en) * | 2017-08-15 | 2017-11-24 | 陈晓阳 | A kind of virtual positioning CT guiding lower lung biopsy methods |
CN111067622A (en) * | 2019-12-09 | 2020-04-28 | 天津大学 | Respiratory motion compensation method for percutaneous lung puncture |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10159421B2 (en) * | 2015-03-30 | 2018-12-25 | Resmed Sensor Technologies Limited | Detection of periodic breathing |
WO2017030915A1 (en) * | 2015-08-14 | 2017-02-23 | Intuitive Surgical Operations, Inc. | Systems and methods of registration for image-guided surgery |
-
2021
- 2021-12-10 CN CN202111507399.9A patent/CN114176726B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106952285A (en) * | 2017-02-15 | 2017-07-14 | 上海交通大学 | The pulmonary movements method of estimation of motion model and auto-registration is counted based on priori |
CN107374678A (en) * | 2017-08-15 | 2017-11-24 | 陈晓阳 | A kind of virtual positioning CT guiding lower lung biopsy methods |
CN111067622A (en) * | 2019-12-09 | 2020-04-28 | 天津大学 | Respiratory motion compensation method for percutaneous lung puncture |
Also Published As
Publication number | Publication date |
---|---|
CN114176726A (en) | 2022-03-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114176726B (en) | Puncture method based on phase registration | |
US9659374B2 (en) | Feature-based registration method | |
US20200170623A1 (en) | Methods for using radial endobronchial ultrasound probes for three-dimensional reconstruction of images and improved target localization | |
EP3127485A1 (en) | System and method for local three dimensional volume reconstruction using a standard fluoroscope | |
US10154820B2 (en) | Method and apparatus for navigating CT scan with a marker | |
KR101982149B1 (en) | Method and apparatus for creating medical image using partial medical image | |
EP3441939A1 (en) | A method and system for registering a 3d pre-acquired image coordinates system with a medical positioning system coordinate system and with a 2d image coordinate system | |
CN101229080A (en) | Registration of images of an organ using anatomical features outside the organ | |
KR20140096919A (en) | Method and Apparatus for medical image registration | |
CN114886560A (en) | System and method for local three-dimensional volume reconstruction using standard fluoroscopy | |
CN112509022A (en) | Non-calibration object registration method for preoperative three-dimensional image and intraoperative perspective image | |
WO2023066072A1 (en) | Catheter positioning method, interventional surgery system, electronic device and storage medium | |
CN114452508B (en) | Catheter motion control method, interventional operation system, electronic device, and storage medium | |
CN112617877B (en) | Autonomous scanning method of mobile CT system, storage medium and CT scanning device | |
US20230030343A1 (en) | Methods and systems for using multi view pose estimation | |
CN114283179A (en) | Real-time fracture far-near end space pose acquisition and registration system based on ultrasonic images | |
CN115462903B (en) | Human body internal and external sensor cooperative positioning system based on magnetic navigation | |
EP3944190A1 (en) | Systems and methods for estimating the movement of a target using universal deformation models for anatomic tissue | |
EP4082444A1 (en) | Automatic frame selection for 3d model construction | |
CN114795473A (en) | Respiration tracking method based on spliced image | |
Jian-jun et al. | A Rapid Temporal Bone Localization Method Based on Machine Visual Detection Markers | |
Cao et al. | An improved multi-resolution 2D/3D registration method | |
CN115040243A (en) | Method for judging motion state similarity of target points | |
CN115040217A (en) | Method for judging motion state similarity of target points | |
CN118141360A (en) | Electrophysiology three-dimensional mapping system and non-contact respiration measurement gating method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information | ||
CB02 | Change of applicant information |
Address after: 210000 building 3, No. 34, Dazhou Road, Yuhuatai District, Nanjing, Jiangsu Province Applicant after: Tuodao Medical Technology Co.,Ltd. Address before: Room 102-86, building 6, 57 Andemen street, Yuhuatai District, Nanjing, Jiangsu 210000 Applicant before: Nanjing Tuodao Medical Technology Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |