CN115040243A - Method for judging motion state similarity of target points - Google Patents

Method for judging motion state similarity of target points Download PDF

Info

Publication number
CN115040243A
CN115040243A CN202210907889.6A CN202210907889A CN115040243A CN 115040243 A CN115040243 A CN 115040243A CN 202210907889 A CN202210907889 A CN 202210907889A CN 115040243 A CN115040243 A CN 115040243A
Authority
CN
China
Prior art keywords
pose
time
target
tracer
similarity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210907889.6A
Other languages
Chinese (zh)
Inventor
王英杰
曹红洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Tuodao Medical Technology Co Ltd
Original Assignee
Nanjing Tuodao Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Tuodao Medical Technology Co Ltd filed Critical Nanjing Tuodao Medical Technology Co Ltd
Publication of CN115040243A publication Critical patent/CN115040243A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/34Trocars; Puncturing needles
    • A61B17/3403Needle locating or guiding means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Robotics (AREA)
  • Urology & Nephrology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)

Abstract

The invention discloses a method for judging motion state similarity of target points, which comprises the following steps: placing a follow-up tracer for representing the motion state of a target point; acquiring a 3D image of a patient, and planning on the 3D image to obtain a planning channel; acquiring exposure time of an image slice at the tail end of the planning channel as alignment time, and acquiring a follow-up tracer pose corresponding to the alignment time as a target pose; and acquiring the effective real-time pose of the follow-up tracer, comparing the effective real-time pose with the target pose, and judging that the current motion state of the target point is similar to the motion state corresponding to the alignment time if the similarity between the effective real-time pose and the target pose meets set conditions. According to the invention, the time when the current breathing state is close to the target breathing state is found out through comparison of the effective real-time pose of the follow-up tracer and the similarity of the target pose to guide execution, so that the one-time execution success rate is improved.

Description

Method for judging motion state similarity of target points
Technical Field
The invention relates to the technical field of surgical navigation, in particular to a method for judging motion state similarity of target points.
Background
In clinical practice, particularly in clinical practice of tumor treatment, medical images are widely used for guidance, and then the mechanical arm is used for assisting in positioning of an operation tool to perform minimally invasive surgery, so that the wound surface of a patient is reduced, and the recovery speed of the patient is accelerated.
The medical operation navigation robot provides high-precision position and angle navigation for doctors in the fields of percutaneous puncture, biopsy, ablation and the like, reduces the dependence of the doctors on hand feeling and experience, but in operations such as biopsy and ablation of puncture of organs such as lungs and the like which are greatly influenced by respiration, puncture channel planning is executed by the doctors based on physiological anatomical features and the current actual physiological condition of patients, belongs to a static condition, the patients are in a continuous respiration state during the actual operation, and main parts of puncture and ablation are parts such as the lungs, the liver and the like which are greatly influenced by the respiration. Because the patient breathes and causes the mediastinum to move, the displacement of focus point in the organ can reach 5cm at most, belong to the dynamic condition, therefore even experienced doctor has been unable to guarantee one hundred percent puncture success rate. Conventional active or passive breath holding operation is difficult to enable patients to cooperate, and the current surgical navigation auxiliary robot cannot accurately grasp the motion law of the focus, so that the moment when the focus state is basically consistent with the focus motion state during channel planning is difficult to find, the surgical precision cannot be guaranteed, and the success rate of one-time execution of the focus is low.
Disclosure of Invention
The purpose of the invention is as follows: aiming at the problems, the invention provides the method for judging the motion state similarity of the target point, which is easy to realize, simple in calculation, low in execution difficulty and capable of improving the one-time execution success rate.
The technical scheme is as follows: a method for judging motion state similarity of target points comprises the following steps: placing a follow-up tracer for representing the motion state of a target point; acquiring a 3D image of a patient, and planning on the 3D image to obtain a planning channel; acquiring exposure time of an image slice at the tail end of the planning channel as alignment time, and acquiring a follow-up tracer pose corresponding to the alignment time as a target pose; and acquiring the effective real-time pose of the follow-up tracer, comparing the effective real-time pose with the target pose, and judging that the current motion state of the target point is similar to the motion state corresponding to the aiming time if the similarity between the effective real-time pose and the target pose meets the set condition.
Specifically, the pose of the follow-up tracer is obtained by using an optical tracking system, and certainly, the pose of the tracer can be obtained by using other devices.
Further, the effective real-time pose of the follow-up tracer is obtained specifically as follows:
and under the condition that the operating table moves, the real-time pose of the follow-up tracer is obtained by combining the optical tracking system and the movement amount of the operating table, and the effective real-time pose of the follow-up tracer is obtained by calculation.
Furthermore, the similarity between the effective real-time pose and the target pose is specifically determined as follows:
and respectively converting the effective real-time pose and the target pose into a translation transformation matrix V, V 'and a rotation transformation matrix A, A', and jointly representing the similarity of the target pose and the effective real-time pose by virtue of the Euclidean distance | | | | V '-V | | of translation transformation and the Euclidean distance | | | A' -A | | | of rotation transformation.
Furthermore, when | | V '-V | | < a first threshold and | | | a' -a | | < a second threshold, it is determined that the effective real-time pose is similar to the target pose.
Further, the first threshold value is set to 2mm, and the second threshold value is set to 0.2 °.
Furthermore, a plane rectangular coordinate system is constructed by taking the V '-V | as an x axis and the A' -A | as a y axis, and the similarity between the effective real-time pose and the target pose is represented by the distance between a point (| V '-V | and | A' -A |) and the origin of the coordinate system, so that the method is more intuitive.
Has the advantages that: the invention converts the change of the motion state of the target point into the pose change of the follow-up tracer caused by respiratory motion, finds the moment when the current motion state of the target point is close to the motion state of the target through the comparison of the effective real-time pose of the follow-up tracer and the similarity of the target pose, and guides the execution, thereby improving the success rate of one-time execution.
Drawings
FIG. 1 is a flow chart of the present invention;
fig. 2 is a schematic diagram of the motion trajectory of the point p on the plane coordinate system.
Detailed Description
The invention is further elucidated with reference to the drawings and the embodiments.
Fig. 1 is a flow chart of the present invention, and as shown in fig. 1, the present invention comprises the following steps:
(1) placing a follow-up tracer;
the method is characterized in that a follow-up tracer is placed on a patient, the pose of the follow-up tracer changes along with the respiration of the patient, and the motion state of a focus (target point) can be represented, and specifically, the follow-up tracer is placed on the surface of a human body close to the focus of the patient, such as the position between the navel and diaphragm muscle, and further placed 2cm above the navel.
(2) Acquiring a 3D image containing a focus of a patient and a pose of a follow-up tracer;
the patient is scanned to obtain a 3D image containing the focus of the patient, and in the process, the follow-up tracer is ensured to be always in the visual field range of the optical tracking system, so that the optical tracking system can capture the pose of the follow-up tracer.
(3) Acquiring the pose of the follow-up tracer at the corresponding moment of the alignment time as a target pose:
a doctor carries out channel planning on the 3D image to obtain a planning channel, extracts the sampling time of an image slice (namely the time stamp of the exposure of the slice) at the tail end of the planning channel as the alignment time, and obtains the pose of the follow-up tracer at the moment corresponding to the alignment time as the target pose Z of the follow-up tracer 1
(4) Acquiring the effective real-time pose of the follow-up tracer and judging the similarity between the effective real-time pose and the target pose;
in the process of acquiring the 3D image, a human body part near the focus needs to be placed in the imaging equipment, so that the condition that the focus of a patient cannot be operated due to the shielding of the imaging equipment exists, before the operation is performed, the position of the optical tracking system is kept fixed, the operating bed is moved to a position suitable for the operation, wherein the movement transformation quantity of the operating bed relative to the position in the step (3) is T, and the real-time pose Z of the follow-up tracer acquired in real time by the optical tracking system is real-time 2 Removing the movement amount T of the operating table to obtain the effective real-time pose Z of the follow-up tracer for eliminating the influence of the movement of the operating table 2 ′=T -1 Z 2
If the condition that the patient is shielded by the image equipment and the operation cannot be carried out on the focus of the patient does not exist, the operation bed does not need to be moved, and the effective real-time pose Z of the follow-up tracer under the condition is 2 ' real-time pose Z of follow-up tracer directly captured by optical tracking system 2
When the effective real-time pose of the follow-up tracer is more similar to the target pose of the follow-up tracer, the current respiratory state of the patient is more consistent with the target respiratory state corresponding to the alignment time, the motion state of the corresponding focus is basically consistent with the target motion state of the focus corresponding to the alignment time, and the consistency of the actual execution path and the planned channel can be ensured when the follow-up tracer is executed.
Effective real-time pose Z 2 ' and object pose Z 1 The similarity judgment is specifically as follows:
Z 2 ' and Z 1 4 x 4, for convenient calculation, converting both into translation transformation matrix and rotation transformation matrix respectively represented as Z 1 (V (x, y, Z), A (r, p, q)) and Z 2 ' (V ' (x ', y ', Z '), A ' (r ', p ', q ')), wherein Z is represented by a combination of the Euclidean distance | | | V ' -V | | of the translational transformation and the Euclidean distance | | | A ' -A | | of the rotational transformation 2 ' and Z 1 When V | |'-V < first threshold and A' -A < second threshold, then Z is considered to be 2 ' and Z 1 Similarly; further, the first threshold value is set to 2mm, and the second threshold value is set to 0.2 °.
Effective real-time pose Z for visual representation of follow-up tracer 2 ' and its target pose Z 1 The similarity of the following tracer is that as shown in fig. 2, a planar rectangular coordinate system is constructed by taking | | V '-V | | as a horizontal coordinate and | | | a' -a | | as a vertical coordinate, the effective real-time pose of the following tracer is driven by the respiration of a patient to change, so that a point (| V '-V |, | | a' -a |) of the planar rectangular coordinate system does irregular motion on the plane, the motion track of the point is an irregular line as shown in fig. 2, the distance between the point and the origin of the planar rectangular coordinate system can represent the similarity between the effective real-time pose of the following tracer and the target pose of the following tracer, and the closer the point is to the origin, the more Z is 2 ' and Z 1 The more similar, the similarity between the effective real-time pose of the follow-up tracer and the target pose can be visually seen through the motion track of the point on the plane.
In the disclosed embodiment of the invention, the pose of the follower tracer is acquired by an optical tracking system, but the invention is not limited thereto, and in other embodiments, other means may be employed to acquire the pose of the tracer.
According to the invention, the time when the current breathing state is close to the target breathing state is found out through comparison of the effective real-time pose of the follow-up tracer and the similarity of the target pose to guide execution, so that the one-time execution success rate is improved; according to the invention, only the pose of the follow-up tracer is required to be captured, the comparison of similarity of the poses is simplified by using a dimension reduction algorithm, and the comparison is respectively and independently carried out according to the translation deviation and the rotation deviation, so that the calculation is simpler; in addition, the invention can establish a plane rectangular coordinate system according to the translation deviation and the rotation deviation, and intuitively reflect the similarity of the effective real-time pose and the target pose through the position of the point representing the pose similarity on the plane rectangular coordinate system.
Although the preferred embodiments of the present invention have been described in detail, the present invention is not limited to the details of the embodiments, and various equivalent modifications can be made within the technical spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention.

Claims (7)

1. The motion state similarity judgment method of the target point is characterized by comprising the following steps: the method comprises the following steps:
placing a follow-up tracer for representing the motion state of a target point;
acquiring a 3D image of a patient, and planning on the 3D image to obtain a planning channel;
acquiring exposure time of an image slice at the tail end of the planning channel as alignment time, and acquiring a follow-up tracer pose corresponding to the alignment time as a target pose;
and acquiring the effective real-time pose of the follow-up tracer, comparing the effective real-time pose with the target pose, and judging that the current motion state of the target point is similar to the motion state corresponding to the alignment time if the similarity between the effective real-time pose and the target pose meets set conditions.
2. The method for determining the similarity of motion states of target points according to claim 1, wherein: and obtaining the pose of the follow-up tracer by adopting an optical tracking system.
3. The method for determining the similarity of motion states of target points according to claim 2, wherein: the method for acquiring the effective real-time pose of the follow-up tracer specifically comprises the following steps: and under the condition that the operating table moves, the real-time pose of the follow-up tracer and the movement amount of the operating table are obtained by combining the optical tracking system, and the effective real-time pose is obtained by calculation.
4. The method for judging the similarity of the motion states of the target points according to any one of claims 1 to 3, wherein: the similarity judgment of the effective real-time pose and the target pose specifically comprises the following steps:
and respectively converting the effective real-time pose and the target pose into a translation transformation matrix V, V 'and a rotation transformation matrix A, A', and jointly representing the similarity of the target pose and the effective real-time pose by virtue of the Euclidean distance | | | | V '-V | | of translation transformation and the Euclidean distance | | | A' -A | | | of rotation transformation.
5. The method for determining the motion state similarity of target points according to claim 4, wherein: and when the | | | V '-V | | | < a first threshold and | | | | A' -A | | | < a second threshold, judging that the effective real-time pose is similar to the target pose.
6. The method according to claim 5, wherein the method further comprises: the first threshold value is set to 2mm, and the second threshold value is set to 0.2 °.
7. The method according to claim 4, wherein the method further comprises: and constructing a plane rectangular coordinate system by taking the V '-V | as an x axis and the A' -A | as a y axis, and representing the similarity between the effective real-time pose and the target pose by the distance between a point (| V '-V | and | | |, A' -A |) and the origin of the coordinate system.
CN202210907889.6A 2022-05-16 2022-07-29 Method for judging motion state similarity of target points Pending CN115040243A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210526390 2022-05-16
CN2022105263900 2022-05-16

Publications (1)

Publication Number Publication Date
CN115040243A true CN115040243A (en) 2022-09-13

Family

ID=83167579

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210907889.6A Pending CN115040243A (en) 2022-05-16 2022-07-29 Method for judging motion state similarity of target points

Country Status (1)

Country Link
CN (1) CN115040243A (en)

Similar Documents

Publication Publication Date Title
US11896414B2 (en) System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
Wallach et al. Comparison of freehand‐navigated and aiming device‐navigated targeting of liver lesions
US20080221520A1 (en) Positioning System for Percutaneous Interventions
Soper et al. In vivo validation of a hybrid tracking system for navigation of an ultrathin bronchoscope within peripheral airways
CN110123449B (en) System and method for local three-dimensional volume reconstruction using standard fluoroscopy
CN112641514B (en) Minimally invasive interventional navigation system and method
WO2017043926A1 (en) Guiding method of interventional procedure using medical images, and system for interventional procedure therefor
KR101758740B1 (en) Guiding method of interventional procedure using medical images and system for interventional procedure for the same
CN113616333A (en) Catheter movement assistance method, catheter movement assistance system, and readable storage medium
CN114176726B (en) Puncture method based on phase registration
CN117481756A (en) Puncture guiding method, puncture guiding equipment and robot system
CN115040243A (en) Method for judging motion state similarity of target points
Gruionu et al. Robotic System for Catheter Navigation during Medical Procedures
WO2023129562A1 (en) Systems and methods for pose estimation of imaging system
CN115462885A (en) Percutaneous puncture method and system
CN115040217A (en) Method for judging motion state similarity of target points
CN113940756B (en) Operation navigation system based on mobile DR image
US20240148445A1 (en) Image guidance for medical procedures
Xu et al. CT fluoroscopy-guided robotically-assisted lung biopsy
Meinzer et al. Computer-assisted soft tissue interventions
CN117653286A (en) Puncture navigation method, puncture navigation system, electronic equipment and storage medium
WO2024079584A1 (en) Systems and methods of moving a medical tool with a target in a visualization or robotic system for higher yields
KR20170030688A (en) Guiding method of interventional procedure using medical images and system for interventional procedure for the same
CN117770960A (en) Navigation robot rotation and translation method and device, electronic equipment and medium
WO2023161848A1 (en) Three-dimensional reconstruction of an instrument and procedure site

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 210000 building 3, No. 34, Dazhou Road, Yuhuatai District, Nanjing, Jiangsu Province

Applicant after: Tuodao Medical Technology Co.,Ltd.

Address before: 210000 building 3, No. 34, Dazhou Road, Yuhuatai District, Nanjing, Jiangsu Province

Applicant before: Nanjing Tuodao Medical Technology Co.,Ltd.