CN112451093A - Physical space and image space registration method for image-guided robot minimally invasive surgery - Google Patents

Physical space and image space registration method for image-guided robot minimally invasive surgery Download PDF

Info

Publication number
CN112451093A
CN112451093A CN202110104422.3A CN202110104422A CN112451093A CN 112451093 A CN112451093 A CN 112451093A CN 202110104422 A CN202110104422 A CN 202110104422A CN 112451093 A CN112451093 A CN 112451093A
Authority
CN
China
Prior art keywords
image
coordinate system
projection
space
physical space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110104422.3A
Other languages
Chinese (zh)
Other versions
CN112451093B (en
Inventor
王静
牛田野
罗辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202110104422.3A priority Critical patent/CN112451093B/en
Priority to PCT/CN2021/076194 priority patent/WO2022160384A1/en
Publication of CN112451093A publication Critical patent/CN112451093A/en
Application granted granted Critical
Publication of CN112451093B publication Critical patent/CN112451093B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/107Visualisation of planned trajectories or target regions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/108Computer aided selection or customisation of medical implants or cutting guides
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Robotics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention relates to a physical space and image space registration method for an image-guided robot minimally invasive surgery, and belongs to the technical field of X-ray imaging. The calibration die body is scanned by CBCT in operation, imaging geometric parameters are automatically identified, a three-dimensional reconstruction algorithm model under a die body coordinate system is established, the CBCT in operation scans an operation object to reconstruct a three-dimensional CT image in the die body coordinate system, the calibration die body is fixedly connected with the robot through the base, and the relatively fixed conversion relation between the die body coordinate system and the robot coordinate system is calibrated once. The method utilizes the calibration die body to unify the physical space and the image space, avoids the complex process of acquiring and registering the coordinates of the pairing mark points of the physical space and the image space in the conventional method, greatly improves the registration precision of the physical space and the image space, simplifies the registration process, shortens the registration time and has no damage to the surgical object.

Description

Physical space and image space registration method for image-guided robot minimally invasive surgery
Technical Field
The invention relates to the technical field of X-ray imaging, in particular to a physical space and image space registration method for an image-guided robot minimally invasive surgery.
Background
Physical space to image space registration is a key link in image-guided robotic minimally invasive surgery, and the registration process is a process that associates both the physical space containing robot and the patient with the image space containing the CT image. The distance between a drilled cochlear implant channel and important anatomical structures such as facial nerves, tympanostomy and the like is about 0.5mm, an operation path is planned and implemented in a physical space, and the anatomical structures are identified and acquired in an image space, so that the registration accuracy of the physical space and the image space determines whether the minimally invasive surgical operation of the robot is successful or not to a great extent.
At present, the physical space and image space registration is usually realized by adopting a point registration method or a surface registration method. The higher precision in the point Registration method is to implant fiducial marker points in the bone by means of surgery, and the existing literature (Gerber, n., et al. (2013) 'High-Accuracy Patient-to-Image Registration for the facility of Image-Guided rotation microscopy on the Head', IEEE Transactions on biological Engineering, 60(4): 960) discloses a temporal bone incision screwed into ten anchor points as implanted fiducial marker points, Target Registration Error (Target Registration Error), TRE =0.2mm, but has invasive damage to the Patient; the registration accuracy obtained by the skin-attaching reference mark points and the dissection reference mark points is poor. In addition, the physical space coordinates of the reference mark points are obtained by an optical or magnetic position sensor, such as a high-precision optical tracker and a special six-degree-of-freedom force sensing device, but the three-dimensional measurement process is complex, the precision is limited and the cost is high. The surface registration method adopts laser surface scanning or tracking pointer to detect surface point cloud coordinates and then performs registration through an iterative closest point algorithm, the registration precision of the method is not higher than that of point registration, for example, a three-dimensional white light scanner is adopted to obtain head surface point cloud as disclosed in the prior document (Fan, y., et al. (2020). "a robust automated surface-matching registration method for reconstruction", Medical Physics, 47(7): 2755-2767.), and the target registration error TRE =1.17mm after coarse and fine registration for two times.
In conclusion, the existing physical space and image space registration method has the defects of low precision, complex process, high cost, body injury and the like, and the development and application of the image-guided robot minimally invasive surgery in the aspect of complex and delicate surgery are severely limited.
Disclosure of Invention
The invention aims to provide a method for registering a physical space and an image space of an image-guided robot minimally invasive surgery, which effectively avoids the complex processes of acquiring and registering coordinates of matching mark points of the physical space and the image space in the conventional method.
In order to achieve the above object, the present invention provides a method for registering physical space and image space of an image-guided robotic minimally invasive surgery, comprising the following steps:
1) fixedly connecting the calibration die body to the robot, and determining a conversion relation between a die body coordinate system and a robot coordinate system;
2) in-operation CBCT scans and calibrates the mold body to obtain the mold body projection drawings of different angles;
3) aiming at each projection image obtained in the step 2), identifying imaging geometric parameters at each angle in the CBCT scanning process in the operation according to a CBCT imaging geometric parameter calibration algorithm based on a model body, and solving a conversion relation between a model body coordinate system and a detector coordinate system;
4) correcting a CBCT three-dimensional reconstruction algorithm according to the intraoperative CBCT imaging geometric parameters obtained in the step 3), and establishing a three-dimensional reconstruction algorithm model under a model coordinate system;
5) in-operation CBCT scans an operation object to obtain human body projection drawings at different angles, wherein the angles of the collected human body projection drawings correspond to the angles of the mould body projection drawings in the step 2) one by one;
6) substituting the human body projection image obtained in the step 5) into the three-dimensional reconstruction algorithm model established in the step 4), and reconstructing a three-dimensional CT image of the operation object in a body coordinate system;
7) and obtaining the relative position relation between the robot and the internal structure of the operation object by utilizing the conversion relation between the model coordinate system and the robot coordinate system.
In the technical scheme, reference mark points do not need to be set, a three-dimensional measuring device does not need to be used, the CBCT in the operation scans and calibrates the die body, imaging geometric parameters are automatically identified, a three-dimensional reconstruction algorithm model under a die body coordinate system is established, the CBCT in the operation scans the operation object again to reconstruct a three-dimensional CT image in the die body coordinate system, the calibration die body is fixedly connected with the robot through the base, and the relatively fixed conversion relation between the die body coordinate system and the robot coordinate system is calibrated once.
Optionally, in an embodiment, the calibration mold body includes a cylindrical barrel and a spherical body regularly arranged in the barrel, and a fixing seat for connecting the robot base is arranged at the bottom of the barrel.
The spheres are arranged in the following way:
two ends of the central shaft of the cylinder body are respectively provided with a sphere, and the central position of the cylinder body is provided with a sphere;
a circle of spheres are uniformly arranged around the top end cover and the bottom end cover of the cylinder body.
Further, step 3) comprises:
3-1) for each projection drawing, dividing the projection of each sphere in the calibration phantom;
3-2) calculating the center coordinate of each sphere projection;
3-3) solving the imaging geometric parameters and the transformation matrix between the model coordinate system and the detector coordinate system according to the geometric relation between the central coordinate of the sphere projection and the sphere projection.
The segmentation method in the step 3-1) comprises a segmentation method based on a gray threshold, a segmentation method based on edge detection and Hough circle detection, and a segmentation method based on a U-Net full convolution network.
In the step 3-2), the center coordinate of each sphere projection is calculated by adopting a pixel gray value weighting method.
Further, in step 4), the CBCT three-dimensional reconstruction algorithm is modified in the form of a homogeneous transformation matrix, and step 4) includes:
4-1) performing cosine correction and line-by-line linear shift invariant filtering operation on the projection data by using the imaging geometric parameters obtained in the step 3-3) in a model coordinate system;
4-2) transforming the focus coordinates and the reconstructed voxel coordinates in the model coordinate system into the detector coordinate system by using the imaging geometric parameters obtained in the step 3-3) and the transformation matrix between the model coordinate system and the detector coordinate system;
4-3) in a detector coordinate system, correcting the weighted back projection process according to the intersection form of a focal point and voxel point connecting line and a detector plane based on the voxel-driven back projection process, and rapidly traversing and solving each voxel by adopting a rapid increment algorithm to obtain a three-dimensional reconstruction algorithm model.
The imaging geometric parameters comprise a focus coordinate of a calibration die body, a detector central point coordinate and a detector deflection angle.
Compared with the prior art, the invention has the advantages that:
the invention unifies the physical space and the image space by utilizing the calibration die body, identifies the CBCT imaging geometric parameters in the operation by the calibration die body, establishes the three-dimensional reconstruction algorithm model under the die body coordinate system according to the CBCT imaging geometric parameters, directly reconstructs the three-dimensional CT image of the operation object in the die body coordinate system, avoids the complex process of acquiring and registering the coordinates of the pairing mark points of the physical space and the image space in the conventional method, greatly improves the registration precision of the physical space and the image space, simplifies the registration flow, shortens the registration time, has no damage to the operation object, and can effectively promote the development and the application of the image-guided robot minimally invasive surgery in the aspect of complex and fine operations.
Drawings
Fig. 1 is a flowchart of a registration method of an image-guided robotic minimally invasive surgery physical space and an image space in an embodiment of the invention.
FIG. 2 is a schematic structural diagram of a calibration phantom in an embodiment of the present invention.
Fig. 3 is a flow chart of obtaining imaging geometry parameters in an embodiment of the present invention.
FIG. 4 is a flowchart of obtaining CT images of a surgical object according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of an intraoperative CBCT scan calibration phantom in accordance with an embodiment of the present invention.
FIG. 6 is a schematic illustration of an intraoperative CBCT scan surgical object in accordance with an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described with reference to the following embodiments and accompanying drawings. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments without any inventive step, are within the scope of protection of the invention.
Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The use of the word "comprise" or "comprises", and the like, in the context of this application, is intended to mean that the elements or items listed before that word, in addition to those listed after that word, do not exclude other elements or items.
Examples
The calibration mold body structure 100 used in this embodiment is shown in fig. 2, and includes a cylindrical barrel 4 and a sphere regularly arranged in the barrel, wherein the top of the barrel 4 is provided with an upper end cover 1, and the bottom of the barrel is provided with a lower end cover 9 and a fixing seat 10 for connecting a robot 200. The sphere includes: the upper layer central small ball 3 and the lower layer central small ball 7 are positioned at two ends of the cylinder body middle upright post 6, and the middle central small ball 5 is positioned at the center of the cylinder body middle upright post 6; the upper layer of the balls 2 uniformly distributed around the upper end cover 1 and the lower layer of the balls 8 uniformly distributed around the lower end cover 9.
Of course, the present invention is not limited to the use of the above calibration phantom, which is only one embodiment of the present invention.
Referring to fig. 1, the method for registering a physical space and an image space of an image-guided minimally invasive robotic surgery in the embodiment includes the following steps:
s1: referring to fig. 5, the calibration phantom 100 is fixed on the base 201 connected with the robot 200, and the calibration phantom 100 is located in the field of view of the CBCT400 during the operation. The calibration phantom 100 can calibrate all imaging geometric parameters including a focus coordinate, a detector center point coordinate and a detector deflection angle according to a projection diagram.
S2: the CBCT400 scans the calibration phantom 100 during operation to obtain a plurality of projection images with different angles in a 360-degree range. Each projection angle value at which projection data is acquired may be acquired by a tilt sensor mounted on the intraoperative CBCT 400.
S3: as shown in fig. 2, for each projection map obtained in step S2, imaging geometry parameters at each angle during the scanning process of the CBCT400 during the operation are automatically identified according to the phantom-based CBCT imaging geometry parameter calibration algorithm. Referring to fig. 3, this step specifically includes the following sub-steps:
s31: for each projection graph, segmenting the projection of each small ball in the calibration phantom by adopting a segmentation method based on a gray threshold value, or a segmentation method based on edge detection and Hough circle detection, or a segmentation method based on a U-Net full convolution network;
s32: calculating the center coordinates of the small ball projection by adopting a pixel gray value weighting method for each small ball projection;
s33: and solving imaging geometric parameters such as a focus coordinate, a detector central point coordinate, a detector deflection angle and the like and a transformation matrix between the model coordinate system and the detector coordinate system according to the small ball projection central coordinate and the geometric relation between the small ball projections.
S4: as shown in fig. 4, the CBCT three-dimensional reconstruction algorithm is modified in the form of a homogeneous transformation matrix according to the actual imaging geometric parameters of the intraoperative CBCT400 obtained in step S3, and a three-dimensional reconstruction algorithm model in the model coordinate system is established. The method specifically comprises the following substeps:
s41: in the model coordinate system, the imaging geometric parameters obtained in the step S33 are used for carrying out cosine correction and line-by-line linear shift invariant filtering operation on the projection data;
s42: transforming the focus coordinate and the reconstructed voxel coordinate in the model coordinate system into the detector coordinate system by using the transformation matrix between the focus coordinate and the model coordinate system and the detector coordinate system which are obtained in the step S33;
s43: in a detector coordinate system, a back projection process based on voxel driving is realized according to the intersection form of a focal point and voxel point connecting line and a detector plane, the weighted back projection process is corrected, and fast traversal and solution are carried out on each voxel by adopting a fast incremental algorithm.
S5: referring to fig. 6, the intraoperative CBCT400 scans the surgical object 300 to obtain several projection views at different angles within a 360 ° range, and the angles at which the projection views are acquired correspond to the projection view angles in step S2 one-to-one. The position of the intraoperative CBCT400 during scanning of the surgical object 300 is consistent with the position of the calibration phantom 100 during scanning in step S2. As the intraoperative CBCT400 scans the surgical object 300, each projection angle value at which projection data is acquired may be acquired by a tilt sensor mounted on the intraoperative CBCT 400.
S6: the projection map obtained in step S5 is substituted into the three-dimensional reconstruction algorithm model in the model coordinate system established in step S4, and the three-dimensional CT image of the surgical object 300 in the model coordinate system is reconstructed. The three-dimensional CT image of the surgical object 300 is directly reconstructed in the model coordinate system, so that the unification of the physical space and the image space is realized, and the anatomical structure acquired in the image space is directly positioned in the physical space determined by the model coordinate system.
S7: the relative positional relationship between the robot 200 and the surgical object 300 and the anatomical structure of the surgical object 300 can be obtained by using the conversion relationship determined between the model coordinate system and the robot coordinate system. And the relatively fixed conversion relation between the model coordinate system and the robot coordinate system in the physical space is determined once when the surgical robot is set up.
The embodiment also provides an implementation result of the target test bead under the condition of scanning the calibration phantom by the physical space and image space registration method. The target test balls are an upper layer central ball 3 and a lower layer central ball 7 in the calibration phantom shown in FIG. 2, the origin O of the coordinate system of the phantom is arranged at the center of the middle central ball 5, the axis of the calibration phantom is set as the Z axis, the middle plane is set as the XOY plane, and the coordinates of the center of the upper layer central ball 3 of the calibration phantom in the physical space are (0, 0, 39.5) (namely, the coordinates of the center of the upper layer central ball 3 of the calibrationGold standard), the center of the sphere 7 of the lower layer is (0, 0, -39.5) (i.e. gold standard). In a three-dimensional CT reconstruction image of a calibration die body under a die body coordinate system, the line attenuation of acrylic materials used by the upper end cover 1, the lower end cover 9, the barrel 4, the middle upright post 6, the fixed seat 10 and the like is about 0.021, and the line attenuation of a small sphere is about 0.56, so that the area where the small sphere voxel is located can be divided from the reconstruction body by setting a voxel threshold value (which is 0.1), and the image space coordinate of the center of the small sphere is obtained in a voxel value weighting mode. The spherical center coordinates of the central sphere 3 of the upper layer of the calibration phantom in the image space are (-0.0253, -0.0474, 39.5163), and the spherical center coordinates of the central sphere 7 of the lower layer are (-0.0254, -0.0299, -39.4921). In this embodiment, a Target Registration Error (TRE) between a physical space and an image space, an upper center bead 3:
Figure 735241DEST_PATH_IMAGE001
=0.056mm, lower central bead 7:
Figure 839332DEST_PATH_IMAGE002
=0.040 mm. The time for calibrating the phantom by CBCT scanning in the operation is 2 minutes, the time for automatically operating a geometric parameter identification and CT reconstruction algorithm correction computer program is 0.5 minute, the time for CBCT scanning the operation object in the operation is 2 minutes, the time for reconstructing the three-dimensional CT image of the operation object is 0.5 minute, and the total time for the registration process is 5 minutes.
The registration accuracy obtained by applying the registration method based on the calibration template coordinate system in the embodiment in the registration process of the target test pellet physical space and the image space is obviously superior to that obtained by the conventional method (generally, several millimeters, respectively, several tenths of millimeters). Particularly, the registration method in the embodiment has simple registration flow, does not need to implant a reference mark point through an operation, does not need to use a high-precision three-dimensional measuring device, effectively shortens the registration time, and ensures the repeatability and reliability of the registration result. The conventional point registration method needs to increase the operation time of a pre-operation link for implanting the reference mark point in the operation, and the surface registration method adopts manual operation for three-dimensional measurement of the point cloud on the physical space surface, and the registration time is generally 20-30 minutes.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications therefrom are within the scope of the invention.

Claims (9)

1. An image-guided robotic minimally invasive surgical physical space-image space registration method, comprising the steps of:
1) fixedly connecting the calibration die body to the robot, and determining a conversion relation between a die body coordinate system and a robot coordinate system;
2) in-operation CBCT scans and calibrates the mold body to obtain the mold body projection drawings of different angles;
3) aiming at each projection image obtained in the step 2), identifying imaging geometric parameters at each angle in the CBCT scanning process in the operation according to a CBCT imaging geometric parameter calibration algorithm based on a model body, and solving a conversion relation between a model body coordinate system and a detector coordinate system;
4) correcting a CBCT three-dimensional reconstruction algorithm according to the intraoperative CBCT imaging geometric parameters obtained in the step 3), and establishing a three-dimensional reconstruction algorithm model under a model coordinate system;
5) in-operation CBCT scans an operation object to obtain human body projection drawings at different angles, wherein the angles of the collected human body projection drawings correspond to the angles of the mould body projection drawings in the step 2) one by one;
6) substituting the human body projection image obtained in the step 5) into the three-dimensional reconstruction algorithm model established in the step 4), and reconstructing a three-dimensional CT image of the operation object in a body coordinate system;
7) and obtaining the relative position relation between the robot and the internal structure of the operation object by utilizing the conversion relation between the model coordinate system and the robot coordinate system.
2. The method for registering a physical space and an image space of an image-guided robotic minimally invasive surgery as claimed in claim 1, wherein the calibration phantom comprises a cylindrical barrel and a sphere regularly arranged in the barrel, and a fixing seat for connecting the robot base is arranged at the bottom of the barrel.
3. The method of image-guided robotic minimally invasive surgery physical space to image space registration according to claim 2, wherein the spheres are arranged as follows:
two ends of the central shaft of the cylinder body are respectively provided with a sphere, and the central position of the cylinder body is provided with a sphere;
a circle of spheres are uniformly arranged around the top end cover and the bottom end cover of the cylinder body.
4. The method for image-guided robotic minimally invasive surgery physical space to image space registration according to claim 2, wherein step 3) comprises:
3-1) for each projection drawing, dividing the projection of each sphere in the calibration phantom;
3-2) calculating the center coordinate of each sphere projection;
3-3) solving the imaging geometric parameters and the transformation matrix between the model coordinate system and the detector coordinate system according to the geometric relation between the central coordinate of the sphere projection and the sphere projection.
5. The method for registering a physical space and an image space of an image-guided robotic minimally invasive surgery according to claim 4, wherein the segmentation method in the step 3-1) comprises a segmentation method based on a gray threshold, a segmentation method based on edge detection and Hough circle detection, and a segmentation method based on a U-Net full convolution network.
6. The method of claim 4, wherein in step 3-2), the center coordinates of each sphere projection are calculated by a pixel gray-value weighting method.
7. The image-guided robotic minimally invasive surgical physical space-image space registration method of claim 4, wherein step 4) comprises:
4-1) performing cosine correction and line-by-line linear shift invariant filtering operation on the projection data by using the imaging geometric parameters obtained in the step 3-3) in a model coordinate system;
4-2) transforming the focus coordinates and the reconstructed voxel coordinates in the model coordinate system into the detector coordinate system by using the imaging geometric parameters obtained in the step 3-3) and the transformation matrix between the model coordinate system and the detector coordinate system;
4-3) in a detector coordinate system, correcting the weighted back projection process according to the intersection form of a focal point and voxel point connecting line and a detector plane based on the voxel-driven back projection process, and rapidly traversing and solving each voxel by adopting a rapid increment algorithm to obtain a three-dimensional reconstruction algorithm model.
8. The method for registering physical space and image space of image-guided robotic minimally invasive surgery according to claim 1, characterized in that in step 4), the CBCT three-dimensional reconstruction algorithm is modified in the form of homogeneous transformation matrix.
9. The method for image-guided robotic minimally invasive surgery physical space to image space registration according to any one of claims 1-8, wherein the imaging geometry parameters include a focal coordinate of a calibration phantom, a detector center point coordinate, and a detector deflection angle.
CN202110104422.3A 2021-01-26 2021-01-26 Physical space and image space registration method for image-guided robot minimally invasive surgery Active CN112451093B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110104422.3A CN112451093B (en) 2021-01-26 2021-01-26 Physical space and image space registration method for image-guided robot minimally invasive surgery
PCT/CN2021/076194 WO2022160384A1 (en) 2021-01-26 2021-02-09 Physical space and image space registration method for image-guided robotic minimally invasive surgical operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110104422.3A CN112451093B (en) 2021-01-26 2021-01-26 Physical space and image space registration method for image-guided robot minimally invasive surgery

Publications (2)

Publication Number Publication Date
CN112451093A true CN112451093A (en) 2021-03-09
CN112451093B CN112451093B (en) 2021-05-04

Family

ID=74802758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110104422.3A Active CN112451093B (en) 2021-01-26 2021-01-26 Physical space and image space registration method for image-guided robot minimally invasive surgery

Country Status (2)

Country Link
CN (1) CN112451093B (en)
WO (1) WO2022160384A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113143463A (en) * 2021-03-16 2021-07-23 上海交通大学 Operation navigation device, system, calibration method, medium and electronic equipment
CN113643428A (en) * 2021-08-17 2021-11-12 北京唯迈医疗设备有限公司 Full-parameter geometric calibration method suitable for multi-degree-of-freedom cone beam CT
CN113855288A (en) * 2021-11-01 2021-12-31 杭州柳叶刀机器人有限公司 Image generation method, image generation device, electronic equipment and storage medium
CN113963056A (en) * 2021-09-07 2022-01-21 于留青 CT image reconstruction method, device, electronic equipment and storage medium
CN115005991A (en) * 2022-08-03 2022-09-06 北京壹点灵动科技有限公司 Precision detection method of surgical navigation device and surgical navigation precision detection device
CN116672082A (en) * 2023-07-24 2023-09-01 苏州铸正机器人有限公司 Navigation registration method and device of operation navigation ruler
CN117338422A (en) * 2023-10-30 2024-01-05 赛诺威盛医疗科技(扬州)有限公司 Space registration and kinematics solver control method, system and device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115795579B (en) * 2022-12-23 2023-06-27 岭南师范学院 Rapid coordinate alignment method for measuring error analysis of featureless complex curved surface
CN115880469B (en) * 2023-02-20 2023-05-09 江苏省人民医院(南京医科大学第一附属医院) Registration method of surface point cloud data and three-dimensional image
CN116993791B (en) * 2023-09-28 2023-12-19 真健康(北京)医疗科技有限公司 Medical image registration method and equipment based on body surface positioning device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104083216A (en) * 2014-07-03 2014-10-08 北京天智航医疗科技股份有限公司 Location ruler used in operation
US20140312229A1 (en) * 2011-12-21 2014-10-23 Rolls-Royce Plc Position measurement
EP3295887A1 (en) * 2016-09-16 2018-03-21 Globus Medical, Inc. Robotic fluoroscopic navigation
CN111388091A (en) * 2020-03-17 2020-07-10 京东方科技集团股份有限公司 Optical scale and coordinate system registration method
CN111388089A (en) * 2020-03-19 2020-07-10 京东方科技集团股份有限公司 Treatment equipment and registration method and registration device thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140312229A1 (en) * 2011-12-21 2014-10-23 Rolls-Royce Plc Position measurement
CN104083216A (en) * 2014-07-03 2014-10-08 北京天智航医疗科技股份有限公司 Location ruler used in operation
EP3295887A1 (en) * 2016-09-16 2018-03-21 Globus Medical, Inc. Robotic fluoroscopic navigation
CN111388091A (en) * 2020-03-17 2020-07-10 京东方科技集团股份有限公司 Optical scale and coordinate system registration method
CN111388089A (en) * 2020-03-19 2020-07-10 京东方科技集团股份有限公司 Treatment equipment and registration method and registration device thereof

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113143463A (en) * 2021-03-16 2021-07-23 上海交通大学 Operation navigation device, system, calibration method, medium and electronic equipment
CN113643428A (en) * 2021-08-17 2021-11-12 北京唯迈医疗设备有限公司 Full-parameter geometric calibration method suitable for multi-degree-of-freedom cone beam CT
CN113963056A (en) * 2021-09-07 2022-01-21 于留青 CT image reconstruction method, device, electronic equipment and storage medium
CN113963056B (en) * 2021-09-07 2022-08-26 于留青 CT image reconstruction method, device, electronic equipment and storage medium
CN113855288A (en) * 2021-11-01 2021-12-31 杭州柳叶刀机器人有限公司 Image generation method, image generation device, electronic equipment and storage medium
CN115005991A (en) * 2022-08-03 2022-09-06 北京壹点灵动科技有限公司 Precision detection method of surgical navigation device and surgical navigation precision detection device
CN116672082A (en) * 2023-07-24 2023-09-01 苏州铸正机器人有限公司 Navigation registration method and device of operation navigation ruler
CN116672082B (en) * 2023-07-24 2024-03-01 苏州铸正机器人有限公司 Navigation registration method and device of operation navigation ruler
CN117338422A (en) * 2023-10-30 2024-01-05 赛诺威盛医疗科技(扬州)有限公司 Space registration and kinematics solver control method, system and device
CN117338422B (en) * 2023-10-30 2024-04-05 赛诺威盛医疗科技(扬州)有限公司 Space registration and kinematics solver control method, system and device

Also Published As

Publication number Publication date
CN112451093B (en) 2021-05-04
WO2022160384A1 (en) 2022-08-04

Similar Documents

Publication Publication Date Title
CN112451093B (en) Physical space and image space registration method for image-guided robot minimally invasive surgery
US9542743B2 (en) Calibration and transformation of a camera system's coordinate system
WO2018218611A1 (en) Geometric parameter determination method for cone beam computed tomography system
US5901199A (en) High-speed inter-modality image registration via iterative feature matching
US20040171927A1 (en) Method and apparatus for measuring and compensating for subject motion during scanning
CN112258593B (en) CT or PET-CT intelligent positioning scanning method under monocular camera
Voie et al. Three-dimensional reconstruction of the cochlea from two-dimensional images of optical sections
CN110464462B (en) Image navigation registration system for abdominal surgical intervention and related device
US9355454B2 (en) Automatic estimation of anatomical extents
CN104771189A (en) Three-dimensional head image alignment method and device
CN112617877B (en) Autonomous scanning method of mobile CT system, storage medium and CT scanning device
CN116883471B (en) Line structured light contact-point-free cloud registration method for chest and abdomen percutaneous puncture
CN109829922B (en) Brain image redirection method, device, equipment and storage medium
Martinez et al. Super resolution for improved positioning of an mri-guided spinal cellular injection robot
Banks Model based 3D kinematic estimation from 2D perspective silhouettes: application with total knee prostheses
CN116529756A (en) Monitoring method, device and computer storage medium
US8311365B2 (en) Analysis method for regional image
Tomaevič et al. “Gold Standard” 2D/3D registration of X-ray to CT and MR images
CN116452755B (en) Skeleton model construction method, system, medium and equipment
CN110992406B (en) Radiotherapy patient positioning rigid body registration algorithm based on region of interest
US20230398376A1 (en) Methods and systems for radiation therapy guidance
US20230154019A1 (en) Method and navigation system for registering two-dimensional image data set with three-dimensional image data set of body of interest
EP4187496A1 (en) System and method for autonomous identification of heterogeneous phantom regions
Chaoui et al. Virtual movements-based calibration method of ultrasound probe for computer assisted surgery
Maurer Jr Registration of multimodal three-dimensional medical images using points and surfaces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant