CN114224489B - Track tracking system for surgical robot and tracking method using same - Google Patents
Track tracking system for surgical robot and tracking method using same Download PDFInfo
- Publication number
- CN114224489B CN114224489B CN202111520421.3A CN202111520421A CN114224489B CN 114224489 B CN114224489 B CN 114224489B CN 202111520421 A CN202111520421 A CN 202111520421A CN 114224489 B CN114224489 B CN 114224489B
- Authority
- CN
- China
- Prior art keywords
- surgical
- marker
- surgical instrument
- dimensional
- binocular
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 239000003550 marker Substances 0.000 claims abstract description 64
- 238000003384 imaging method Methods 0.000 claims abstract description 32
- 239000013598 vector Substances 0.000 claims description 12
- 239000011248 coating agent Substances 0.000 claims description 9
- 238000000576 coating method Methods 0.000 claims description 9
- 239000011159 matrix material Substances 0.000 claims description 5
- 238000005286 illumination Methods 0.000 claims description 4
- 239000002184 metal Substances 0.000 claims description 3
- 239000002985 plastic film Substances 0.000 claims description 3
- 230000001105 regulatory effect Effects 0.000 claims description 2
- 230000000694 effects Effects 0.000 abstract description 6
- 238000005516 engineering process Methods 0.000 abstract description 2
- 230000003287 optical effect Effects 0.000 description 9
- 238000002679 ablation Methods 0.000 description 8
- 238000001356 surgical procedure Methods 0.000 description 7
- 239000000853 adhesive Substances 0.000 description 5
- 230000001070 adhesive effect Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 2
- 239000003292 glue Substances 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 206010028980 Neoplasm Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 239000008188 pellet Substances 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/102—Modelling of surgical devices, implants or prosthesis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/39—Markers, e.g. radio-opaque or breast lesions markers
- A61B2090/3983—Reference marker arrangements for use with image guided surgery
Abstract
The invention relates to a track tracking technology of a surgical robot, and aims to provide a track tracking system for the surgical robot and a tracking method using the system. The system comprises a plurality of markers which are sheet-shaped and the surfaces of which are painted with checkerboard patterns; targets, including surgical instruments and surgical sites; a binocular imaging apparatus; the processing unit is connected with the binocular imaging equipment through a signal wire and is used for processing images acquired by the binocular imaging equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching the target model. The invention adopts the combination of the relative miniature marker and the binocular camera equipment, and achieves the tracking effect with higher precision without changing the marking mode of surgical instruments and surgical position appearance. The method can reduce the influence on the clinical operation workflow, can realize the marking and positioning of any surgical instrument and surgical position, and has the effect of reducing the restriction of the surgical space.
Description
Technical Field
The present invention relates to a trajectory tracking technique of a surgical robot, and more particularly, to a trajectory tracking system for acquiring spatial position information and posture information of a target object by tracking information of a marker attached to a surgical instrument and a surgical position, and a tracking method using the same.
Background
When performing ablation surgery for tumors, research work has been conducted in the industry on ablation surgery robots in order to reduce the workload of doctors and to improve the accuracy of ablation surgery. When performing a surgery using an ablation surgical robot, in order to minimize the risk of the surgery and improve the precision of the surgery, it is necessary to use a navigator that can accurately guide the surgical instrument to the affected part of the patient after accurately tracking and detecting spatial position information and posture information of the surgical instrument and markers at the surgical site.
To this end, it is generally necessary to construct a tracking system comprising: a marker for marking surgical instruments and surgical sites; an optical locator system for recording optical information of the marker; and the processor is used for processing the relation between the markers, acquiring three-dimensional coordinates and posture information of the surgical instrument and the surgical position marked by the markers, calculating the spatial relation between the surgical instrument and the surgical position, and guiding a movement scheme after the surgical robot.
In the technical field of optical tracking of markers and the like, northern Digital Inc (NDI) corporation is a technology leading person, and NDI optical locator systems and matched Track software produced by the Northern Digital Inc (NDI) corporation are applied to surgical robots. Wherein, the optical instrument used together with the optical positioning instrument uses the optical balls as markers, and a plurality of (four) optical balls are generally used for forming a marker group, and the marker group occupies about 10cm multiplied by 1cm, and is externally attached to the surgical instrument and the surgical position for marking the gesture and the position. However, the marking group has the problems of oversized marker, influence on actual operation effect, influence on clinical workflow and the like for conventional ablation surgical instruments and operation positions, so that the application of the marking group is restricted.
Disclosure of Invention
The invention aims to solve the technical problem of overcoming the defects in the prior art and providing a track tracking system for a surgical robot and a tracking method using the system.
In order to solve the technical problems, the invention adopts the following solutions:
there is provided a trajectory tracking system for a surgical robot, comprising: the markers are sheet-shaped, and the surfaces of the markers are painted with checkerboard patterns and are used for being attached to different surfaces of a target object of which the space position information and the gesture information need to be determined; the target includes a surgical instrument and a surgical site; a binocular imaging apparatus for recording live information of each marker; the processing unit is connected with the binocular imaging equipment through a signal wire and is used for processing images acquired by the binocular imaging equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching the target model.
Preferably, the checkerboard pattern of the marker surface comprises at least 2×3 black and white squares which are adjacent and arranged at intervals, and the side length of each square is 0.45-0.75 mm.
Preferably, there are at least 4 of said markers on each target.
Preferably, the marker is adhesive paper with back glue, the surface of which is printed with a checkerboard pattern, and the adhesive paper is directly adhered to the surface of the target or the attachment on the surface of the target; or the marker is a metal sheet or a plastic sheet with a checkered pattern printed on the surface and is fixed on the surface of the target object in a snap-in mode; alternatively, the marker is a coating that is printed or sprayed onto the surface of the surgical instrument, the coating having a checkerboard pattern.
Preferably, the resolution of the binocular imaging apparatus is at least 6500W pixels.
Preferably, the processing unit is a computer; or a processor with computing power.
Preferably, the system further comprises an operating lamp used as a light source, and the illumination intensity of the operating lamp is more than 100001 ux.
The invention further provides a tracking method by utilizing the track tracking system, which comprises the following steps:
(1) Attaching markers on different surfaces of the surgical instrument and the surgical position, and placing the surgical instrument and the surgical position in a calibration position;
(2) Shooting the surgical instrument and the surgical position by using binocular shooting equipment, processing the obtained images by using a processing unit, calculating the three-dimensional coordinate position and the spatial relation of each marker at the calibration position, and modeling based on the three-dimensional coordinate position and the spatial relation to obtain a target object model of the surgical instrument and the surgical position at the calibration position;
(3) In the operation process of the surgical robot, acquiring real-time images of the surgical instrument and the surgical position through binocular imaging equipment; processing the obtained image by a processing unit, and calculating the real-time three-dimensional coordinate position and the spatial relationship of each marker; after matching and comparing with the target object model, obtaining the real-time three-dimensional coordinate position and spatial relation of the surgical instrument;
(4) Inputting the calculation result obtained in the step (3) into a control system of the surgical robot as track data of the surgical instrument, and further tracking and regulating the operation action of the surgical robot according to a preset surgical scheme.
Preferably, the three-dimensional coordinate position and the spatial relationship of each marker are determined according to the following method:
(1) Calculating three-dimensional attitude information of the marker by utilizing a projection rule of the three-dimensional image on a two-dimensional plane; the gesture information is three-dimensional vectors formed by long sides, short sides and straight lines perpendicular to the plane of the marker, and each captured and identified marker is represented by unique three-dimensional vector features;
(2) Converting three-dimensional vector representations of the markers acquired by the imaging unit of one camera into the imaging unit of the other camera by using an external reference matrix calibrated by the binocular imaging equipment; if the three vectors are equal, the tag matching is considered successful; further calculating the coordinates of the marker under world coordinates by using a binocular camera projection rule;
(3) Forming a three-dimensional point cloud by using the world coordinates of the part of the markers obtained by calculation;
(4) Obtaining a target object model of the surgical instrument and the surgical position at a calibration position based on the three-dimensional point cloud; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain real-time coordinate information and posture information of the surgical instrument.
Description of the inventive principles:
in the present invention, relatively "miniature" volume markers are adhered to different surfaces of objects (surgical instruments and surgical sites); the binocular imaging equipment is used for acquiring images, and the processing unit is used for calibrating, calculating, matching and comparing the position and the spatial relation of each marker, so that a tracking effect with higher precision is realized.
In the invention, imaging acquisition and modeling of three-dimensional coordinate space position information and attitude information are firstly carried out on a target object in a calibration state, and a miniature marker point set after modeling is taken as a basic model: and then acquiring an image for the marker through a binocular imaging device in the actual operation process, and calculating three-dimensional coordinate position and posture information implemented by the target based on the three-dimensional coordinate set matching basic model of the marker point set so as to be used for operation of the surgical robot.
Compared with the prior art, the invention has the beneficial effects that:
1. the present invention employs a combination of relatively "miniature" markers with a binocular camera device, as opposed to a larger-sized light emitting pellet and optical locator system. The tracking effect with higher precision can be achieved without changing the marking mode of the appearance of the surgical instrument and the surgical position.
2. The invention can calculate the three-dimensional coordinate position and posture information of the target object (the surgical instrument and the surgical position). Therefore, the method not only can reduce the influence on the clinical operation workflow, but also can realize the marking and positioning of any surgical instrument and surgical position, and has the effect of being limited by the operation space relatively little.
Drawings
FIG. 1 is a diagram of a tracking system of an embodiment of the present invention.
FIG. 2 is an illustration of a micro tag attached to a target.
FIG. 3 is a block diagram of steps for calculating three-dimensional coordinate position and pose information of a target object.
Reference numerals: 1 a surgical instrument; 2, an operation position; 3 a marker; 4 binocular camera equipment; 5 a processing unit; 6, an operating lamp; 31-34 marker individuals.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in more detail with reference to the accompanying drawings.
In the present invention, the object includes a surgical instrument 1 and a surgical site 2. The surgical instrument 1 is a medical instrument such as an ablation needle and a puncture needle attached to the distal end of a manipulator of a surgical robot. During surgery, the non-operative area of the human body is masked, i.e. the operative site 2 (i.e. the local area of the patient's body where the operation is to be performed), so that a separate marker can be used for the operative site 2. The surgical robot adjusts the motion planning of the mechanical arm according to the real-time change condition of the three-dimensional coordinate position and the gesture information of the surgical instrument 1 and the surgical position 2 and combining with a preset surgical scheme so as to complete the operation. Of course, the mask of the marker in the form of self-coating from factory can be further selected according to the calculation capability and the control precision requirement of the system, and the positioning precision of track tracking is improved by increasing the number of auxiliary positioning markers. In the invention, the gesture information refers to a rotation transformation relation between a target object and a basic model position, and the three-dimensional coordinate position refers to a translation transformation relation between the target object and the basic model position. The specific control method of the surgical robot belongs to the prior art, and the technical scheme of the invention does not relate to the content, so that the description is omitted.
The implementation of the present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
As shown in fig. 1-2, a trajectory tracking system for a surgical robot, comprising: a marker 3, a binocular imaging apparatus 4, a processing unit 5, and an operating lamp 6.
The marker 3 has a checkerboard pattern for machine vision recognition, which can be clearly imaged in a high-precision binocular camera. For example, the checkerboard pattern is 4 x 5 black and white squares which are adjacently arranged at intervals, and the side length of each square is 0.45-0.75 mm.
The specific implementation of the marker 3 is numerous, and the surgical instrument 1 and the surgical site 2 should be selected in a manner suitable for each case according to the actual situation. For example:
mode one: the marker 3 is adhesive paper with back glue, the surface of which is printed with a checkerboard pattern, and can be respectively adhered to different surfaces of the surgical instrument 1 and the surgical position 2, and the number and the fixing mode can be selected according to actual needs.
Mode two: the marker 3 is a metal sheet or a plastic sheet with a checkered pattern printed on the surface thereof, and is fixed on the surface of the surgical instrument 1 by a snap-fit or adhesive manner, or is fixed on the surface of the surgical site 2 by an adhesive manner.
Mode three: the marker 3 is a coating that is printed or sprayed onto the surface of the surgical instrument 1, the coating having a checkerboard pattern, typically the product is shipped from the factory. For old products without labels, the method can also be realized by adopting a mode of post-labeling.
Mode four: the tag 3 is a coating that is printed or sprayed onto the surface of the mask, the coating having a checkerboard pattern, typically the mask product is shipped from the factory. For old products without labels, the method can also be realized by adopting a mode of post-labeling.
Typically, the object of fig. 1 comprises a surgical instrument 1 and a surgical site 2, each having a different surface, on which a marker 3 may be attached; and the target object is observed from different angles, and the obtained marker object has different position relations and no repetition. Whereas only one viewing angle illustration is selected in fig. 2. At this viewing angle, the ablation needle has a plate-like trailing end, with 10 markers randomly adhering to the surface of the ablation needle trailing end.
The binocular imaging apparatus 4 may be a high-precision binocular camera with a resolution of at least 6500W pixels and the lens is an ultra-low dispersion lens. The two cameras are arranged at a certain angle and used for shooting the same area, and recording the live information of each marker on the target object in the area.
The processing unit 5 is connected with the binocular imaging device through a signal line, and is used for processing images acquired by the binocular imaging device, calculating the position and the spatial relationship of each marker 3, and obtaining the three-dimensional coordinate position and the spatial relationship of the target after matching the target model. The processing unit 5 may alternatively be a computer or a processor with computing power.
The operating lamp 6 serves as a light source with an illumination intensity of 10000lux or more for providing an illumination intensity required for real-time imaging by the binocular imaging apparatus 4.
In the invention, the tracking method by using the track tracking system comprises the following steps:
(1) Attaching a marker 3 to the surfaces of the surgical instrument 1 and the surgical position 2, and placing the surgical instrument 1 and the surgical position 2 at a calibration position;
(2) Shooting the surgical instrument 1 and the surgical position 2 by using binocular shooting equipment 4, processing the obtained images by a processing unit 5, calculating the three-dimensional coordinate position and the spatial relation of each marker 3 at the calibration position, and obtaining a target object model of the surgical instrument 1 and the surgical position 2 at the calibration position based on the three-dimensional coordinate position and the spatial relation;
in the step, the construction mode of the target object model can adopt the existing three-dimensional modeling method or point cloud matching method and the like. The obtained target object model can be built in a robot control program after modeling is completed so as to be used repeatedly for a plurality of times, and thus, modeling is not needed to be carried out once before track tracking is carried out each time. Of course, if the calibration position changes greatly, the step should be repeated to update the object model, so as to avoid errors.
(3) Acquiring real-time images of the surgical instrument 1 and the surgical site 2 through the binocular imaging apparatus 4 during operation of the surgical robot; processing the obtained images by a processing unit 5, and calculating the real-time three-dimensional coordinate positions and spatial relations of the markers 3; after matching and comparing with the target object model, obtaining the real-time three-dimensional coordinate position and the spatial relationship of the surgical instrument 1;
(4) Inputting the calculation result obtained in the step (3) as track data of the surgical instrument 1 into a control system of the surgical robot, and further tracking and controlling the operation action of the surgical robot according to a preset surgical scheme.
For convenience of explanation, the method for determining the three-dimensional coordinate position and spatial relationship of each marker will be described below with reference to the point set consisting of the individual markers 31 to 34 in fig. 2:
(3.1) at a certain position, the two cameras of the binocular imaging apparatus 4 simultaneously capture the marker individuals 31-34. Calculating three-dimensional attitude information of the marker individuals 31-34 by utilizing a projection rule of the three-dimensional image on a two-dimensional plane; the gesture information is three-dimensional vectors formed by long sides, short sides and straight lines perpendicular to the plane of the markers of the marker individuals 31-34, and each captured and identified marker individual 31-34 is represented by unique three-dimensional vector features;
(3.2) converting three-dimensional vector representations of the marker individuals 31-34 acquired by the imaging unit of one camera into the imaging unit of the other camera by using the external reference matrix calibrated by the binocular imaging apparatus; if the three vectors are equal, the tag matching is considered successful; further calculating coordinates of the marker individuals 31-34 under world coordinates by using a binocular camera projection rule;
the world coordinates are calculated as follows:
wherein Int L ,Int R An internal reference matrix of the left camera and the right camera; s is(s) L ,s R The scaling factors of the left camera and the right camera; pixel coordinates of the cameras on the left and right for the tag individuals 31-34; r is R L R R T L T L Representing the left and right camera matrices, respectively.
T=T R -RT L As an extrinsic matrix, only s in the above representation L ,s R ,P w Unknown.
Then let s through calculation L For (A'. Times.A) -1 * A' is the first element of T, whereinThe world coordinate takes the origin of the coordinate of one of the cameras as the origin and the xy axis of the camera as the xy axis, the world coordinate of the marker +.>
(3.3) forming a three-dimensional point cloud by using the world coordinates of the part of the markers obtained by calculation;
after the world coordinates of the individual markers 31-34 are calculated, a three-dimensional point cloud consisting of the individual markers 31-34 can be obtained. And matching and comparing the basic models by using the three-dimensional point cloud to obtain the coordinate information and the gesture information of the target object.
(3.4) referring to the same way, a target object model of the surgical instrument and the surgical position in the calibration position can be obtained based on the three-dimensional point cloud; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain real-time coordinate information and posture information of the surgical instrument.
Claims (7)
1. A trajectory tracking system for a surgical robot, the system comprising: a plurality of markers, a binocular camera device and a processing unit; wherein,
the marker is in a sheet shape, and the surface of the marker is painted with a checkerboard pattern and is used for being attached to different surfaces of a target object of which the space position information and the gesture information need to be determined; the target includes a surgical instrument and a surgical site; the binocular camera equipment is used for recording the live information of each marker; the processing unit is connected with the binocular imaging equipment through a signal line and is used for processing the images acquired by the binocular imaging equipment, calculating the position and the spatial relationship of each marker and acquiring the three-dimensional coordinate position and the spatial relationship of the target after matching the target model;
the track tracking system is used for executing the following track tracking method:
(1) Attaching markers on different surfaces of the surgical instrument and the surgical position, and placing the surgical instrument and the surgical position in a calibration position;
(2) Shooting the surgical instrument and the surgical position by using binocular shooting equipment, processing the obtained images by using a processing unit, calculating the three-dimensional coordinate position and the spatial relation of each marker at the calibration position, and modeling based on the three-dimensional coordinate position and the spatial relation to obtain a target object model of the surgical instrument and the surgical position at the calibration position;
the three-dimensional coordinate position and spatial relationship of each marker is determined according to the following method:
(2.1) calculating three-dimensional attitude information of the marker by utilizing a projection rule of the three-dimensional image on a two-dimensional plane; the gesture information is three-dimensional vectors formed by long sides, short sides and straight lines perpendicular to the plane of the marker, and each captured and identified marker is represented by unique three-dimensional vector features;
(2.2) converting three-dimensional vector representations of the markers acquired by the imaging unit of one camera into the imaging unit of the other camera by using an external reference matrix calibrated by the binocular imaging equipment; if the three vectors are equal, the tag matching is considered successful; further calculating the coordinates of the marker under world coordinates by using a binocular camera projection rule;
(2.3) forming a three-dimensional point cloud by using the world coordinates of the part of the markers obtained by calculation;
(2.4) obtaining a target object model of the surgical instrument and the surgical position under the calibration position based on the three-dimensional point cloud; or matching and comparing the three-dimensional point cloud obtained by real-time calculation with the target object model to obtain real-time coordinate information and posture information of the surgical instrument;
(3) In the operation process of the surgical robot, acquiring real-time images of the surgical instrument and the surgical position through binocular imaging equipment; processing the obtained image by a processing unit, and calculating the real-time three-dimensional coordinate position and the spatial relationship of each marker; after matching and comparing with the target object model, obtaining the real-time three-dimensional coordinate position and spatial relation of the surgical instrument;
(4) Inputting the calculation result obtained in the step (3) into a control system of the surgical robot as track data of the surgical instrument, and further tracking and regulating the operation action of the surgical robot according to a preset surgical scheme.
2. The trajectory tracking system of claim 1 wherein the checkerboard pattern of marker surfaces comprises at least 2 x 3 black and white squares arranged adjacently and at intervals, the individual squares having a side length of 0.45 to 0.75mm.
3. The trajectory tracking system of claim 1 wherein there are at least 4 of said markers on each target.
4. The track following system according to claim 1, wherein the marker is a sticked paper with a checkerboard pattern printed on the surface, and is directly stuck on the target surface or an attachment on the target surface; or the marker is a metal sheet or a plastic sheet with a checkered pattern printed on the surface and is fixed on the surface of the target object in a snap-in mode; alternatively, the marker is a coating that is printed or sprayed onto the surface of the surgical instrument, the coating having a checkerboard pattern.
5. The trajectory tracking system of claim 1, wherein the binocular imaging apparatus has a resolution of at least 6500W pixels.
6. The trajectory tracking system of claim 1, wherein the processing unit is a computer; or a processor with computing power.
7. The trajectory tracking system of any one of claims 1 to 6, further comprising an operating light as a light source with an illumination intensity of 10000lux or more.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111520421.3A CN114224489B (en) | 2021-12-12 | 2021-12-12 | Track tracking system for surgical robot and tracking method using same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111520421.3A CN114224489B (en) | 2021-12-12 | 2021-12-12 | Track tracking system for surgical robot and tracking method using same |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114224489A CN114224489A (en) | 2022-03-25 |
CN114224489B true CN114224489B (en) | 2024-02-13 |
Family
ID=80755286
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111520421.3A Active CN114224489B (en) | 2021-12-12 | 2021-12-12 | Track tracking system for surgical robot and tracking method using same |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114224489B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115170646A (en) * | 2022-05-30 | 2022-10-11 | 清华大学 | Target tracking method and system and robot |
CN115089293A (en) * | 2022-07-04 | 2022-09-23 | 山东大学 | Calibration method for spinal endoscopic surgical robot |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107179322A (en) * | 2017-06-15 | 2017-09-19 | 长安大学 | A kind of bridge bottom crack detection method based on binocular vision |
CN108154552A (en) * | 2017-12-26 | 2018-06-12 | 中国科学院深圳先进技术研究院 | A kind of stereo laparoscope method for reconstructing three-dimensional model and device |
CN109833092A (en) * | 2017-11-29 | 2019-06-04 | 上海复拓知达医疗科技有限公司 | Internal navigation system and method |
CN109903313A (en) * | 2019-02-28 | 2019-06-18 | 中国人民解放军国防科技大学 | Real-time pose tracking method based on target three-dimensional model |
CN111388087A (en) * | 2020-04-26 | 2020-07-10 | 深圳市鑫君特智能医疗器械有限公司 | Surgical navigation system, computer and storage medium for performing surgical navigation method |
CN113034700A (en) * | 2021-03-05 | 2021-06-25 | 广东工业大学 | Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal |
CN113347937A (en) * | 2019-01-25 | 2021-09-03 | 伯恩森斯韦伯斯特(以色列)有限责任公司 | Registration of frame of reference |
CN113693723A (en) * | 2021-08-05 | 2021-11-26 | 北京大学 | Cross-modal navigation positioning system and method for oral and throat surgery |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6490417B2 (en) * | 2014-12-18 | 2019-03-27 | 株式会社東芝 | Moving body tracking treatment apparatus and moving body tracking treatment program |
CN111031958B (en) * | 2017-08-16 | 2023-09-15 | 柯惠有限合伙公司 | Synthesizing spatially aware transitions between multiple camera viewpoints during minimally invasive surgery |
-
2021
- 2021-12-12 CN CN202111520421.3A patent/CN114224489B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107179322A (en) * | 2017-06-15 | 2017-09-19 | 长安大学 | A kind of bridge bottom crack detection method based on binocular vision |
CN109833092A (en) * | 2017-11-29 | 2019-06-04 | 上海复拓知达医疗科技有限公司 | Internal navigation system and method |
CN108154552A (en) * | 2017-12-26 | 2018-06-12 | 中国科学院深圳先进技术研究院 | A kind of stereo laparoscope method for reconstructing three-dimensional model and device |
CN113347937A (en) * | 2019-01-25 | 2021-09-03 | 伯恩森斯韦伯斯特(以色列)有限责任公司 | Registration of frame of reference |
CN109903313A (en) * | 2019-02-28 | 2019-06-18 | 中国人民解放军国防科技大学 | Real-time pose tracking method based on target three-dimensional model |
CN111388087A (en) * | 2020-04-26 | 2020-07-10 | 深圳市鑫君特智能医疗器械有限公司 | Surgical navigation system, computer and storage medium for performing surgical navigation method |
CN113034700A (en) * | 2021-03-05 | 2021-06-25 | 广东工业大学 | Anterior cruciate ligament reconstruction surgery navigation method and system based on mobile terminal |
CN113693723A (en) * | 2021-08-05 | 2021-11-26 | 北京大学 | Cross-modal navigation positioning system and method for oral and throat surgery |
Also Published As
Publication number | Publication date |
---|---|
CN114224489A (en) | 2022-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114224489B (en) | Track tracking system for surgical robot and tracking method using same | |
CN114041875B (en) | Integrated operation positioning navigation system | |
US6816755B2 (en) | Method and apparatus for single camera 3D vision guided robotics | |
CN110051436B (en) | Automated cooperative work assembly and application thereof in surgical instrument | |
US7242818B2 (en) | Position and orientation sensing with a projector | |
Wei et al. | Real-time visual servoing for laparoscopic surgery. Controlling robot motion with color image segmentation | |
US20160000518A1 (en) | Tracking apparatus for tracking an object with respect to a body | |
JPH10253322A (en) | Method and apparatus for designating position of object in space | |
CN112472297A (en) | Pose monitoring system, pose monitoring method, surgical robot system and storage medium | |
US20190086198A1 (en) | Methods, systems and computer program products for determining object distances and target dimensions using light emitters | |
CN114668534B (en) | Intraoperative implantation precision detection system and method for dental implant surgery | |
CN114523471B (en) | Error detection method based on association identification and robot system | |
CN114536399B (en) | Error detection method based on multiple pose identifications and robot system | |
CN112998856B (en) | Three-dimensional real-time positioning method | |
TWI708591B (en) | Three-dimensional real-time positioning method for orthopedic surgery | |
CN111862170A (en) | Optical motion capture system and method | |
CN114536331B (en) | Method for determining external stress of deformable mechanical arm based on association identification and robot system | |
TWI735390B (en) | Method for real-time positioning compensation of image positioning system and image positioning system capable of real-time positioning compensation | |
CN113855240B (en) | Medical image registration system and method based on magnetic navigation | |
CN114926542A (en) | Mixed reality fixed reference system calibration method based on optical positioning system | |
CN115397634A (en) | Device for acquiring position of visual sensor in robot control coordinate system, robot system, method, and computer program | |
Yu et al. | Vision-based method of kinematic calibration and image tracking of position and posture for 3-RPS parallel robot | |
CN110051433B (en) | Method for keeping track of target and application thereof in image-guided surgery | |
US20230248467A1 (en) | Method of medical navigation | |
CN116468648A (en) | Execution arm detection method based on association identification and robot system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |