CN114176779A - Surgical robot navigation positioning method and device - Google Patents
Surgical robot navigation positioning method and device Download PDFInfo
- Publication number
- CN114176779A CN114176779A CN202111662281.3A CN202111662281A CN114176779A CN 114176779 A CN114176779 A CN 114176779A CN 202111662281 A CN202111662281 A CN 202111662281A CN 114176779 A CN114176779 A CN 114176779A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- target
- image
- surgical robot
- path
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 71
- 238000006243 chemical reaction Methods 0.000 claims abstract description 138
- 239000003550 marker Substances 0.000 claims abstract description 53
- 238000001356 surgical procedure Methods 0.000 claims abstract description 12
- 230000003287 optical effect Effects 0.000 claims description 37
- 230000008569 process Effects 0.000 claims description 32
- 230000009466 transformation Effects 0.000 claims description 28
- 238000004891 communication Methods 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 12
- 238000004364 calculation method Methods 0.000 claims description 5
- 238000012545 processing Methods 0.000 abstract description 4
- 238000013519 translation Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 6
- 230000009471 action Effects 0.000 description 3
- 239000011324 bead Substances 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 238000010924 continuous production Methods 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 239000002184 metal Substances 0.000 description 2
- 229910052751 metal Inorganic materials 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
- 210000000115 thoracic cavity Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/101—Computer-aided simulation of surgical operations
- A61B2034/105—Modelling of the patient, e.g. for ligaments or bones
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/107—Visualisation of planned trajectories or target regions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
- A61B2034/108—Computer aided selection or customisation of medical implants or cutting guides
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
Landscapes
- Health & Medical Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Manipulator (AREA)
Abstract
The embodiment of the invention provides a navigation and positioning method and a navigation and positioning device for a surgical robot, which relate to the technical field of data processing, and the method comprises the following steps: acquiring a three-dimensional image acquired by image acquisition equipment; planning a surgical path for performing surgery on the target object in the three-dimensional image; identifying a marker in the three-dimensional image to obtain an image position of the marker in an image coordinate system where the three-dimensional image is located; calculating a second conversion relation between the image coordinate system and the target coordinate system according to the obtained image position, the pre-obtained calibration position and the first conversion relation; converting the operation path into a target path under a target coordinate system based on the second conversion relation; and navigating and positioning the surgical robot according to the target path. By applying the navigation and positioning scheme of the surgical robot provided by the embodiment of the invention, the surgical robot can be navigated and positioned.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a navigation and positioning method and device for a surgical robot.
Background
Nowadays, doctors can complete surgery using surgical robots, and since surgical robots can generally perform more delicate and accurate operations, performing surgery using surgical robots can improve the success rate of surgery. In the operation process, a moving path of the surgical robot is usually planned, and then the surgical robot is controlled to move according to the moving path, so that navigation and positioning of the surgical robot are realized, and the operation is completed based on the positioned surgical robot.
Therefore, a surgical robot navigation and positioning scheme is needed to navigate and position the surgical robot.
Disclosure of Invention
The embodiment of the invention aims to provide a navigation and positioning scheme for a surgical robot, so as to perform navigation and positioning on the surgical robot. The specific technical scheme is as follows:
in a first aspect, an embodiment of the present invention provides a surgical robot navigation and positioning method, where the method includes:
obtaining a three-dimensional image acquired by an image acquisition device, wherein the field of view of the image acquisition device comprises: a target object and a calibration plate with a marker;
planning a surgical path for performing a surgery on the target object in the three-dimensional image;
identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image;
calculating a second conversion relation between the image coordinate system and the target coordinate system according to the obtained image position, a pre-obtained calibration position and a first conversion relation, wherein the calibration position is as follows: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: according to the coordinate system established by the calibration plate, the target coordinate system is as follows: according to a coordinate system established by the surgical robot, the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
converting the surgical path into a target path in the target coordinate system based on the second conversion relation;
and navigating and positioning the surgical robot according to the target path.
In an embodiment of the present invention, the navigating and positioning the surgical robot according to the target path includes:
starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset, wherein the position offset is as follows: an offset in the target coordinate system between a position of the target object at a current time and a position at an acquisition time of the three-dimensional image.
In an embodiment of the present invention, the detecting a position offset of the target object in real time, and correcting the target path based on the position offset includes:
obtaining a first position and a current position in a first coordinate system acquired by a first sensor, wherein the first coordinate system is: according to the coordinate system established by the first sensor, the first position is: the position of the target object at the acquisition time of the three-dimensional image is as follows: a position of the target object at a current time;
calculating the position offset of the target object according to the first position, the current position and a conversion relation between the first coordinate system and a target coordinate system obtained in advance;
based on the position offset, the target path is rectified.
In one embodiment of the invention, the first sensor is an optical tracker and/or a magnetic sensor.
In an embodiment of the present invention, the navigating and positioning the surgical robot according to the target path includes:
obtaining a second position of the surgical robot in a second coordinate system acquired by a second sensor, wherein the second coordinate system is: establishing a coordinate system according to the second sensor;
converting the second position into a third position under a target coordinate system according to a conversion relation between the second coordinate system and the target coordinate system obtained in advance;
and navigating and positioning the surgical robot according to the target path and the third position.
In one embodiment of the invention, the second sensor is an optical tracker and/or a magnetic sensor.
In an embodiment of the present invention, the calculating a second transformation relation between the image coordinate system and the target coordinate system according to the image position, a calibration position obtained in advance, and a first transformation relation includes:
calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and a calibration position obtained in advance;
and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and a first conversion relation obtained in advance.
In a second aspect, an embodiment of the present invention further provides a surgical robot navigation and positioning apparatus, where the apparatus includes:
the image acquisition module is used for acquiring a three-dimensional image acquired by image acquisition equipment, wherein the field of view of the image acquisition equipment comprises: a target object and a calibration plate with a marker;
a path planning module for planning a surgical path for performing a surgery on the target object in the three-dimensional image;
the marker identification module is used for identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image;
a relationship calculation module, configured to calculate a second conversion relationship between the image coordinate system and the target coordinate system according to the obtained image position, a pre-obtained calibration position, and a first conversion relationship, where the calibration position is: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: according to the coordinate system established by the calibration plate, the target coordinate system is as follows: according to a coordinate system established by the surgical robot, the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
the path conversion module is used for converting the operation path into a target path under the target coordinate system based on the second conversion relation;
and the navigation positioning module is used for navigating and positioning the surgical robot according to the target path.
In an embodiment of the present invention, the navigation positioning module is specifically configured to:
starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset, wherein the position offset is as follows: an offset in the target coordinate system between a position of the target object at a current time and a position at an acquisition time of the three-dimensional image.
In an embodiment of the present invention, the navigation positioning module is specifically configured to:
obtaining a first position and a current position in a first coordinate system acquired by a first sensor, wherein the first coordinate system is: according to the coordinate system established by the first sensor, the first position is: the position of the target object at the acquisition time of the three-dimensional image is as follows: a position of the target object at a current time;
calculating the position offset of the target object according to the first position, the current position and a conversion relation between the first coordinate system and a target coordinate system obtained in advance;
based on the position offset, the target path is rectified.
In one embodiment of the invention, the first sensor is an optical tracker and/or a magnetic sensor.
In an embodiment of the present invention, the navigation positioning module is specifically configured to:
obtaining a second position of the surgical robot in a second coordinate system acquired by a second sensor, wherein the second coordinate system is: establishing a coordinate system according to the second sensor;
converting the second position into a third position under a target coordinate system according to a conversion relation between the second coordinate system and the target coordinate system obtained in advance;
and navigating and positioning the surgical robot according to the target path and the third position.
In one embodiment of the invention, the second sensor is an optical tracker and/or a magnetic sensor.
In an embodiment of the present invention, the relationship calculation module is specifically configured to:
calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and a calibration position obtained in advance;
and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and a first conversion relation obtained in advance.
In a third aspect, an embodiment of the present invention further provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete mutual communication through the communication bus;
a memory for storing a computer program;
and the processor is used for realizing the steps of the surgical robot navigation positioning method in the first aspect when executing the program stored in the memory.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when executed by a processor, the computer program implements the steps of the surgical robot navigation and positioning method according to the first aspect.
The embodiment of the invention has the following beneficial effects:
as can be seen from the above, when the scheme provided by the embodiment of the present invention is applied to navigation and positioning of a surgical robot, an object in an obtained three-dimensional image includes a target object and a calibration plate, and by identifying a marker of the calibration plate in the three-dimensional image, an image position of the marker in an image coordinate system can be obtained. The image position is a position under an image coordinate system, the calibration position is a position under the calibration coordinate system, and the first conversion relation is a conversion relation between the calibration coordinate system and a target coordinate system, so that the second conversion relation between the image coordinate system and the target coordinate system can be accurately obtained according to the image position, the calibration position and the first conversion relation, and the surgical path can be converted into the target path under the target coordinate system based on the second conversion relation after the surgical path is planned in the three-dimensional image.
In addition, in the scheme, when the second conversion relation between the image coordinate system and the target coordinate system is determined, the position of the marker on the calibration plate under the image coordinate system can be obtained by identifying the marker in the three-dimensional image only by placing the calibration plate in the visual field range of the image acquisition equipment, so that the second conversion relation is calculated. The calibration plate can be placed at any position in the visual field range of the image acquisition equipment, and the second conversion relation can be calculated by utilizing the calibration plate without performing any operation on a target object, so that the navigation and positioning scheme of the surgical robot provided by the embodiment of the invention can enhance the flexibility of the navigation and positioning of the surgical robot and reduce the operation difficulty of the navigation and positioning of the surgical robot.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other embodiments can be obtained by referring to these drawings.
Fig. 1 is a schematic flowchart of a first surgical robot navigation and positioning method according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a second surgical robot navigation and positioning method according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a third surgical robot navigation and positioning method according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a fourth surgical robot navigation and positioning method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a surgical robot navigation and positioning device according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments obtained by those of ordinary skill in the art based on the embodiments of the present invention are within the scope of the present invention.
Referring to fig. 1, fig. 1 is a schematic flow chart of a first surgical robot navigation and positioning method according to an embodiment of the present invention, where the method includes the following steps S101 to S106.
Step S101: and obtaining a three-dimensional image acquired by the image acquisition equipment.
Wherein, the field of view of image acquisition equipment contains: a target object and a calibration plate with a marker.
The image acquisition apparatus may be a three-dimensional imaging apparatus, and for example, the image acquisition apparatus may be a CBCT (Cone Beam CT), an O-arm machine, a CT (Computed Tomography), an MR (Magnetic Resonance) apparatus, or the like.
The markers are typically metal beads, such as steel beads, or beads made of other metals.
The calibration plate typically comprises a plurality of identifiers that are not coplanar.
Specifically, the field of view of the image capturing device includes the target object and the calibration plate, so that when the image capturing device captures an image, the image capturing device can capture images of the target object and the calibration plate, and the three-dimensional image includes a region corresponding to the target object and a region corresponding to the calibration plate.
Step S102: a surgical path for performing a procedure on the target object is planned in the three-dimensional image.
Specifically, according to the three-dimensional image, the body structure of the target object can be known, so that the body part needing to be operated on the target object is determined, and the operation path for operating the part is planned in the three-dimensional image.
In one embodiment of the present invention, there are two implementations of planning a surgical path.
In a first implementation, the physician can view the three-dimensional image and manually plan a surgical path in the three-dimensional image based on past work experience.
In a second implementation, the three-dimensional image may be processed by using an existing surgical path planning technique, so as to obtain a surgical path in the three-dimensional image.
Step S103: and identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image.
The image coordinate system may be a coordinate system established according to three dimensions of the three-dimensional image.
Specifically, by identifying the marker in the three-dimensional image, the position of the pixel point corresponding to the marker in the three-dimensional image in the image coordinate system can be determined, and the position is the image position of the marker in the image coordinate system in which the three-dimensional image is located.
In an embodiment of the present invention, the identifier may be used as an identification object, and the identifier in the three-dimensional image may be identified by using an existing object identification technology, for example, the three-dimensional image may be processed by using an object identification model, so as to obtain an image position of the identifier in an image coordinate system.
Step S104: and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the obtained image position, the pre-obtained calibration position and the first conversion relation.
Wherein, the calibration positions are as follows: the position of the marker in a calibration coordinate system, wherein the calibration coordinate system is as follows: and establishing a coordinate system according to the calibration plate.
In an embodiment of the present invention, the calibration coordinate system may be a coordinate system established by using a vertex of the calibration board as a coordinate origin and according to the length, the width, and the height of the calibration board. In this case, the nominal position of the marker may be a coordinate constituted by distances of the marker from the origin of coordinates in the length direction, width direction, and height direction, respectively.
In another embodiment of the present invention, since the markers on the calibration plate are not generally on the same plane, the calibration coordinate system can be established according to the rows, columns and layers formed by the arrangement of the markers on the calibration plate. In this case, the calibration position of the marker may be a coordinate consisting of the number of rows, columns, and layers of the marker on the calibration plate.
The target coordinate system is as follows: according to a coordinate system established by the surgical robot.
The target coordinate system may be a world coordinate system or a spatial coordinate system established according to an environment in which the surgical robot is located.
The conversion relationship between the two coordinate systems can be expressed as the following expression:
wherein r is1-r9For the amount of rotation between the two coordinate systems, a rotation matrix can be constructedtx、ty、tzRespectively the translation amounts of the two coordinate systems between the three coordinate axes respectively.
The first transformation relation is a transformation relation between the calibration coordinate system and the target coordinate system, and can be expressed as
The specific implementation manner of calculating the second conversion relationship according to the image position, the calibration position and the first conversion relationship can be seen in the following embodiments, and will not be described in detail here.
Step S105: and converting the operation path into a target path under a target coordinate system based on the second conversion relation.
Specifically, since the surgical path is a path planned in the three-dimensional image, the surgical path is a path in the image coordinate system, the second conversion relationship is a conversion relationship between the image coordinate system and the target coordinate system, and the position in the image coordinate system can be converted to the position in the target coordinate system based on the second conversion relationship, and therefore, the surgical path can be converted to the target path in the target coordinate system based on the second conversion relationship.
In an embodiment of the present invention, in a case that the path is a straight line segment, two end positions of the path may be determined in the image coordinate system first, and then the two end positions are converted from the image coordinate system to the target coordinate system based on the second conversion relationship, and the straight line segment formed by the two positions in the target coordinate system after the conversion is the target path in the target coordinate system.
Specifically, the conversion relationship between the two coordinate systems usually includes a rotation parameter and a translation parameter, and when the position of a point in one of the two coordinate systems is converted into the other coordinate system, the point may be rotated and translated according to the rotation parameter and the translation parameter included in the conversion relationship, and the position obtained after the processing is the position of the point in the other coordinate system. Therefore, for the two end point positions, the two end points may be rotated and shifted according to the rotation parameter and the translation parameter included in the second conversion relationship, so as to obtain two positions of the two end points in the target coordinate system.
In another embodiment of the present invention, the surgical path may be divided into the positions of a plurality of points in the image coordinate system according to the principle of calculus, and then the positions of the plurality of points are converted from the image coordinate system to the target coordinate system based on the second conversion relationship, in which the target path is composed of the positions of the plurality of converted points.
Specifically, for the position of each point, the point may be rotated and translated according to the rotation parameter and the translation parameter included in the second conversion relationship, so as to convert the position of the point in the image coordinate system to the target coordinate system.
Step S106: and navigating and positioning the surgical robot according to the target path.
The surgical robot may include a robot arm, and after the target path is obtained, the robot arm may be controlled so that the end of the robot arm moves according to the target path, thereby implementing navigation and positioning.
The end tool may be mounted at the end of the robot arm, and after the end of the robot arm reaches the position indicated by the target path, the user may perform an operation on the target object using the end tool mounted at the end of the robot arm.
As can be seen from the above, when the scheme provided by the embodiment of the present invention is applied to navigation and positioning of a surgical robot, an object in an obtained three-dimensional image includes a target object and a calibration plate, and by identifying a marker of the calibration plate in the three-dimensional image, an image position of the marker in an image coordinate system can be obtained. The image position is a position under an image coordinate system, the calibration position is a position under the calibration coordinate system, and the first conversion relation is a conversion relation between the calibration coordinate system and a target coordinate system, so that the second conversion relation between the image coordinate system and the target coordinate system can be accurately obtained according to the image position, the calibration position and the first conversion relation, and the surgical path can be converted into the target path under the target coordinate system based on the second conversion relation after the surgical path is planned in the three-dimensional image.
In addition, in the scheme, when the second conversion relation between the image coordinate system and the target coordinate system is determined, the position of the marker on the calibration plate under the image coordinate system can be obtained by identifying the marker in the three-dimensional image only by placing the calibration plate in the visual field range of the image acquisition equipment, so that the second conversion relation is calculated. The calibration plate can be placed at any position in the visual field range of the image acquisition equipment, and the second conversion relation can be calculated by utilizing the calibration plate without performing any operation on a target object, so that the navigation and positioning scheme of the surgical robot provided by the embodiment of the invention can enhance the flexibility of the navigation and positioning of the surgical robot and reduce the operation difficulty of the navigation and positioning of the surgical robot.
The following describes a specific implementation of calculating the second conversion relationship according to the image position, the calibration position, and the first conversion relationship.
In an embodiment of the present invention, referring to fig. 2, a flowchart of a second surgical robot navigation and positioning method is provided, and compared with the foregoing embodiment shown in fig. 1, in this embodiment, the foregoing step S104 can be implemented by the following steps S104A-S104B.
Step S104A: and calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and the calibration position obtained in advance.
Specifically, the calibration plate may generally include a plurality of markers, each marker having a respective position in the calibration coordinate system. When the markers in the three-dimensional image are identified, the positions of the multiple markers can be identified, so that multiple image positions of the multiple markers in the image coordinate system are obtained, and a third conversion relation between the image coordinate system and the calibration coordinate system can be calculated according to the image positions and the calibration positions of the multiple markers.
Step S104B: and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and the first conversion relation obtained in advance.
Specifically, the third transformation relation is a transformation relation between the image coordinate system and the calibration coordinate system, and can be expressed asThe first transformation relation is a transformation relation between the calibration coordinate system and the target coordinate system, and can be expressed asBased on the third transformation relationship and the second transformation relationship, a second transformation relationship between the image coordinate system and the target coordinate system can be calculated as
Therefore, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the third conversion relation between the image coordinate system and the calibration coordinate system can be accurately calculated according to the image position and the calibration position, and then the second conversion relation can be accurately calculated according to the third conversion relation and the first conversion relation. Therefore, the surgical robot navigation positioning scheme provided by the embodiment of the invention can accurately calculate the second conversion relation between the image coordinate system and the target coordinate system, thereby improving the accuracy of navigation positioning.
In another embodiment of the present invention, since the first transformation relation is a transformation relation between the calibration coordinate system and the target coordinate system, the calibration position may be transformed into a position under the target coordinate system according to the first transformation relation, and then the second transformation relation between the image coordinate system and the target coordinate system may be calculated according to the image position and the transformed position.
When the surgical robot performs navigation and positioning, the position of the target object may change, for example, the position of the target object moves due to the lifting of the operating table, or the position of the target object moves, and in this case, if the surgical robot performs navigation and positioning according to the originally obtained target path, the actual moving path of the surgical robot may deviate from the expected path, and the navigation and positioning effect is reduced.
In view of the above situation, in an embodiment of the present invention, referring to fig. 3, a flowchart of a third surgical robot navigation and positioning method is provided, and compared with the foregoing embodiment shown in fig. 1, in this embodiment, the step S106 can be implemented by the following step S106A.
Step S106A: and starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset.
Wherein the position offset is: an offset in the target coordinate system between the position of the target object at the current instant and the position at the acquisition instant of the three-dimensional image.
The position deviation may be understood as a deviation of an operation area to be operated in the target object, and the surgical robot may be positioned to the operation area along the target path, so that if the position of the operation area on the target object is changed, the target path needs to be corrected, so that the surgical robot can always accurately position the operation area.
Because the position of the target object may change at any time in the navigation and positioning process of the surgical robot, the target path needs to be continuously corrected in the navigation and positioning process of the surgical robot, so that the surgical robot can always perform navigation and positioning according to an expected path, that is, the navigation and positioning process of the surgical robot and the process of correcting the target path are performed synchronously.
Specifically, when the image acquisition device acquires a three-dimensional image, the position of the target object in the target coordinate system at that moment can be recorded, then in the navigation and positioning process of the surgical robot, the position of the target object in the target coordinate system is detected in real time, if the position of the target object in the target coordinate system is the same as the position recorded at the acquisition time of the three-dimensional image, it is indicated that the position of the target object is not changed, the position offset of the target object is 0 at that time, and the surgical robot performs navigation and positioning along the original target path; if the position of the target object in the target coordinate system is different from the recorded position, it indicates that the position of the target object has changed, and at this time, the position deviation of the target object may be calculated from the detected position and the recorded position, and then the target path may be corrected based on the position deviation, and the surgical robot may continue navigation and positioning along the corrected target path.
For example, if one unit coordinate value on the X axis corresponds to 1 cm in the actual scene in the target coordinate system, at this time, if the target object moves 5 cm in the positive X axis direction, it is necessary to move the position of the target path 5 in the positive X axis direction in the target coordinate system, and then the surgical robot continues navigation and positioning along the moved target path.
Therefore, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the position deviation of the target object is detected in real time in the navigation and positioning process, and the target path is corrected based on the position deviation, so that the target path can be continuously corrected in the navigation and positioning process, the surgical robot can move along the expected path at any time of navigation and positioning, and the accuracy of the navigation and positioning of the surgical robot is improved.
Next, the positional deviation of the target object is detected in real time in step S106A, and the target path is corrected based on the positional deviation.
In an embodiment of the present invention, a first position and a current position in a first coordinate system acquired by a first sensor may be obtained first, then a position offset of a target object may be calculated according to the first position, the current position and a transformation relationship between the first coordinate system and a target coordinate system obtained in advance, and finally a target path may be corrected based on the position offset.
Wherein, the first coordinate system is as follows: according to the coordinate system established by the first sensor, the first position is: the position of the target object under the acquisition moment of the three-dimensional image is as follows: the position of the target object at the current time.
The first sensor generally includes a tracking module and a positioning module, the tracking module of the first sensor can acquire the position of the positioning module of the first sensor in the first coordinate system, and the position of the target object in the first coordinate system can be obtained by placing the positioning module of the first sensor on the body of the target object.
In addition, before the operation is performed on the target object, the position of the operation area, which is required to perform the operation, on the body of the target object can be generally known, for example, the position can be a chest cavity, a heart and the like, so that when the positioning module of the first sensor is placed, the positioning module of the first sensor can be placed on the position or the area near the position, so that the position deviation of the operation area can be more accurately detected, and the accuracy of navigation and positioning is improved.
Specifically, after the positioning module of the first sensor is placed on the body of the target object, the positioning module of the first sensor may acquire the position of the positioning module at the acquisition time of the three-dimensional image, and use the position as the first position of the target object in the first coordinate system, and may also acquire the position of the positioning module in real time during the navigation positioning process, and use the position as the current position of the target object in the first coordinate system, and then calculate the position offset of the target object according to the acquired first position, current position, and the conversion relationship between the first coordinate system and the target coordinate system, and further correct the target path based on the position offset.
There are two following implementations of calculating the position offset from the first position, the current position, and the above-described conversion relationship.
In a first implementation manner, an initial position offset between the first position and the current position may be first calculated, and after the initial position offset is calculated, the initial position offset may be converted into a position offset in a target coordinate system according to a conversion relationship between the first coordinate system and the target coordinate system.
In a second implementation manner, the first position and the current position may be respectively converted into the target coordinate system according to a conversion relationship between the first coordinate system and the target coordinate system, and then a deviation between the converted first position and the current position is calculated, that is, the position deviation.
In an embodiment of the present invention, when obtaining the transformation relationship between the first coordinate system and the target coordinate system, since the transformation relationship between the first coordinate system and the calibration coordinate system where the calibration board is located can be generally obtained in advance, and the second transformation relationship between the calibration coordinate system and the target coordinate system is known, the transformation relationship between the first coordinate system and the calibration coordinate system can be obtained in advance according to the first coordinate system and the calibration coordinate systemAnd a second conversion relationCalculating to obtain the conversion relation between the first coordinate system and the target coordinate system
As can be seen from the above, when the scheme provided by the embodiment of the present invention is applied to navigation and positioning of the surgical robot, the first sensor may respectively acquire the first position and the current position of the target object at the two moments, where the two positions are positions in the first coordinate system, and according to the two positions and the conversion relationship between the first coordinate system and the target coordinate system, the position offset of the target object may be accurately calculated, so that the target path is corrected based on the position offset, and the accuracy of navigation and positioning of the surgical robot may be improved.
In addition, after the surgical robot completes navigation and positioning, a doctor can perform surgery on the target object by using other surgical instruments based on the positioned surgical robot, and the surgical instruments can be provided with the positioning module in the first sensor, so that in the process of performing surgery by using the surgical instruments, the tracking module of the first sensor can acquire the instrument position of the surgical instrument in the first coordinate system, and the instrument position in the first coordinate system is converted into the position in the three-dimensional coordinate system by using the conversion relation between every two coordinate systems in the four coordinate systems, namely the three-dimensional coordinate system, the calibration coordinate system, the first coordinate system and the target coordinate system, so that the position of the surgical instrument is displayed in a real-time displayed three-dimensional image of the target object, and the doctor can conveniently check the position.
For example, when the instrument position is converted, the instrument position may be converted into the target coordinate system based on a conversion relationship between the first coordinate system and the target coordinate system, then the instrument position in the target coordinate system may be converted into each two-dimensional coordinate system again based on a third conversion relationship between each two-dimensional coordinate system and the target coordinate system, and finally the instrument position in each two-dimensional coordinate system may be converted into the three-dimensional coordinate system based on the first conversion relationship between each two-dimensional coordinate system and the three-dimensional coordinate system.
The first sensor will be explained below.
In an embodiment of the invention, the first sensor is an optical tracker.
In this case, the first position and the current position are positions of the positioning module acquired by the tracking module of the optical tracker at the two times, respectively.
Because the positioning accuracy of the optical tracker is higher, the optical tracker is used as a first sensor, the accuracy of the first position and the current position can be improved, the position offset is calculated according to the first position and the current position with higher accuracy, the accuracy of the displacement offset can be improved, and the accuracy of navigation positioning is improved.
In addition, under the condition that the surgical instrument used by the doctor is a hard instrument, the positioning module of the optical tracker can be installed on the hard instrument so as to realize the real-time display of the position of the hard instrument in the three-dimensional image of the target object.
In another embodiment of the present invention, the first sensor is a magnetic sensor.
In this case, the first position and the current position are positions of the positioning module acquired by the tracking module of the magnetic sensor at the two times, respectively.
The process of obtaining the current position is a continuous process, and the tracking continuity of the magnetic sensor is strong, so that the magnetic sensor is used as the first sensor, the target object can be ensured to be continuously tracked, the current position of the target object in the first coordinate system can be obtained at any time of navigation and positioning, and the reliability of the navigation and positioning of the surgical robot is improved.
In addition, when the surgical instrument used by the doctor is a soft instrument, the positioning module of the magnetic sensor can be installed at the tip of the soft instrument, so that the position of the tip of the soft instrument can be displayed in real time in the three-dimensional image of the target object.
In yet another embodiment of the present invention, the first sensor includes an optical tracker and a magnetic sensor.
Because the optical tracker is easy to be shielded, under the condition that the optical tracker is shielded, the tracking module of the optical tracker is difficult to acquire the position of the included positioning module, the magnetic sensor is easy to be interfered by an external magnetic field, and the positioning precision is lower than that of the optical tracker, therefore, the optical tracker and the magnetic sensor are both used as the first sensor, and under the condition that one sensor is difficult to accurately acquire the position of the target object, the other sensor is used for acquiring the position of the target object.
Since the accuracy of the optical sensor is higher than that of the magnetic sensor, when both the optical tracker and the magnetic sensor are normally operated, the position acquired by the optical tracker is usually selected as the position of the target object, and when the optical tracker is difficult to operate normally due to being blocked or the like, the position of the target object can be acquired by the magnetic sensor.
In the scheme, the optical tracker and the magnetic sensor are used as the first sensor, so that the reliability of obtaining the first position and the current position can be ensured, and the reliability of navigation and positioning of the surgical robot is improved.
During the navigation and positioning of the surgical robot, the surgical robot may cause the actual moving path of the surgical robot to be different from the target path due to some uncontrollable factors, for example, the uncontrollable factors may be the loosening of the instruments of the surgical robot itself, the interference of the navigation and positioning process, and the like.
In view of the above situation, in an embodiment of the present invention, referring to fig. 4, a flowchart of a fourth surgical robot navigation and positioning method is provided, and compared with the foregoing embodiment shown in fig. 1, in this embodiment, the step S106 can be implemented through the following steps S106B-S106D.
Step S106B: a second position of the surgical robot in a second coordinate system acquired by a second sensor is obtained.
Wherein, the second coordinate system is: according to the coordinate system established by the second sensor.
The second sensor may be an optical tracker, a magnetic sensor, an optical tracker, and a magnetic sensor, and may also include a tracking module and a positioning module, similar to the first sensor.
The tracking module of the second sensor can acquire the position of the positioning module of the second sensor in the second coordinate system, and the position of the surgical robot in the second coordinate system can be obtained by installing the positioning module of the second sensor on the surgical robot.
In the case that the surgical robot includes a robot arm, the positioning module of the second sensor may be mounted at the end of the robot arm, and be configured to acquire a position of the end of the robot arm in the second coordinate system as the second position of the surgical robot.
The accuracy of the obtained second position is high in the case where the second sensor is an optical tracker, the continuous tracking performance for the second position is high in the case where the second sensor is a magnetic sensor, and the reliability of obtaining the second position is high in the case where the second sensor is an optical tracker and a magnetic sensor.
Specifically, after the second sensor is mounted on the surgical robot, the second sensor may acquire the position of the surgical robot at the current time in real time. Therefore, the second position can be obtained in real time in the whole navigation and positioning process of the surgical robot, and the real-time position detection of the surgical robot is realized.
Step S106C: and converting the second position into a third position under the target coordinate system according to the conversion relation between the second coordinate system and the target coordinate system obtained in advance.
Similar to the transformation relationship between the first coordinate system and the target coordinate system, the transformation relationship between the second coordinate system and the target coordinate system can be calculated according to the transformation relationship between the second coordinate system and the calibration coordinate system and the transformation relationship between the calibration coordinate system and the target coordinate system, and will not be described herein again.
Specifically, the conversion relationship between the second coordinate system and the target coordinate system may include a rotation parameter and a translation parameter between the two coordinate systems, and therefore, the second position may be rotated and translated according to the rotation parameter and the translation parameter, so as to convert the second position into a third position in the target coordinate system.
In addition, since the second position can be obtained in real time, the third position is a position in the target coordinate system obtained by converting the second position, and when the second position is a position of the surgical robot in the second coordinate system obtained at the current time, the third position can be regarded as the position of the surgical robot in the target coordinate system at the current time.
Step S106D: and navigating and positioning the surgical robot according to the target path and the third position.
Specifically, the target path may be regarded as a moving track of the third position, and a process of performing navigation and positioning by the surgical robot along the target path may be regarded as a process of moving the surgical robot according to the target path. In the moving process of the surgical robot, the third position can be obtained in real time, so that when the surgical robot does not move according to the target path due to the uncontrollable factors in the navigation and positioning process, the position of the surgical robot can be corrected according to the target path and the third position where the surgical robot is located at the current moment, and then the surgical robot is continuously navigated and positioned according to the target path.
As can be seen from the above, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the third position of the surgical robot in the target coordinate system is obtained based on the obtained second position and the transformation relationship between the two coordinate systems, namely the second coordinate system and the target coordinate system, so that the position of the surgical robot can be corrected under the condition that the actual moving path of the surgical robot in the navigation and positioning process does not accord with the target path, thereby ensuring that the surgical robot can always perform navigation and positioning according to the target path, and improving the accuracy of the navigation and positioning of the surgical robot.
Corresponding to the surgical robot navigation positioning method, the embodiment of the invention also provides a surgical robot navigation positioning device.
Referring to fig. 5, there is provided a schematic structural diagram of a surgical robot navigation and positioning device, the device including:
an image obtaining module 501, configured to obtain a three-dimensional image acquired by an image acquisition device, where a field of view of the image acquisition device includes: a target object and a calibration plate with a marker;
a path planning module 502 for planning a surgical path for performing a surgery on the target object in the three-dimensional image;
a marker identification module 503, configured to identify the marker in the three-dimensional image, and obtain an image position of the marker in an image coordinate system where the three-dimensional image is located;
a relationship calculation module 504, configured to calculate a second conversion relationship between the image coordinate system and the target coordinate system according to the obtained image position, a pre-obtained calibration position, and a first conversion relationship, where the calibration position is: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: according to the coordinate system established by the calibration plate, the target coordinate system is as follows: according to a coordinate system established by the surgical robot, the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
a path conversion module 505, configured to convert the surgical path into a target path in the target coordinate system based on the second conversion relationship;
and a navigation positioning module 506, configured to perform navigation positioning on the surgical robot according to the target path.
As can be seen from the above, when the scheme provided by the embodiment of the present invention is applied to navigation and positioning of a surgical robot, an object in an obtained three-dimensional image includes a target object and a calibration plate, and by identifying a marker of the calibration plate in the three-dimensional image, an image position of the marker in an image coordinate system can be obtained. The image position is a position under an image coordinate system, the calibration position is a position under the calibration coordinate system, and the first conversion relation is a conversion relation between the calibration coordinate system and a target coordinate system, so that the second conversion relation between the image coordinate system and the target coordinate system can be accurately obtained according to the image position, the calibration position and the first conversion relation, and the surgical path can be converted into the target path under the target coordinate system based on the second conversion relation after the surgical path is planned in the three-dimensional image.
In addition, in the scheme, when the second conversion relation between the image coordinate system and the target coordinate system is determined, the position of the marker on the calibration plate under the image coordinate system can be obtained by identifying the marker in the three-dimensional image only by placing the calibration plate in the visual field range of the image acquisition equipment, so that the second conversion relation is calculated. The calibration plate can be placed at any position in the visual field range of the image acquisition equipment, and the second conversion relation can be calculated by utilizing the calibration plate without performing any operation on a target object, so that the navigation and positioning scheme of the surgical robot provided by the embodiment of the invention can enhance the flexibility of the navigation and positioning of the surgical robot and reduce the operation difficulty of the navigation and positioning of the surgical robot.
In an embodiment of the present invention, the navigation positioning module 506 is specifically configured to:
starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset, wherein the position offset is as follows: an offset in the target coordinate system between a position of the target object at a current time and a position at an acquisition time of the three-dimensional image.
Therefore, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the position deviation of the target object is detected in real time in the navigation and positioning process, and the target path is corrected based on the position deviation, so that the target path can be continuously corrected in the navigation and positioning process, the surgical robot can move along the expected path at any time of navigation and positioning, and the accuracy of the navigation and positioning of the surgical robot is improved.
In an embodiment of the present invention, the navigation positioning module 506 is specifically configured to:
obtaining a first position and a current position in a first coordinate system acquired by a first sensor, wherein the first coordinate system is: according to the coordinate system established by the first sensor, the first position is: the position of the target object at the acquisition time of the three-dimensional image is as follows: a position of the target object at a current time;
calculating the position offset of the target object according to the first position, the current position and a conversion relation between the first coordinate system and a target coordinate system obtained in advance;
based on the position offset, the target path is rectified.
As can be seen from the above, when the scheme provided by the embodiment of the present invention is applied to navigation and positioning of the surgical robot, the first sensor may respectively acquire the first position and the current position of the target object at the two moments, where the two positions are positions in the first coordinate system, and according to the two positions and the conversion relationship between the first coordinate system and the target coordinate system, the position offset of the target object may be accurately calculated, so that the target path is corrected based on the position offset, and the accuracy of navigation and positioning of the surgical robot may be improved.
In an embodiment of the invention, the first sensor is an optical tracker.
Because the positioning accuracy of the optical tracker is higher, the optical tracker is used as a first sensor, the accuracy of the first position and the current position can be improved, the position offset is calculated according to the first position and the current position with higher accuracy, the accuracy of the displacement offset can be improved, and the accuracy of navigation positioning is improved.
In another embodiment of the present invention, the first sensor is a magnetic sensor.
The process of obtaining the current position is a continuous process, and the tracking continuity of the magnetic sensor is strong, so that the magnetic sensor is used as the first sensor, the target object can be ensured to be continuously tracked, the current position of the target object under the first coordinate system can be obtained at any time, and the reliability of the navigation and positioning of the surgical robot is improved.
In yet another embodiment of the present invention, the first sensor includes an optical tracker and a magnetic sensor.
In the scheme, the optical tracker and the magnetic sensor are used as the first sensor, so that the reliability of obtaining the first position and the current position can be ensured, and the reliability of navigation and positioning of the surgical robot is improved.
In an embodiment of the present invention, the navigation positioning module 506 is specifically configured to:
obtaining a second position of the surgical robot in a second coordinate system acquired by a second sensor, wherein the second coordinate system is: establishing a coordinate system according to the second sensor;
converting the second position into a third position under a target coordinate system according to a conversion relation between the second coordinate system and the target coordinate system obtained in advance;
and navigating and positioning the surgical robot according to the target path and the third position.
As can be seen from the above, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the third position of the surgical robot in the target coordinate system is obtained based on the obtained second position and the transformation relationship between the two coordinate systems, namely the second coordinate system and the target coordinate system, so that the position of the surgical robot can be corrected under the condition that the actual moving path of the surgical robot in the navigation and positioning process does not accord with the target path, thereby ensuring that the surgical robot can always perform navigation and positioning according to the target path, and improving the accuracy of the navigation and positioning of the surgical robot.
In an embodiment of the invention, the second sensor is an optical tracker and/or a magnetic sensor.
The accuracy of the obtained second position is high in the case where the second sensor is an optical tracker, the continuous tracking performance for the second position is high in the case where the second sensor is a magnetic sensor, and the reliability of obtaining the second position is high in the case where the second sensor is an optical tracker and a magnetic sensor.
In an embodiment of the present invention, the relationship calculating module 504 is specifically configured to:
calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and a calibration position obtained in advance;
and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and a first conversion relation obtained in advance.
Therefore, when the scheme provided by the embodiment of the invention is applied to navigation and positioning of the surgical robot, the third conversion relation between the image coordinate system and the calibration coordinate system can be accurately calculated according to the image position and the calibration position, and then the second conversion relation can be accurately calculated according to the third conversion relation and the first conversion relation. Therefore, the surgical robot navigation positioning scheme provided by the embodiment of the invention can accurately calculate the second conversion relation between the image coordinate system and the target coordinate system, thereby improving the accuracy of navigation positioning.
An embodiment of the present invention further provides an electronic device, as shown in fig. 6, including a processor 601, a communication interface 602, a memory 603, and a communication bus 604, where the processor 601, the communication interface 602, and the memory 603 complete mutual communication through the communication bus 604,
a memory 603 for storing a computer program;
the processor 601 is configured to implement the following steps when executing the program stored in the memory 603:
obtaining a three-dimensional image acquired by an image acquisition device, wherein the field of view of the image acquisition device comprises: a target object and a calibration plate with a marker;
planning a surgical path for performing a surgery on the target object in the three-dimensional image;
identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image;
calculating a second conversion relation between the image coordinate system and the target coordinate system according to the obtained image position, a pre-obtained calibration position and a first conversion relation, wherein the calibration position is as follows: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: the coordinate system established according to the calibration plate, the target coordinate system is the coordinate system established according to the surgical robot, and the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
converting the surgical path into a target path in the target coordinate system based on the second conversion relation;
and navigating and positioning the surgical robot according to the target path.
Besides, the electronic device can also implement other surgical robot navigation positioning methods as described in the previous embodiment, and details thereof are not described here.
The communication bus mentioned in the electronic device may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown, but this does not mean that there is only one bus or one type of bus.
The communication interface is used for communication between the electronic equipment and other equipment.
The Memory may include a Random Access Memory (RAM) or a Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the processor.
The Processor may be a general-purpose Processor, including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components.
In yet another embodiment of the present invention, a computer-readable storage medium is further provided, in which a computer program is stored, and the computer program, when executed by a processor, implements the steps of any one of the above surgical robot navigation positioning methods.
In yet another embodiment, a computer program product containing instructions is provided, which when run on a computer causes the computer to execute any one of the above-mentioned surgical robot navigation positioning methods.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the invention to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
All the embodiments in the present specification are described in a related manner, and the same and similar parts among the embodiments may be referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the apparatus, the electronic device, the computer-readable storage medium, and the computer program product embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiments.
The above description is only for the preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention shall fall within the protection scope of the present invention.
Claims (16)
1. A surgical robot navigation positioning method, characterized in that the method comprises:
obtaining a three-dimensional image acquired by an image acquisition device, wherein the field of view of the image acquisition device comprises: a target object and a calibration plate with a marker;
planning a surgical path for performing a surgery on the target object in the three-dimensional image;
identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image;
calculating a second conversion relation between the image coordinate system and a target coordinate system according to the obtained image position, a pre-obtained calibration position and a first conversion relation, wherein the calibration position is as follows: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: according to the coordinate system established by the calibration plate, the target coordinate system is as follows: according to a coordinate system established by the surgical robot, the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
converting the surgical path into a target path in the target coordinate system based on the second conversion relation;
and navigating and positioning the surgical robot according to the target path.
2. The method of claim 1, wherein said navigating said surgical robot according to said target path comprises:
starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset, wherein the position offset is as follows: an offset in the target coordinate system between a position of the target object at a current time and a position at an acquisition time of the three-dimensional image.
3. The method of claim 2, wherein the detecting a position offset of the target object in real time, and based on the position offset, rectifying the target path comprises:
obtaining a first position and a current position in a first coordinate system acquired by a first sensor, wherein the first coordinate system is: according to the coordinate system established by the first sensor, the first position is: the position of the target object at the acquisition time of the three-dimensional image is as follows: a position of the target object at a current time;
calculating the position offset of the target object according to the first position, the current position and a conversion relation between the first coordinate system and a target coordinate system obtained in advance;
based on the position offset, the target path is rectified.
4. A method according to claim 3, wherein the first sensor is an optical tracker and/or a magnetic sensor.
5. The method of claim 1, wherein said navigating said surgical robot according to said target path comprises:
obtaining a second position of the surgical robot in a second coordinate system acquired by a second sensor, wherein the second coordinate system is: establishing a coordinate system according to the second sensor;
converting the second position into a third position under a target coordinate system according to a conversion relation between the second coordinate system and the target coordinate system obtained in advance;
and navigating and positioning the surgical robot according to the target path and the third position.
6. The method of claim 5, wherein the second sensor is an optical tracker and/or a magnetic sensor.
7. The method according to any one of claims 1-6, wherein said calculating a second transformation relationship between the image coordinate system and the target coordinate system based on the image position, a pre-obtained calibration position and a first transformation relationship comprises:
calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and a calibration position obtained in advance;
and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and a first conversion relation obtained in advance.
8. A surgical robotic navigation and positioning device, the device comprising:
the image acquisition module is used for acquiring a three-dimensional image acquired by image acquisition equipment, wherein the field of view of the image acquisition equipment comprises: a target object and a calibration plate with a marker;
a path planning module for planning a surgical path for performing a surgery on the target object in the three-dimensional image;
the marker identification module is used for identifying the marker in the three-dimensional image to obtain the image position of the marker in the image coordinate system of the three-dimensional image;
a relationship calculation module, configured to calculate a second conversion relationship between the image coordinate system and the target coordinate system according to the obtained image position, a pre-obtained calibration position, and a first conversion relationship, where the calibration position is: the position of the marker under a calibration coordinate system, wherein the calibration coordinate system is as follows: according to the coordinate system established by the calibration plate, the target coordinate system is as follows: according to a coordinate system established by the surgical robot, the first conversion relationship is as follows: the conversion relation between the calibration coordinate system and the target coordinate system;
the path conversion module is used for converting the operation path into a target path under the target coordinate system based on the second conversion relation;
and the navigation positioning module is used for navigating and positioning the surgical robot according to the target path.
9. The apparatus according to claim 8, wherein the navigation positioning module is specifically configured to:
starting from the initial position of the target path, performing navigation positioning on the surgical robot along the target path, detecting the position offset of the target object in real time in the navigation positioning process, and correcting the target path based on the position offset, wherein the position offset is as follows: an offset in the target coordinate system between a position of the target object at a current time and a position at an acquisition time of the three-dimensional image.
10. The apparatus according to claim 9, wherein the navigation positioning module is specifically configured to:
obtaining a first position and a current position in a first coordinate system acquired by a first sensor, wherein the first coordinate system is: according to the coordinate system established by the first sensor, the first position is: the position of the target object at the acquisition time of the three-dimensional image is as follows: a position of the target object at a current time;
calculating the position offset of the target object according to the first position, the current position and a conversion relation between the first coordinate system and a target coordinate system obtained in advance;
based on the position offset, the target path is rectified.
11. The apparatus of claim 10, wherein the first sensor is an optical tracker and/or a magnetic sensor.
12. The apparatus according to claim 8, wherein the navigation positioning module is specifically configured to:
obtaining a second position of the surgical robot in a second coordinate system acquired by a second sensor, wherein the second coordinate system is: establishing a coordinate system according to the second sensor;
converting the second position into a third position under a target coordinate system according to a conversion relation between the second coordinate system and the target coordinate system obtained in advance;
and navigating and positioning the surgical robot according to the target path and the third position.
13. The device of claim 12, wherein the second sensor is an optical tracker and/or a magnetic sensor.
14. The apparatus according to any one of claims 8-13, wherein the relationship calculation module is specifically configured to:
calculating a third conversion relation between the image coordinate system and the calibration coordinate system according to the image position and a calibration position obtained in advance;
and calculating a second conversion relation between the image coordinate system and the target coordinate system according to the third conversion relation and a first conversion relation obtained in advance.
15. An electronic device is characterized by comprising a processor, a communication interface, a memory and a communication bus, wherein the processor and the communication interface are used for realizing mutual communication by the memory through the communication bus;
a memory for storing a computer program;
a processor for implementing the steps of the surgical robot navigation positioning method of any one of claims 1-7 when executing the program stored in the memory.
16. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the surgical robot navigation positioning method steps of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111662281.3A CN114176779B (en) | 2021-12-31 | 2021-12-31 | Surgical robot navigation positioning method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111662281.3A CN114176779B (en) | 2021-12-31 | 2021-12-31 | Surgical robot navigation positioning method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114176779A true CN114176779A (en) | 2022-03-15 |
CN114176779B CN114176779B (en) | 2023-12-26 |
Family
ID=80606482
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111662281.3A Active CN114176779B (en) | 2021-12-31 | 2021-12-31 | Surgical robot navigation positioning method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114176779B (en) |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103356284A (en) * | 2012-04-01 | 2013-10-23 | 中国科学院深圳先进技术研究院 | Surgical navigation method and system |
CN105658167A (en) * | 2013-08-23 | 2016-06-08 | 斯瑞克欧洲控股I公司 | Computer-implemented technique for determining a coordinate transformation for surgical navigation |
CN110169823A (en) * | 2019-04-24 | 2019-08-27 | 艾瑞迈迪科技石家庄有限公司 | Ultrasonic probe scaling method, device, terminal and storage medium |
CN110522514A (en) * | 2019-08-21 | 2019-12-03 | 昆明医科大学第二附属医院 | A kind of hepatobiliary surgery location tracking system |
CN110559077A (en) * | 2018-06-05 | 2019-12-13 | 上海联影医疗科技有限公司 | Coordinate system registration method, robot control method, device, equipment and medium |
CN110711031A (en) * | 2019-10-31 | 2020-01-21 | 武汉联影智融医疗科技有限公司 | Surgical navigation system, coordinate system registration system, method, device, and medium |
CN111839727A (en) * | 2020-07-10 | 2020-10-30 | 哈尔滨理工大学 | Prostate particle implantation path visualization method and system based on augmented reality |
CN112006776A (en) * | 2020-09-27 | 2020-12-01 | 安徽埃克索医疗机器人有限公司 | Surgical navigation system and registration method thereof |
CN112472294A (en) * | 2020-12-15 | 2021-03-12 | 山东威高医疗科技有限公司 | Method for acquiring spatial positions of different ultrasonic equipment probes in electromagnetic navigation system |
CN112618017A (en) * | 2020-12-16 | 2021-04-09 | 苏州微创畅行机器人有限公司 | Navigation surgery system, computer-readable storage medium, and electronic device |
CN112867427A (en) * | 2018-10-04 | 2021-05-28 | 伯恩森斯韦伯斯特(以色列)有限责任公司 | Computerized Tomography (CT) image correction using orientation and orientation (P & D) tracking assisted optical visualization |
CN113008233A (en) * | 2021-02-01 | 2021-06-22 | 北京中医药大学第三附属医院 | Surgical instrument navigation method, device and system and storage medium |
CN113081265A (en) * | 2021-03-24 | 2021-07-09 | 重庆博仕康科技有限公司 | Surgical navigation space registration method and device and surgical navigation system |
CN113100939A (en) * | 2021-04-06 | 2021-07-13 | 德智鸿(上海)机器人有限责任公司 | Orthopedic surgery navigation method, device, computer equipment, system and storage medium |
-
2021
- 2021-12-31 CN CN202111662281.3A patent/CN114176779B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103356284A (en) * | 2012-04-01 | 2013-10-23 | 中国科学院深圳先进技术研究院 | Surgical navigation method and system |
CN105658167A (en) * | 2013-08-23 | 2016-06-08 | 斯瑞克欧洲控股I公司 | Computer-implemented technique for determining a coordinate transformation for surgical navigation |
CN110559077A (en) * | 2018-06-05 | 2019-12-13 | 上海联影医疗科技有限公司 | Coordinate system registration method, robot control method, device, equipment and medium |
CN112867427A (en) * | 2018-10-04 | 2021-05-28 | 伯恩森斯韦伯斯特(以色列)有限责任公司 | Computerized Tomography (CT) image correction using orientation and orientation (P & D) tracking assisted optical visualization |
CN110169823A (en) * | 2019-04-24 | 2019-08-27 | 艾瑞迈迪科技石家庄有限公司 | Ultrasonic probe scaling method, device, terminal and storage medium |
CN110522514A (en) * | 2019-08-21 | 2019-12-03 | 昆明医科大学第二附属医院 | A kind of hepatobiliary surgery location tracking system |
CN110711031A (en) * | 2019-10-31 | 2020-01-21 | 武汉联影智融医疗科技有限公司 | Surgical navigation system, coordinate system registration system, method, device, and medium |
CN111839727A (en) * | 2020-07-10 | 2020-10-30 | 哈尔滨理工大学 | Prostate particle implantation path visualization method and system based on augmented reality |
CN112006776A (en) * | 2020-09-27 | 2020-12-01 | 安徽埃克索医疗机器人有限公司 | Surgical navigation system and registration method thereof |
CN112472294A (en) * | 2020-12-15 | 2021-03-12 | 山东威高医疗科技有限公司 | Method for acquiring spatial positions of different ultrasonic equipment probes in electromagnetic navigation system |
CN112618017A (en) * | 2020-12-16 | 2021-04-09 | 苏州微创畅行机器人有限公司 | Navigation surgery system, computer-readable storage medium, and electronic device |
CN113008233A (en) * | 2021-02-01 | 2021-06-22 | 北京中医药大学第三附属医院 | Surgical instrument navigation method, device and system and storage medium |
CN113081265A (en) * | 2021-03-24 | 2021-07-09 | 重庆博仕康科技有限公司 | Surgical navigation space registration method and device and surgical navigation system |
CN113100939A (en) * | 2021-04-06 | 2021-07-13 | 德智鸿(上海)机器人有限责任公司 | Orthopedic surgery navigation method, device, computer equipment, system and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114176779B (en) | 2023-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3254621B1 (en) | 3d image special calibrator, surgical localizing system and method | |
CN104000654A (en) | Computer-implemented technique for calculating a position of a surgical device | |
CN112006779A (en) | Precision detection method for surgical navigation system | |
US20150208948A1 (en) | Method and system for surgical instrument guidance and tracking with position and orientation correction | |
US20150125033A1 (en) | Bone fragment tracking | |
CN113081265B (en) | Surgical navigation space registration method and device and surgical navigation system | |
CN113524201B (en) | Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium | |
CN103678837A (en) | Method and device for determining processing remains of target area | |
CN113662665A (en) | Precision detection method and device of knee joint replacement surgical robot system | |
CN115619781B (en) | Precision detection method and device, electronic equipment and storage medium | |
JPWO2018043524A1 (en) | Robot system, robot system control apparatus, and robot system control method | |
CN114246635A (en) | Osteotomy plane positioning method, osteotomy plane positioning system and osteotomy plane positioning device | |
CN114224428B (en) | Osteotomy plane positioning method, system and device | |
CN116473677A (en) | Surgical navigation positioning method and device, electronic equipment and medium | |
CN114209433B (en) | Surgical robot navigation positioning device | |
CN110251209A (en) | A kind of bearing calibration and device | |
CN108420531B (en) | Surgical tool adjusting method, electronic device and clamping device | |
US10991113B2 (en) | Gyroscope-based system and method for assisting in tracking heat source on mechanical arm | |
CN114176779B (en) | Surgical robot navigation positioning method and device | |
CN115533863B (en) | Method, system and device for determining positioning tracer mounting groove and electronic equipment | |
CN110123452A (en) | The navigation methods and systems of robot | |
JP7037810B2 (en) | Image processing device, image processing program, and image processing method | |
KR102307919B1 (en) | Pose estimation method of bendable interventional medical device using single-view x-ray image | |
JP6631225B2 (en) | 3D shape measuring device | |
CN116459008A (en) | Surgical robot navigation positioning method, device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |