CN115229805B - Hand-eye calibration method and device for surgical robot, storage medium and processor - Google Patents
Hand-eye calibration method and device for surgical robot, storage medium and processor Download PDFInfo
- Publication number
- CN115229805B CN115229805B CN202211147000.5A CN202211147000A CN115229805B CN 115229805 B CN115229805 B CN 115229805B CN 202211147000 A CN202211147000 A CN 202211147000A CN 115229805 B CN115229805 B CN 115229805B
- Authority
- CN
- China
- Prior art keywords
- flange
- coordinate system
- tool
- actual acquisition
- surgical robot
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
Abstract
The invention provides a method and a device for calibrating hands and eyes of a surgical robot, a storage medium and a processor. The calibration method for the hand and the eye of the surgical robot comprises the following steps: obtaining a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions according to the initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the flange of the surgical robot and the calibration relation of a tool carried by the surgical robot; enabling the flange to be sequentially located at an initial point position and a plurality of actual acquisition point positions, and acquiring the pose of the flange in a robot base coordinate system and the pose of a tool installed on the flange in a visual acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position; and obtaining the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system. The technical scheme of the application solves the problem that the precision of the calibration of the hand and the eye of the surgical robot in the related technology cannot be guaranteed.
Description
Technical Field
The invention relates to the technical field of calibration of surgical robots, in particular to a method and a device for calibrating hands and eyes of a surgical robot, a storage medium and a processor.
Background
In medical surgery, it is generally required to accurately track the real-time position of a tool on a flange of a surgical instrument, i.e., a robot used for surgery, under the tracking of an optical tracker, and to display the real-time position of the tool on an image through registration of a physical space and an image space, so that a doctor performs the surgery with reference to the image of a lesion site of a patient.
The most important step in the whole medical operation process is to register the actual body position of a patient and the image space position, and the hand-eye calibration is an important link. The purpose of the hand-eye calibration is to obtain the conversion relation between the robot base coordinate system and the vision equipment coordinate system and the conversion relation between the robot flange coordinate system and the tool coordinate system.
In the related art, hand-eye calibration needs to be realized through calibration of images and pose data of surgical instruments and cooperation of visual equipment operators and surgical tool personnel. However, the above method often cannot guarantee the accuracy requirement of the hand-eye calibration.
Disclosure of Invention
The invention mainly aims to provide a method, a device, a storage medium and a processor for calibrating a hand and an eye of a surgical robot, so as to solve the problem that the precision of the calibration of the hand and the eye of the surgical robot in the related art cannot be ensured.
In order to achieve the above object, according to a first aspect of the present invention, there is provided a surgical robot hand-eye calibration method, including: obtaining a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions according to the initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the flange of the surgical robot and the calibration relation of a tool carried by the surgical robot; enabling the flange to be sequentially located at an initial point position and a plurality of actual acquisition point positions, enabling the flange to be located at an actual acquisition attitude angle corresponding to each actual acquisition point position when the flange is located at each actual acquisition point position, and acquiring the pose of the flange in a robot base coordinate system and the pose of a tool mounted on the flange in a visual acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position; and obtaining the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system according to the pose of the flange in the robot base coordinate system when the flange is positioned at the initial point position and each actual acquisition point position and the pose of a tool arranged on the flange in the vision acquisition device coordinate system.
Further, the step of obtaining a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions according to the flange initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the surgical robot and the tool calibration relation of the surgical robot comprises: obtaining an M matrix of the surgical robot; enabling a flange of the surgical robot to be in an initial point position and an initial posture angle, obtaining the posture of the tool in a robot base coordinate system through a tool calibration relation carried by the surgical robot, and obtaining the posture of the tool in a coordinate system of a vision acquisition device through the vision acquisition device; obtaining a plurality of groups of attitude angle lists corresponding to a plurality of actual acquisition point locations, wherein each group of attitude angle list comprises a plurality of attitude angles corresponding to the plurality of actual acquisition point locations; obtaining the condition number of an M matrix corresponding to each group of posture angle list according to the posture of the tool in the robot base coordinate system, the posture of the tool in the visual acquisition device coordinate system, the multiple groups of posture angle lists and the M matrix; and determining a plurality of actual acquisition attitude angles according to an attitude angle list corresponding to the minimum value in the condition numbers of all M matrixes.
Further, determining a plurality of groups of attitude angle lists corresponding to the plurality of actual acquisition point locations, where each group of attitude angle list includes a plurality of attitude angles corresponding to the plurality of actual acquisition point locations includes: and obtaining a plurality of attitude angles in each group of attitude angle lists corresponding to the actual acquisition point locations by randomly sampling under a preset rule, wherein the preset rule is that an included angle between a Z axis of a tool coordinate system of each actual acquisition point location and a Z axis of a tool coordinate system of the initial point location is less than or equal to 60 degrees.
Further, the number of actual acquisition point locations is eight, and the eight actual acquisition point locations are eight vertices of a cube centered on the initial point location.
Further, before the step of sequentially positioning the flange at the initial point location and the plurality of actual acquisition point locations, the method for calibrating the hand and the eye of the surgical robot further comprises the following steps: and adjusting the actual acquisition area of the vision acquisition device according to the position of the tool when the flange is located at the initial point position.
Further, the step of adjusting the actual acquisition area of the vision acquisition device according to the position of the tool when the flange is located at the initial point position comprises the following steps: and reducing the acquisition area of the vision acquisition device in all directions to obtain a limited area, so that the tool is positioned in the limited area.
Further, the M matrix is:
the pose of the flange in the robot base coordinate system when the flange is positioned at the initial point position,the pose of the flange in the robot base coordinate system when the flange is positioned at one of the actual acquisition point positions;
is the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at the initial point position,the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at one of the actual acquisition point positions.
According to a second aspect of the present invention, there is provided a surgical robot hand-eye calibration device, comprising: the attitude angle acquisition unit is used for acquiring a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions according to the flange initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the surgical robot and the self-contained tool calibration relation of the surgical robot; the position and posture acquisition unit is used for enabling the flange to be sequentially located at an initial point position and a plurality of actual acquisition point positions, when the flange is located at each actual acquisition point position, the flange is located at an actual acquisition posture angle corresponding to the actual acquisition point position, and when the flange is located at the initial point position and each actual acquisition point position, the position and posture of the flange in a robot base coordinate system and the position and posture of a tool installed on the flange in a visual acquisition device coordinate system are acquired; and the computing unit is used for obtaining the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system according to the position and posture of the flange in the robot base coordinate system and the position and posture of the tool in the vision acquisition device coordinate system when the flange is positioned at the initial point position and each actual acquisition point position.
According to a third aspect of the present invention, there is provided a computer readable storage medium comprising a stored program, wherein the program performs the surgical robot hand-eye calibration method described above.
According to a fourth aspect of the present invention, a processor is provided, the processor is configured to execute a program, wherein the program executes the method for calibrating the hands and eyes of the surgical robot.
By applying the technical scheme of the invention, a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions are obtained according to the initial point position, the initial attitude angle, a plurality of actual acquisition point positions of the flange of the surgical robot and the calibration relation of the tool of the surgical robot. The actual acquisition attitude angles are obtained by calculating the calibration relation of the tools carried by the surgical robot, so that the singular problem in the subsequent conversion relation solving process can be avoided to a certain extent. And then acquiring required data, specifically, enabling the flange to be sequentially located at the initial point position and the plurality of actual acquisition point positions, and acquiring the pose of the flange in the robot base coordinate system and the pose of a tool installed on the flange in the coordinate system of the vision acquisition device when the flange is located at the initial point position and each actual acquisition point position. And finally, deriving according to the data acquired in the process and a related formula to obtain the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system, because the conversion relation between the tool coordinate system and the flange coordinate system is unique. The technical scheme of the application can effectively solve the problem that the precision of the calibration of the hand and the eye of the surgical robot in the related technology cannot be guaranteed.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 shows a schematic flow diagram of an embodiment of a surgical robot hand-eye calibration method according to the present invention;
fig. 2 shows a detailed flowchart of step S10 of the surgical robot hand-eye calibration method of fig. 1;
FIG. 3 shows a schematic flow diagram of a preferred embodiment of a surgical robot hand-eye calibration method according to the present invention;
FIG. 4 shows a schematic view of an embodiment of a surgical robotic hand-eye calibration device according to the present invention;
FIG. 5 is a schematic diagram illustrating a coordinate transformation relationship between a vision collecting device and a surgical robot in the hand-eye calibration method of the surgical robot in FIG. 1;
FIG. 6 is a schematic view of an acquisition region of a vision acquisition device in the surgical robot hand-eye calibration method of FIG. 1;
FIG. 7 shows three views of the acquisition area of the visual acquisition apparatus of FIG. 6;
fig. 8 is a schematic view of the Z-axis coordinate direction of the tool coordinate system of each actual acquisition point location in the hand-eye calibration method of the surgical robot in fig. 2.
Wherein the figures include the following reference numerals:
10. a mechanical arm; 20. a vision acquisition device; 100. an attitude angle acquisition unit; 200. a pose acquisition unit; 300. and a computing unit.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The selection of the actual acquisition attitude angle of the surgical robot can cause a singular problem when the conversion relation is solved. In the related art, the actual acquisition attitude angle is generally manually adjusted according to the experience of an operator, which reduces the accuracy of the subsequently calculated conversion relationship, i.e., affects the accuracy of hand-eye calibration. According to the technical scheme, a better actual acquisition attitude angle is found before data acquisition work, and the problems are solved better.
As shown in fig. 1, the method for calibrating the hands and eyes of the surgical robot in the embodiment includes:
step S10: according to the initial point position, the initial attitude angle, the actual acquisition point positions of the flange of the surgical robot and the tool calibration relation of the surgical robotObtaining a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions;
step S30: enabling the flange to be sequentially located at an initial point position and a plurality of actual acquisition point positions, enabling the flange to be located at an actual acquisition attitude angle corresponding to each actual acquisition point position when the flange is located at each actual acquisition point position, and acquiring the pose of the flange in a robot base coordinate system and the pose of a tool installed on the flange in a coordinate system of the vision acquisition device when the flange is located at the initial point position and each actual acquisition point position;
step S40: and obtaining the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system according to the position and the position of the flange in the robot base coordinate system and the position of the tool in the vision acquisition device coordinate system when the flange is positioned at the initial point position and each actual acquisition point position.
By applying the technical scheme of the embodiment, a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions are obtained according to the flange initial point position, the initial attitude angle, a plurality of actual acquisition point positions of the surgical robot and the tool calibration relation of the surgical robot. The actual acquisition attitude angles are obtained by calculating the calibration relation of the tools carried by the surgical robot, so that the singular problem in the subsequent conversion relation solving process can be avoided to a certain extent. And then acquiring required data, specifically, enabling the flange to be sequentially located at the initial point position and the plurality of actual acquisition point positions, and acquiring the pose of the flange in the robot base coordinate system and the pose of a tool installed on the flange in the coordinate system of the vision acquisition device when the flange is located at the initial point position and each actual acquisition point position. And finally, deriving according to the data acquired in the process and a related formula to obtain the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system, because the conversion relation between the tool coordinate system and the flange coordinate system is unique. The technical scheme of the embodiment can effectively solve the problem that the precision of the calibration of the hand and the eye of the surgical robot in the related technology cannot be ensured.
The calibration relation of the self-carried tool of the surgical robotIs self-carried calibration when the mechanical arm 10 of the operation robot leaves the factoryThe method has the advantages that the conversion relation between the flange coordinate system and the tool coordinate system is calibrated, the calibration precision is poor, and the technical scheme of the embodiment obtains a plurality of actual acquisition attitude angles by utilizing the self-contained tool calibration relation, namely obtains a plurality of attitude angles capable of improving the calibration precision.
As shown in fig. 2, in the present embodiment, step S10: according to the initial point position of the flange, the initial attitude angle, a plurality of actual acquisition point positions of the surgical robot and the calibration relation of the tools carried by the surgical robotThe step of obtaining a plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions includes:
step S11: obtaining an M matrix of the surgical robot;
step S12: the flange of the surgical robot is positioned at an initial point position and an initial attitude angle, and the relation is calibrated by the tool of the surgical robotObtaining the pose of the tool in the robot base coordinate systemThe pose of the tool in the coordinate system of the vision acquisition device is obtained by the vision acquisition device 20;
Step S13: obtaining a plurality of groups of attitude angle lists corresponding to a plurality of actual acquisition point positions, wherein each group of attitude angle list comprises a plurality of attitude angles corresponding to the plurality of actual acquisition point positions, and the attitude angles are determined according to the position and the attitude of the tool in the robot base coordinate systemAnd the position and attitude of the tool in the coordinate system of the vision acquisition deviceObtaining a condition number cond (M) of the M matrix corresponding to each group of attitude angle lists by the plurality of groups of attitude angle lists and the M matrix;
step S14: and determining a plurality of actual acquisition attitude angles according to an attitude angle list corresponding to the minimum value in the condition numbers of all M matrixes.
The above step S11 may be performed before step S12, or may be performed between step S12 and step S13. In step S11, the M matrix needs to be obtained by formula derivation, and the derivation process is as follows:
for two random pairs of motion data of the flange, the conversion relationship from the flange coordinate system to the tool coordinate system is unique, namely:
therefore, the method can be obtained as follows:
the pose of the flange in the robot base coordinate system when the flange is positioned at the initial point position,the pose of the flange in the robot base coordinate system when the flange is positioned at one of the actual acquisition point positions;
the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at the initial point position,the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at one of the actual acquisition point positions;
simplifying to obtain:
order to,And the above formula is simplified toAnd then converting into an optimal solution problem:
in step S30, when the flange is located at the initial point location and each actual collection point location, the pose of the flange in the robot base coordinate system can obtain a coordinate system set R, where R includes:
the above-mentioned R set is obtained by a feedback value of a flange of the surgical robot, wherein,the pose of the flange in the robot base coordinate system when the flange is positioned at the initial point position,toThe pose of the flange in the robot base coordinate system when the flange is positioned at each actual acquisition point position;
acquiring the pose of the tool mounted on the flange in the coordinate system of the vision acquisition device to obtain a coordinate system set V, wherein V comprises:
,the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at the initial point position,toThe pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at each actual acquisition point position. The V set is obtained through a visual acquisition device, a visual feature array (such as four luminous balls) is generally arranged on the tool, and the four luminous balls are generally arranged to form a quadrangle.
In step S40, a transformation relationship between the robot base coordinate system and the vision acquisition device coordinate system and a transformation relationship between the tool coordinate system and the flange coordinate system are obtained according to the pose of the flange in the robot base coordinate system and the pose of the tool in the vision acquisition device coordinate system when the flange is located at the initial point location and each acquisition point location.
Specifically, according to the formula obtained above, by performing connected pair combination on the data in the R set and the V set, the a set and the B set corresponding to 8 groups can be obtained, so that k =8. The M matrix can then be determined, i.e., R can also be obtained x 。
On the basis of the formula:
find P x Finally, the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system is obtainedFurther, a tool coordinate system and a flange coordinate system can be obtainedConversion relation of。
In step S13, a plurality of actual acquisition point locations are planned in advance, specifically, in this embodiment, the number of actual acquisition point locations is eight, and the eight actual acquisition point locations are eight vertices of a cube centered on the initial point location. When the side length of the cube is a determined value, the positions of the eight actual acquisition point bits relative to the initial point position are determined. Therefore, when each group of attitude angle lists corresponding to the actual acquisition point positions are determined, the attitude of the flange in the robot base coordinate system when the flange is located at each actual acquisition point can be obtained under the condition that the mechanical arm does not move.
And obtaining a plurality of attitude angles in each group of attitude angle lists corresponding to the actual acquisition point positions through random sampling. During sampling each time, the Z axis of the tool coordinate system for actually acquiring the point location needs to rotate by a certain included angle relative to the Z axis of the tool coordinate system for the initial point location. Since the included angle is known, the conversion relation of the tool coordinate system between the actual acquisition point location and the initial point location can be obtained. Therefore, the pose of the tool of the actual acquisition point in the coordinate system of the visual acquisition device does not need to be acquired by the visual acquisition device, and can be obtained by calculation: the position and pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at the initial point positionMultiplied by the transformation of the tool coordinate system. The pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at each actual acquisition point position can be obtained by the method.
And calculating the condition number of the M matrix corresponding to each group of attitude angle list through the attitude of the flange in the robot base coordinate system, the attitude of the tool in the visual acquisition device coordinate system and the M matrix when the flange is positioned at each actual acquisition point.
For the M matrix, the smaller the condition number of the matrix is, the lower the divergence caused by the matrix is, and the higher the accuracy of the final result is, so in step S14, a plurality of actual acquisition pose angles are determined according to the pose angle list corresponding to the minimum value among the condition numbers of all the M matrices. By the method, a better actual acquisition attitude angle can be obtained.
In the present embodiment, step S13: determining a plurality of groups of attitude angle lists corresponding to the actual acquisition point locations, wherein each group of attitude angle list comprises a plurality of attitude angles corresponding to the actual acquisition point locations, and the step of determining the attitude angle lists comprises the following steps:
and obtaining a plurality of attitude angles in each group of attitude angle lists corresponding to the actual acquisition point locations by randomly sampling under a preset rule, wherein the preset rule is that an included angle between a Z axis of a tool coordinate system of each actual acquisition point location and a Z axis of a tool coordinate system of the initial point location is less than or equal to 60 degrees. The above-mentioned angle takes into account the range of motion of the robot arm of the robot. Fig. 8 shows a schematic view of the tool coordinate system Z coordinate directions of different acquisition points.
The number of groups of the attitude angle list can be determined according to actual needs, and in the embodiment, 10 groups of attitude angle lists are needed. Meanwhile, the condition number of at least 1 set of M matrixes in the 10 sets of attitude angle lists needs to be smaller than a specified threshold value. Otherwise, the surgical robot is judged to have a fault, and the fault needs to be checked. Specifically, the attitude angle lists of the 1 st group to the 10 th group and the condition numbers of the M matrixes corresponding to the attitude angle lists are obtained in sequence, and if only 1 group is smaller than a threshold value, the attitude angle lists are adopted to determine a plurality of actual acquisition attitude angles. And if a plurality of groups are all smaller than the threshold value, selecting a posture angle list corresponding to the minimum value in the condition numbers of the M matrix to determine a plurality of actual acquisition posture angles.
In a preferred embodiment, the method for calibrating the hands and the eyes of the surgical robot can automatically collect data, and can ensure that each point of the surgical robot can be effectively collected by the vision collecting device.
Before the step of sequentially locating the flange at the initial point location and the plurality of actual acquisition point locations, the method for calibrating the hand and the eye of the surgical robot further comprises the following steps:
step S20: and adjusting the acquisition area of the vision acquisition device according to the position of the tool when the flange is positioned at the initial point.
And calculating the motion point of the subsequent surgical robot according to the initial point position of the surgical robot and the effective working space of the vision acquisition device, and further judging whether the subsequent point position can be acquired or not. The method comprises the following specific steps:
for the position of the surgical robot, as shown in fig. 6, the movement range of the surgical robot is first set to 8 vertexes (multiple actual collection points) and a central point (initial point) of a cube, 9 sets of points are provided, and the movement initial point of the surgical robot is set to be the center of the cube. Secondly, the sphere envelope space region is constructed by taking the distance from the center of the cube to the vertex as a radius, so that each point position of the movement of the surgical robot can be effectively acquired by the vision acquisition device as long as the envelope region is ensured to be in an effective acquisition region of the vision acquisition device.
In fig. 6, a solid line polyhedron is an effective detection space of the visual acquisition device, a dotted line cube fixed point is a machine motion point position, and a dotted line sphere is a cube enveloping space.
Further, as shown in fig. 3, step S20: the step of adjusting the actual acquisition area of the vision acquisition device according to the position of the tool when the flange is at the initial point comprises:
and reducing the acquisition area of the visual acquisition device in all directions to obtain a limited area, so that the tool is positioned in the limited area.
Specifically, in the effective working area of the vision acquisition device, the radius of the sphere is taken as the length, and the overall acquisition area is reduced in each direction, as shown in fig. 7, so as to obtain the robot motion effective area of the initial point of the surgical robot. Therefore, as long as the initial point location of the surgical robot is ensured to be in the effective moving area of the surgical robot, after the robot moves to the initial point location, the vision acquisition device needs to be adjusted, so that the tool is displayed in the virtual center range, namely, all the positions in the moving process are in the effective acquisition interval of the vision equipment.
In fig. 7, the solid line part is a three-view of the effective detection space of the vision collecting device, the dashed line part is a three-view of the effective movement space of the surgical robot under the coordinate system of the vision collecting device, and the black point is the position of the tool under the coordinate system of the vision collecting device.
According to a second aspect of the present invention, a surgical robot hand-eye calibration device is provided, which is used for executing the surgical robot hand-eye calibration method provided by the embodiments of the present application. The surgical robot hand-eye calibration apparatus of the present embodiment will be described below.
As shown in fig. 4, the hand-eye calibration device for a surgical robot includes:
the attitude angle acquisition unit 100 is configured to obtain a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions according to an initial flange point position, an initial attitude angle, a plurality of actual acquisition point positions of the surgical robot, and a tool calibration relationship of the surgical robot;
the pose acquisition unit 200 is used for enabling the flange to be sequentially located at an initial point position and a plurality of actual acquisition point positions, enabling the flange to be located at an actual acquisition pose angle corresponding to each actual acquisition point position when the flange is located at each actual acquisition point position, and acquiring the pose of the flange in a robot base coordinate system and the pose of a tool installed on the flange in a visual acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position;
and the calculating unit 300 is configured to obtain a conversion relationship between the robot base coordinate system and the vision acquisition device coordinate system and a conversion relationship between the tool coordinate system and the flange coordinate system according to the pose of the flange in the robot base coordinate system and the pose of the tool in the vision acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position.
By applying the technical scheme of this embodiment, the attitude angle obtaining unit 100 is configured to obtain a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions according to the flange initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the surgical robot and a tool calibration relationship of the surgical robot. The actual acquisition attitude angles are obtained by calculating the calibration relation of the tools carried by the surgical robot, so that the singular problem in the subsequent conversion relation solving process can be avoided to a certain extent. And then acquiring required data, specifically, a pose acquisition unit 200, configured to enable the flange to be located at the initial point location and the multiple actual acquisition point locations in sequence, and acquire the pose of the flange in the robot base coordinate system when the flange is located at the initial point location and each actual acquisition point location, and the pose of a tool mounted on the flange in the coordinate system of the vision acquisition device. Finally, since the transformation relationship between the tool coordinate system and the flange coordinate system is unique, the calculation unit 300 is configured to obtain the transformation relationship between the robot base coordinate system and the vision acquisition device coordinate system and the transformation relationship between the tool coordinate system and the flange coordinate system according to the data acquired in the above process and the derivation of the related formula. The technical scheme of the embodiment can effectively solve the problem that the precision of the calibration of the hand and the eye of the surgical robot in the related technology cannot be ensured.
According to a third aspect of the present invention, there is provided a computer-readable storage medium comprising a stored program, wherein the program performs the surgical robot hand-eye calibration method described above.
According to a fourth aspect of the present invention, a processor is provided, the processor is configured to execute a program, wherein the program executes the method for calibrating the hands and eyes of the surgical robot.
In the above embodiments of the present invention, the description of each embodiment has its own emphasis, and reference may be made to the related description of other embodiments for parts that are not described in detail in a certain embodiment.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described apparatus embodiments are merely illustrative, and for example, the division of the above-described units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a computer-readable storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned computer-readable storage media comprise: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (9)
1. A method for calibrating hands and eyes of a surgical robot is characterized by comprising the following steps:
obtaining a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions according to an initial point position, an initial attitude angle, a plurality of actual acquisition point positions of a flange of a surgical robot and a tool calibration relation of the surgical robot;
enabling the flange to be sequentially located at the initial point position and a plurality of actual acquisition point positions, enabling the flange to be located at an actual acquisition attitude angle corresponding to each actual acquisition point position when the flange is located at each actual acquisition point position, and acquiring the pose of the flange in a robot base coordinate system and the pose of a tool installed on the flange in a visual acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position;
obtaining a conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and a conversion relation between a tool coordinate system and a flange coordinate system according to the pose of the flange in the robot base coordinate system and the pose of a tool mounted on the flange in the vision acquisition device coordinate system when the flange is positioned at the initial point position and each actual acquisition point position;
the method comprises the following steps of obtaining a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions according to an initial flange point position, an initial attitude angle, a plurality of actual acquisition point positions of a surgical robot and a tool calibration relation of the surgical robot, wherein the step of obtaining the plurality of actual acquisition attitude angles corresponding to the plurality of actual acquisition point positions comprises the following steps:
obtaining an M matrix of the surgical robot;
enabling a flange of the surgical robot to be in the initial point position and the initial posture angle, obtaining the pose of the tool in a robot base coordinate system through the tool calibration relation of the surgical robot, and obtaining the pose of the tool in a vision acquisition device coordinate system through a vision acquisition device;
obtaining a plurality of groups of attitude angle lists corresponding to a plurality of actual acquisition point positions, wherein each group of attitude angle list comprises a plurality of attitude angles corresponding to the plurality of actual acquisition point positions, and obtaining the condition number of the M matrix corresponding to each group of attitude angle list according to the attitude of the tool in a robot base coordinate system, the attitude of the tool in a visual acquisition device coordinate system, the plurality of groups of attitude angle lists and the M matrix;
and determining a plurality of actual acquisition attitude angles according to the attitude angle list corresponding to the minimum value in the condition numbers of all the M matrixes.
2. The surgical robot hand-eye calibration method according to claim 1, wherein the step of determining a plurality of sets of attitude angle lists corresponding to the plurality of actual acquisition point locations, each set of attitude angle lists including a plurality of attitude angles corresponding to the plurality of actual acquisition point locations comprises:
and obtaining a plurality of attitude angles in each group of attitude angle lists corresponding to the actual acquisition point locations by randomly sampling under a preset rule, wherein the preset rule is that an included angle between a Z axis of a tool coordinate system of each actual acquisition point location and a Z axis of a tool coordinate system of the initial point location is less than or equal to 60 degrees.
3. The surgical robot hand-eye calibration method according to claim 1, wherein the number of the actual acquisition points is eight, and the eight actual acquisition points are eight vertexes of a cube centered on the initial point.
4. The method according to claim 1, wherein the step of sequentially positioning the flange at the initial point location and the plurality of actual collection point locations is preceded by the step of:
and adjusting the actual acquisition area of the vision acquisition device according to the position of the tool when the flange is positioned at the initial point position.
5. A surgical robot hand-eye calibration method according to claim 4, wherein the step of adjusting the actual acquisition area of the vision acquisition device according to the position of the tool when the flange is located at the initial position comprises:
and reducing the acquisition area of the vision acquisition device in all directions to obtain a limited area, so that the tool is positioned in the limited area.
6. The surgical robot hand-eye calibration method according to claim 1, wherein the M matrix is:
the above-mentionedThe position and posture of the flange in the robot base coordinate system when the flange is positioned at the initial point position are shownWhen the flange is positioned at one of the actual acquisition point positionsA pose within the robot base coordinate system;
the above-mentionedThe pose of the tool in the coordinate system of the vision acquisition device when the flange is located at the initial point position, the poseAnd the pose of the tool in the coordinate system of the vision acquisition device when the flange is positioned at one of the actual acquisition point positions.
7. A surgical robot hand-eye calibration device is characterized by comprising:
the attitude angle acquisition unit is used for acquiring a plurality of actual acquisition attitude angles corresponding to a plurality of actual acquisition point positions according to the flange initial point position, the initial attitude angle, the plurality of actual acquisition point positions of the surgical robot and the calibration relation of a tool carried by the surgical robot;
the position and posture acquisition unit is used for enabling the flange to be sequentially located at the initial point position and the plurality of actual acquisition point positions, enabling the flange to be at an actual acquisition posture angle corresponding to each actual acquisition point position when the flange is located at each actual acquisition point position, and acquiring the position and posture of the flange in a robot base coordinate system and the position and posture of a tool installed on the flange in a visual acquisition device coordinate system when the flange is located at the initial point position and each actual acquisition point position;
the calculation unit is used for obtaining the conversion relation between the robot base coordinate system and the vision acquisition device coordinate system and the conversion relation between the tool coordinate system and the flange coordinate system according to the position and the posture of the flange in the robot base coordinate system and the position and the posture of the tool in the vision acquisition device coordinate system when the flange is positioned at the initial point position and each actual acquisition point position;
wherein the attitude angle acquisition unit is further configured to:
obtaining an M matrix of the surgical robot;
enabling a flange of the surgical robot to be in the initial point position and the initial posture angle, obtaining the pose of the tool in a robot base coordinate system through the tool calibration relation of the surgical robot, and obtaining the pose of the tool in a vision acquisition device coordinate system through a vision acquisition device;
obtaining a plurality of groups of attitude angle lists corresponding to a plurality of actual acquisition point positions, wherein each group of attitude angle lists comprises a plurality of attitude angles corresponding to the plurality of actual acquisition point positions, and obtaining the condition number of the M matrix corresponding to each group of attitude angle lists according to the attitude of the tool in a robot base coordinate system, the attitude of the tool in a visual acquisition device coordinate system, the plurality of groups of attitude angle lists and the M matrix;
and determining a plurality of actual acquisition attitude angles according to the attitude angle list corresponding to the minimum value in the condition numbers of all the M matrixes.
8. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored program, wherein the program performs the surgical robot hand-eye calibration method of any one of claims 1 to 6.
9. A processor, characterized in that the processor is configured to run a program, wherein the program is configured to perform the surgical robot hand-eye calibration method according to any one of claims 1 to 6 when running.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211147000.5A CN115229805B (en) | 2022-09-21 | 2022-09-21 | Hand-eye calibration method and device for surgical robot, storage medium and processor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211147000.5A CN115229805B (en) | 2022-09-21 | 2022-09-21 | Hand-eye calibration method and device for surgical robot, storage medium and processor |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115229805A CN115229805A (en) | 2022-10-25 |
CN115229805B true CN115229805B (en) | 2022-12-09 |
Family
ID=83682272
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211147000.5A Active CN115229805B (en) | 2022-09-21 | 2022-09-21 | Hand-eye calibration method and device for surgical robot, storage medium and processor |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115229805B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115813563B (en) * | 2023-02-20 | 2023-07-14 | 北京壹点灵动科技有限公司 | Surgical robot control device, storage medium, and processor |
CN115990890B (en) * | 2023-03-23 | 2023-06-30 | 深圳广成创新技术有限公司 | Calibration method and device for manipulator, computer equipment and storage medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109465822A (en) * | 2018-10-22 | 2019-03-15 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | Based on 3D vision hand and eye calibrating method |
CN111070199A (en) * | 2018-10-18 | 2020-04-28 | 杭州海康威视数字技术股份有限公司 | Hand-eye calibration assessment method and robot |
CN113246135A (en) * | 2021-06-03 | 2021-08-13 | 季华实验室 | Robot hand-eye calibration method and device, electronic equipment and storage medium |
CN113510708A (en) * | 2021-07-28 | 2021-10-19 | 南京航空航天大学 | Contact industrial robot automatic calibration system based on binocular vision |
CN114280153A (en) * | 2022-01-12 | 2022-04-05 | 江苏金晟元控制技术有限公司 | Intelligent detection robot for complex curved surface workpiece, detection method and application |
CN114343847A (en) * | 2022-01-06 | 2022-04-15 | 广东工业大学 | Hand-eye calibration method of surgical robot based on optical positioning system |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114519738A (en) * | 2022-01-24 | 2022-05-20 | 西北工业大学宁波研究院 | Hand-eye calibration error correction method based on ICP algorithm |
CN114886567A (en) * | 2022-05-12 | 2022-08-12 | 苏州大学 | Method for calibrating hands and eyes of surgical robot with telecentric motionless point constraint |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104827480A (en) * | 2014-02-11 | 2015-08-12 | 泰科电子(上海)有限公司 | Automatic calibration method of robot system |
-
2022
- 2022-09-21 CN CN202211147000.5A patent/CN115229805B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111070199A (en) * | 2018-10-18 | 2020-04-28 | 杭州海康威视数字技术股份有限公司 | Hand-eye calibration assessment method and robot |
CN109465822A (en) * | 2018-10-22 | 2019-03-15 | 江苏集萃微纳自动化系统与装备技术研究所有限公司 | Based on 3D vision hand and eye calibrating method |
CN113246135A (en) * | 2021-06-03 | 2021-08-13 | 季华实验室 | Robot hand-eye calibration method and device, electronic equipment and storage medium |
CN113510708A (en) * | 2021-07-28 | 2021-10-19 | 南京航空航天大学 | Contact industrial robot automatic calibration system based on binocular vision |
CN114343847A (en) * | 2022-01-06 | 2022-04-15 | 广东工业大学 | Hand-eye calibration method of surgical robot based on optical positioning system |
CN114347027A (en) * | 2022-01-08 | 2022-04-15 | 天晟智享(常州)机器人科技有限公司 | Pose calibration method of 3D camera relative to mechanical arm |
CN114280153A (en) * | 2022-01-12 | 2022-04-05 | 江苏金晟元控制技术有限公司 | Intelligent detection robot for complex curved surface workpiece, detection method and application |
CN114519738A (en) * | 2022-01-24 | 2022-05-20 | 西北工业大学宁波研究院 | Hand-eye calibration error correction method based on ICP algorithm |
CN114886567A (en) * | 2022-05-12 | 2022-08-12 | 苏州大学 | Method for calibrating hands and eyes of surgical robot with telecentric motionless point constraint |
Non-Patent Citations (1)
Title |
---|
机器人视觉装配中的精确定位策略研究;吴广雨等;《机械与电子》;20200324(第03期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN115229805A (en) | 2022-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115229805B (en) | Hand-eye calibration method and device for surgical robot, storage medium and processor | |
CN113442169B (en) | Method and device for calibrating hands and eyes of robot, computer equipment and readable storage medium | |
US20230172679A1 (en) | Systems and methods for guided port placement selection | |
KR20160070006A (en) | Collision avoidance method, control device, and program | |
CN112603542B (en) | Hand-eye calibration method and device, electronic equipment and storage medium | |
CN111494009A (en) | Image registration method and device for surgical navigation and surgical navigation system | |
WO2023011339A1 (en) | Line-of-sight direction tracking method and apparatus | |
Li et al. | Autonomous multiple instruments tracking for robot-assisted laparoscopic surgery with visual tracking space vector method | |
CN113288358B (en) | Pose information determination method and device, electronic equipment and storage medium | |
US20200035122A1 (en) | Visual guidance system and method for posing a physical object in three dimensional space. | |
CN114886567A (en) | Method for calibrating hands and eyes of surgical robot with telecentric motionless point constraint | |
CN113524201A (en) | Active adjusting method and device for pose of mechanical arm, mechanical arm and readable storage medium | |
CN116019564B (en) | Knee joint operation robot and control method | |
CN115954096B (en) | Image data processing-based cavity mirror VR imaging system | |
CN116363093A (en) | Method and device for searching rotation center of acetabulum, operation planning system and storage medium | |
CN114587593B (en) | Surgical navigation positioning system and use method thereof | |
CN110458886A (en) | A kind of surgical navigational automation registration frame of reference | |
CN113119131B (en) | Robot control method and device, computer readable storage medium and processor | |
WO2021173044A1 (en) | Method for controlling a camera in a robotic surgical system | |
CN113580141A (en) | Pose solving method for 6-axis mechanical arm | |
CN111870346A (en) | Space registration method and device for robot and image equipment and electronic equipment | |
CN111544113A (en) | Target tracking and distance dynamic graphical display method and device in surgical navigation | |
Ho et al. | Supervised control for robot-assisted surgery using augmented reality | |
CN116459013B (en) | Collaborative robot based on 3D visual recognition | |
Leibrandt et al. | Designing robots for reachability and dexterity: Continuum surgical robots as a pretext application |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |