CN113547515A - Coordinate calibration method based on ultrasonic servo surgical robot - Google Patents

Coordinate calibration method based on ultrasonic servo surgical robot Download PDF

Info

Publication number
CN113547515A
CN113547515A CN202110808016.5A CN202110808016A CN113547515A CN 113547515 A CN113547515 A CN 113547515A CN 202110808016 A CN202110808016 A CN 202110808016A CN 113547515 A CN113547515 A CN 113547515A
Authority
CN
China
Prior art keywords
coordinate system
probe
tail end
guide frame
mechanical arm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110808016.5A
Other languages
Chinese (zh)
Other versions
CN113547515B (en
Inventor
赵兴炜
郑果
陶波
凌青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110808016.5A priority Critical patent/CN113547515B/en
Publication of CN113547515A publication Critical patent/CN113547515A/en
Application granted granted Critical
Publication of CN113547515B publication Critical patent/CN113547515B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/02Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type
    • B25J9/04Programme-controlled manipulators characterised by movement of the arms, e.g. cartesian coordinate type by rotating at least one arm, excluding the head movement itself, e.g. cylindrical coordinate type or polar coordinate type
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J18/00Arms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/021Optical sensing devices
    • B25J19/023Optical sensing devices including video camera means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Abstract

The invention belongs to the technical field of surgical robots and discloses a coordinate calibration method based on an ultrasonic servo surgical robot, which comprises the following steps: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a camera coordinate system; the method comprises the steps of adopting a stereoscopic vision high-precision positioning camera to photograph a guide frame, a probe, a mechanical arm tail end and an operation execution tail end respectively to obtain a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system, further obtaining transformation relations between the probe and the guide frame, between the guide frame and a mechanical arm base and between the operation execution tail end and the mechanical arm tail end coordinate system respectively, multiplying the transformation relations, and transforming step by step to obtain a transformation relation between the probe coordinate system and the operation execution tail end coordinate system. The coordinate calibration method and the coordinate calibration device can realize coordinate calibration of the detection end and the operation execution end intuitively and quickly, and realize unification of coordinates of the detection end and the operation execution end.

Description

Coordinate calibration method based on ultrasonic servo surgical robot
Technical Field
The invention belongs to the technical field of surgical robots, and particularly relates to a coordinate calibration method based on an ultrasonic servo surgical robot.
Background
Benign prostatic hyperplasia and the most common causes of urinary dysfunction in men are shown, the benign prostatic disease has the highest incidence among the elderly, and statistics show that the morbidity of the benign prostatic disease is about 16-25% in 50-65-year-old men; the prevalence rate of the male over 70 years old reaches 30% -46%, the increase of the prevalence rate is along with the increase of the age, transurethral prostate electrotomy is the main mode of treating prostatic hyperplasia at present, but the operation has the problems of safety, effectiveness, consistency and the like. With the gradual maturity of the surgical robot technology, the prostate surgical robot provides a brand new idea for the treatment of prostatic hyperplasia, but in the prior art, an ultrasonic probe is used for detecting images in real time, the two surgical robots cannot be synchronized well during the operation, the detected images cannot be timely fed back to the robot execution tail end, and the operation of the surgical robot is slower.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides the coordinate calibration method based on the ultrasonic servo surgical robot, which can intuitively and quickly realize the coordinate calibration of the detection end and the surgical execution tail end and realize the unification of the coordinates of the detection end and the surgical execution tail end.
To achieve the above object, according to one aspect of the present invention, there is provided a coordinate calibration method based on an ultrasonic servo surgical robot, the ultrasonic servo surgical robot including a robot arm, a surgical execution end disposed at an end of the robot arm, a guide frame, a motor disposed on the guide frame, and a probe, the motor driving the probe to rotate, the method including: s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system; s2: and the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the operation execution tail end coordinate system and the mechanical arm tail end coordinate system are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the operation execution tail end coordinate system.
Preferably, the method further comprises: s3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
Preferably, the step S2 includes: s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure BDA0003167272960000021
And a guide frame coordinate system
Figure BDA0003167272960000022
Wherein T is a 4 × 4 homogeneous matrix; s22: based on the probe coordinate system
Figure BDA0003167272960000023
And a guide frame coordinate system
Figure BDA0003167272960000024
Obtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
Figure BDA0003167272960000025
Figure BDA0003167272960000026
S23: adopt the standThe high-precision positioning camera for body vision shoots the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000027
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure BDA0003167272960000028
Figure BDA0003167272960000029
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail end
Figure BDA00031672729600000210
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm
Figure BDA00031672729600000211
Figure BDA00031672729600000212
Wherein the content of the first and second substances,
Figure BDA00031672729600000213
directly acquiring a transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the geometric structure of the mechanical arm; s25: according to the above
Figure BDA00031672729600000214
Then obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure BDA0003167272960000031
Figure BDA0003167272960000032
Preferably, the step S3 includes: definition ofThe two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and then the transformation relation between the two-dimensional image coordinate system and the probe coordinate system is obtained
Figure BDA0003167272960000033
Multiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
Figure BDA0003167272960000034
Figure BDA0003167272960000035
Preferably, step S21 specifically includes: step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame; step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera; step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation
Figure BDA0003167272960000036
And a guide frame coordinate system
Figure BDA0003167272960000037
Preferably, the positions of the first reflective target ball and the second reflective target ball after the rotation are combined in the step 3 are obtained in the camera coordinatesProbe coordinate system under arbitrary pose
Figure BDA0003167272960000038
And a guide frame coordinate system
Figure BDA0003167272960000039
The method specifically comprises the following steps: obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure BDA00031672729600000310
And a guide frame coordinate system
Figure BDA00031672729600000311
According to the initial probe coordinate system
Figure BDA00031672729600000312
And a guide frame coordinate system
Figure BDA00031672729600000313
Acquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose; when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Figure BDA00031672729600000314
Preferably, step S23 specifically includes: arranging a third light reflecting target ball at the tail end of the mechanical arm, photographing the third light reflecting target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000041
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000042
And a guide frame coordinate system
Figure BDA0003167272960000043
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure BDA0003167272960000044
Figure BDA0003167272960000045
Preferably, step S24 is specifically: setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate system
Figure BDA0003167272960000046
Coordinate system of the surgical execution terminal
Figure BDA0003167272960000047
And
Figure BDA0003167272960000048
and
Figure BDA0003167272960000049
converting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
Figure BDA00031672729600000410
Generally, compared with the prior art, the coordinate calibration method based on the ultrasonic servo surgical robot provided by the invention has the following beneficial effects:
1. the camera is arranged in the working space of the robot, the coordinate system of the camera is used as a reference, the coordinate systems of different positions of different surgical robots are collected by the camera to be converted with each other, and then the coordinate calibration of the probe and the execution tail end can be obtained.
2. Because the two-dimensional image coordinate system is overlapped with a plane of the probe coordinate system, the change relation between the two-dimensional image and the operation execution tail end can be easily obtained according to the transformation relation between the probe and the execution tail end, so that the unified calibration of the two-dimensional image coordinate system and the operation execution tail end coordinate system is realized.
Drawings
FIG. 1 is a schematic diagram of coordinate calibration of an ultrasonic servo-based surgical robot according to the present embodiment;
FIG. 2 is a step diagram of a coordinate calibration method based on an ultrasonic servo surgical robot according to the present embodiment;
fig. 3 is a schematic diagram of the coordinate calibration method according to the embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Referring to fig. 1, the present invention provides a coordinate calibration method based on an ultrasonic servo surgical robot, where the ultrasonic servo surgical robot includes a mechanical arm, a surgical execution end disposed at the end of the mechanical arm, a guide frame, a motor disposed on the guide frame, and a probe, and the motor drives the probe to rotate during operation. The calibration coordinate system of the present application relates to a two-dimensional IMAGE coordinate system (IMAGE), a PROBE coordinate system (PROBE), a GUIDE frame coordinate system (GUIDE), a robot BASE coordinate system (BASE), a robot END coordinate system (END), a continuum execution coordinate system (WORK), and a CAMERA coordinate system (CAMERA), and the relationship between the coordinate systems is shown in fig. 1 below. The two-dimensional image coordinate system and the three-dimensional probe coordinate system YZ plane or XZ plane are superposed, so that the two-dimensional image coordinate system and the three-dimensional probe coordinate system are not directly marked in the figure.
As shown in FIG. 2, the method includes the following steps S1-S2.
S1: and arranging a stereoscopic vision high-precision positioning camera obliquely above the operation execution tail end and the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system.
The camera coordinate system is set to an absolute coordinate system.
S2: and the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the operation execution tail end coordinate system and the mechanical arm tail end coordinate system are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the operation execution tail end coordinate system.
Specifically, the method comprises steps S21-S25.
S21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure BDA0003167272960000061
And a guide frame coordinate system
Figure BDA0003167272960000062
Where T is a 4 × 4 homogeneous matrix. The method specifically comprises the following steps 1-3.
Step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation
Figure BDA0003167272960000063
And a guide frame coordinate system
Figure BDA0003167272960000064
Specifically, an initial probe coordinate system of the probe is obtained according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure BDA0003167272960000065
And a guide frame coordinate system
Figure BDA0003167272960000066
Because the position of the origin-determined reflective target ball is determined, and the Z axis is determined, the initial probe coordinate system of the probe can be determined
Figure BDA0003167272960000067
And a guide frame coordinate system
Figure BDA0003167272960000068
According to the initial probe coordinate system
Figure BDA0003167272960000069
And leadGuide frame coordinate system
Figure BDA00031672729600000610
Acquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Figure BDA00031672729600000611
S22: based on the probe coordinate system
Figure BDA00031672729600000612
And a guide frame coordinate system
Figure BDA00031672729600000613
Obtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
Figure BDA00031672729600000614
Figure BDA00031672729600000615
S23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000071
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure BDA0003167272960000072
Figure BDA0003167272960000073
Specifically, the tail end of the mechanical arm is provided withPlacing a third reflective target ball, photographing the third reflective target ball at the tail end of the mechanical arm by using the stereoscopic vision high-precision positioning camera to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and simultaneously obtaining the transformation relation between the camera coordinate system and a mechanical arm base coordinate system according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000074
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure BDA0003167272960000075
And a guide frame coordinate system
Figure BDA0003167272960000076
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure BDA0003167272960000077
Figure BDA0003167272960000078
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail end
Figure BDA0003167272960000079
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm
Figure BDA00031672729600000710
Figure BDA00031672729600000711
Wherein the content of the first and second substances,
Figure BDA00031672729600000712
for the end coordinate system of a robot arm and a machineAnd the transformation relation of the arm base coordinate system is directly obtained according to the connecting rod parameters of the mechanical arm.
Specifically, a fourth reflective target ball is arranged at the operation execution end, and the stereoscopic vision high-precision positioning camera is adopted to photograph the fourth reflective target ball at the operation execution end to obtain an operation execution end coordinate system of the operation execution end in the camera coordinate system
Figure BDA00031672729600000713
Coordinate system of the surgical execution terminal
Figure BDA00031672729600000714
And
Figure BDA00031672729600000715
and
Figure BDA00031672729600000716
converting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
Figure BDA00031672729600000717
S25: according to the above
Figure BDA00031672729600000718
Then obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure BDA00031672729600000719
Figure BDA00031672729600000720
The method of the present application further includes transforming the transformation relationship between the probe coordinate system and the surgical performing tip coordinate system into a transformation relationship between the two-dimensional image coordinate system and the surgical performing tip coordinate system, i.e., as follows, step S3.
S3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
The coordinate system conversion relation is calibrated by means of representation of different coordinate systems through the same point. As shown in FIG. 3, the specific method is to know two coordinate systems A and B, and respectively record the coordinates of N same spatial points in A, B coordinate system
Figure BDA0003167272960000081
And
Figure BDA0003167272960000082
i is 1 … N, the recorded point set is used for constructing an auxiliary coordinate system C, and the transformation relation between the coordinate system A, B and the auxiliary coordinate system C can be obtained from the mapping relation of the point set
Figure BDA0003167272960000083
And
Figure BDA0003167272960000084
according to the formula
Figure BDA0003167272960000085
Can obtain
Figure BDA0003167272960000086
Namely the transformation relation between the coordinate systems A and B at the calibrated position
Figure BDA0003167272960000087
After a certain point in a two-dimensional image coordinate system is obtained through medical images, the point can be converted into a coordinate system of an operation execution tail end by using the calibration method, and then the operation execution tail end is guided to complete a corresponding operation.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. A coordinate calibration method based on an ultrasonic servo surgical robot is characterized in that the ultrasonic servo surgical robot comprises a mechanical arm, a surgical execution tail end arranged at the tail end of the mechanical arm, a guide frame, a motor arranged on the guide frame and a probe, wherein the motor drives the probe to rotate, and the method comprises the following steps:
s1: arranging a stereoscopic vision high-precision positioning camera at the operation execution tail end and obliquely above the probe, and acquiring a coordinate system of the stereoscopic vision high-precision positioning camera, namely a camera coordinate system;
s2: and the stereoscopic vision high-precision positioning camera is adopted to photograph the guide frame, the probe, the tail end of the mechanical arm and the operation execution tail end respectively, a guide frame coordinate system, a probe coordinate system, a mechanical arm coordinate system and an operation execution tail end coordinate system under a camera coordinate system are obtained, the transformation relations between the probe coordinate system and the guide frame coordinate system, between the guide frame coordinate system and the mechanical arm base coordinate system, between the operation execution tail end coordinate system and the mechanical arm tail end coordinate system are further obtained respectively, and the transformation relations are multiplied step by step to obtain the transformation relations between the probe coordinate system and the operation execution tail end coordinate system.
2. The method of claim 1, further comprising:
s3: and acquiring a transformation relation between a two-dimensional image coordinate system and the probe coordinate system, wherein the two-dimensional image coordinate system is superposed with a coordinate plane of the probe coordinate system, and multiplying the transformation relation between the probe coordinate system and the operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the operation execution tail end coordinate system.
3. The method according to claim 2, wherein the step S2 includes:
s21: the stereoscopic vision high-precision positioning camera is adopted to photograph the probe and the guide frame to obtain a probe coordinate system under any pose under a camera coordinate system
Figure FDA0003167272950000011
And a guide frame coordinate system
Figure FDA0003167272950000012
Wherein T is a 4 × 4 homogeneous matrix;
s22: based on the probe coordinate system
Figure FDA0003167272950000013
And a guide frame coordinate system
Figure FDA0003167272950000014
Obtaining a transformation relation between a probe coordinate system and a guide frame coordinate system
Figure FDA0003167272950000021
Figure FDA0003167272950000022
S23: the stereoscopic vision high-precision positioning camera is adopted to photograph the tail end of the mechanical arm to obtain the transformation relation between the camera coordinate system and the mechanical arm base coordinate system
Figure FDA0003167272950000023
Further obtaining the transformation relation between the mechanical arm base coordinate system and the guide frame coordinate system
Figure FDA0003167272950000024
Figure FDA0003167272950000025
S24: the stereoscopic vision high-precision positioning camera is adopted to photograph the operation execution tail end to obtain the coordinate system of the operation execution tail end
Figure FDA0003167272950000026
Obtaining a transformation relation between the coordinate system of the surgical execution end and the coordinate system of the end of the mechanical arm
Figure FDA0003167272950000027
Figure FDA0003167272950000028
Wherein the content of the first and second substances,
Figure FDA0003167272950000029
directly acquiring the transformation relation between a mechanical arm tail end coordinate system and a mechanical arm base coordinate system according to the connecting rod parameters of the mechanical arm;
s25: according to the above
Figure FDA00031672729500000210
Then obtaining the transformation relation between the coordinate system of the probe and the coordinate system of the operation executing end
Figure FDA00031672729500000211
Figure FDA00031672729500000212
4. The method according to claim 3, wherein the step S3 includes:
defining the coincidence of a coordinate plane of the two-dimensional image coordinate system and the probe coordinate system, and then obtaining the transformation relation between the two-dimensional image coordinate system and the probe coordinate system
Figure FDA00031672729500000213
Multiplying the transformation relation between the probe coordinate system and the surgical operation execution tail end coordinate system and the transformation relation between the two-dimensional image coordinate system and the probe coordinate system to obtain the transformation relation between the two-dimensional image coordinate system and the surgical operation execution tail end coordinate system
Figure FDA00031672729500000214
Figure FDA00031672729500000215
5. The method according to claim 3, wherein step S21 specifically comprises:
step 1, respectively arranging a first reflective target ball and a second reflective target ball on the probe and the guide frame;
step 2, keeping the guide frame from moving back and forth, starting a motor to drive the probe to rotate for a circle, simultaneously recording a plurality of points on the motion trail of the first reflective target ball by using the stereoscopic vision high-precision positioning camera, and recording the positions of the first reflective target ball and the second reflective target ball after rotation by using the stereoscopic vision high-precision positioning camera;
step 3, fitting the plurality of points to obtain a circle center serving as an origin of the guide frame coordinate system and the initial probe coordinate system, wherein a circle axis obtained by fitting is a Z axis of the guide frame coordinate system and the initial probe coordinate system; obtaining a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation
Figure FDA0003167272950000031
And a guide frame coordinate system
Figure FDA0003167272950000032
6. The method of claim 5, wherein the steps are performed in a batch processAcquiring a probe coordinate system under any pose under a camera coordinate system by combining the positions of the first reflective target ball and the second reflective target ball after rotation in step 3
Figure FDA0003167272950000033
And a guide frame coordinate system
Figure FDA0003167272950000034
The method specifically comprises the following steps:
obtaining an initial probe coordinate system of the probe according to the original point, the Z axis and the positions of the first reflective target ball and the second reflective target ball
Figure FDA0003167272950000036
And a guide frame coordinate system
Figure FDA0003167272950000035
According to the initial probe coordinate system
Figure FDA0003167272950000037
And a guide frame coordinate system
Figure FDA0003167272950000038
Acquiring initial rotation angles of an initial probe coordinate system and a guide frame coordinate system when the probe is in an initial pose;
when the probe moves to a certain pose, acquiring the motion variables of the forward and backward movement and the rotation around the Z axis of the probe, and correcting the motion variables by adopting the initial rotation angle to obtain a probe coordinate system of the probe in any pose
Figure FDA0003167272950000039
7. The method according to claim 3, wherein step S23 specifically comprises:
a third reflective target ball is arranged at the tail end of the mechanical arm,the stereoscopic vision high-precision positioning camera is adopted to photograph the third reflective target ball at the tail end of the mechanical arm to obtain the coordinate of the tail end of the mechanical arm in a camera coordinate system, and meanwhile, the transformation relation between the camera coordinate system and a mechanical arm base coordinate system is obtained according to the transformation relation between the mechanical arm tail end coordinate system and the mechanical arm base coordinate system
Figure FDA00031672729500000310
Transforming the camera coordinate system and the mechanical arm base coordinate system
Figure FDA00031672729500000311
And a guide frame coordinate system
Figure FDA00031672729500000312
Multiplying to obtain the transformation relation between the base coordinate system of the mechanical arm and the coordinate system of the guide frame
Figure FDA00031672729500000313
Figure FDA00031672729500000314
8. The method according to claim 3, wherein step S24 is specifically:
setting a fourth reflective target ball at the operation execution tail end, and photographing the fourth reflective target ball at the operation execution tail end by using the stereoscopic vision high-precision positioning camera to obtain an operation execution tail end coordinate system of the operation execution tail end in the camera coordinate system
Figure FDA00031672729500000315
Coordinate system of the surgical execution terminal
Figure FDA00031672729500000316
And
Figure FDA00031672729500000317
and
Figure FDA00031672729500000318
converting to obtain the transformation relation between the coordinate system of the operation execution end and the coordinate system of the end of the mechanical arm
Figure FDA0003167272950000041
CN202110808016.5A 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot Active CN113547515B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110808016.5A CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110808016.5A CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Publications (2)

Publication Number Publication Date
CN113547515A true CN113547515A (en) 2021-10-26
CN113547515B CN113547515B (en) 2022-07-12

Family

ID=78131970

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110808016.5A Active CN113547515B (en) 2021-07-16 2021-07-16 Coordinate calibration method based on ultrasonic servo surgical robot

Country Status (1)

Country Link
CN (1) CN113547515B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114137085A (en) * 2021-12-01 2022-03-04 仲恺农业工程学院 Ultrasonic flaw detection robot based on vision-assisted positioning and detection method thereof
CN115153782A (en) * 2022-08-12 2022-10-11 哈尔滨理工大学 Puncture robot space registration method under ultrasonic guidance
CN116473681A (en) * 2023-03-28 2023-07-25 北京维卓致远医疗科技发展有限责任公司 Control system and method of surgical robot

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
JP2016007277A (en) * 2014-06-23 2016-01-18 公立大学法人公立はこだて未来大学 Surgery support device and surgery support system
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN108778179A (en) * 2016-02-26 2018-11-09 思想外科有限公司 Method and system for instructing user positioning robot
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method
CN110202560A (en) * 2019-07-12 2019-09-06 易思维(杭州)科技有限公司 A kind of hand and eye calibrating method based on single feature point
CN110225720A (en) * 2017-01-13 2019-09-10 株式会社卓越牵引力 Operation auxiliary device and its control method, program and surgical assistant system
CN110279467A (en) * 2019-06-19 2019-09-27 天津大学 Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
US20200198145A1 (en) * 2018-12-19 2020-06-25 Industrial Technology Research Institute Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function
CN112525074A (en) * 2020-11-24 2021-03-19 杭州素问九州医疗科技有限公司 Calibration method, calibration system, robot, computer device and navigation system

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103706568A (en) * 2013-11-26 2014-04-09 中国船舶重工集团公司第七一六研究所 System and method for machine vision-based robot sorting
JP2016007277A (en) * 2014-06-23 2016-01-18 公立大学法人公立はこだて未来大学 Surgery support device and surgery support system
CN104858870A (en) * 2015-05-15 2015-08-26 江南大学 Industrial robot measurement method based on tail end numbered tool
CN108778179A (en) * 2016-02-26 2018-11-09 思想外科有限公司 Method and system for instructing user positioning robot
CN105716525A (en) * 2016-03-30 2016-06-29 西北工业大学 Robot end effector coordinate system calibration method based on laser tracker
CN110225720A (en) * 2017-01-13 2019-09-10 株式会社卓越牵引力 Operation auxiliary device and its control method, program and surgical assistant system
CN111070199A (en) * 2018-10-18 2020-04-28 杭州海康威视数字技术股份有限公司 Hand-eye calibration assessment method and robot
CN109199586A (en) * 2018-11-09 2019-01-15 山东大学 A kind of laser bone-culting operation robot system and its paths planning method
US20200198145A1 (en) * 2018-12-19 2020-06-25 Industrial Technology Research Institute Method and apparatus of non-contact tool center point calibration for a mechanical arm, and a mechanical arm system with said calibration function
CN110279467A (en) * 2019-06-19 2019-09-27 天津大学 Ultrasound image under optical alignment and information fusion method in the art of puncture biopsy needle
CN110202560A (en) * 2019-07-12 2019-09-06 易思维(杭州)科技有限公司 A kind of hand and eye calibrating method based on single feature point
CN112525074A (en) * 2020-11-24 2021-03-19 杭州素问九州医疗科技有限公司 Calibration method, calibration system, robot, computer device and navigation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘博健: "基于多模态感知的前列腺介入手术机器人控制", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114137085A (en) * 2021-12-01 2022-03-04 仲恺农业工程学院 Ultrasonic flaw detection robot based on vision-assisted positioning and detection method thereof
CN115153782A (en) * 2022-08-12 2022-10-11 哈尔滨理工大学 Puncture robot space registration method under ultrasonic guidance
CN116473681A (en) * 2023-03-28 2023-07-25 北京维卓致远医疗科技发展有限责任公司 Control system and method of surgical robot
CN116473681B (en) * 2023-03-28 2024-02-20 北京维卓致远医疗科技发展有限责任公司 Control system and method of surgical robot

Also Published As

Publication number Publication date
CN113547515B (en) 2022-07-12

Similar Documents

Publication Publication Date Title
CN113547515B (en) Coordinate calibration method based on ultrasonic servo surgical robot
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN110728715A (en) Camera angle self-adaptive adjusting method of intelligent inspection robot
CN109974584A (en) The calibration system and scaling method of a kind of auxiliary laser bone-culting operation robot
CN111012506B (en) Robot-assisted puncture surgery end tool center calibration method based on stereoscopic vision
CN111156925A (en) Three-dimensional measurement method for large component based on line structured light and industrial robot
CN109794963B (en) Robot rapid positioning method facing curved surface component
CN108098762A (en) A kind of robotic positioning device and method based on novel visual guiding
CN109719438A (en) A kind of industrial welding robot welding line automatic tracking method
CN111300384B (en) Registration system and method for robot augmented reality teaching based on identification card movement
CN113133832B (en) Calibration method and system for double-arm robot puncture system
CN113208731B (en) Binocular vision system-based hand and eye calibration method for surgical puncture robot
CN114289934B (en) Automatic welding system and method for large structural part based on three-dimensional vision
WO2020063058A1 (en) Calibration method for multi-degree-of-freedom movable vision system
CN114211484B (en) Front-end tool pose synchronization method, electronic equipment and storage medium
CN114777676A (en) Self-adaptive terahertz three-dimensional tomography device and method
CN112958974A (en) Interactive automatic welding system based on three-dimensional vision
CN110421565B (en) Robot global positioning and measuring system and method for practical training
CN112743546B (en) Robot hand-eye calibration pose selection method and device, robot system and medium
CN114046889A (en) Automatic calibration method of infrared camera
CN112022343B (en) Intelligent laser speckle removing system
US20230181263A1 (en) Dynamic 3d scanning robotic laparoscope
CN110051433B (en) Method for keeping track of target and application thereof in image-guided surgery
Yang et al. Research on Positioning of Robot based on Stereo Vision
CN115488889A (en) Welding method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant