CN117618111A - Visual field control device and visual field control method based on instrument tracking - Google Patents

Visual field control device and visual field control method based on instrument tracking Download PDF

Info

Publication number
CN117618111A
CN117618111A CN202311637181.4A CN202311637181A CN117618111A CN 117618111 A CN117618111 A CN 117618111A CN 202311637181 A CN202311637181 A CN 202311637181A CN 117618111 A CN117618111 A CN 117618111A
Authority
CN
China
Prior art keywords
endoscope
instrument
robot
reference area
rcm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311637181.4A
Other languages
Chinese (zh)
Inventor
黎斌
刘云辉
张德康
石照辉
钟仿洵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Hong Kong Institute Of Innovation Chinese University Of Hong Kong Futian
Original Assignee
Shenzhen Hong Kong Institute Of Innovation Chinese University Of Hong Kong Futian
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Hong Kong Institute Of Innovation Chinese University Of Hong Kong Futian filed Critical Shenzhen Hong Kong Institute Of Innovation Chinese University Of Hong Kong Futian
Priority to CN202311637181.4A priority Critical patent/CN117618111A/en
Publication of CN117618111A publication Critical patent/CN117618111A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00059Operational features of endoscopes provided with identification means for the endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00174Optical arrangements characterised by the viewing angles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Robotics (AREA)
  • Endoscopes (AREA)

Abstract

The invention discloses a visual field control device based on instrument tracking, which comprises an endoscope, data processing and control equipment and a robot, and also provides a visual field control method based on instrument tracking, comprising the following steps: s1, identifying and tracking an instrument; s2, judging whether the instrument is positioned in an ideal visual field area or not; s3, calculating the execution speed of the robot; s4, robot action; s5, judging whether the operation is continued or not; s6, ending. The invention has the beneficial effects that: according to the endoscope adjusting device, in the process of adjusting the endoscope, the instrument returns to the inner plane reference area from the outer plane reference area, namely, the movement of the endoscope is stopped, so that the fine adjustment of the endoscope can be realized, the problem of unstable lenses is effectively solved, and the wrong direction of the view of the endoscope can be prevented.

Description

Visual field control device and visual field control method based on instrument tracking
Technical Field
The invention relates to the technical field of endoscope visual field control, in particular to a visual field control device and a visual field control method based on instrument tracking.
Background
In recent years, it has become a popular trend to design robots to manipulate endoscopes, and these systems still need to be manipulated by a separate lens holding assistant, or driven by external signals from the surgeon, such as eye gaze tracking, voice control, etc. However, additional lens holding assistants still present a misunderstanding problem among the surgeons who need to constantly communicate with the assistant to adjust the field of view, which also distracts the surgeon and increases the surgical burden.
The patent application document of China with the publication number of CN113143461A discloses a man-machine cooperation type minimally invasive endoscope holding robot system, and relates to the field of endoscopes. Obtaining an endoscope view and a robot pose; acquiring the states of all joints of the robot, and solving the pose of the endoscope camera according to the forward kinematics principle of the robot; detecting the position of the tip of the surgical instrument in the endoscope view by using a YOLOv3 algorithm; acquiring the distance from the surgical instrument tip to the center point of the view based on the position of each surgical instrument tip in the view; acquiring a visual tracking vector based on the surgical instrument tip position, the distance from the surgical instrument tip to the view center point, and the camera parameters; obtaining an endoscope insertion in-vivo distance constraint vector; and acquiring the joint speeds of the cooperative mechanical arm based on the vision tracking vector and the endoscope insertion in-vivo distance constraint vector. Acquiring the positioning of a surgical instrument tip in an endoscope view by using a YOLOv3 algorithm, and solving the pose of an endoscope camera through robot positive kinematic modeling; in the case where the camera parameters are known, the tracking controller outputs a camera motion vector. In order to ensure that the surgical instrument is always in the endoscope view, the tracking controller judges whether to change the pose of the camera according to the distance between the tip of the surgical instrument and the center point of the view. Meanwhile, in order to ensure the stability of the endoscope view, if the distance between the center of the surgical instrument tip and the center of the view is smaller than a threshold r, the endoscope is not moved. Conversely, the endoscope needs to be moved according to the position of the surgical instrument tip in the view and the distance of the surgical instrument tip from the center of the view. The man-machine-coordinated minimally invasive endoscope holding robot system is easy to cause the problem of overlarge endoscope movement frequency because the endoscope holding robot system moves to the view center each time, and inconvenient to observe if the threshold value r is too small, and causes the problem of inconvenient adjustment of the endoscope field of view because the distance and the angle of the endoscope field of view are too large each time and the view angle is difficult to adjust to a proper position. And the problem of misorientation of the endoscope in the visual field direction caused by rotation in the movement process of the robot cannot be solved, so that the operation direction of the operation instrument is staggered from the visual field direction of the endoscope.
Disclosure of Invention
In order to solve the above technical problems, an object of the present invention is to provide a visual field control device based on instrument tracking, which includes an endoscope, a data processing and controlling device, a robot, and a visual field control method based on instrument tracking, including the following steps: s1, identifying and tracking an instrument; s2, judging whether the instrument is positioned in an ideal visual field area or not; s3, calculating the execution speed of the robot; s4, robot action; s5, judging whether the operation is continued or not; s6, ending. The visual field control device based on instrument tracking has the advantages of being convenient for adjusting the visual field of the endoscope and preventing the visual field of the endoscope from misorientation.
In order to achieve the above purpose, the technical scheme adopted by the invention is as follows:
the visual field control device based on instrument tracking comprises an endoscope, a data processing and controlling device and a robot, wherein the data processing and controlling device is respectively in communication connection with the endoscope and the robot, and the data processing and controlling device is provided with a planner;
the endoscope is used for collecting image data and sending the image data to the data processing and control equipment;
the data processing and control equipment is used for receiving image data, the image data comprises view pictures, and the identification and tracking of the instrument are completed through the image data; the planner adds an inner plane reference area and an outer plane reference area in a view picture, wherein the inner plane reference area is positioned in the outer plane reference area; the data processing and control device executes a robot control algorithm based on the identification and tracking of the instrument; linear velocity of endoscope relative to RCM coordinate systemAnd the angular velocity of the endoscope relative to the RCM coordinate system +.>The following calculation formula is satisfied:
wherein K is r Representing control RCM error e r Positive fixed gain matrix, K of (t) s Representing control instrument distance planning center frame error e p Positive definite gain matrix of (t), J fov (t)=J img ·J d ∈R 2×4 Jacobian matrix representing control field of view, I 4 Representing a 4x4 identity matrix,representative matrix J * In the form of a pseudo-inverse of (c),based on the Z-axis as the basis vector e 3 And translation vector from endoscope to RCM frame +.> Indicating the speed of controlling the corresponding depth direction of the z-axis,/-, for example>E for controlling the angular velocity of rotation about the z-axis for counteracting the viewing-direction misdirection r (t) RCM error representing distance of distal stationary point from endoscope, e r (t) =0, the data processing and control device is controlled by +.>Calculating the end execution speed of the robot, converting the end execution speed into instruction information containing the speed of each joint of the robot through inverse kinematics of the robot, and transmitting the instruction information to the robot;
the robot is used for moving according to instruction information and driving the endoscope to move.
By such arrangement: the endoscope movement is stopped when the instrument returns to the inner plane reference area from the outer plane reference area, so that the fine adjustment of the endoscope can be realized, the problem of unstable lens is effectively solved, the angle of the lens can be more conveniently adjusted to a proper direction, and the advantage of conveniently controlling the visual field is achieved.
The constraint of establishing a remote fixed point for the movement of the endoscope is realized, so that the safety in the process of automatically controlling the movement of the endoscope by the robot can be effectively improved when the robot drives the endoscope to move, and the advantage of higher safety is achieved. In addition, the main objective of the visual servo controller is to adjust the two-dimensional position of the instrument in the image space, and the auxiliary objective is to adjust the depth error (namely the zoom level error) and the direction error generated when the robot moves to drive the endoscope to rotate.
The field of view control method based on instrument tracking adopts a field of view control device based on instrument tracking, wherein the field of view control device based on instrument tracking comprises an endoscope, data processing and control equipment and a robot, and the data processing and control equipment is provided with a planner;
the method comprises the following steps:
s1, identifying and tracking the instrument: acquiring image data by using the endoscope and sending the image data to a control module, wherein the control module is used for identifying and tracking the instrument through the image data;
s2, judging whether the instrument is positioned in an ideal visual field area: the image data comprises a view picture, the planner adds an inner plane reference area and an outer plane reference area in the view picture, the inner plane reference area is positioned in the outer plane reference area, if the instrument is positioned outside the outer plane reference area, the planner plans the movement path of the endoscope based on the instrument position and enters step S3; if the instrument is located in the external plane reference area, step S5 is performed;
s3, calculating the execution speed of the robot:
linear velocity of endoscope relative to RCM coordinate systemAnd the angular velocity of the endoscope relative to the RCM coordinate system +.>The following calculation formula is satisfied:
wherein K is r Representing control RCM error e r Positive fixed gain matrix, K of (t) s Representing control instrument distance planning center frame error e p Positive definite gain matrix of (t), J fov (t)=J img ·J d ∈R 2×4 Jacobian matrix representing control field of view (fov), I 4 Units representing 4x4The matrix is formed by a matrix of,representative matrix J * In the form of a pseudo-inverse of (c),based on the Z-axis as the basis vector e 3 And translation vector from endoscope to RCM frame +.> Indicating the speed of controlling the corresponding depth direction of the z-axis,/-, for example>E for controlling the angular velocity of rotation about the z-axis for counteracting the viewing-direction misdirection r (t) RCM error representing distance of distal stationary point from endoscope, e r (t) =0 byCalculating the end execution speed of the robot;
s4, robot action: the data processing and controlling equipment converts the execution speed of the tail end of the robot into the speed q of each joint of the robot through inverse kinematics of the robot j And sending the motion information to the robot to enable the robot to move until the instrument in the view picture moves into the internal plane reference area;
s5, judging whether the operation is continued or not: if the operation is continued, the step S1 is carried out, and if the operation is finished, the step S6 is carried out;
s6, ending.
By such arrangement: the endoscope movement is stopped when the instrument returns to the inner plane reference area from the outer plane reference area, so that the fine adjustment of the endoscope can be realized, the problem of unstable lens is effectively solved, the angle of the lens can be more conveniently adjusted to a proper direction, and the advantage of conveniently controlling the visual field is achieved.
The constraint of establishing a remote fixed point for the movement of the endoscope is realized, so that the safety in the process of automatically controlling the movement of the endoscope by the robot can be effectively improved when the robot drives the endoscope to move, and the advantage of higher safety is achieved. In addition, the main objective of the visual servo controller is to adjust the two-dimensional position of the instrument in the image space, and the auxiliary objective is to adjust the depth error (namely the zoom level error) and the direction error generated when the robot moves to drive the endoscope to rotate.
Preferably, in the step S2, the method further includes the steps of:
the size scale=max (scale x ,scale y )
Wherein the method comprises the steps ofx e Representing the distance of the instrument from the external planar reference area in the x-axis direction, y e Representing the distance of the instrument from the outer planar reference area in the y-axis direction.
By such arrangement: when the distance between the instrument and the external plane reference area is smaller, the moving distance of the instrument on the lens picture is reduced, and the stability of the lens picture is improved; when the distance between the instrument and the external plane reference area is larger, the movement amplitude of the endoscope to the instrument returning to the internal plane reference area is increased, and the function of quickly adjusting the picture orientation is achieved. Improves the control efficiency of the endoscope and achieves the advantage of convenient control of the visual field.
Preferably, in the step S2, the method further includes the steps of:
the planner adds an internal depth reference area and an external depth reference area, wherein the internal depth reference area is positioned between the external depth reference area and the endoscope, and if the instrument is positioned outside the external depth reference area, the planner plans the movement path of the endoscope based on the instrument position and enters step S3; if the instrument is located in the external depth reference area, proceeding to step S5;
in the step S4, the method further includes the steps of:
until the instrument moves into the interior depth reference zone.
By such arrangement: the problem of unstable lens is effectively reduced, and the angle of the lens can be more conveniently adjusted to a proper direction, so that the advantage of conveniently controlling the visual field is achieved.
Preferably, in the step S2, the method further includes the steps of:
interval length of the external depth reference region
Interval length of the internal depth reference region
Wherein Depth is m Representing intermediate values of the inner depth reference region and the outer depth reference region, R representing a length ratio of the inner depth reference region and the outer depth reference region, and R representing a length of the outer depth reference region.
By such arrangement: when the distance between the instrument and the external depth reference area is smaller, the moving distance of the instrument on the lens picture is reduced, and the stability of the lens picture is improved; when the distance between the instrument and the external depth reference area is larger, the movement amplitude of the endoscope to the instrument which returns to the internal depth reference area is increased, and the function of quickly adjusting the picture orientation is achieved. Improves the control efficiency of the endoscope and achieves the advantage of convenient control of the visual field.
Preferably, in the step S3, the method further includes the steps of:
the data processing and control device calculates the end execution speed of the robot through the following calculation methods
Wherein the method comprises the steps ofRepresenting the rotation matrix from the RCM coordinate system to the UR5 basic coordinate system, +.>Representing the translational component of the UR5 end effector to the RCM coordinate system, I 3 Represents a 3x3 identity matrix, skew (·) represents the form of a vector in a skewed symmetric matrix, +.>Represents the linear velocity of the endoscope camera relative to the RCM coordinate system, < >>The angular velocity of the endoscope camera relative to the RCM coordinate system is shown.
By such arrangement: the function of the data processing and control equipment according to the execution speed of the computing terminal is realized, so that the safety is ensured, and the function of conveniently controlling the visual field is realized.
Preferably, in the step S3, the method further includes the steps of:
the soft RCM mechanism is adopted, the endoscope moves around a far-end fixed point through kinematic control, the origin of a robot reference coordinate system is recorded as b, the end point is e, the endoscope rod clamping point s, the position c of an endoscope camera and the far-end fixed point r, and the pose of the end point of the robot is calculated through positive kinematicsCalibrating relative pose of endoscope camera from end point of robot>
Calculating the pose of an endoscope cameraClamping point pose->Wherein->From a rotation matrix of a unit array and a length of a z direction of l laparoscope Is composed of reference vectors of (2);
and the same applies to the length of the endoscope rod rcm Pose of stationary pointObtaining the distance of the stationary point from the endoscope shaft as RCM error +.>
By such arrangement: the constraint limit of the remote fixed point is realized, and the effect of improving the safety is achieved.
Preferably, in the step S3, the method further includes the steps of:
the motion of the endoscope meets the rotation angle constraint of the following calculation formula:
wherein the method comprises the steps ofIndicating the reference direction at the beginning of the endoscope movement, < >>Is the direction of the axial rotation angle theta of the endoscope, A θ ∈R 2×2 ,t θ ∈R 2×1 Representing affine matrix and 2D image displacement at the endoscope axial rotation angle θ, respectively.
By such arrangement: the minimized rotation constraint based on bionic mapping is realized, and the effect of minimizing visual servo misorientation in the robot motion process is achieved.
Preferably, in the step S3, the method further includes the steps of:
A θ =UDV T =(UV T )(VDV T )=R(φ)·(VDV T )
wherein U, D, V T Representative matrix A θ Three matrices of singular value decomposition: u is a 2×2 unitary matrix, D is a 2×2 order non-negative real diagonal matrix, V T Is the conjugate transpose of V; r (phi) represents an azimuthal offset caused by the axial rotation angle θ of the endoscope.
By such arrangement: minimization of visual servo misorientation can be achieved by finding an optimal θ.
Preferably, in the step S3, the method further includes the steps of:
according to theta * =argmin θ Phi finds a theta * To minimize |phi|.
By such arrangement: the minimization of |phi| is achieved so that an optimal theta can be found to minimize visual servo misdirection.
Compared with the prior art, the invention has the beneficial technical effects that:
1. when the instrument is only a small distance beyond the outer planar reference area, the angle of the endoscope need only be adjusted by a small amount. According to the endoscope adjusting device, in the process of adjusting the endoscope, the instrument returns to the internal plane reference area from the external plane reference area to stop the movement of the endoscope, so that the fine adjustment of the endoscope can be realized, the problem of unstable lenses is effectively solved, the angle of the lenses can be adjusted to a proper direction more conveniently, and the advantage of conveniently controlling the visual field is achieved.
2. Since the endoscope is robotically held, a specific robot controller strategy is required to acquire control targets, including ideal position and depth errors, from the images. Based on image visual servo, the movement speed of the camera plane is obtained based on the instrument position tracked by the front section and the target position planned in real time as information differences, and the speed of the camera plane is transmitted to the robot for speed control by adopting a method based on soft RCM constraint and a robot coordinate conversion relation in consideration of the restriction of the movement of the endoscope to RCM. The constraint of establishing a remote fixed point for the movement of the endoscope is realized, so that the safety in the process of automatically controlling the movement of the endoscope by the robot can be effectively improved when the robot drives the endoscope to move, and the advantage of higher safety is achieved. In addition, the main objective of the visual servo controller is to adjust the two-dimensional position of the instrument in the image space, and the auxiliary objective is to adjust the depth error (namely the zoom level error) and the direction error generated when the robot moves to drive the endoscope to rotate.
3. When the distance between the instrument and the external plane reference area is smaller, the moving distance of the instrument on the lens picture is reduced, and the stability of the lens picture is improved; when the distance between the instrument and the external plane reference area is larger, the movement amplitude of the endoscope to the instrument returning to the internal plane reference area is increased, and the function of quickly adjusting the picture orientation is achieved. Improves the control efficiency of the endoscope and achieves the advantage of convenient control of the visual field.
Drawings
FIG. 1 is a schematic diagram of a system architecture of a field of view control device based on instrument tracking according to an embodiment of the present invention;
FIG. 2A is a schematic illustration of an inner planar reference area and an outer planar reference area in an embodiment of the invention;
FIG. 2B is a schematic view of an instrument positioned within an external planar reference area in an embodiment of the present invention;
FIG. 2C is a schematic view of an instrument positioned outside of an external planar reference area in an embodiment of the present invention;
FIG. 2D is a schematic diagram of an inner depth reference region and an outer depth reference region in an embodiment of the invention;
fig. 3 is another isometric view of a saw blade cutter according to an embodiment of the present invention.
Wherein, the technical characteristics that each reference sign indicates are as follows:
11. a display; 12. a remote control device; 13. a data processing and controlling device; 14. a robot; 15. an endoscope; 16. a holder; 17. an instrument; 21. an interior planar reference region; 22. an outer planar reference region; 23. an internal depth reference region; 24. an external depth reference region.
Detailed Description
The present invention will be further described in detail with reference to the following examples, for the purpose of making the objects, technical solutions and advantages of the present invention more apparent, but the scope of the present invention is not limited to the following specific examples.
Referring to fig. 1, a visual field control device based on instrument tracking includes an endoscope 15, a data processing and control device 13, and a robot 14, wherein the data processing and control device 13 is in communication connection with the endoscope 15 and the robot 14, respectively, and the data processing and control device 13 is provided with a planner. The data processing and controlling device 13 is a computer, the computer is connected with the display 11 and the remote control device 12 in a communication way, the computer displays a view picture through the display 11, and instructions are input to the computer through the remote control device 12 so as to realize the control of the computer. Robot 14 is coupled to endoscope 15 via gripper 16 such that robot 14 can move endoscope 15.
The endoscope 15 is used to collect image data and transmit the image data to the data processing and control device 13.
The data processing and controlling device 13 is used for receiving image data, wherein the image data comprises view pictures, an instrument 17 target detection model is obtained based on a pre-trained deep learning model YOLO, and identification and tracking of the instrument 17 are completed through the image data; the planner adds an inner planar reference area 21 and an outer planar reference area 22 in the view, the inner planar reference area 21 being located within the outer planar reference area 22; the data processing and control device 13 calculates the end execution speed of the robot 14, converts the end execution speed into instruction information including the speeds of the joints of the robot 14 by inverse kinematics of the robot 14, and transmits the instruction information to the robot 14.
The robot 14 is used to move and drive the endoscope 15 to move according to the instruction information.
A visual field control method based on instrument tracking adopts a visual field control device based on instrument tracking, and the method comprises the following steps:
s1, identifying and tracking the instrument: image data is acquired using the endoscope 15 and sent to a control module which, by means of the image data and completing the identification and tracking of the instruments 17, if there are a plurality of instruments 17, sets one of the instruments 17 as the master instrument 17 and the subsequent field of view is controlled according to the position of this master instrument 17 in the image.
S2, judging whether the instrument is positioned in an ideal visual field area: the image data includes a view picture in which the planner adds an inner planar reference area 21 and an outer planar reference area 22, the inner planar reference area 21 being located within the outer planar reference area 22, the size scale=max (scale x ,scale y )
Wherein the method comprises the steps ofx e Representing the distance of the instrument 17 from the outer planar reference area 22 in the x-axis direction, y e Representing the distance of the instrument 17 from the outer planar reference area 22 in the y-axis direction;
the planner adds an inner depth reference region 23 and an outer depth reference region 24, the inner depth reference region 23 being located between the outer depth reference region 24 and the endoscope 15, the section length of the outer depth reference region 24
Interval length of the internal depth reference region 23
Wherein Depth is m Representing intermediate values of the inner depth reference region 23 and the outer depth reference region 24, R representing a length ratio of the inner depth reference region 23 and the outer depth reference region 24, R representing a length of the outer depth reference region 24;
if the instrument 17 is located outside the outer planar reference area 22 and/or if the instrument 17 is located outside the outer depth reference area 24, planning a path of movement of the endoscope 15 based on the position of the instrument 17 and proceeding to step S3; if the instrument 17 is located in the outer planar reference area 22 and the instrument 17 is located in the outer depth reference area 24, step S5 is entered.
S3, calculating the execution speed of the robot: as shown in FIG. 3, the axis x' i ,y′ i Representing the direction of adjustment with minimal rotational constraint, coordinate systems xoy and x when endoscope 15 motion is constrained by a distal stationary point during minimally invasive surgery i oy i There is always a visual misalignment between, where x 0 ,y 0 Representing the direction of the doctor's view at the beginning, x i ,y i Representing the conventional visual field direction based on visual servo control, a minimized rotation constraint based on biomimetic mapping is adopted to solve the problem of inconsistent visual field direction, and the motion of the endoscope 15 conforms to the rotation angle constraint of the following calculation formula:
wherein the method comprises the steps ofIndicating the reference direction, < > for the initial movement of the endoscope 15>Is the direction of the axial rotation angle theta of the endoscope 15, A θ ∈R 2×2 ,t θ ∈R 2×1 Representing the affine matrix and the 2D image displacement at the endoscope 15 axial rotation angle θ, respectively;
wherein the displacement t θ This can be achieved by moving the plane of endoscope 15;
pair A θ The following singular value decomposition is performed:
A θ =UDV T =(UV T )(VDV T )=R(φ)·(VDV T )
wherein U, D, V T Representative matrix A θ Three matrices of singular value decomposition: u is a 2×2 unitary matrix, D is a 2×2 order non-negative real diagonal matrix, V T Is the conjugate transpose of V; r (phi) represents an azimuthal offset caused by the axial rotation angle θ of the endoscope 15;
in order to minimize misalignment related to camera rotation, our goal is to find a θ * To minimize |phi| according to theta * =argmin θ Phi finds a theta * To minimize |φ|;
the endoscope 15 moves around a far-end fixed point by adopting a soft RCM mechanism through kinematic control, the origin of a reference coordinate system of the robot 14 is recorded as b, the end point is e, the clamping point s of a rod of the endoscope 15, the position c of a camera of the endoscope 15, the far-end fixed point r and the pose of the end point of the robot 14 are calculated through positive kinematicsCalibrating relative pose of camera of endoscope 15 away from end point of robot 14 +.>
Calculating the pose of the camera of the endoscope 15Clamping point position and posture/>Wherein->From a rotation matrix of a unit array and a length of a z direction of l laparoscope Is composed of reference vectors of (2);
and the same applies to the length l of the rod along the endoscope 15 rcm Pose of stationary pointObtain the distance of the stationary point from the shaft of the endoscope 15 as RCM error +.>Wherein the three-dimensional space position of the far-end fixed point is r #>Vector of distal stationary point r pointing to endoscope 15 shaft holding point s +.>For the vector of the distal stationary point r pointing towards the camera position c of the endoscope 15,is the length of the endoscope 15 rod clamping point s from the camera position c of the endoscope 15.
Linear velocity of endoscope 15 relative to RCM coordinate systemAnd the angular velocity of the endoscope 15 relative to the RCM coordinate system +.>The following calculation formula is satisfied:
wherein K is r Representing control RCM error e r Positive fixed gain matrix, K of (t) s Representing the control instrument 17 distance planning center error e p Positive definite gain matrix of (t), J fov (t)=J img ·J d ∈R 2×4 Jacobian matrix representing control field of view (fov), I 4 Representing a 4x4 identity matrix,representative matrix J * Is in pseudo-inverse form,/-> Based on the Z-axis as the basis vector e 3 And translation vector from endoscope 15 to RCM frame +.>Indicating the speed of controlling the corresponding depth direction of the z-axis,/-, for example>E for controlling the angular velocity of rotation about the z-axis for counteracting the viewing-direction misdirection r (t) RCM error representing the distance of the distal stationary point from the endoscope 15; by movement control such that e r (t) =0, ensuring the constraint of the endoscope 15 motion at the stationary point, while minimizing the misorientation of the visual servoing during the motion of the robot 14; by passing throughCalculating an end execution speed of the robot 14;
the data processing and control device 13 calculates the end execution speed of the robot 14 by the following calculation method
Wherein the method comprises the steps ofRepresenting the rotation matrix from the RCM coordinate system to the UR5 basic coordinate system, +.>Representing the translational component of the UR5 end effector to the RCM coordinate system, 1 3 Represents a 3x3 identity matrix, skew (·) represents the form of a vector in a skewed symmetric matrix, +.>Representing the linear speed of the camera of the endoscope 15 relative to the RCM coordinate system, ±>The angular velocity of the camera of endoscope 15 relative to the RCM coordinate system is shown.
S4, robot action: the data processing and control device 13 performs the end execution speed of the robot 14 by the robot 14 inverse kinematicsThe velocity qi of each joint of the robot 14 is converted and sent to the robot 14 to move the robot 14 until the instrument 17 moves into the internal planar reference area 21 and the internal depth reference area 23 in the view.
S5, judging whether the operation is continued or not: if the operation is continued, the process proceeds to step S1, and if the operation is completed, the process proceeds to step S6.
S6, ending.
The invention provides a zero-space-based multitasking vision servo control strategy, which ensures that the position in the visual field is as consistent as possible with the actual position of a surgeon while realizing proper visual field control, and reduces the extra operation burden caused by visual field dislocation. Whether the current view is appropriate is determined by considering the image position and depth of the instrument 17, here with reference to the external planar reference region 22 and the external depth reference region 24, to determine whether the view needs adjustment. If the position of the instrument 17 on the image or the depth between the instrument 17 and the endoscope 15 exceeds the reference area, the automatic control strategy will automatically move the endoscope 15 back to the proper position and depth range for the instrument 17. In order to avoid the problem of lens instability caused by high frequency automatic control triggered when the instrument 17 is operated on both sides of the reference area, an inner planar reference area 21 and an inner depth reference area 23 are added, and the previous outer planar reference area 22 and outer depth reference area 24 are combined as control references.
As shown in fig. 2A, 2B, 2C, two-dimensional position control in the image space. If the instrument 17 is within the outer planar reference area 22, the endoscope 15 will remain stationary, but once the instrument 17 is outside the outer planar reference area 22, the endoscope 15 will be controlled such that the instrument 17 returns to the inner planar reference area 21 at a minimum distance, so the lens control instability problem described above can be avoided. As shown in fig. 2D, the depth direction is controlled in a manner consistent with the logic of the plane direction: if the external depth reference area 24 between the instrument 17 and the endoscope 15 is inside, the endoscope 15 will remain stationary, but once the depth is outside the external depth reference area 24, the endoscope 15 is controlled to return the depth to the internal depth reference area 23 at a minimum distance. Only when the instrument 17 is again outside the external depth reference area 24 will the endoscope 15 be controlled.
The shapes of the outer plane reference area and the outer depth reference area can be set individually according to the use requirement, and the outer plane reference area and the outer depth reference area can be rectangular, square or elliptic and the like, the number of the outer plane reference area and the outer depth reference area is not limited to 1, and a plurality of outer plane reference areas and outer depth reference areas can be arranged at different positions, so that the use requirement of simultaneous operation of multiple instruments 17 at different positions can be met.
This embodiment has the following advantages:
when instrument 17 is only a small distance beyond outer planar reference area 22, it is only necessary to adjust the angle of endoscope 15 by a small amount. In the process of adjusting the endoscope 15, the endoscope 15 is stopped to move after the instrument 17 returns to the inner plane reference area 21, so that the fine adjustment of the endoscope 15 can be realized, the problem of unstable lens is effectively solved, the angle of the lens can be more conveniently adjusted to a proper direction, and the advantage of conveniently controlling the visual field is achieved.
Since endoscope 15 is held by robot 14, a specific robot 14 controller strategy is required to obtain control targets, including ideal position and depth errors, from the images. Based on the image visual servo, the movement speed of the camera plane is obtained based on the position of the instrument 17 tracked by the front section and the target position planned in real time as information differences, and the speed of the camera plane is transmitted to the robot 14 for speed control by adopting a method based on soft RCM constraint and a robot 14 chemical coordinate conversion relation in consideration of the restriction of the movement of the endoscope 15 by RCM. The restraint of the remote fixed point is established for the movement of the endoscope 15, so that the safety in the process of automatically controlling the movement of the endoscope 15 by the robot 14 can be effectively improved when the robot 14 drives the endoscope 15 to move, and the advantage of higher safety is achieved. In addition, as the main objective of the vision servo controller is to adjust the two-dimensional position of the instrument 17 in the image space, the auxiliary objective is to adjust the depth error (namely the zoom level error) and the direction error generated when the robot 14 moves to drive the endoscope 15 to rotate, and the speed of the depth direction corresponding to the z axis and the misdirection angular speed of the view direction are controlled to be counteracted by rotating around the z axis, the invention realizes the minimization of the misdirection of the endoscope 15 in the view direction caused by rotation in the moving process of the robot 14, and achieves the advantage of convenient control of the view.
The following relationship exists between the movement amplitude of the endoscope 15 and the position of the main instrument 17 on the screen in the actual use process: when the main instrument 17 moves near the center of the picture, if the distance exceeds the preset external plane reference area 22 by a small distance, the endoscope 15 is only required to be finely adjusted to move; when the main instrument 17 is moved at the edge farther from the center of the screen, in order to improve the field of view adjustment efficiency, it is necessary to move the endoscope 15 by a large margin to return the instrument 17 to the off-center position of the screen. By associating the size of the inner planar reference area 21 with the distance between the instrument 17 and the outer planar reference area 22, when the distance between the instrument 17 and the outer planar reference area 22 is small, the inner planar reference area 21 is large, so that the movement amplitude of the endoscope 15 is reduced, the movement distance of the instrument 17 on the lens frame is reduced, and the stability of the lens frame is improved; when the distance between the instrument 17 and the external plane reference area 22 is larger, the internal plane reference area 21 is smaller, so that the movement amplitude of the endoscope 15 from the instrument 17 to the internal plane reference area 21 is increased, and the endoscope 15 can be greatly moved when the instrument 17 is far from the center of the picture, thereby playing a role in quickly adjusting the picture orientation. The control efficiency of the endoscope 15 is improved, and the advantage of convenient control of the field of view is achieved.
When the instrument 17 is only a small distance beyond the external depth reference area 24, the angle of the endoscope 15 need only be adjusted by a small amount. In the process of adjusting the endoscope 15, the endoscope 15 is stopped to move after the instrument 17 returns to the inner depth reference area 23, so that the fine adjustment of the endoscope 15 can be realized, the problem of unstable lens is effectively solved, the angle of the lens can be more conveniently adjusted to a proper direction, and the advantage of conveniently controlling the visual field is achieved.
By correlating the size of the inner depth reference area 23 with the distance of the instrument 17 from the outer depth reference area 24, when the distance of the instrument 17 from the outer depth reference area 24 is smaller, the inner depth reference area 23 is larger, thereby reducing the amplitude of the movement of the endoscope 15, reducing the distance of the instrument 17 moving in the lens frame, and improving the stability of the lens frame; when the distance between the instrument 17 and the external depth reference area 24 is larger, the internal depth reference area 23 is smaller, so that the movement amplitude of the endoscope 15 from the instrument 17 to the internal depth reference area 23 is increased, and the endoscope 15 can be greatly moved when the instrument 17 is far from the center of the picture, thereby playing a role in quickly adjusting the picture orientation. The control efficiency of the endoscope 15 is improved, and the advantage of convenient control of the field of view is achieved.
Variations and modifications to the above would be obvious to persons skilled in the art to which the invention pertains from the foregoing description and teachings. Therefore, the invention is not limited to the specific embodiments disclosed and described above, but some modifications and changes of the invention should be also included in the scope of the claims of the invention. In addition, although specific terms are used in the present specification, these terms are for convenience of description only and do not constitute any limitation on the invention.

Claims (10)

1. The utility model provides a visual field controlling means based on apparatus tracking, includes endoscope (15), data processing and controlling device (13), robot (14), data processing and controlling device (13) are connected with endoscope (15) and robot (14) communication respectively, its characterized in that: the data processing and control equipment (13) is provided with a planner;
the endoscope (15) is used for acquiring image data and sending the image data to the data processing and control device (13);
the data processing and control device (13) is used for receiving image data, wherein the image data comprises view pictures, and identification and tracking of the instrument (17) are completed through the image data; the planner adds an inner plane reference area (21) and an outer plane reference area (22) in the view picture, wherein the inner plane reference area (21) is positioned in the outer plane reference area (22); the data processing and control device (13) performs a robot (14) control algorithm based on the identification and tracking of the instrument (17); linear velocity of endoscope (15) relative to RCM coordinate systemAnd the angular velocity of the endoscope (15) relative to the RCM coordinate system>The following calculation formula is satisfied:
wherein K is r Representing control RCM error e r Positive fixed gain matrix, K of (t) s Representing control instrument (17) distance planningCenter frame error e p Positive definite gain matrix of (t), J fov (t)=J img ·J d ∈R 2×4 Jacobian matrix representing control fieldofview, I 4 Representing a 4x4 identity matrix,representative matrix J * In the form of a pseudo-inverse of (c),based on the Z-axis as the basis vector e 3 And translation vector from endoscope (15) to RCM frame +.> Indicating the speed of controlling the corresponding depth direction of the z-axis,/-, for example>E for controlling the angular velocity of rotation about the z-axis for counteracting the viewing-direction misdirection r (t) RCM error, e, representing distance of distal stationary point from endoscope (15) r (t) =0, and the data processing and control device (13) is controlled by +.>Calculating the end execution speed of the robot (14), converting the end execution speed into instruction information containing the speed of each joint of the robot (14) through inverse kinematics of the robot (14), and transmitting the instruction information to the robot (14);
the robot (14) is used for moving and driving the endoscope (15) to move according to the instruction information.
2. The visual field control method based on the instrument tracking is characterized by adopting a visual field control device based on the instrument tracking, wherein the visual field control device based on the instrument tracking comprises an endoscope (15), a data processing and control device (13) and a robot (14), and the data processing and control device (13) is provided with a planner;
the method comprises the following steps:
s1, identifying and tracking the instrument: acquiring image data using the endoscope (15) and transmitting the image data to a control module, which performs identification and tracking of the instrument (17) by means of the image data;
s2, judging whether the instrument is positioned in an ideal visual field area: the image data comprises a view picture, the planner adds an inner plane reference area (21) and an outer plane reference area (22) in the view picture, the inner plane reference area (21) is positioned in the outer plane reference area (22), and if the instrument (17) is positioned outside the outer plane reference area (22), the planner plans the movement path of the endoscope (15) based on the position of the instrument (17) and enters the step S3; if the instrument (17) is located in the outer planar reference area (22), step S5 is entered;
s3, calculating the execution speed of the robot:
linear velocity of endoscope (15) relative to RCM coordinate systemAnd the angular velocity of the endoscope (15) relative to the RCM coordinate system>The following calculation formula is satisfied:
wherein K is r Representing control RCM error e r Positive fixed gain matrix, K of (t) s Representing a control instrument (17) distance planning center error e p Positive definite gain matrix of (t), J fov (t)=J img ·J d ∈R 2×4 Jacobian matrix representing control fieldofview (fov), I 4 Representing a 4x4 identity matrix,representative matrix J * Is in pseudo-inverse form,/-> Based on the Z-axis as the basis vector e 3 And translation vector from endoscope (15) to RCM frame +.> Indicating the speed of controlling the corresponding depth direction of the z-axis,/-, for example>E for controlling the angular velocity of rotation about the z-axis for counteracting the viewing-direction misdirection r (t) RCM error, e, representing distance of distal stationary point from endoscope (15) r (t) =0, by->Calculating an end execution speed of the robot (14);
s4, robot action: the data processing and control device (13) converts the end execution speed of the robot (14) into the speed q of each joint of the robot (14) through inverse kinematics of the robot (14) i And send to the robot (14) to move the robot (14) until the instrument (17) in the view moves into the interior planar reference area (21);
s5, judging whether the operation is continued or not: if the operation is continued, the step S1 is carried out, and if the operation is finished, the step S6 is carried out;
s6, ending.
3. The instrument tracking-based field of view control method according to claim 2, further comprising, in the step S2, the steps of:
the size scale=max (scale) of the inner plane reference area (21) x ,scale y )
Wherein the method comprises the steps ofx e Representing the distance of the instrument (17) from the external planar reference area (22) in the x-axis direction, y e Representing the distance of the instrument (17) from the outer planar reference area (22) in the y-axis direction.
4. The instrument tracking-based field of view control method according to claim 2, further comprising, in the step S2, the steps of:
the planner adds an inner depth reference region (23) and an outer depth reference region (24), wherein the inner depth reference region (23) is positioned between the outer depth reference region (24) and the endoscope (15), and if the instrument (17) is positioned outside the outer depth reference region (24), planning a movement path of the endoscope (15) based on the position of the instrument (17) and entering step S3; if the instrument (17) is located within the external depth reference region (24), proceeding to step S5;
in the step S4, the method further includes the steps of:
until the instrument (17) moves into the internal depth reference area (23).
5. The instrument tracking-based field of view control method according to claim 4, further comprising, in the step S2, the steps of:
interval length of the external depth reference region (24)
Interval length of the inner depth reference region (23)
Wherein Depth is m Represents the intermediate value of the inner depth reference region (23) and the outer depth reference region (24), R represents the length ratio of the inner depth reference region (23) and the outer depth reference region (24), and R represents the length of the outer depth reference region (24).
6. The instrument tracking-based field of view control method according to claim 2, further comprising, in the step S3, the steps of:
the data processing and control device (13) calculates the end execution speed of the robot (14) by the following calculation method
Wherein the method comprises the steps ofRepresenting the rotation matrix from the RCM coordinate system to the UR5 basic coordinate system, +.>Representing the translational component of the UR5 end effector to the RCM coordinate system, I 3 Representing a 3x3 identity matrix, skew (·) representing the form of a vector in a skewed symmetric matrix,represents the linear speed of the camera of the endoscope (15) relative to the RCM coordinate system, +.>The angular velocity of the camera of the endoscope (15) relative to the RCM coordinate system is represented.
7. The instrument tracking-based field of view control method according to claim 2, further comprising, in the step S3, the steps of:
the method comprises the steps of enabling an endoscope (15) to move around a far-end fixed point through kinematic control by adopting a soft RCM mechanism, recording the origin of a reference coordinate system of a robot (14) as b, recording the end point as e, clamping a rod of the endoscope (15) as a clamping point s, enabling the position of a camera of the endoscope (15) to be c, enabling the far-end fixed point r and calculating the pose of the end point of the robot (14) through positive kinematicsCalibrating relative pose of a camera of an endoscope (15) from the tail end point of a robot (14)>
Calculating the pose of the camera of the endoscope (15)Clamping point pose->Wherein->From a rotation matrix of a unit array and a length of a z direction of l laparoscope Is composed of reference vectors of (2);
and the same applies to the length l of the rod of the endoscope (15) rcm Pose of stationary pointObtain the distance of the stationary point from the shaft of the endoscope (15) as RCM error +.>
8. The instrument tracking-based field of view control method according to claim 2, further comprising, in the step S3, the steps of:
the movement of the endoscope (15) conforms to the rotation angle constraint of the following calculation formula:
wherein the method comprises the steps ofIndicating the reference direction of the initial movement of the endoscope (15), ->Is the direction of the axial rotation angle theta of the endoscope (15), A θ ∈R 2×2 ,t θ ∈R 2×1 The affine matrix and the 2D image displacement of the endoscope (15) at the axial rotation angle θ are represented, respectively.
9. The instrument tracking-based field of view control method according to claim 8, further comprising, in the step S3, the steps of:
A θ =UDV T =(UV T )(VDV T )=R(φ)·(VDV T )
wherein U, D, V T Representative matrix A θ Three matrices of singular value decomposition: u is a 2×2 unitary matrix, D is a 2×2 order non-negative real diagonal matrix, V T Is the conjugate transpose of V; r (phi) represents an azimuthal offset caused by the axial rotation angle theta of the endoscope (15).
10. The instrument tracking-based field of view control method according to claim 9, further comprising, in the step S3, the steps of:
according to theta * =argmin θ Phi finds a theta * To minimize |phi|.
CN202311637181.4A 2023-12-01 2023-12-01 Visual field control device and visual field control method based on instrument tracking Pending CN117618111A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311637181.4A CN117618111A (en) 2023-12-01 2023-12-01 Visual field control device and visual field control method based on instrument tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311637181.4A CN117618111A (en) 2023-12-01 2023-12-01 Visual field control device and visual field control method based on instrument tracking

Publications (1)

Publication Number Publication Date
CN117618111A true CN117618111A (en) 2024-03-01

Family

ID=90017843

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311637181.4A Pending CN117618111A (en) 2023-12-01 2023-12-01 Visual field control device and visual field control method based on instrument tracking

Country Status (1)

Country Link
CN (1) CN117618111A (en)

Similar Documents

Publication Publication Date Title
US11801100B2 (en) Estimation of a position and orientation of a frame used in controlling movement of a tool
CN102791214B (en) Adopt the visual servo without calibration that real-time speed is optimized
CN114041875B (en) Integrated operation positioning navigation system
CN110325331B (en) Medical support arm system and control device
US8971597B2 (en) Efficient vision and kinematic data fusion for robotic surgical instruments and other applications
US11406460B2 (en) Surgery assisting apparatus, method of controlling the same, storage medium, and surgery assisting system
CN110215284B (en) Visualization system and method
WO2018159328A1 (en) Medical arm system, control device, and control method
US20200261155A1 (en) Image based robot guidance
Zhang et al. Autonomous scanning for endomicroscopic mosaicing and 3D fusion
Munoz et al. A medical robotic assistant for minimally invasive surgery
CN115768371A (en) System and method for reversing the orientation and view of selected components of a small surgical robotic unit in vivo
CN113180828B (en) Surgical robot constraint motion control method based on rotation theory
CN107009363A (en) Medical robot and its control method
CN113143461B (en) Man-machine cooperative minimally invasive endoscope holding robot system
JP2013516264A5 (en)
CN113633387A (en) Surgical field tracking supporting laparoscopic minimally invasive robot touch force interaction method and system
Huber et al. Homography-based visual servoing with remote center of motion for semi-autonomous robotic endoscope manipulation
CN117618111A (en) Visual field control device and visual field control method based on instrument tracking
WO2023116333A1 (en) Robot-assisted automatic trocar docking method and apparatus
Doignon et al. The role of insertion points in the detection and positioning of instruments in laparoscopy for robotic tasks
Debarba et al. Tracking a consumer HMD with a third party motion capture system
CN113876433A (en) Robot system and control method
Chen et al. Design and Visual Servoing Control of a Hybrid Dual-Segment Flexible Neurosurgical Robot for Intraventricular Biopsy
CN115972208A (en) Target following control method, mirror holding robot and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination