CN117297773A - Surgical instrument control method, surgical robot, and storage medium - Google Patents

Surgical instrument control method, surgical robot, and storage medium Download PDF

Info

Publication number
CN117297773A
CN117297773A CN202210715323.3A CN202210715323A CN117297773A CN 117297773 A CN117297773 A CN 117297773A CN 202210715323 A CN202210715323 A CN 202210715323A CN 117297773 A CN117297773 A CN 117297773A
Authority
CN
China
Prior art keywords
surgical instrument
instrument
tail end
real
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210715323.3A
Other languages
Chinese (zh)
Inventor
请求不公布姓名
王家寅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202210715323.3A priority Critical patent/CN117297773A/en
Publication of CN117297773A publication Critical patent/CN117297773A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/302Surgical robots specifically adapted for manipulations within body cavities, e.g. within abdominal or thoracic cavities

Abstract

The invention provides a surgical instrument control method, a surgical robot and a storage medium, wherein the method comprises the steps of controlling the tail end of an instrument to move to a target configuration in an initialization stage, and acquiring an in-vivo scene image of a patient acquired by an endoscope; acquiring actual pose information of the tail end of the instrument according to the scene image in the patient; acquiring pose deviation information of the tail end of the instrument according to the actual pose information and target pose information of the tail end of the instrument; judging whether the pose deviation of the tail end of the instrument is larger than a first preset threshold value or not; if yes, in the control stage, acquiring real-time pose information of the tail end of the instrument based on the real-time in-vivo scene images of the patient acquired by the endoscope. The invention can realize the accurate control of the surgical instrument.

Description

Surgical instrument control method, surgical robot, and storage medium
Technical Field
The invention relates to the technical field of surgical robots, in particular to a surgical instrument control method, a surgical robot and a storage medium.
Background
The surgical robot is designed to accurately implement complex surgical operations in a minimally invasive manner. Under the condition that the traditional operation faces various limitations, a surgical robot is developed to replace the traditional operation, the surgical robot breaks through the limitations of human eyes, and a stereoscopic imaging technology is adopted to clearly present internal organs to an operator. In the area that original hand is not stretched into, 360 degrees rotations, move, every single move, centre gripping can be accomplished to the robot to avoid the shake. The surgical robot has the advantages of small wound, less bleeding and quick recovery when performing surgery, can greatly shorten the postoperative hospitalization time of a patient, and can obviously improve the postoperative survival rate and recovery rate of the patient. As a high-end medical device, surgical robots are popular with a large number of doctors and patients, and have been widely used in various clinical operations.
The laparoscopic surgical robot includes a doctor console and a patient trolley, wherein the doctor console is operated by a surgeon to generate and transmit necessary signals; the patient trolley receives signals from the doctor console for the actual operation of the patient. Further, an operation master arm is mounted on the doctor console, and the surgeon can control the mechanical arm mounted on the patient trolley and the surgical instrument and the endoscope mounted on the tail end of the mechanical arm to perform corresponding motions by operating the operation master arm.
In the prior art, the installation position of a driver for driving the surgical instrument to move is far away from the position of each joint of the surgical instrument, and the transmission of force and displacement is realized between the driver and the corresponding joint through wire transmission. Because the instrument tail end space of the surgical instrument is narrow, sensor equipment for acquiring the instrument tail end motion information cannot be integrated, the motion model parameter identification is performed through a wire transmission model in the prior art, and the position of a corresponding joint is acquired. However, the creep phenomenon exists in the transmission wire on the surgical instrument along with the increase of the using times, so that the wire transmission model is inaccurate, and therefore, the pose control of the tail end of the instrument is realized by adopting the data provided by the driver, the deviation between the actual pose of the tail end of the instrument and the instruction position can be caused, and finally, the control precision of the tail end of the instrument of the surgical instrument is influenced.
It should be noted that the information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosure of Invention
The invention aims to provide a surgical instrument control method, a surgical robot and a storage medium, which can solve the problem that in the prior art, the control accuracy of the tail end of a surgical instrument is affected by adopting data provided by a driver to realize the pose control of the tail end of the surgical instrument.
In order to solve the above technical problems, the present invention provides a surgical instrument control method applied to a surgical robot, wherein the surgical robot includes at least one mechanical arm, wherein a surgical instrument and an endoscope are mounted at the end of at least one mechanical arm, the surgical instrument control method includes:
in an initialization stage, controlling the tail end of the surgical instrument to move to a target configuration, and acquiring an in-vivo scene image of a patient acquired by the endoscope;
acquiring actual pose information of the tail end of the surgical instrument according to the in-vivo scene image of the patient;
Acquiring pose deviation information of the tail end of the surgical instrument according to the actual pose information and the target pose information of the tail end of the surgical instrument;
judging whether the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value or not;
if yes, in a control stage, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope so as to control the surgical instrument.
Optionally, the surgical instrument includes an instrument cartridge and an instrument tip, the instrument cartridge including at least one drive shaft in driving connection with the instrument tip, the control method further comprising:
and if the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, acquiring real-time pose information of the tail end of the instrument based on the real-time pose information of each driving shaft in the control stage so as to realize control of the surgical instrument.
Optionally, the acquiring actual pose information of the distal end of the surgical instrument according to the in-vivo scene image of the patient includes:
identifying feature points of the scene image in the patient to identify at least three target feature points located on the distal end of the surgical instrument;
And acquiring the actual pose information of the tail end of the surgical instrument according to the actual position information of the target feature points.
Optionally, the obtaining the actual pose information of the distal end of the surgical instrument according to the actual position information of the target feature point includes:
acquiring actual position information of the target feature point under the reference coordinate system according to the actual position information of the target feature point under the endoscope coordinate system and the mapping relation between the endoscope coordinate system and the reference coordinate system;
and acquiring actual pose information of the tail end of the surgical instrument under the reference coordinate system according to the actual position information of the target feature point under the reference coordinate system and the geometric parameter information of the tail end of the surgical instrument acquired in advance.
Optionally, the surgical instrument comprises an instrument box and an instrument end, the instrument box comprises at least one driving shaft, the driving shaft is in transmission connection with the instrument end, the instrument end comprises at least one joint, and the driving shaft is in transmission connection with the corresponding joint;
the step of obtaining the actual pose information of the tail end of the surgical instrument according to the in-vivo scene image of the patient comprises the following steps:
Identifying feature points of the scene image in the patient to identify target feature points on each of the joints of the instrument tip;
acquiring actual pose information of each joint of the instrument tail end according to the actual position information of the target feature points on each joint of the instrument tail end and the geometric parameter information of the instrument tail end acquired in advance;
the step of obtaining pose deviation information of the surgical instrument end according to the actual pose information and the target pose information of the surgical instrument end comprises the following steps:
and aiming at each joint at the tail end of the instrument, acquiring pose deviation information of the joint according to the actual pose information and target pose information of the joint.
Optionally, the determining whether the pose deviation of the distal end of the surgical instrument is greater than a first preset threshold includes:
judging whether the pose deviation of each joint of the tail end of the instrument is larger than a first preset threshold value or not;
if yes, in a control stage, acquiring real-time pose information of the tail end of the instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope, wherein the real-time pose information comprises the following steps:
and if the pose deviation of at least one joint is larger than the first preset threshold, acquiring real-time pose information of the tail end of the instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope in a control stage.
Optionally, the target feature point on the joint is an end point in the axis direction of the joint.
Optionally, the acquiring real-time pose information of the instrument tip based on the real-time in-vivo scene image of the patient acquired by the endoscope to realize control of the surgical instrument includes:
acquiring real-time pose information of each joint at the tail end of the instrument according to the real-time in-vivo scene image of the patient acquired by the endoscope;
and aiming at each joint at the tail end of the instrument, acquiring real-time pose deviation information of the joint according to the real-time pose information and the instruction pose information of the joint, and controlling the joint to perform corresponding movement according to the real-time pose deviation information of the joint.
Optionally, the surgical instrument control method further includes:
acquiring real-time in-vivo scene images of the patient acquired by the endoscope in the process of controlling the tail end of the instrument to move to a target configuration;
for each driving shaft, iteratively updating transmission model parameters between the driving shaft and the tail end of the instrument according to the real-time in-patient scene image until a preset iteration ending condition is met so as to acquire updated transmission model parameters;
The acquiring real-time pose information of the instrument end based on the real-time pose information of each driving shaft comprises the following steps:
and acquiring real-time pose information of the tail end of the instrument based on the updated transmission model parameters corresponding to the driving shafts and the real-time pose information of the driving shafts.
Optionally, the preset iteration end condition is that a linear distance between a transmission model parameter obtained by a current iteration and a transmission model parameter obtained by a last iteration is smaller than a second preset threshold.
Optionally, the surgical instrument control method further includes:
judging whether the tail end of the surgical instrument exceeds the working visual field range of the endoscope;
if yes, entering an adjustment mode until the tail end of the instrument is in the working field of view of the endoscope; and/or, entering a locking mode to bring the surgical instrument into a locked state until the surgical instrument tip is within the working field of view of the endoscope.
Optionally, the surgical instrument control method further includes:
if the tail end of the instrument is blocked by the tissue in the patient, locking the gesture joint of the main control arm on the doctor console until the tail end of the surgical instrument is not blocked by the tissue in the patient; or alternatively
Real-time pose information of the tail end of the surgical instrument is obtained based on the real-time pose information of each driving shaft on the surgical instrument, so that the control of the surgical instrument is realized.
To solve the above technical problem, the present invention further provides a surgical robot, including a doctor console, a patient trolley and a controller which are connected by a signal, wherein the doctor console and the patient trolley have a master-slave control relationship, the patient trolley includes at least one mechanical arm, wherein a surgical instrument and an endoscope are mounted at the end of at least one mechanical arm, and the controller is configured to implement the surgical instrument control method described above.
To solve the above technical problem, the present invention further provides a readable storage medium having a computer program stored therein, which when executed by a processor, implements the surgical instrument control method described above.
Compared with the prior art, the surgical instrument control method, the surgical robot and the storage medium provided by the invention have the following advantages:
the surgical instrument control method provided by the invention comprises the steps of firstly controlling the tail end of the surgical instrument to move to a target configuration in an initialization stage, and acquiring an in-vivo scene image of a patient acquired by the endoscope; acquiring actual pose information of the tail end of the surgical instrument according to the scene image in the patient; then, according to the actual pose information and the target pose information of the tail end of the surgical instrument, pose deviation information of the tail end of the surgical instrument is obtained; finally judging whether the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value or not; if the judgment result is yes, in the control stage, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope so as to realize control of the surgical instrument. When the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value, the fact that the actual pose of the tail end of the surgical instrument is larger than the target pose is indicated, namely, a transmission model between a driving shaft used for driving the tail end of the surgical instrument to move and the tail end of the surgical instrument is not very accurate, and therefore, under the condition, the real-time pose information of the tail end of the surgical instrument is acquired based on the real-time in-vivo scene image of a patient acquired by the endoscope to control the surgical instrument, and higher control precision can be achieved. Therefore, the surgical instrument control method provided by the invention can effectively solve the problem that the control precision of the tail end of the surgical instrument is affected by adopting the data provided by the driver to realize the pose control of the tail end of the surgical instrument in the prior art.
Further, the surgical instrument includes an instrument cartridge and an instrument tip, the instrument cartridge including at least one drive shaft in driving connection with the instrument tip, the control method further comprising: and if the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, acquiring real-time pose information of the tail end of the instrument based on the real-time pose information of each driving shaft in the control stage so as to realize control of the surgical instrument. Because the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, the actual pose of the tail end of the surgical instrument is closer to the target pose, namely, the transmission model between the driving shaft for driving the tail end of the surgical instrument to move and the tail end of the surgical instrument is more accurate, under the condition, the real-time pose information of the tail end of the surgical instrument is acquired based on the real-time pose information of each driving shaft to control the surgical instrument, and higher control precision can be achieved.
Because the surgical robot and the storage medium provided by the invention belong to the same conception as the surgical instrument control method provided by the invention, the surgical robot and the storage medium provided by the invention have all the advantages of the surgical instrument control method provided by the invention, and the advantages of the surgical robot and the storage medium provided by the invention are not repeated one by one.
Drawings
Fig. 1 is a schematic view of an application scenario of a surgical robot according to an embodiment of the present invention;
FIG. 2 is a schematic view of a patient trolley according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of a multi-instrument mating of a single bore endoscopic robot provided in an embodiment of the present invention;
FIG. 4 is a schematic view of a doctor console according to an embodiment of the present invention;
FIG. 5 is a schematic view of an installation of a surgical instrument according to an embodiment of the present invention;
FIG. 6 is a schematic view of the structure of an instrument tip according to one embodiment of the present invention;
FIG. 7 is a schematic diagram of a wire drive assembly according to one embodiment of the present invention;
FIG. 8 is a schematic view of an endoscope according to an embodiment of the present invention;
FIG. 9 is a flow chart of a method for controlling a surgical instrument according to an embodiment of the present invention;
FIG. 10 is a schematic diagram illustrating a principle of locating a target feature point according to an embodiment of the present invention;
FIG. 11 is a schematic diagram of acquiring a pose deviation of an end of the instrument according to an embodiment of the present invention;
FIG. 12 is a schematic diagram of a target feature point according to an embodiment of the present invention;
FIG. 13 is a schematic diagram showing the identification result of target feature points according to an embodiment of the present invention;
FIG. 14 is a schematic illustration of interaction of a surgical instrument control scheme provided in accordance with one embodiment of the present invention;
FIG. 15 is a schematic diagram of a surgical instrument control using an image control mode according to an embodiment of the present invention;
FIG. 16 is a diagram illustrating updating transmission model parameters according to an embodiment of the present invention;
FIG. 17 is a schematic diagram of surgical instrument control using a decoupling control mode according to one embodiment of the present invention;
FIG. 18 is a schematic view of an instrument end according to one embodiment of the present invention with the instrument end occluded;
FIG. 19 is a flow chart of a method for determining whether the distal end of the instrument is beyond the working field of view of the endoscope, in accordance with one embodiment of the present invention.
Wherein, the reference numerals are as follows:
a doctor console-100; a master control arm-110; a second display unit-120;
patient trolley-200; a base-210; a robotic arm-220;
an image dolly-300; a first display unit-310;
surgical instrument-400; an instrument box-410; a drive shaft-411; instrument tip-420; an end base-421; actuator base-422; an end effector-423; pitch joint-424; yaw joint-425; an instrument bar-430; wire drive assembly-440; a driving wheel-441; drive wire-442; guide wheels-443; driven wheel-444;
An endoscope-500; a connecting rod-510; lens-520; an imaging fiber-530; cold light source-540;
tool bogie-600;
auxiliary trolley-700;
power box-800;
a controller-900.
Detailed Description
The surgical instrument control method, the surgical robot and the storage medium according to the present invention are described in further detail below with reference to the accompanying drawings and detailed description. The advantages and features of the present invention will become more apparent from the following description. It should be noted that the drawings are in a very simplified form and are all to a non-precise scale, merely for the purpose of facilitating and clearly aiding in the description of embodiments of the invention. For a better understanding of the invention with objects, features and advantages, refer to the drawings. It should be understood that the structures, proportions, sizes, etc. shown in the drawings are shown only in connection with the present disclosure for the understanding and reading of the present disclosure, and are not intended to limit the scope of the invention, which is defined by the appended claims, and any structural modifications, proportional changes, or dimensional adjustments, which may be made by the present disclosure, should fall within the scope of the present disclosure under the same or similar circumstances as the effects and objectives attained by the present invention.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the description of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "axial", "radial", "circumferential", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings are merely for convenience in describing the present invention and to simplify the description, and do not indicate or imply that the device or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and therefore should not be construed as limiting the present invention.
In the description of the present invention, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "secured" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
In the present invention, unless expressly stated or limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "over" and "on" a second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is higher in level than the second feature. The first feature being "under", "below" and "beneath" the second feature includes the first feature being directly under and obliquely below the second feature, or simply means that the first feature is less level than the second feature.
Furthermore, in the description herein, reference to the terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the various embodiments or examples described in this specification and the features of the various embodiments or examples may be combined and combined by those skilled in the art without contradiction.
The invention provides a surgical instrument control method, a surgical robot and a storage medium, which are used for solving the problem that in the prior art, the control accuracy of the tail end of a surgical instrument is affected by adopting data provided by a driver to realize the pose control of the tail end of the surgical instrument.
The proximal end refers to an end close to an operator, the distal end refers to an end far away from the operator, and the instrument end refers to a surgical instrument end. In addition, although the surgical instrument control method is described herein as being applied to a surgical robot having a master-slave control relationship, as will be appreciated by those skilled in the art, the surgical instrument control method provided by the present invention may also be applied to a surgical robot not having a master-slave control relationship, and the present invention is not limited thereto.
Example 1
To achieve the above-described idea, the present embodiment provides a surgical instrument control method applied to a surgical robot. Referring to fig. 1, a schematic view of an application scenario of a surgical robot according to an embodiment of the present invention is schematically shown, and as shown in fig. 1, the surgical robot includes a doctor console 100, a patient trolley 200, and an image trolley 300 that are in communication with each other.
With continued reference to fig. 2, a schematic structural diagram of a patient trolley according to an embodiment of the present embodiment is schematically shown. As shown in fig. 2, the patient trolley 200 includes a base 210 and at least one mechanical arm 220 mounted on the base 210, wherein a surgical instrument 400 is mounted at a distal end of at least one mechanical arm 220, and an endoscope 500 is mounted at a distal end of at least one mechanical arm 220. It should be noted that, as will be appreciated by those skilled in the art, when only one mechanical arm 220 is provided on the base 210, the surgical instrument 400 and the endoscope may be mounted on the same mechanical arm 220; when only a plurality of robotic arms 220 are provided on the base 210, the surgical instrument 400 and the endoscope may be mounted on different robotic arms 22. Referring to fig. 3, a schematic diagram of a multi-instrument matching of the single-hole endoscope robot according to an embodiment of the present invention is schematically shown. As shown in fig. 3, the single lumen endoscopic robot includes a robotic arm having a flexible endoscope 500 and at least one serpentine flexible surgical instrument 400 mounted to the distal end of the robotic arm.
In particular, the surgical instrument 400 and the endoscope 500 may extend into the patient through a poking hole in the patient's surface. The endoscope 500 may be used to acquire in-vivo scene images of a patient, specifically including acquiring information of surgical scene images of organs of human body, surgical instruments 400, blood vessels, body fluids, etc., and the acquired in-vivo scene images of the patient can be transferred to the first display unit 310 of the image trolley 300 (for convenience of distinction, the display part on the image trolley 300 is represented by the first display unit 310, and the display part on the doctor console 100 is represented by the second display unit 120).
With continued reference to fig. 4, a schematic structural diagram of a doctor console according to an embodiment of the present invention is schematically shown. As shown in fig. 4, the physician console 100 includes at least one master control arm 110. During the operation, an operator (i.e., doctor) sitting on the doctor control console 100 can control the surgical instrument 400 and the endoscope 500 positioned on the mechanical arm 220 to perform various operations by manipulating the main control arm 110, thereby achieving the purpose of performing the operation on the patient. In actual operation, the operator views the returned scene image in the patient's body through the second display unit 120 on the doctor console 100, and controls the surgical instrument 400 and the endoscope 500 on the robot arm 220 to move by manipulating the main control arm 110.
With continued reference to fig. 1, in one exemplary embodiment, as shown in fig. 1, the surgical robot further includes a tool trolley 600 for storing surgical instruments 400 and an auxiliary trolley 700 (including a ventilator and an anesthesia machine) for use during surgery. It should be noted that, as those skilled in the art can understand, those skilled in the art may select and configure the auxiliary carriages 700 according to the prior art, so they will not be further described herein. In addition, it should be noted that, for the relevant content regarding the more working principles of the surgical robot, reference may be made to the prior art, and a detailed description thereof will not be given here.
With continued reference to fig. 5, a schematic view of the installation of a surgical instrument according to an embodiment of the present invention is schematically shown. As shown in fig. 5, the surgical instrument 400 includes an instrument case 410 and an instrument end 420, and the instrument case 410 and the instrument end 420 are connected by an instrument bar 430. The instrument cartridge 410 includes at least one drive shaft 411, the drive shaft 411 being in driving connection with the instrument tip 420, whereby the instrument cartridge 410 can provide a driving force to the instrument tip 420 to drive the instrument tip 420 in motion. Specifically, the instrument tip 420 includes at least one joint, the instrument cartridge 410 further includes a base (not shown), and the drive shaft 411 is rotatably disposed on the base and is drivingly connected to the joint of the instrument tip 420.
With continued reference to fig. 6, a schematic diagram of an instrument tip 420 according to an embodiment of the present invention is schematically illustrated. As shown in fig. 6, the instrument tip 420 includes a tip base 421, an actuator base 422, an end actuator 423, a pitch joint 424, and two yaw joints 425, wherein the tip base 421 is rotatable about a first axis L1; the actuator base 422 is disposed on the end base 421 through a pitch joint 424 and is rotatable about a second axis L2; the two deflection plates of the end effector 423 are respectively disposed on the effector base 422 through a deflection joint 425 and can rotate around a third axis L3, and preferably, the third axis L3 is perpendicular to and does not intersect the second axis. It should be noted that, although the present embodiment is described by taking the end effector 423 as an example with a structure having a clamping capability, the specific type of the end effector 423 is not particularly limited in this embodiment, and may be selected according to actual needs, for example, the end effector 423 may be an electric hook (only one yaw joint 425 may be needed at this time) or a flushing tube (at this time, the end effector 423 is fixedly disposed on the effector base 422). Furthermore, it should be noted that, as those skilled in the art can understand, the number of joints and the distribution of joints of the instrument end 420 are not limited to the above description, and those skilled in the art can set the present invention according to actual requirements, which is not limited in this embodiment.
Specifically, four driving shafts 411 are disposed in the instrument box 410, where two driving shafts 411 are respectively used to drive two deflection pieces of the end effector 423 to rotate around the third axis L3 through one deflection joint 425; a drive shaft 411 for driving the actuator base 422 to rotate about the second axis L2 via the pitch joint 424; a drive shaft 411 is used to drive the end base 421 to rotate about the first axis L1. It should be noted that, as will be understood by those skilled in the art, a radio frequency chip is further disposed in the driving box, and the radio frequency chip is configured to record information about the surgical instrument 400, including a unique identifier of the surgical instrument 400, an instrument type, a number of uses, and the like.
Further, the drive shaft 411 is drivingly coupled to the articulation of the instrument end 420 via a wire drive assembly 440. Referring to fig. 7, a transmission schematic diagram of a wire transmission assembly 440 according to an embodiment of the present invention is schematically shown. As shown in fig. 7, the wire transmission assembly 440 includes a driving wheel 441, a transmission wire 442, a plurality of guiding wheels 443, and a driven wheel 444, wherein the driving wheel 441 is mounted on the driving shaft 411, the driving wheel 441 is used for driving the transmission wire 442 to move, the driven wheel 444 is mounted on a joint of the distal end 420 of the instrument, the transmission wire 442 on one side of the driven wheel 444 is in a tensioned state, the transmission wire 442 on the other side of the driving wheel is in a loose state, and the guiding wheel 443 is disposed between the driving wheel 441 and the driven wheel 444 to play a guiding role. Therefore, the driving wheel 441 can drive the driven wheel 444 to rotate through the transmission wire 442 so as to realize the movement of the joint. As the operating time of the surgical instrument 400 increases, the drive wire 442 repeatedly tightens and loosens, which tends to cause creep of the drive wire 442, resulting in insufficient initial tension to overcome the compliance error, and changes in the motion map (i.e., drive model parameters) between the drive wheel 441 and the driven wheel 444, which ultimately results in increased instrument control pointing error.
Further, as shown in fig. 5, a power box 800 is mounted at the end of the mechanical arm 220, and at least one driver (not shown) is disposed in the power box 800 and is in one-to-one correspondence with the driving shaft 411. Specifically, the proximal end of the instrument box 410 is provided with a transmission interface (not shown in the figure), and the power box 800 is also provided with a transmission interface (not shown in the figure), so that the driving shaft 411 and the driver can be in transmission connection through the transmission interface. Preferably, the driver is a motor with a reduction gear.
With continued reference to fig. 8, a schematic diagram of an endoscope 500 according to an embodiment of the present invention is schematically shown. As shown in fig. 8, the binocular lens 520 of the endoscope 500 is located at the distal end of the link 510 of the endoscope 500, and the proximal end of the endoscope 500 is provided with an imaging optical fiber 530 and a cold light source 540, wherein the imaging optical fiber 530 is used to transmit the in-patient scene image acquired by the endoscope 500 to the image trolley 300, the doctor console 100 and the controller 900 described below, and the cold light source 540 provides a light source for imaging of the endoscope 500 in the abdominal cavity of the patient. It should be noted that the link 510 is hollow, as will be understood by those skilled in the art, for the cable and the light source to pass through.
With continued reference to fig. 9, a schematic flow chart of a method for controlling a surgical instrument according to an embodiment of the present invention is schematically shown. As shown in fig. 9, the surgical instrument control method includes the steps of:
step S100, during an initialization phase, controls the instrument tip 420 to move to a target configuration and acquires an in-vivo scene image of the patient acquired by the endoscope 500.
Step 200, acquiring actual pose information of the instrument end 420 according to the in-vivo scene image of the patient.
Step S300, according to the actual pose information and the target pose information of the instrument tail end 420, pose deviation information of the instrument tail end 420 is obtained.
Step S400, judging whether the pose deviation of the instrument tail end 420 is larger than a first preset threshold value.
If yes, executing step S500; further, if the determination result is no, step S600 is performed.
In the control stage, the real-time pose information of the instrument tip 420 is acquired based on the real-time in-vivo scene image of the patient acquired by the endoscope 500, so as to control the surgical instrument 400.
Step S600, in the control stage, acquiring real-time pose information of the instrument tip 420 based on the real-time pose information of each driving shaft 411, so as to control the surgical instrument 400.
Because the pose deviation of the instrument end 420 is smaller than or equal to the first preset threshold, it indicates that the actual pose of the instrument end 420 is relatively close to the target pose, that is, the transmission model between the driving shaft 411 for driving the instrument end 420 to move and the instrument end 420 is relatively accurate, in this case, the real-time pose information of the instrument end 420 is obtained based on the real-time pose information of each driving shaft 411 to control the surgical instrument 400, so that higher control accuracy can be achieved; when the pose deviation of the instrument end 420 is greater than the first preset threshold, it indicates that the actual pose of the instrument end 420 is greatly different from the target pose, that is, the transmission model between the driving shaft 411 for driving the instrument end 420 to move and the instrument end 420 is not very accurate, so in this case, the control of the surgical instrument 400 is performed based on the real-time pose information of the instrument end 420 acquired by the endoscope 500 and acquired based on the real-time scene image of the patient in vivo, which can achieve higher control accuracy. Therefore, the surgical instrument control method provided by the invention can effectively solve the problem that the control accuracy of the instrument end 420 of the surgical instrument 400 is affected by using the data provided by the driver (motor) to realize the pose control of the instrument end 420 in the prior art.
Specifically, after the instruments (including the surgical instrument 400 and the endoscope 500) are installed during the preparation phase of the operation, during the initialization process (self-test process) of the surgical instrument 400, a target configuration command is sent to the driver, for example, 30 ° of yaw (i.e., each yaw piece rotates 15 ° about the third axis L3) and 45 ° of pitch (i.e., the actuator base 422 rotates 45 ° about the second axis L2), under the action of the driver, the instrument end 420 may be moved to the target configuration, and a period of time may be allowed after the instrument end 420 is moved to the target configuration, and an in-vivo scene image of the patient in the target configuration may be acquired by the endoscope 500.
In an exemplary embodiment, the acquiring actual pose information of the instrument tip 420 from the in-patient scene image includes:
identifying feature points of the scene image in the patient to identify at least three target feature points located on the instrument tip 420;
and acquiring the actual pose information of the instrument tail end 420 according to the actual position information of the target characteristic points.
Thus, by identifying feature points of the scene image in the patient, at least three target feature points located on the distal end 420 of the instrument can be identified, and thus, the actual pose information of the distal end 420 of the instrument can be obtained according to the actual position information of the target feature points.
In an exemplary embodiment, the identifying feature points of the scene image in the patient includes:
and identifying characteristic points of the scene image in the patient by adopting a pre-trained neural network model.
Because the neural network model is continuously trained, the neural network model can have higher recognition accuracy, and therefore, the characteristic points on the tail end 420 of the instrument can be rapidly and accurately recognized by recognizing the characteristic points of the scene image in the patient by adopting the neural network model which is trained in advance. It should be noted that, as those skilled in the art can understand, the network structure of the neural network model includes, but is not limited to, the network structures of TCNN (tweaked convolutional neural networks, adjusting convolutional neural network), DAN (Deep Alignment Network, deep aligned neural network), and the like.
In an exemplary embodiment, the acquiring the actual pose information of the instrument tip 420 according to the actual position information of the target feature point includes:
acquiring actual position information of the target feature point under a reference coordinate system according to the actual position information of the target feature point under the endoscope 500 coordinate system and the mapping relation between the endoscope 500 coordinate system and the reference coordinate system;
And acquiring actual pose information of the instrument tail end 420 under the reference coordinate system according to the actual position information of the target feature point under the reference coordinate system and the pre-acquired geometric parameter information of the instrument tail end 420.
Specifically, since the endoscope 500 has two lenses 520 with the same specification, and the two lenses 520 are distributed at a certain interval, the condition of the binocular positioning principle is satisfied, and thus two in-vivo scene images of the patient can be collected simultaneously through the binocular lens 520 of the endoscope 500, by respectively identifying the two in-vivo scene images of the patient, each target feature point can be identified in the two in-vivo scene images of the patient, and for each target feature point, the actual position information of the target feature point under the coordinate system of the endoscope 500 can be obtained based on the basic principle of binocular positioning. Referring to fig. 10, a schematic diagram of a positioning principle of a target feature point provided by an embodiment of the present invention is schematically shown. As shown in fig. 10, the distance from the optical axis center of the binocular lens 520 is L, the distance from the binocular lens 520 to the imaging plane is f, and when the binocular lens 520 simultaneously observes the target feature point P, the horizontal coordinate observed by the "left eye" is x l The horizontal coordinate observed by the right eye is x r Definition of "parallax" d=x l -x r The method comprises the steps of carrying out a first treatment on the surface of the Because the imaging planes of the binocular lens 520 are the same baseline plane, the observed longitudinal coordinates are equal and y, and the following relationship exists according to the triangle similarity principle:
wherein (x) l Y) represents the position coordinates of the target feature point P in the in-vivo scene image of the patient taken by the left-located lens 520, (x) r Y) represents the position coordinates of the target feature point P in the in-vivo scene image of the patient taken by the right-hand lens 520, (x) c ,y c ,z c ) The actual position of the target feature point P in the coordinate system of the endoscope 500 is represented.
Thus, for each target feature point, the actual position information of the target feature point in the coordinate system of the endoscope 500 can be obtained by the above formula (1). It should be noted that, as will be understood by those skilled in the art, pose information of the endoscope 500 in the base coordinate system (i.e., the robot coordinate system) may be obtained according to the pose of each joint of the mechanical arm 220 mounted with the endoscope 500, so as to obtain a rotation matrix for characterizing the mapping relationship between the coordinate system of the endoscope 500 and the reference coordinate system, so that, for each target feature point, according to the actual pose information of the target feature point in the coordinate system of the endoscope 500 and the mapping relationship between the coordinate system of the endoscope 500 and the reference coordinate system, the actual position information of the target feature point in the reference coordinate system may be obtained, and then, according to the actual position information of each target feature point in the reference coordinate system and the geometric parameter information of the instrument end 420 acquired in advance, the actual pose information of the instrument end 420 in the reference coordinate system may be obtained. In addition, as will be appreciated by those skilled in the art, the target pose information of the instrument tip 420 may be obtained from the target configuration and pose information of each joint of the robotic arm 220 to which the surgical instrument 400 is mounted.
Referring to fig. 11, a schematic diagram of the pose deviation of the distal end of the instrument according to an embodiment of the present invention is schematically shown. As shown in FIG. 11, in the drawingRepresenting the actual pose of the instrument tip 420 in the base coordinate system, +.>Representing the target pose of the instrument tip 420 in the base coordinate system, the pose deviation e of the instrument tip 420 may be expressed as:
in an exemplary embodiment, the identifying feature points of the scene image within the patient to identify at least one target feature point located on the instrument tip 420 includes:
the identification of feature points is performed on the in-patient scene image to identify target feature points on each of the joints of the instrument tip 420.
Correspondingly, the obtaining the actual pose information of the instrument end 420 according to the actual position information of the target feature point includes:
and acquiring the actual pose information of each joint of the instrument tail end 420 according to the actual position information of the target feature point on each joint of the instrument tail end 420.
The obtaining the pose deviation information of the instrument end 420 according to the actual pose information and the target pose information of the instrument end 420 includes:
For each joint of the instrument end 420, pose deviation information of the joint is obtained according to the actual pose information and target pose information of the joint.
Thus, by identifying the target feature point on each of the joints of the instrument tip 420, the actual pose information of each joint in the reference coordinate system can be obtained according to the actual position information of the target feature point on each joint in the reference coordinate system and the geometric parameter information of the instrument tip 420. And aiming at each joint, according to the actual pose information and the target pose information of the joint under the reference coordinate system, acquiring the pose deviation information of the joint.
In an exemplary embodiment, the target feature point on the joint is an end point in the axial direction of the joint.
Thus, by identifying at least one feature point in the axial direction of each joint on the instrument tip 420 as a target feature point, it is possible to more easily calculate the actual pose information (the actual pose information in the reference coordinate system) of each joint on the instrument tip 420 from the actual position information (the actual position information in the reference coordinate system) of each target feature point, and to effectively reduce the amount of calculation. Referring to fig. 12, a schematic diagram of the target feature point provided in one implementation manner of the present embodiment is schematically shown. As shown in fig. 12, points a and B represent end points of both axial ends of the pitch joint 424 (i.e., target feature points on the pitch joint 424), points C and D represent end points of both yaw joints 425 in the axial direction (i.e., target feature points on the two yaw joints 425), points E and F represent end points of both yaw pieces of the end effector 423, whereby the actual pose information of the pitch joint 424 of the instrument end 420 can be calculated from the actual position information of the target feature point a, the target feature point B, the target feature point C, and the target feature point D in combination with the geometric parameter information of the instrument end 420; according to the actual position information of the target feature point C, the target feature point D and the target feature point E and by combining the geometric parameter information of the instrument tail end 420, the actual position information of the target feature point E and the target feature point F of one of the yaw joints 425 can be calculated; the actual pose information of the other yaw joint 425 is calculated according to the actual position information of the target feature point C, the target feature point D and the target feature point F and by combining the geometric parameter information of the instrument end 420.
Preferably, in order to facilitate the identification of the respective target feature points in the acquired in-vivo scene image of the patient, the respective target feature points may be marked on the instrument end 420 in advance with a highly differentiated color or marking on the instrument end 420, in particular, the respective target feature points may be marked with different colors and/or markings for different joints.
In other embodiments of the invention, the respective target feature points may also be markers disposed at other locations on the distal end of the instrument that can be identified from an endoscopic acquired in-vivo scene image including the distal end of the instrument.
Please continue to refer to fig. 13, which schematically shows a schematic diagram of the identification result of the target feature point provided in the embodiment of the present invention. As shown in fig. 13, after the surgical instrument 400 and the endoscope 500 are mounted to the distal end of the mechanical arm 220 on the patient carriage 200 and are deep into the patient, the endoscope 500 may collect an in-patient scene image including the instrument distal end 420, by recognizing the in-patient scene image, each target feature point on the instrument distal end 420 may be recognized, three-dimensional position coordinates of each target feature point may be calculated according to the above formula (1), and the first display unit 310 on the image carriage 300 and the second display unit 120 on the doctor console 100 may display the recognition result of the target feature point and the three-dimensional position calculation result in real time. It should be noted that the three-dimensional position of the target feature point is changed in real time as the position of the instrument tip 420 is changed, as will be appreciated by those skilled in the art. Furthermore, it should be noted that in the initialization phase, the instrument tip 420 may be designated to move to a particular configuration in order to be able to acquire as many target feature points as possible, as will be appreciated by those skilled in the art. It should be further noted that, as will be understood by those skilled in the art, the characteristic points of the instrument end 420 may be blocked during the operation (i.e., the control phase, such as the master-slave control phase), but at least one target characteristic point may be always acquired for each joint, and at this time, interpolation may be performed according to the geometric parameter information of the instrument end 420, so as to calculate the position of each joint of the instrument end 420. It should be noted that, as those skilled in the art will understand, the three-dimensional position coordinates of the point a, the point D, and the point F in fig. 13 are only illustrative, and do not limit the present invention.
In an exemplary embodiment, the determining whether the pose deviation of the instrument tip 420 is greater than a first preset threshold includes:
judging whether the pose deviation of each joint of the instrument tail end 420 is larger than a first preset threshold value or not;
if so, in a control phase, acquiring real-time pose information of the instrument tip 420 based on real-time in-patient scene images acquired by the endoscope 500, including:
if the pose deviation of at least one of the joints is greater than the first preset threshold, then in a control phase, real-time pose information of the instrument tip 420 is acquired based on real-time in-patient scene images acquired by the endoscope 500.
Thus, in the initialization stage, for each joint of the instrument end 420, the pose deviation of the joint is calculated first, then whether the pose deviation of the joint is greater than a first preset threshold is determined, and finally, when the pose deviation of any joint is greater than the first preset threshold according to the determination result of the pose deviation of each joint, real-time pose information of the instrument end 420 is acquired based on the real-time in-patient scene image acquired by the endoscope 500, so as to control the surgical instrument 400 (that is, control the surgical instrument 400 in an image control mode); if the pose deviations of all the joints of the instrument end 420 are smaller than or equal to the first preset threshold, in the control stage, the real-time pose information of the instrument end 420 is obtained based on the real-time pose information of each driving shaft 411, so as to control the surgical instrument 400 (i.e., control the surgical instrument 400 in a decoupling control mode), thereby further improving the control accuracy of the surgical instrument control method provided in the embodiment.
Referring to fig. 14, an interactive schematic diagram of a control manner of a surgical instrument according to an embodiment of the present invention is schematically shown. As shown in fig. 14, when a plurality of surgical instruments 400 are mounted on the robot arm 220 of the patient table 200. For each surgical instrument 400, in the initialization stage, the surgical instrument 400 is controlled to move to a corresponding target configuration, actual pose information of the instrument end 420 of the surgical instrument 400 is acquired through the in-vivo scene image of the patient acquired by the endoscope 500, pose deviation information of the instrument end 420 of the surgical instrument 400 is further calculated, and a control mode of the surgical instrument 400 is determined according to a comparison result between the pose deviation of the instrument end 420 of the surgical instrument 400 and the first preset threshold value (when the pose deviation is greater than the first preset threshold value, an image control mode is recommended, and when the pose deviation is smaller than or equal to the first preset threshold value, a decoupling control mode is recommended).
In an exemplary embodiment, the acquiring real-time pose information of the instrument tip 420 based on real-time in-patient scene images acquired by the endoscope 500 to enable control of the surgical instrument 400 includes:
Acquiring real-time pose information of each joint of the instrument tip 420 according to real-time in-vivo scene images of the patient acquired by the endoscope 500;
for each joint of the instrument end 420, acquiring real-time pose deviation information of the joint according to the real-time pose information and the instruction pose information of the joint, and controlling the joint to perform corresponding movement according to the real-time pose deviation information of the joint.
In particular, with respect to how to obtain real-time pose information of each joint of the instrument tip 420 based on real-time in-vivo scene images of the patient acquired by the endoscope 500, reference may be made to the related description above, and detailed description thereof will be omitted. With continued reference to fig. 15, a schematic diagram of a surgical instrument control using an image control mode according to an embodiment of the present invention is schematically shown. As shown in fig. 15, the image control mode is a dual loop architecture, wherein the inner loop is a control method of an existing joint control module, including but not limited to PID (proportional-integral-derivative controller), LQR (linear quadratic regulator) and other control methods; the outer loop is an image-based position feedback control method, and for each joint of the instrument end 420, a pose deviation between a command position of the joint and a real-time pose of the joint acquired according to a real-time scene image in the patient body acquired by the endoscope 500 is used as an input signal of a joint control module of the joint, so as to control the corresponding joint to move to a corresponding command position, thereby realizing accurate control of the surgical instrument 400. It should be noted that, as those skilled in the art will understand, the joint control module in this embodiment belongs to one of the functional modules in the controller 900 below.
In an exemplary embodiment, the surgical instrument control method further comprises:
acquiring real-time in-patient scene images acquired by the endoscope 500 during control of the instrument tip 420 to move to a target configuration;
for each driving shaft 411, according to the real-time in-patient scene image, iteratively updating the transmission model parameters between the driving shaft 411 and the instrument end 420 until a preset iteration end condition is met, so as to obtain updated transmission model parameters.
Correspondingly, the acquiring real-time pose information of the instrument tip 420 based on the real-time pose information of each of the drive shafts 411 includes:
based on the updated transmission model parameters corresponding to each of the drive shafts 411 and the real-time pose information of each of the drive shafts 411, the real-time pose information of the instrument tip 420 is obtained.
Specifically, the acquiring real-time pose information of the instrument tip 420 based on the updated transmission model parameters corresponding to each driving shaft 411 and the real-time pose information of each driving shaft 411 includes:
for each driving shaft 411, according to the updated transmission model parameters corresponding to the driving shaft 411 and the real-time pose information of the driving shaft 411, the real-time pose information of the corresponding joint of the instrument tail end 420 is obtained.
Thus, in the process of controlling the movement of the instrument end 420 to the target configuration, the real-time in-vivo scene image of the patient acquired by the endoscope 500 is acquired, so as to update the transmission model parameters between the driving shaft 411 and the instrument end 420 (i.e., the transmission model parameters between the driving shaft 411 and the corresponding joint, i.e., the transmission model parameters between the driver and the corresponding joint), so that the transmission model parameters capable of truly reflecting the transmission relationship between the driving shaft 411 and the instrument end 420 (i.e., the transmission relationship between the driving shaft 411 and the corresponding joint, i.e., the transmission relationship between the driver and the corresponding joint) can be acquired, and further, the accuracy of performing the control of the surgical instrument 400 by adopting the mode of decoupling control in the control stage can be effectively improved, based on the updated transmission model parameters corresponding to the driving shaft 411 and the real-time pose information of the driving shaft 411 (i.e., the real-time pose information of the driver).
Further, the preset iteration ending condition is that the linear distance between the transmission model parameter obtained by the current iteration and the transmission model parameter obtained by the last iteration is smaller than a second preset threshold.
Specifically, please refer to fig. 16, which schematically illustrates a schematic diagram of updating transmission model parameters according to an embodiment of the present invention. As shown in fig. 16, in the initialization stage, after determining the target configuration of the surgical instrument 400, the motion track of each joint on the instrument end 420 is planned according to the target configuration, and for each joint on the instrument end 420, the corresponding joint control module may calculate the target pose of the driver at each step according to the motion track of the joint and combining the transmission model parameters between the driver and the joint (i.e., the transmission model parameters between the corresponding driving shaft 411 and the joint), for example, assuming that the transmission model parameters between the driver and the joint are J, and the target pose of the joint at a certain step is Y 0 The driver at the target pose X of the step 0 The method comprises the following steps:
X 0 =J -1 Y 0 (2)
whereby the driver is capable of moving to a corresponding target pose under the influence of the joint control module to drive corresponding joints on the instrument tip 420 for corresponding movement. After the driver moves to the target pose, by acquiring a real-time in-vivo scene image of the patient acquired by the endoscope 500 and identifying feature points of the in-vivo scene image of the patient, each target feature point can be identified to estimate the actual pose of the joint, wherein the actual pose of the driver can be acquired by a position encoder mounted on the driver, and thus, according to the actual pose Y of the joint and the actual pose X of the driver, the transmission model parameter J between the driver and the joint can be updated according to the following formula:
J=YX -1 (3)
Calculating a linear distance between the updated transmission model parameter at the moment and the transmission model parameter at the last moment, judging whether the linear distance is smaller than a second preset threshold value, if so, stopping iterative updating, and taking the updated transmission model parameter at the moment as a final updated transmission model parameter of the driving shaft 411; if not, the updated transmission model parameter J is brought into the above formula (2), the target pose of the driver at the next step can be calculated, the driver is controlled to move to the target pose, the in-patient scene image acquired by the endoscope 500 is acquired again, the feature points of the in-patient scene image are identified, each target feature point can be identified again, so as to estimate the actual pose of the joint at the moment, and then the transmission model parameter J between the driver and the joint can be updated again according to the above formula (3) according to the actual pose of the driver acquired by the pose encoder mounted on the driver at the moment. Then, judging whether the linear distance between the updated transmission model parameter at the moment and the updated transmission model parameter at the last moment is smaller than the second preset threshold value, if so, stopping iterative updating, and taking the updated transmission model parameter at the moment as the final updated transmission model parameter of the driving shaft 411; if not, carrying the updated transmission model parameter J into the formula (2), and continuing to update the transmission model parameter J in an iteration mode until the linear distance between the transmission model parameter obtained by the current iteration and the transmission model parameter obtained by the last iteration is smaller than a second preset threshold.
With continued reference to fig. 17, a schematic diagram of a surgical instrument control using a decoupling control mode according to an embodiment of the present invention is schematically illustrated. As shown in fig. 17, at the initialization stage, updated transmission model parameters J between each drive shaft 411 (driver) and the corresponding joint of the instrument tip 420 may be obtained. In a control phase (e.g., a master-slave control phase), for each joint of the instrument end 420, the joint control module of the joint may calculate a joint moment of the joint according to the commanded pose of the joint, and further may calculate an output moment of the driver according to the joint moment of the joint through a transpose of a corresponding updated transmission model parameter J, so as to control the corresponding driver to output a corresponding moment and act on the instrument end 420; the real-time pose of the driver (i.e. the real-time pose of the driving shaft 411 connected with the driver) can be obtained through the encoder installed on the driver, the real-time pose of the joint can be calculated through the corresponding updated transmission model parameter J, and then the real-time pose deviation of the joint can be calculated according to the instruction pose and the real-time pose of the joint, so as to be used as a feedback signal of the joint control module of the joint, so as to control the joint to move to the corresponding instruction pose, thereby realizing the accurate control of the surgical instrument 400.
In an exemplary embodiment, the surgical instrument control method further comprises:
during the control phase, if the instrument tip 420 is occluded by tissue in the patient, locking the pose joint of the master control arm 110 on the physician console 100 until the instrument tip 420 is no longer occluded by tissue in the patient; or alternatively
Real-time pose information of the instrument tip 420 is acquired based on the real-time pose information of each of the drive shafts 411 to enable control of the surgical instrument 400.
Referring to fig. 18, a schematic diagram of an apparatus according to an embodiment of the present invention is shown with the end of the apparatus blocked. As shown in fig. 18, during the surgical procedure, there may be instances where the instrument tip 420 is obscured by tissue. Since when the instrument end 420 is blocked, the instrument end 420 is not fully displayed in the in-vivo scene image of the patient acquired by the endoscope 500, the target feature point on one or several joints is blocked and cannot be identified, and thus the real-time pose information of each joint of the instrument end 420 cannot be acquired based on the real-time in-vivo scene image of the patient acquired by the endoscope 500, thereby affecting the control of the surgical instrument 400. Therefore, in the instrument control method provided by the embodiment, when the instrument end 420 exceeds the working field of view of the endoscope 500, the prompt information for entering the adjustment mode of the endoscope 500 is sent to the doctor console 100, so that a doctor can adjust the position of the endoscope 500, the instrument end 420 is not blocked, and in a control stage, real-time pose information of each joint of the instrument end 420 can be acquired smoothly based on real-time in-vivo scene images acquired by the endoscope 500, thereby realizing accurate control of the surgical instrument 400. Or when the instrument end 420 is blocked, the decoupling control mode is directly adopted to control the surgical instrument 400, so that the real-time pose information of the instrument end 420 can be directly obtained according to the real-time pose information of each driving shaft 411, and further the accurate control of the surgical instrument 400 is realized.
Further, when there is a condition in which the instrument end 420 is occluded by the tissue, the portion of the instrument end 420 that is occluded by the tissue may be reconstructed by AR reconstruction techniques.
In an exemplary embodiment, the surgical instrument control method further comprises:
during the control phase, determining whether the instrument tip 420 is beyond the operating field of view of the endoscope 500;
if so, an adjustment mode is entered until the instrument tip 420 is within the operating field of view of the endoscope 500.
Therefore, by judging whether the instrument end 420 exceeds the working field of view of the endoscope 500, and entering an adjustment mode when the instrument end 420 exceeds the working field of view of the endoscope 500, a doctor of the doctor console 100 is reminded to adjust the position of the endoscope 500, so that the instrument end 420 can be positioned in the working field of view of the endoscope 500, each target feature point on the instrument end 420 can be ensured to be displayed in the in-vivo scene image of the patient acquired by the endoscope 500, and thus, the real-time pose information of the instrument end 420 can be ensured to be successfully acquired based on the real-time in-vivo scene image acquired by the endoscope 500.
In an exemplary embodiment, if the instrument tip 420 is beyond the working field of view of the endoscope 500, the surgical instrument control method further comprises:
a locking mode is entered to bring the instrument tip 420 into a locked state until the instrument tip 420 is within the operating field of view of the endoscope 500.
Thus, by locking the attitude joint of the main control arm 110 on the doctor console 100 so that the instrument tip 420 enters the locked state when the instrument tip 420 is out of the working field of view of the endoscope 500, it is possible to effectively prevent a doctor from operating the surgical instrument 400 by mistake before adjusting the endoscope 500 to a proper position so that the instrument tip 420 is in the working field of view of the endoscope 500.
With continued reference to fig. 19, a schematic flow chart for determining whether the distal end of the instrument is beyond the operating field of view of the endoscope is provided in accordance with one embodiment of the present invention. As shown in fig. 19, in a control stage (e.g., a master-slave control stage), an in-patient scene image is acquired in real time by the endoscope 500, and feature points are identified on the in-patient scene image to identify each target feature point, and when a two-dimensional coordinate position of any one of the target feature points exceeds a working field of view of the endoscope 500, the instrument end 420 is determined to exceed the working range of the endoscope 500; at this time, a prompt message for entering an adjustment mode of the endoscope 500 is transmitted to the doctor console 100 and the posture joint of the main control arm 110 is locked by a feedback force. After entering the endoscope 500 adjustment mode, the position joint of the master control arm 110 is released, the attitude joint of the master control arm 110 is locked, and by applying a feedback force to the position joint of the master control arm 110, the doctor can be guided to adjust the working field of view of the endoscope 500, and when the instrument tip 420 is completely within the field of view of the endoscope 500, the attitude joint of the master control arm 110 is released, and the master-slave operation mode is entered.
Example two
In correspondence to the above-mentioned surgical instrument control method, the present embodiment provides a surgical robot, please refer to fig. 1. As shown in fig. 1, the surgical robot includes a controller 900 in addition to the above-described doctor console 100, patient trolley 200, and image trolley 300, the doctor console 100, patient trolley 200, and image trolley 300 being communicatively connected to the controller 900, the controller 900 being configured to implement the above-described surgical instrument control method. Because the controller 900 in the surgical robot provided in this embodiment can implement the above-described surgical instrument control method, the surgical robot provided in this embodiment can effectively avoid the problem of a larger control error caused by the silk creep phenomenon of the silk transmission system of the surgical instrument 400, so as to implement precise control of the surgical instrument 400, and especially when the surgical robot provided in this embodiment is a single-lumen endoscope robot, a plurality of instruments (including the surgical instrument 400 and the endoscope 500) can be implemented so as to not affect each other in a limited space, but also to be able to be mated accurately.
It should be noted that the controller 900 may be provided in combination with any one or more of the devices of the surgical robots described above, for example, the controller 900 may be provided at the physician console 100, or at the patient trolley 200, or at the image trolley 300, etc., and in still other embodiments, the controller 900 may be provided separately. In addition, as those skilled in the art will appreciate, the controller 900 may be a specific hardware or software unit, or may be a combination of hardware and software, and the present invention is not limited to the specific configuration of the controller 900.
Example III
The present embodiment provides a readable storage medium having stored therein a computer program which, when executed by a processor, can implement the surgical instrument control method described above. Since the readable storage medium provided in the present embodiment and the surgical instrument control method provided in the first embodiment belong to the same inventive concept, the readable storage medium provided in the present embodiment has all the advantages of the surgical instrument control method provided in the first embodiment, and therefore the advantages of the readable storage medium provided in the present embodiment are not repeated one by one.
The readable storage medium of the present embodiment may employ any combination of one or more computer readable media. The readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer hard disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
In summary, compared with the prior art, the surgical instrument control method, the surgical robot and the storage medium provided by the invention have the following advantages:
the surgical instrument control method provided by the invention comprises the steps of firstly controlling the tail end of the surgical instrument to move to a target configuration in an initialization stage, and acquiring an in-vivo scene image of a patient acquired by the endoscope; acquiring actual pose information of the tail end of the surgical instrument according to the scene image in the patient; then, according to the actual pose information and the target pose information of the tail end of the surgical instrument, pose deviation information of the tail end of the surgical instrument is obtained; finally judging whether the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value or not; if the judgment result is yes, in the control stage, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope so as to realize control of the surgical instrument. When the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value, the fact that the actual pose of the tail end of the surgical instrument is larger than the target pose is indicated, namely, a transmission model between a driving shaft used for driving the tail end of the surgical instrument to move and the tail end of the surgical instrument is not very accurate, and therefore under the condition, the real-time pose information of the tail end of the surgical instrument is acquired based on the real-time in-vivo scene image of a patient acquired by the endoscope to control the surgical instrument, and higher control precision can be achieved. Therefore, the surgical instrument control method provided by the invention can effectively solve the problem that the control precision of the tail end of the surgical instrument is affected by adopting the data provided by the driver to realize the pose control of the tail end of the surgical instrument in the prior art.
Further, the surgical instrument includes an instrument cartridge and an instrument tip, the instrument cartridge including at least one drive shaft in driving connection with the instrument tip, the control method further comprising: and if the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, acquiring real-time pose information of the tail end of the instrument based on the real-time pose information of each driving shaft in the control stage so as to realize control of the surgical instrument. Because the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, the actual pose of the tail end of the surgical instrument is closer to the target pose, namely, the transmission model between the driving shaft for driving the tail end of the surgical instrument to move and the tail end of the surgical instrument is more accurate, under the condition, the real-time pose information of the tail end of the surgical instrument is acquired based on the real-time pose information of each driving shaft to control the surgical instrument, and higher control precision can be achieved.
Because the surgical robot and the storage medium provided by the invention belong to the same conception as the surgical instrument control method provided by the invention, the surgical robot and the storage medium provided by the invention have all the advantages of the surgical instrument control method provided by the invention, and the advantages of the surgical robot and the storage medium provided by the invention are not repeated one by one.
It should be noted that computer program code for carrying out operations of the present invention may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that the apparatus and methods disclosed in the embodiments herein may be implemented in other ways. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments herein. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. In addition, the functional modules in the embodiments herein may be integrated together to form a single part, or the modules may exist alone, or two or more modules may be integrated to form a single part.
The above description is only illustrative of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention, and any alterations and modifications made by those skilled in the art based on the above disclosure shall fall within the scope of the present invention. It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, the present invention is intended to include such modifications and alterations insofar as they come within the scope of the invention or the equivalents thereof.

Claims (14)

1. A surgical instrument control method applied to a surgical robot, the surgical robot including at least one mechanical arm, wherein a surgical instrument and an endoscope are mounted at a distal end of at least one mechanical arm, the surgical instrument control method comprising:
in an initialization stage, controlling the tail end of the surgical instrument to move to a target configuration, and acquiring an in-vivo scene image of a patient acquired by the endoscope;
acquiring actual pose information of the tail end of the surgical instrument according to the in-vivo scene image of the patient;
acquiring pose deviation information of the tail end of the surgical instrument according to the actual pose information and the target pose information of the tail end of the surgical instrument;
Judging whether the pose deviation of the tail end of the surgical instrument is larger than a first preset threshold value or not;
if yes, in a control stage, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope so as to control the surgical instrument.
2. A surgical instrument control method according to claim 1, wherein the surgical instrument comprises an instrument cartridge and an instrument tip, the instrument cartridge comprising at least one drive shaft in driving connection with the instrument tip, the control method further comprising:
and if the pose deviation of the tail end of the surgical instrument is smaller than or equal to the first preset threshold value, acquiring real-time pose information of the tail end of the instrument based on the real-time pose information of each driving shaft in the control stage so as to realize control of the surgical instrument.
3. The method according to claim 1, wherein the acquiring actual pose information of the distal end of the surgical instrument from the in-patient scene image includes:
identifying feature points of the scene image in the patient to identify at least three target feature points located on the distal end of the surgical instrument;
And acquiring the actual pose information of the tail end of the surgical instrument according to the actual position information of the target feature points.
4. The surgical instrument control method according to claim 3, wherein the acquiring actual pose information of the distal end of the surgical instrument based on the actual position information of the target feature point includes:
acquiring actual position information of the target feature point under the reference coordinate system according to the actual position information of the target feature point under the endoscope coordinate system and the mapping relation between the endoscope coordinate system and the reference coordinate system;
and acquiring actual pose information of the tail end of the surgical instrument under the reference coordinate system according to the actual position information of the target feature point under the reference coordinate system and the geometric parameter information of the tail end of the surgical instrument acquired in advance.
5. A surgical instrument control method according to claim 1, wherein the surgical instrument comprises an instrument cartridge and an instrument tip, the instrument cartridge comprising at least one drive shaft in driving connection with the instrument tip, the instrument tip comprising at least one joint in driving connection with the corresponding joint;
The step of obtaining the actual pose information of the tail end of the surgical instrument according to the in-vivo scene image of the patient comprises the following steps:
identifying feature points of the scene image in the patient to identify target feature points on each of the joints of the instrument tip;
acquiring actual pose information of each joint of the instrument tail end according to the actual position information of the target feature points on each joint of the instrument tail end and the geometric parameter information of the instrument tail end acquired in advance;
the step of obtaining pose deviation information of the surgical instrument end according to the actual pose information and the target pose information of the surgical instrument end comprises the following steps:
and aiming at each joint at the tail end of the instrument, acquiring pose deviation information of the joint according to the actual pose information and target pose information of the joint.
6. The surgical instrument control method according to claim 5, wherein the determining whether the pose deviation of the distal end of the surgical instrument is greater than a first preset threshold includes:
judging whether the pose deviation of each joint of the tail end of the instrument is larger than a first preset threshold value or not;
If yes, in a control stage, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope, wherein the real-time pose information comprises the following steps:
and if the pose deviation of at least one joint is larger than the first preset threshold, acquiring real-time pose information of the tail end of the surgical instrument based on the real-time in-vivo scene image of the patient acquired by the endoscope in a control stage.
7. A surgical instrument control method according to claim 5, wherein the target feature point on the joint is an end point in the axial direction of the joint.
8. The surgical instrument control method according to claim 5, wherein the acquiring real-time pose information of the instrument tip based on the real-time in-patient scene image acquired by the endoscope to realize control of the surgical instrument includes:
acquiring real-time pose information of each joint at the tail end of the instrument according to the real-time in-vivo scene image of the patient acquired by the endoscope;
and aiming at each joint at the tail end of the instrument, acquiring real-time pose deviation information of the joint according to the real-time pose information and the instruction pose information of the joint, and controlling the joint to perform corresponding movement according to the real-time pose deviation information of the joint.
9. The surgical instrument control method according to claim 2, characterized in that the surgical instrument control method further comprises:
acquiring real-time in-vivo scene images of the patient acquired by the endoscope in the process of controlling the tail end of the instrument to move to a target configuration;
for each driving shaft, iteratively updating transmission model parameters between the driving shaft and the tail end of the instrument according to the real-time in-patient scene image until a preset iteration ending condition is met so as to acquire updated transmission model parameters;
the acquiring real-time pose information of the instrument end based on the real-time pose information of each driving shaft comprises the following steps:
and acquiring real-time pose information of the tail end of the instrument based on the updated transmission model parameters corresponding to the driving shafts and the real-time pose information of the driving shafts.
10. The surgical instrument control method according to claim 9, wherein the preset iteration end condition is that a linear distance between a transmission model parameter obtained by a current iteration and a transmission model parameter obtained by a previous iteration is smaller than a second preset threshold.
11. A surgical instrument control method according to claim 1, characterized in that the surgical instrument control method further comprises:
Judging whether the tail end of the surgical instrument exceeds the working visual field range of the endoscope;
if yes, entering an adjustment mode until the tail end of the surgical instrument is in the working field of view of the endoscope; and/or, entering a locking mode to enable the tail end of the surgical instrument to enter a locking state until the tail end of the surgical instrument is in the working field of view of the endoscope.
12. A surgical instrument control method according to claim 1, characterized in that the surgical instrument control method further comprises:
if the tail end of the surgical instrument is blocked by the tissue in the patient body, locking the gesture joint of the main control arm on the doctor console until the tail end of the surgical instrument is not blocked by the tissue in the patient body any more; or alternatively
And acquiring real-time pose information of the tail end of the surgical instrument based on the real-time pose information of each driving shaft on the surgical instrument so as to realize control of the surgical instrument.
13. A surgical robot comprising a doctor console, a patient trolley and a controller in communication connection, the doctor console and the patient trolley having a master-slave control relationship, the patient trolley comprising at least one robotic arm, wherein a surgical instrument and an endoscope are mounted at a distal end of at least one of the robotic arms, the controller being configured for implementing the surgical instrument control method of any one of claims 1 to 12.
14. A readable storage medium, characterized in that the readable storage medium has stored therein a computer program which, when executed by a processor, implements the surgical instrument control method of any one of claims 1 to 12.
CN202210715323.3A 2022-06-22 2022-06-22 Surgical instrument control method, surgical robot, and storage medium Pending CN117297773A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210715323.3A CN117297773A (en) 2022-06-22 2022-06-22 Surgical instrument control method, surgical robot, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210715323.3A CN117297773A (en) 2022-06-22 2022-06-22 Surgical instrument control method, surgical robot, and storage medium

Publications (1)

Publication Number Publication Date
CN117297773A true CN117297773A (en) 2023-12-29

Family

ID=89283638

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210715323.3A Pending CN117297773A (en) 2022-06-22 2022-06-22 Surgical instrument control method, surgical robot, and storage medium

Country Status (1)

Country Link
CN (1) CN117297773A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117653007A (en) * 2024-01-31 2024-03-08 浙江华诺康科技有限公司 Parallax adjustment method, system and computer equipment of endoscope

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117653007A (en) * 2024-01-31 2024-03-08 浙江华诺康科技有限公司 Parallax adjustment method, system and computer equipment of endoscope

Similar Documents

Publication Publication Date Title
CN110944595B (en) System for mapping an endoscopic image dataset onto a three-dimensional volume
US8657781B2 (en) Automated alignment
US9283049B2 (en) Control system configured to compensate for non-ideal actuator-to-joint linkage characteristics in a medical robotic system
CN110215284B (en) Visualization system and method
KR20210052475A (en) Manual and robot controllable medical devices
CN113613580A (en) System and method for aligning inputs on a medical instrument
US20090012533A1 (en) Robotic instrument control system
KR102410247B1 (en) Systems and methods for displaying an instrument navigator in a teleoperated system
US20140343569A1 (en) Grip force normalization for surgical instrument
US20090259099A1 (en) Image-based control systems
WO2022188352A1 (en) Augmented-reality-based interventional robot non-contact teleoperation system, and calibration method therefor
US11622672B2 (en) Method for positioning an endoscope with flexible shaft
CN113180828A (en) Operation robot constrained motion control method based on rotation theory
US20210354286A1 (en) Systems and methods for master/tool registration and control for intuitive motion
EP3939001A1 (en) Systems and methods for connecting segmented structures
CN117297773A (en) Surgical instrument control method, surgical robot, and storage medium
CN110662507A (en) Robotic surgical system with automatic guidance
US20200246084A1 (en) Systems and methods for rendering alerts in a display of a teleoperational system
KR102225448B1 (en) Master device for manipulating active steering catheter and catheter system capability controlling bidirection of active steering catheter and master device
Ahmad et al. Shared Control of an Automatically Aligning Endoscopic Instrument Based on Convolutional Neural Networks
WO2023018684A1 (en) Systems and methods for depth-based measurement in a three-dimensional view
WO2023018685A1 (en) Systems and methods for a differentiated interaction environment
CN116076984A (en) Endoscope visual field adjusting method, control system and readable storage medium
WO2023233280A1 (en) Generating imaging pose recommendations
CN117813631A (en) System and method for depth-based measurement in three-dimensional views

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination