CN116492064A - Master-slave motion control method based on pose identification and surgical robot system - Google Patents

Master-slave motion control method based on pose identification and surgical robot system Download PDF

Info

Publication number
CN116492064A
CN116492064A CN202210059153.8A CN202210059153A CN116492064A CN 116492064 A CN116492064 A CN 116492064A CN 202210059153 A CN202210059153 A CN 202210059153A CN 116492064 A CN116492064 A CN 116492064A
Authority
CN
China
Prior art keywords
pose
joint
coordinate system
tool
handle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210059153.8A
Other languages
Chinese (zh)
Inventor
徐凯
吴百波
王龙飞
姬利永
李茂林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Surgerii Robot Co Ltd
Original Assignee
Beijing Surgerii Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Surgerii Robot Co Ltd filed Critical Beijing Surgerii Robot Co Ltd
Priority to CN202210059153.8A priority Critical patent/CN116492064A/en
Publication of CN116492064A publication Critical patent/CN116492064A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2072Reference field transducer attached to an instrument or patient

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Robotics (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Manipulator (AREA)

Abstract

The invention relates to the field of robots, and discloses a master-slave motion control method based on pose identification, which comprises the following steps: acquiring a positioning image; in the positioning image, a plurality of pose identifiers positioned on the driven tool are identified, wherein the pose identifiers comprise different pose identifier patterns; determining a current pose of the slave tool relative to a reference coordinate system based on the plurality of pose identifiers; determining a target pose of a handle of the master manipulator based on a current pose of the slave tool; and generating a control signal for the primary operator based on the target pose of the handle of the primary operator.

Description

Master-slave motion control method based on pose identification and surgical robot system
Technical Field
The disclosure relates to the field of robots, in particular to a master-slave motion control method based on pose identification and a surgical robot system.
Background
Along with the development of science and technology, the medical robot is used for assisting the medical staff in performing operations to rapidly develop, so that the medical robot not only can help the medical staff to perform a series of medical diagnosis and auxiliary treatment, but also can effectively relieve the shortage of medical resources.
Generally, a medical robot includes a slave tool for performing an operation and a master manipulator for controlling movement of the slave tool. In a practical scenario, the slave tool is arranged to be able to enter the operating area, and the medical staff member controls the movement of the slave tool in the operating area by teleoperation of the master manipulator to effect the medical operation.
However, the number of slave tools that are typically teleoperated may be greater than the number of master operators, and thus there may be instances in surgery where the slave tools controlled by the master operators are changed. Moreover, at the beginning of or during operation, the master operator needs to first establish a mapping with the slave tool and then perform master-slave control. Since the master manipulator is not previously gesture-matched with the correspondingly controlled slave tool, there may be a gesture (e.g., orientation or angle) mismatch between the master manipulator and the slave tool. If the two are directly matched for master-slave mapping, the control accuracy of the slave tool is reduced, and the man-machine interaction experience of medical staff (such as surgeons) is degraded. Therefore, after the main manipulator is in matched connection with the driven tool and before teleoperation, the gesture of the main manipulator and the gesture of the driven tool need to be correspondingly matched, so that the gesture control precision of the main manipulator on the driven tool is improved.
Disclosure of Invention
In some embodiments, the present disclosure provides a method for controlling master-slave motion, comprising: acquiring a positioning image; in the positioning image, a plurality of pose identifiers positioned on the driven tool are identified, wherein the pose identifiers comprise different pose identifier patterns; determining a current pose of the slave tool relative to a reference coordinate system based on the plurality of pose identifiers; determining a target pose of a handle of the master manipulator based on a current pose of the slave tool; and generating a control signal for the primary operator based on the target pose of the handle of the primary operator.
In some embodiments, the present disclosure provides a robotic system comprising: the device comprises a main manipulator, a plurality of joints, a plurality of motion sensors and a plurality of motion sensors, wherein the main manipulator comprises a multi-degree-of-freedom mechanical arm, a handle arranged on the multi-degree-of-freedom mechanical arm, at least one motor and at least one main manipulator sensor which are arranged at least one joint of the multi-degree-of-freedom mechanical arm, and the at least one main manipulator sensor is used for acquiring joint information of the at least one joint; a driven tool including an operating arm and a distal instrument disposed at a distal end of the operating arm; the image collector is used for collecting positioning images; and a control device communicatively coupled to the image collector and the master manipulator, the control device configured to perform a control method of master-slave movement of some embodiments of the present disclosure.
In some embodiments, the present disclosure provides a computer device comprising: a memory for storing at least one instruction; and a processor coupled with the memory and configured to execute at least one instruction to perform a control method of master-slave motion of some embodiments of the present disclosure.
In some embodiments, the present disclosure provides a computer-readable storage medium storing at least one instruction that, when executed by a computer, cause a robotic system to implement a control method of master-slave motion of some embodiments of the present disclosure.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the following will briefly describe the drawings that are required to be used in the description of the embodiments of the present disclosure. The drawings in the following description illustrate only some embodiments of the disclosure and other embodiments may be obtained by those of ordinary skill in the art from the disclosure's contents and drawings without inventive effort.
FIG. 1 illustrates a schematic diagram of a robotic system according to some embodiments of the present disclosure;
FIG. 2 illustrates a schematic block diagram of an operating arm according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic structural view of an operating arm according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of a tag including multiple pose identifications according to some embodiments of the present disclosure;
FIG. 5 illustrates a schematic view of a label disposed on the circumference of the distal end of an operating arm and formed in a cylindrical shape according to some embodiments of the present disclosure;
FIG. 6 illustrates a flow chart of a method of controlling master-slave motion according to some embodiments of the present disclosure;
FIG. 7 illustrates a flowchart of a method of determining three-dimensional coordinates of a plurality of pose identifiers relative to a slave tool coordinate system in accordance with some embodiments of the present disclosure;
FIG. 8 illustrates a flow chart of a method of determining three-dimensional coordinates of a plurality of pose identifiers relative to a slave tool coordinate system in accordance with further embodiments of the present disclosure;
FIG. 9 illustrates a flowchart of a method of identifying pose identifiers according to some embodiments of the present disclosure;
FIG. 10 illustrates a schematic diagram of a pose identification pattern according to some embodiments of the present disclosure;
FIG. 11 illustrates a flowchart of a method for searching for pose identifications according to some embodiments of the present disclosure;
FIG. 12 illustrates a schematic diagram of search gesture identification in accordance with some embodiments of the present disclosure;
FIG. 13 illustrates a flowchart of a method for searching for a second pose identification according to some embodiments of the present disclosure;
FIG. 14 illustrates a flowchart of a method for searching for pose identifications according to some embodiments of the present disclosure;
FIG. 15 illustrates a schematic diagram of a primary operator according to some embodiments of the present disclosure;
FIG. 16 illustrates a schematic block diagram of a computer device in accordance with some embodiments of the present disclosure;
FIG. 17 illustrates a schematic view of a surgical robotic system according to some embodiments of the present disclosure;
FIG. 18 illustrates a schematic view of a surgical tool according to some embodiments of the present disclosure;
FIG. 19 illustrates a schematic diagram of a master trolley according to some embodiments of the present disclosure;
fig. 20 illustrates a schematic view of a surgical trolley according to some embodiments of the present disclosure.
Detailed Description
In order to make the technical problems solved by the present disclosure, the technical solutions adopted and the technical effects achieved more clear, the technical solutions of the embodiments of the present disclosure will be described in further detail below with reference to the accompanying drawings, and it is obvious that the described embodiments are merely exemplary embodiments of the present disclosure, and not all embodiments.
In the description of the present disclosure, it should be noted that the directions or positional relationships indicated by the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc. are based on the directions or positional relationships shown in the drawings, are merely for convenience of describing the present disclosure and simplifying the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present disclosure. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present disclosure, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be either a fixed connection or a removable connection, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium; may be a communication between the interiors of the two elements. The specific meaning of the terms in this disclosure will be understood by those of ordinary skill in the art as the case may be. In this disclosure, the end proximal to the operator (e.g., physician) is defined as proximal, or posterior, and the end proximal to the surgical patient is defined as distal, or anterior, anterior. Those skilled in the art will appreciate that embodiments of the present disclosure may be used with medical instruments or surgical robots, as well as with other non-medical devices.
In this disclosure, the term "position" refers to the positioning of an object or a portion of an object in three dimensions (e.g., three translational degrees of freedom may be described using Cartesian X, Y and changes in Z coordinates, such as along the Cartesian X, Y and Z axes, respectively). In this disclosure, the term "pose" refers to a rotational setting of an object or a portion of an object (e.g., three rotational degrees of freedom may be described using roll, pitch, and yaw). In the present disclosure, the term "pose" refers to a combination of position and pose of an object or portion of an object, such as may be described using six parameters in the six degrees of freedom mentioned above. In the present disclosure, the pose of the handle of the main manipulator may be represented by a set of joint information of the main manipulator joints (e.g., a one-dimensional matrix composed of these joint information). In the present disclosure, the joint information of the joints may include an angle by which the respective joints are rotated with respect to the respective joint axes or a distance moved with respect to the initial position.
In the present disclosure, the reference coordinate system may be understood as a coordinate system capable of describing the pose of an object. According to the actual positioning requirement, the reference coordinate system can select the origin of the virtual reference object or the origin of the physical reference object as the origin of the coordinate system. In some embodiments, the reference coordinate system may be a world coordinate system or a camera coordinate system or the operator's own perception coordinate system, or the like. In the present disclosure, an object may be understood as an object or target that needs to be positioned, such as an operating arm of a slave tool or an operating arm tip or tip instrument. In this disclosure, the pose of the slave tool or a portion thereof refers to the pose of the slave tool coordinate system defined by the slave tool or a portion thereof relative to the reference coordinate system.
Fig. 1 illustrates a schematic diagram of a robotic system 100 according to some embodiments of the present disclosure. As shown in fig. 1, the robotic system 100 may include: an image acquisition device 110, a control means 120, at least one slave tool (150 a,150 b) and a master manipulator 180. In some embodiments, the image capture device 110 and the primary operator 180 are each communicatively coupled to the control 120.
In some embodiments, the image acquisition device 110 may be used to acquire a positioning image. The positioning image may include some or all of the image of the slave tool (150 a,150 b). In some embodiments, a positioning identifier is provided on the driven tool (150 a,150 b). In some embodiments, the positioning identifier includes a pose identifier based on which a position or pose of the slave tool (150 a,150 b) may be determined. In some embodiments, the plurality of pose identifiers includes different pose identifier patterns (as described in detail below). As shown in FIG. 1, the slave tools (150 a,150 b) are within the field of view of the image acquisition device 110, and a partial image of the slave tools (150 a,150 b) may be included in the acquired positioning image. In some embodiments, image capture device 110 may include, but is not limited to, a dual lens image capture device or a single lens image capture device, such as a binocular or monocular camera. Depending on the application scenario, the image capture device 110 may be an industrial camera, an underwater camera, a miniature electronic camera, an endoscopic camera, etc. In some embodiments, the image acquisition device 110 may be fixed in position or variable in position, for example, an industrial camera fixed in a monitored location or an endoscopic camera adjustable in position or pose. In some embodiments, the image acquisition device 110 may implement at least one of visible light band imaging, infrared band imaging, CT (Computed Tomography, electronic computed tomography) imaging, acoustic wave imaging, and the like. In some embodiments, the image acquisition device 110 may be, for example, the imaging module 2060b shown in fig. 20.
In some embodiments, the control device 120 is configured to execute at least one instruction to perform some or all of the steps in the methods of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14. In some embodiments, the control 120 may receive the positioning image from the image acquisition device 110 and process the positioning image. For example, the control device 120 may identify a plurality of positioning identifiers located on at least one slave tool (150 a,150 b) in the positioning image. In some embodiments, the control device 120 may determine the pose of the slave tool (150 a,150 b) based on the positioning image, e.g., determine the current pose of the slave tool (150 a,150 b) relative to the reference coordinate system based on the positioning image. In some embodiments, the control device 120 may also determine the pose of the master manipulator 180 based on the pose of the slave tool (150 a,150 b), e.g., determine the target pose of the handle of the master manipulator 180 based on the current pose of the slave tool (150 a,150 b). Wherein the target pose of the handle of the master operator 180 has a mapping relationship with the current pose of the slave tool (150 a,150 b). In some embodiments, the control device 120 may also generate control signals for the primary operator 180 based on the target pose of the handle of the primary operator 180. In some embodiments, the control device 120 may transmit a control signal of the main operator 180 to driving motors of a plurality of joints of the main operator 180.
In some embodiments, the primary manipulator 180 comprises a multi-degree of freedom robotic arm (e.g., a six-degree of freedom robotic arm) with joint sensors disposed at portions of the joints on the multi-degree of freedom robotic arm, and joint information (e.g., joint angle data) is generated by the joint sensors. In some embodiments, the joint sensor employs a potentiometer and/or encoder. In some embodiments, the primary operator may be, for example, the primary operator 1500 shown in fig. 15. In some embodiments, a controller may be provided in the main manipulator 180, and the controller may calculate posture data of the main manipulator 180 from the joint information obtained by the respective joint sensors and transmit the calculated posture data to the control device 120. In other embodiments, the control device 120 may also calculate the posture data of the main manipulator according to the joint information sent by the joint sensor.
In some embodiments, driven tool 150a is taken as an example. As shown in fig. 1, the driven tool 150a includes an operating arm 140. In some embodiments, the driven tool 150a further includes an end instrument 160 disposed at the distal end 130 of the manipulator arm. In some embodiments, the operating arm 140 may be a rigid arm or a deformable arm. In some embodiments, the manipulation arm 140 may comprise a continuous body deformable arm. The continuum deformable arm is, for example, an operating arm 300 as shown in fig. 3. In some embodiments, the manipulator arm 140 may comprise a manipulator arm having multiple degrees of freedom that is comprised of multiple joints. Such as an operating arm that can achieve 4 to 7 degrees of freedom motion. For example, an operating arm that can be moved in 6 degrees of freedom can be realized. In some embodiments, the end instrument 160 may include, but is not limited to, forceps, scalpels, electrical hooks, and the like.
Fig. 2 illustrates a schematic view of a knuckle 200 of an operating arm according to some embodiments of the present disclosure. In some embodiments, the operating arm of the driven tool may include at least one deformable structure 200. As shown in fig. 2, the deformable structure 200 includes a fixed disk 210 and a plurality of structural bones 220. The plurality of structural bones 220 have a first end fixedly coupled to the fixed disk 210 and a second end coupled to a driving unit (not shown). In some embodiments, retaining disk 210 may be of various shapes including, but not limited to, annular structures, disk-like structures, etc., and may be circular, rectangular, polygonal, etc. in cross-section. In some embodiments, the drive unit deforms the construct 200 by driving the structural bone 220. For example, the drive unit places the construct 200 in a curved state as shown in FIG. 2 by driving the structural bone 220. In some embodiments, a second end of the plurality of structural bones 220 is coupled to a drive unit through the base plate 230. In some embodiments, similar to the fixed disk 210, the base disk 230 may be of various shapes including, but not limited to, a ring-like structure, a disk-like structure, etc., and may be circular, rectangular, polygonal, etc. in cross-section. The drive unit may comprise a linear motion mechanism, a drive mechanism, or a combination of both. A linear motion mechanism may be coupled to structural bone 220 to push or pull structural bone 220 and thereby drive bending of construct 200. The drive mechanism may include a fixed disk and a plurality of structural bones, wherein one end of the plurality of structural bones is fixedly connected to the fixed disk. The other ends of the plurality of structural bones of the driving construct are connected or integrally formed with the plurality of structural bones 220 to drive bending of the construct 200 by bending of the driving construct. In some embodiments, a spacer disc 240 is also included between the fixation disc 210 and the base disc 230, with the plurality of structural bones 220 passing through the spacer disc 240. Similarly, the drive mechanism may also include a spacer disc.
Fig. 3 illustrates a schematic structural view of an operating arm 300 according to some embodiments of the present disclosure. As shown in fig. 3, the operation arm 300 is a deformable operation arm, and the operation arm 300 may include an operation arm tip 310 and an operation arm body 320. The operating arm body 320 may include one or more construction segments, such as a first construction segment 3201 and a second construction segment 3202. In some embodiments, the first and second construction pieces 3201, 3202 may be similar in structure to the construction piece 200 shown in fig. 2. In some implementations, as shown in fig. 3, the lever body 320 further includes a first straight segment 3203 between the first and second formations 3201, 3202. The first straight shaft section 3203 is connected at a first end to a base plate of the second structural section 3202 and at a second end to a fixed plate of the first structural section 3201. In some implementations, as shown in fig. 3, the manipulator arm body 320 further includes a second straight rod segment 3204, the first end of the second straight rod segment 3204 being connected with the base plate of the first construct segment 3201. As shown in fig. 3, each of the structural members (first structural member 3201 and second structural member 3202) may include a base plate, a fixed plate, and a plurality of structural bones extending through the base plate and the fixed plate, and the plurality of structural bones may be fixedly connected with the fixed plate and slidably connected with the base plate. The continuum deformable arms and the constituent segments they contain can be described by a kinematic model (as described in more detail below).
In some embodiments, each of the segments of the operating arm 300 may be configured as a segment 200 as shown in fig. 2. As shown in FIG. 2, the base plate coordinate systemAttached to the base plate of section t (t=1, 2,3 …) continuum with origin at the center of the base plate, XY plane coincident with the base plate plane,/->From the center of the base plate, a first structural bone (a first structural bone is understood to be a structural bone that is arbitrarily designated one of a plurality of structural bones as a reference). Curved plane coordinate system 1->The origin of the X-Y plane coincides with the origin of the base plate coordinate system, the XY plane coincides with the bending plane, and the X-Y plane coincides with the bending plane>And->And (5) overlapping. Fixed disk coordinate System->Attached to a fixed disk of the section t of continuum with its origin at the fixed positionThe center of the disc, the XY plane coincides with the plane of the fixed disc, < >>From the center of the fixation disc, to the first structural bone. Curved plane coordinate system 2The origin is positioned at the center of the fixed disk, the XY plane is coincided with the bending plane, and the X-Y plane is +.>And->And (5) overlapping.
The individual segments 200 as shown in fig. 2 may be represented by a kinematic model. Position of the t-th knot end (fixed disk coordinate system { te }) relative to the base disk coordinate system { tb }) tb P te Posture and attitude tb R te Can be determined based on the following formulas (1), (2):
tb R tetb R tl t1 R t2 t2 R te (2)
wherein L is t Length, θ, of a virtual structural bone (e.g., virtual structural bone 221 shown in fig. 2) that is the t-th node t In order to make the structure in the t-th section,about->Or->Rotate to +.>The required rotation angle is set to be equal to the required rotation angle, tb R t1 is the attitude of a curved plane coordinate system 1{ t1} of a t-th node relative to a base plate coordinate system { tb', t1 R t2 is the pose of the curved plane coordinate system 2{ t2} of the t-th node relative to the curved plane coordinate system 1{ t1', t2 R te the posture of the fixed disk coordinate system { te } of the t-th node with respect to the curved plane coordinate system 2{ t2 }.
tb R t1t1 R t2 And t2 R te can be based on the following formulas (3), (4) and (5):
wherein delta t In the t-th section, a bending plane andis included in the bearing.
The joint parameter ψ of a single construct 200 as shown in fig. 2 t Can be determined based on the following equation (6):
ψ t =[θ t ,δ t ] T (6)
in some embodiments, the driving amount of the plurality of structural bones has a known mapping relationship with the joint parameters. Based on the target joint parameters and the mapping relationship of the constituent nodes, the driving amounts of the plurality of structural bones can be determined. The driving amount of the multiple structural bones can be understood as a single construct from an initial state (e.g., θ t =0) bendThe length of the structural bone that is pushed or pulled when flexed to the target bending angle. In some embodiments, the mapping relationship of the driving amount of the plurality of structural bones and the joint parameters may be determined based on the following formula (7):
q i_tool ≡-r ti_tool θ t cos(δ tti_tool ) (7)
Wherein r is ti_tool Is the distance between the ith structural bone in the t-th section and the virtual structural bone, beta ti_tool Is the included angle between the ith structural bone and the first structural bone in the t-th section, q i_tool For the driving amount of the i-th structural bone, a driving signal of the driving unit may be determined based on the driving amount of the i-th structural bone.
In some embodiments, the entire deformable arm may be described by a kinematic model. As shown in fig. 3, a transformation may be performed between a plurality of coordinate systems located at a plurality of positions of the deformable arm. For example, the end instrument of the continuum deformable arm in world coordinate system { w } may be determined based on the following equation (8):
W T tipW T 1b 1b T 1e 1e T 2b 2b T 2e 2e T tip (8)
wherein, the liquid crystal display device comprises a liquid crystal display device, W T tip a homogeneous transformation matrix representing the end instrument of the continuum deformable arm relative to the world coordinate system; W T 1b a homogeneous transformation matrix representing the base plate of the first continuum segment relative to the world coordinate system; 1b T le a homogeneous transformation matrix representing a fixed disk of the first continuum segment relative to a base disk of the first continuum segment; 1e T 2b a homogeneous transformation matrix representing the base disk of the second continuum segment relative to the fixed disk of the first continuum segment; 2b T 2e a homogeneous transformation matrix representing a fixed disk of the second continuum segment relative to a base disk of the second continuum segment; 2e T tip representing a homogeneous transformation matrix of the end instrument of the continuum deformable arm relative to the fixed disk of the second continuum segment. In some embodiments, the end instrument is fixedly disposed on the fixed disk Thus, it is 2e T tip Is known or predetermined.
Those skilled in the art will appreciate that the deformable arms have different joint parameters at different operating conditions. For example, the operating arm 300 shown in fig. 3 includes at least four operating states. The four operating states of the operating arm 300 are as follows:
the first working state: only the second construct 3202 is involved in pose control of the end instrument (e.g., only the second construct 3202 enters the workspace), at which point joint parameters of the manipulator 300 may be determined based on the following equation (9):
wherein, psi is c1 Is a joint parameter of the operation arm 300 in the first operation state,to operate the pivot angle L of the arm 300 2 、θ 2 、δ 2 And L in the structural section 200 shown in FIG. 2 t 、θ t And delta t Is the same as the physical meaning of (a).
And a second working state: the second construct 3202 and the first straight segment 3203 are involved in pose control of the end instrument (e.g., the second construct 3202 is fully entered into the working space and the first straight segment 3203 is partially entered into the working space), at which point the joint parameters of the manipulator 300 may be determined based on the following equation (10):
wherein, psi is c2 Is the joint parameter L of the operating arm 300 in the second working state r Is the feed of the first straight segment 3203.
Third working state: the second structure 3202, the first straight segment 3203, and the first structure 3201 are involved in pose control of the end instrument (e.g., the second structure 3202 is fully entered into the working space, the first straight segment 3203 is fully entered into the working space, and the first structure 3201 is partially entered into the working space), at which point joint parameters of the manipulator 300 may be determined based on the following equation (11):
Wherein, psi is c3 Is the joint parameter L of the operating arm 300 in the third working state 1 、θ 1 And delta 1 And L in the structural section 200 shown in FIG. 2 t 、θ t And delta t Is the same as the physical meaning of (a).
Fourth operating state: the second structure section 3202, the first straight line section 3203, the first structure section 3201, and the second straight line section 3204 participate in the pose control of the end instrument (e.g., the second structure section 3202 is fully entered into the working space, the first straight line section 3203 is fully entered into the working space, the first structure section 3201 is fully entered into the working space, and the second straight line section 3204 is partially entered into the working space), at which time the joint parameters of the manipulator 300 may be determined based on the following equation (12):
wherein, psi is c4 For the joint parameters, L, of the operating arm 300 in the fourth operating state s Is the feed of the second straight segment 3204.
In some embodiments, a plurality of pose identifiers are provided on the slave tool. For example, a plurality of pose identifiers are distributed on an operating arm of the driven tool. In some embodiments, a plurality of pose markers are disposed on an outer surface of the columnar portion of the manipulator arm. For example, the plurality of pose markers are circumferentially distributed on the manipulator arm tip 310. For example, a plurality of pose markers are provided on the outer surface of the columnar portion of the manipulator arm tip 310. In some embodiments, the pose of the slave tool may be determined based on the images of the plurality of pose identifications. In some embodiments, a positioning tag (e.g., tag 400 shown in fig. 4, tag 500 shown in fig. 5) including a plurality of pose identifiers including a plurality of different pose identifier patterns distributed on the positioning tag along the circumferential direction of the columnar portion and pose identifier pattern corner points in the pose identifier patterns is provided on the outer surface of the columnar portion of the operation arm.
In some embodiments, the pose identification may include a pose identification pattern and pose identification pattern corner points in the pose identification pattern. In some embodiments, the pose identification pattern may be provided on a label on the distal end of the manipulation arm, or may be printed on the distal end of the manipulation arm, or may be a pattern formed by the physical configuration of the distal end of the manipulation arm itself, for example, may include depressions or protrusions, and combinations thereof. In some embodiments, the pose identification pattern may include a pattern formed in brightness, gray scale, color, and the like. In some embodiments, the pose identification pattern may include a pattern that provides information detected by the image acquisition device, either actively (e.g., self-light emitting) or passively (e.g., reflected light). Those skilled in the art will appreciate that in some embodiments, the pose of the pose identification or pose of the pose identification pattern may be represented by the pose of the pose identification pattern corner coordinate system. In some embodiments, the pose identification pattern is provided on the distal end of the manipulation arm in an area adapted to be imaged by the image acquisition device, e.g. an area that may be covered by the field of view of the image acquisition device during operation or an area that is not easily disturbed or blocked during operation.
Fig. 4 illustrates a schematic diagram of a tag 400 including multiple pose identifications, according to some embodiments. Fig. 5 shows a schematic view of a label 500 provided on the peripheral side of the distal end of the operation arm and formed in a cylindrical shape. It will be appreciated that for simplicity, the tag 400 may include the same pose identification pattern as the tag 500.
Referring to fig. 4, the plurality of pose identifiers may include a plurality of different pose identifier patterns 410. The plurality of pose identifications may also include a plurality of pose identification pattern corner points P in a plurality of different pose identification patterns 410 4 The pose identification pattern corner points are represented by the "good" symbols in the present disclosure. In some embodiments, pattern corner P may be identified by identifying pose identification pattern 410 or a pose therein 4 And determining pose identification.
Referring to fig. 5, in the circumferentially disposed state, the tag 400 becomes a tag 500 spatially configured in a cylindrical shape. In some embodiments, the pivot angle or roll angle of the pose identification may be represented by the pivot angle of the pose identification pattern or the pose identification pattern corner. The pivot angle of each pose identification pattern or pose identification pattern corner identification is known or predetermined. In some embodiments, based on a distribution of a plurality of pose identifications (e.g., pose identification patterns or pose identification pattern corner points), an axis-around angle identified by each pose identification may be determined. In some embodiments, the plurality of pose identifiers may be evenly distributed (e.g., the pose identifier pattern corner in tag 400 is equally spaced apart and the pose identifier pattern corner in tag 500 is equally distributed). In other embodiments, the plurality of pose identifiers may be unevenly distributed. In some embodiments, each pose identification pattern may be used to identify a particular pivot angle based on a distribution of multiple pose identifications, each pose identification pattern having a one-to-one correspondence with the identified pivot angle. In this disclosure, an axis-around angle or roll angle refers to an angle around a Z-axis (e.g., the Z-axis of the driven tool coordinate system { wm }). In some embodiments, the manipulator arm is a deformable manipulator arm and the Z-axis is tangential to the manipulator arm.
As shown in fig. 5, a plurality of different pose identification patterns 510 in the tag 500 are uniformly distributed along the circumferential direction of the cylindrical structure, and a plurality of pose identification pattern corner points are uniformly distributed on a cross-sectional circle 520 of an XY plane of a driven tool coordinate system { wm }, then the distribution angle of any adjacent pose identification pattern corner points (e.g., angle α 0 ) Equal. Setting a pose identification pattern corner point P pointed by X axis 5 ,P 5 As a reference corner (pose identification pattern corner P) for identifying 0 degree pivot angle 5 The located pose identification pattern is used as a reference pattern), the corner points of the arbitrary pose identification pattern and the corner point P of the pose identification pattern can be obtained according to the arbitrary pose identification pattern 5 And determining the pivot angle of the pose identification pattern corner mark. In some embodiments, the pivot angle of the pose identification pattern corner identification may be determined based on the following equation (13):
α m =α 0 (m-1) (13)
wherein alpha is m To identify pattern corner P with pose 5 As the first pose identification pattern corner, the mth pose identification pattern corner has an around-axis angle according to the clockwise direction of the cross-section circle 320.
Some embodiments of the present disclosure provide a method of controlling master-slave motion. Fig. 6 illustrates a flow chart of a method 600 of master-slave motion control according to some embodiments of the present disclosure. The method 600 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. Some or all of the steps in the method 600 may be performed by a control device (e.g., the control device 120) of the robotic system 100. The control means 120 may be configured on a computing device. Method 600 may be implemented by software, firmware, and/or hardware. In some embodiments, method 600 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium.
Referring to fig. 6, in step 601, a positioning image is acquired. In some embodiments, the positioning image includes a portion of the slave tool and a plurality of pose identifiers and at least one angle identifier on the slave tool. For example, the positioning image includes a portion of the operating arm and a plurality of pose identifiers and at least one angle identifier on the operating arm. In some embodiments, the positioning image may be received from an image acquisition device 110 as shown in fig. 1. For example, the control 120 may receive a positioning image actively transmitted by the image acquisition device 110. Alternatively, the control device 120 may send an image request instruction to the image pickup apparatus 110, and the image pickup apparatus 110 sends the positioning image to the control device 120 in response to the image request instruction.
With continued reference to FIG. 6, in step 603, a plurality of pose identifiers located on the slave tool are identified in the localization image, the plurality of pose identifiers including different pose identifier patterns. For example, exemplary methods of identifying multiple pose identifications located on a slave tool may include the methods shown in fig. 9, 11, 13, and 14. In some embodiments, the control device may identify the pose identification of some or all of the positioning images by an image processing algorithm. In some embodiments, the image processing algorithm may include a feature recognition algorithm, which may extract or recognize features of the pose identification. For example, the image processing algorithm may comprise a corner detection algorithm for detecting pose identification pattern corners. The corner detection algorithm may be one of, but not limited to, gray-graph based corner detection, binary image based corner detection, contour curve based corner detection. For example, the image processing algorithm may be a color feature extraction algorithm for detecting color features in the pose identification pattern. For another example, the image processing algorithm may be a contour detection algorithm for detecting contour features of the pose identification pattern. In some embodiments, the control device may identify the pose identification of some or all of the positioning images by identifying the model.
With continued reference to FIG. 6, at step 605, a current pose of the slave tool relative to the reference frame is determined based on the plurality of pose identifiers. In some embodiments, the method 600 further comprises: determining two-dimensional coordinates of the plurality of pose identifiers in the positioning image; and determining a current pose of the slave tool relative to the reference coordinate system based on the two-dimensional coordinates of the plurality of pose identifiers in the positioning image and the three-dimensional coordinates of the plurality of pose identifiers relative to the slave tool coordinate system. In some embodiments, the coordinates of the pose identification may be represented by the coordinates of the pose identification pattern corner points. For example, the pose identification two-dimensional coordinates in the positioning image and the three-dimensional coordinates in the slave tool coordinate system may be represented by the coordinates of the pose identification pattern corner points. In some embodiments, the pose of the slave tool coordinate system relative to the reference coordinate system may be determined as the current pose of the slave tool relative to the reference coordinate system based on the two-dimensional coordinates of the plurality of pose identification pattern corner points in the positioning image and the three-dimensional coordinates in the slave tool coordinate system.
In some embodiments, the method 600 may further comprise: and determining the gesture of the slave tool coordinate system relative to the reference coordinate system based on the two-dimensional coordinates of the plurality of gesture identification pattern corner points in the positioning image, the three-dimensional coordinates of the plurality of gesture identification pattern corner points in the slave tool coordinate system and the transformation relation of the camera coordinate system relative to the reference coordinate system. In some embodiments, the transformation of the camera coordinate system with respect to the reference coordinate system may be known. For example, the reference coordinate system is a world coordinate system, and the transformation relationship between the camera coordinate system and the world coordinate system can be determined according to the pose of the camera. In other embodiments, the reference coordinate system may be the camera coordinate system itself, according to actual requirements. In some embodiments, based on the camera imaging principle and the projection model, the pose of the slave tool coordinate system relative to the camera coordinate system is determined based on the two-dimensional coordinates of the plurality of pose identification pattern corner points in the positioning image and the three-dimensional coordinates of the plurality of pose identification pattern corner points in the slave tool coordinate system. Based on the transformation relation between the pose of the slave tool coordinate system relative to the camera coordinate system and the transformation relation between the camera coordinate system relative to the reference coordinate system, the pose of the slave tool coordinate system relative to the reference coordinate system can be obtained. In some embodiments, camera intrinsic parameters may also be considered. For example, the camera's internal parameters may be the camera's internal parameters of the image capture device 110 as shown in fig. 1. The internal parameters of the camera may be known or calibrated. In some embodiments, the camera coordinate system may be understood as a coordinate system established with the camera origin. For example, a coordinate system established with the optical center of the camera as the origin or a coordinate system established with the lens center of the camera as the origin. When the camera is a binocular camera, the origin of the camera coordinate system may be the center of the left lens of the camera, or the center of the right lens, or any point on the left and right lens center line (e.g., the midpoint of the line).
In some embodiments, the pose of the slave tool coordinate system { wm } relative to a reference coordinate system (e.g., world coordinate system) may be determined based on the following equation (14):
wherein, the liquid crystal display device comprises a liquid crystal display device, w R wm for the pose of the slave tool coordinate system relative to the world coordinate system, w P wm for the position of the slave tool coordinate system relative to the world coordinate system, w R lens for the pose of the camera coordinate system relative to the world coordinate system, w P lens for the position of the camera coordinate system relative to the world coordinate system, lens R wm for the pose of the slave tool coordinate system relative to the camera coordinate system, lens P wm is the position of the slave tool coordinate system relative to the camera coordinate system.
Some embodiments of the present disclosure provide methods of determining three-dimensional coordinates of a plurality of pose identifiers relative to a slave tool coordinate system. In some embodiments, three-dimensional coordinates of the plurality of pose identifiers relative to the slave tool coordinate system are determined based on a distribution of the plurality of pose identifiers. For example, based on the distribution of the plurality of pose identification pattern corner points, three-dimensional coordinates of the plurality of pose identification pattern corner points in the slave tool coordinate system are determined.
In some embodiments, the current pose of the slave tool is the current pose of the slave tool relative to the base coordinate system of the slave tool. The slave tool includes an operation arm and a tip instrument provided at a tip of the operation arm, and a current posture of the slave tool includes a posture of the tip instrument with respect to a base coordinate system of the slave tool or a posture of the tip of the operation arm with respect to the base coordinate system of the slave tool. In some embodiments, the base coordinate system of the slave tool may be the coordinate system of the base on which the slave tool is mounted (e.g., the motion arm tip of the surgical robot), the coordinate system of the sheath through which the slave tool passes (e.g., the sheath exit coordinate system), the coordinate system of the distal center point of motion (Remote Center of Motion, RCM) of the slave tool, and the like. For example, the base coordinate system of the slave tool may be set at the sheath exit location, and the base coordinate system of the slave tool is fixed during teleoperation. The current pose of the end instrument can be transformed to obtain poses relative to other coordinate systems. In some embodiments, the current pose of the slave tool is the current pose of the image of the slave tool in the display relative to the world coordinate system. In some embodiments, the world coordinate system may be the coordinate system of the space in which the operator or the master manipulator is located. Thus, the pose of the image of the slave tool in the display relative to the world coordinate system is the pose perceived by the operator. The slave tools include surgical tools and vision tools. During surgery, a surgical tool performs surgery in a patient, a vision tool acquires images in the patient using a camera, and transmits the acquired images to a surgical trolley. The image is processed by a video processing module in the operation trolley and then displayed on a display of the main control trolley. The operator obtains the current pose of the slave tool from the image in the display. In some embodiments, the current pose of the image of the slave tool in the display relative to the world coordinate system may be derived by coordinate transformation. For example, the current pose of the image of the slave tool in the display relative to the world coordinate system may be obtained based on the base coordinate system of the slave tool, the coordinate system of the camera of the vision tool, the base coordinate system of the vision tool, the coordinate system of the display, and the world coordinate system.
In step 607, a target pose of the handle of the master manipulator is determined based on the current pose of the slave tool. In some embodiments, the current pose of the slave tool is a current pose relative to a base coordinate system of the slave tool, or the current pose of the slave tool is a current pose of an image of the slave tool in a display relative to a world coordinate system. The target pose of the handle of the main manipulator is a pose relative to the base coordinate system of the main manipulator. The base coordinate system of the primary manipulator may be the coordinate system of the base to which the primary manipulator is connected. In some embodiments, the base coordinate system of the master manipulator has a determined transformation relationship with the base coordinate system of the slave tool.
In some embodiments, the current pose of the driven tool matches, e.g., is the same as, proportional to, or has a fixed difference from, the target pose of the handle. For example, before teleoperation, the current posture of the driven tool is kept unchanged, the current posture of the driven tool is taken as the target posture of the handle, and the current posture of the handle is adjusted to the target posture, so that the posture of the handle is matched with the posture of the driven tool.
In step 609, a control signal for the primary operator is generated based on the target pose of the handle of the primary operator. In some embodiments, the method 600 further comprises: determining a current pose of a handle of the primary operator; and generating a control signal of the main operator based on the target posture of the handle of the main operator and the current posture of the handle. The current posture of the handle of the main operator is a posture of the handle of the main operator with respect to the base coordinate system of the main operator. In some embodiments, a control signal corresponding to the handle reaching the target pose from the current pose is determined based on the current pose of the handle and the target pose.
FIG. 7 illustrates a flowchart of a method 700 of determining three-dimensional coordinates of a plurality of pose identifiers relative to a slave tool coordinate system according to some embodiments of the present disclosure. Some or all of the steps in method 700 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 700 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 700 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 700 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium.
Referring to FIG. 7, at step 701, based on a distribution of the plurality of pose identifiers, an axis-wrapping angle of the plurality of pose identifiers with respect to a Z-axis of a driven tool coordinate system is determined. In some embodiments, an axis-wrapping angle of the plurality of pose identifiers relative to a Z-axis of the driven tool coordinate system may be determined based on the plurality of pose identifier patterns. For example, each pose identification pattern may identify a particular pivot angle, with different pose identification patterns corresponding one-to-one to the identified pivot angle. Based on the recognized pose identification pattern and the correspondence of the pose identification pattern and the pivot angle, the pivot angle identified by the recognized pose identification pattern can be determined. It should be appreciated that the distribution of each pose identification pattern is known or predetermined. In some embodiments, the plurality of pose identification patterns or the distribution of the plurality of pose identification pattern angular points may be a distribution as shown in fig. 5. In some embodiments, the pivot angle of each pose identification pattern corner identification may also be determined based on equation (13).
Referring to FIG. 7, at step 703, three-dimensional coordinates of the plurality of pose markers relative to the slave tool coordinate system are determined based on the pivot angles of the plurality of pose markers. In some embodiments, as shown in fig. 5, each pose identification pattern corner is located on the circumference of a cross-sectional circle 520, and the center of the cross-sectional circle 520 and the radius r are known. Identifying pattern corner points P by pose 5 As reference corner point, pose identification pattern corner point P 5 The three-dimensional coordinates in the slave tool coordinate system { wm } are (r, 0). In some embodiments, the three-dimensional coordinates of each pose identification pattern corner in the slave tool coordinate system { wm } may be determined based on the following equation (15):
C m =[r·cosα m r·sinα m 0] T (15)
wherein C is m To identify pattern corner P with pose 5 As the first pose identification pattern corner, the specific pivot angle identified by the mth pose identification pattern corner may be based on three-dimensional coordinates of the plurality of pose identifications in the slave tool coordinate system { wm } in the clockwise direction of the cross-sectional circle 520.
In some embodiments, the pivot angle α of the mth pose identification pattern corner identification is determined based on equation (13) m The pivot angle α then determined based on equation (13) m And equation (15) to determine the three-dimensional coordinate C m
FIG. 8 illustrates a flow chart of a method 800 of determining three-dimensional coordinates of a plurality of pose identifiers relative to a slave tool coordinate system according to further embodiments of the present disclosure. Some or all of the steps in method 800 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 800 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 800 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 800 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium.
Referring to fig. 8, at step 801, an arrangement order of a plurality of pose identifiers is determined based on at least two of the plurality of pose identifiers. In some embodiments, the arrangement order of the plurality of pose identifiers may be represented by the arrangement order of the plurality of pose identifier patterns. In some embodiments, the order of arrangement of the plurality of pose identifiers is determined by identifying any two pose identifier patterns. It should be appreciated that the plurality of pose identifiers includes different pose identifier patterns, and that in the case where any two pose identifier patterns are known, an arrangement order of the plurality of pose identifiers in the positioning image, such as a clockwise arrangement or a counterclockwise arrangement, may be determined based on a distribution of the plurality of pose identifier patterns known (e.g., a distribution of the different pose identifier patterns in the tag 400 shown in fig. 4, or a distribution of the different pose identifier patterns in the tag 500 shown in fig. 5).
Referring to fig. 8, in step 803, three-dimensional coordinates of a plurality of pose markers are determined based on the arrangement order of the plurality of pose markers. In some embodiments, based on a known distribution of the plurality of pose identifiers, three-dimensional coordinates of each pose identifier in the slave tool coordinate system may be determined, the three-dimensional coordinates of each pose identifier may be represented by three-dimensional coordinates of pose identifier pattern corner points in the slave tool coordinate system, each pose identifier pattern corresponding to one coordinate point in the slave tool coordinate system. After determining the arrangement order of the plurality of pose identification patterns, the remaining pose identification patterns can be determined based on the identified pose identification patterns, and further three-dimensional coordinates of each pose identification pattern in the driven tool coordinate system can be determined. In some embodiments, a plurality of pose identification corner points in the positioning image are identified, and any two corresponding pose identification patterns in the plurality of pose identification corner points are determined. And determining the arrangement sequence of the plurality of pose identification pattern corner points based on the two recognized pose identification patterns, so that the three-dimensional coordinates of each pose identification pattern corner point in the driven tool coordinate system can be determined. In addition, based on the arrangement sequence, the distribution of all the pose identification patterns can be determined, so that the pose identification patterns at the corresponding positions on the positioning image are matched by using a specific pose pattern matching template, and the data processing speed is improved. In some embodiments, pattern matching at pose pattern corner points with pose identification pattern templates may be implemented similarly to step 903 in method 900.
In some embodiments, the end instrument is disposed at the end of the manipulator arm, so that the position of the end instrument is known or determinable. The pose transformation relationship of the end instrument relative to the slave tool coordinate system is also known or predetermined. In some embodiments, taking the reference coordinate system as an example of a world coordinate system, the pose of the end instrument of the slave tool relative to the reference coordinate system may be determined based on the following equation (16):
w R tipw R wm wm R tip
w P tipw R wm wm P tip + w P wm (16)
wherein, the liquid crystal display device comprises a liquid crystal display device, w R tip for the pose of the end instrument relative to the world coordinate system, w P tip for the position of the end instrument relative to the world coordinate system, wm R tip for the pose of the end instrument relative to the slave tool coordinate system, wm P tip is the position of the end instrument relative to the driven tool coordinate system.
In some embodiments, the pose of the slave tool coordinate system relative to the world coordinate system is determined based on equation (14) w R wm And position w P wm The pose then determined based on equation (14) w R wm And position w P wm And equation (16) determining the pose of the end instrument relative to the world coordinate system w R tip And position w P tip
Some embodiments of the present disclosure provide methods of identifying pose identifiers. Fig. 9 illustrates a flowchart of a method 900 of identifying pose identifications according to some embodiments of the present disclosure. Some or all of the steps in method 900 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 900 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 900 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 900 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium.
Referring to fig. 9, in step 901, a plurality of candidate pose identifications are determined from a localization image. In some embodiments, the pose identification may include pose identification pattern corner points in the pose identification pattern. The coordinates of the candidate pose identification or the origin of the coordinate system can be represented by the candidate pose identification pattern corner points. In some embodiments, the candidate pose identification pattern corner points may refer to possible pose identification pattern corner points obtained through preliminary processing or preliminary recognition of the positioning image.
In some embodiments, the method 900 may further include determining a region of interest (Region of Interest, ROI) in the localization image. For example, the ROI may be first truncated from the localization image, and a plurality of candidate pose identifications may be determined from the ROI. The ROI may be a whole image of the positioning image or a partial region. For example, the ROI of the current frame may be truncated based on a plurality of regions within a range of pose identification pattern corner points determined from the previous frame image (e.g., the positioning image of the previous image processing cycle). For the positioning image of the non-first frame, the ROI may identify, for a plurality of poses of the previous image processing cycle, a region within a certain distance range centered on a virtual point constituted by coordinates of the pattern corner points. The certain distance range may be a fixed multiple, for example twice, of the average separation distance of the pose identification pattern corner points. It should be appreciated that the predetermined multiple may also be a variable multiple of the average separation distance of the plurality of candidate pose identification pattern corner points in the previous image processing cycle.
In some embodiments, the method 900 may further include determining corner likelihood values (Corner Likelihood, CL) for each pixel point in the positioning image. In some embodiments, the corner likelihood values for the pixel points may be numerical values that characterize the likelihood of the pixel points as feature points (e.g., corner points). In some embodiments, the positioning image may be preprocessed before computing the corner likelihood values for each pixel, after which the corner likelihood values for each pixel in the preprocessed image are determined. The preprocessing of the image may include, for example: at least one of image graying, image denoising and image enhancement. For example, image preprocessing may include: and cutting the ROI from the positioning image, and converting the ROI into a corresponding gray image.
In some embodiments, determining the corner likelihood value of each pixel in the ROI may include, for example, convolving each pixel in the ROI to obtain a first and/or second derivative of each pixel. And (3) obtaining the corner likelihood value of each pixel point by using the first-order derivative and/or the second-order derivative of each pixel point in the ROI range. Illustratively, the corner likelihood values for each pixel may be determined based on the following equation (17):
Where τ is a set constant, for example, set to 2; i x 、I 45 、I y 、I n45 The first derivatives of the pixel points in the directions of 0, pi/4, pi/2 and pi/4 are respectively shown; i xy And I 45_45 The second derivatives of the pixel points in the 0, pi/2 and pi/4, -pi/4 directions, respectively.
In some embodiments, the method 900 may further include dividing the ROI into a plurality of sub-regions. For example, a non-maximal suppression method may be used to equally divide multiple sub-images in a ROI range. In some embodiments, the ROI may be divided equally into multiple sub-images of 5×5 pixels. The above-described embodiments are exemplary and not limiting, and it should be appreciated that the positioning image or ROI may also be segmented into multiple sub-images of other sizes, for example, into multiple sub-images of 9 x 9 pixels.
In some embodiments, method 900 may further include determining a pixel in each sub-region where the corner likelihood value is greatest to form a set of pixels. In some embodiments, the set of pixels is identified as a plurality of candidates determined from the localization image. For example, a pixel point with the largest CL value in each sub-image may be determined, and the pixel point with the largest CL value in each sub-image may be compared with a first threshold value to determine a set of pixels with CL values greater than the first threshold value. In some embodiments, the first threshold may be set to 0.06. It should be appreciated that the first threshold value may also be set to other values.
Referring to fig. 9, step 903 identifies a first pose identification from the candidate pose identifications based on a plurality of different pose pattern matching templates. In some embodiments, a plurality of different pose pattern matching templates are respectively matched with patterns at candidate pose identification pattern corner points to identify a first pose identification. For example, candidate pose identification pattern corner points reaching a preset pose pattern matching degree standard are determined as first pose identification pattern corner points. In some embodiments, the pose pattern matching template has the same or similar features as the pattern of the region near the pose identification pattern corner. If the matching degree of the pose pattern matching template and the pattern of the region near the candidate pose identification pattern corner reaches the preset pose pattern matching degree standard (for example, the matching degree is higher than a threshold value), the pattern near the candidate pose identification pattern corner can be considered to have the same or similar characteristics as the pose pattern matching template, and then the current candidate pose identification pattern corner can be considered to be the pose identification pattern corner.
In some embodiments, a pixel point with the largest CL value in the pixel set is determined as a candidate pose identification pattern corner. For example, all pixels in the pixel set may be ordered in order of CL values from high to low, and the pixel with the highest CL value may be used as the candidate pose identification pattern corner. In some embodiments, after determining the candidate pose identification pattern corner, matching the pose pattern matching template with the pattern at the candidate pose identification pattern corner, and if a preset pose pattern matching degree standard is reached, determining the candidate pose identification pattern corner as the identified first pose identification pattern corner.
In some embodiments, method 900 may further include determining, in response to a match failure, a pixel of the set of pixels having a greatest likelihood value for a corner of the remaining pixels as a candidate pose identification pattern corner. For example, if the candidate pose identification pattern corner does not reach the preset matching degree standard, selecting a pixel point with a secondary CL value (a pixel point with a second largest CL value) as the candidate pose identification pattern corner, matching the pose pattern matching template with a pattern at the candidate pose identification pattern corner, and so on until the first pose identification pattern corner is identified.
In some embodiments, the pose identification patterns may be black and white alternate patterns (e.g., checkerboard patterns), so the pose pattern matching templates may be the same patterns, utilizing the gray distribution G of the pose pattern matching templates M Pixel neighborhood gray scale distribution G of pixel point corresponding to candidate pose identification pattern corner point image The correlation coefficients (Correlation Coefficient, CC) between the two are matched. Pixel neighborhood gray scale distribution G of pixel point image The gradation distribution of pixels is a constant range (for example, 10×10 pixels) of pixels centered on the pixel point. The correlation coefficient may be determined based on the following equation (18):
Where Var () is a variance function and Cov () is a covariance function. In some embodiments, when the correlation coefficient is smaller than 0.8, the correlation between the gray distribution in the pixel domain and the pose pattern matching template is lower, and then the candidate pose identification pattern corner with the largest corner likelihood value is judged to be not the pose identification pattern corner, otherwise, the candidate pose identification pattern corner with the largest corner likelihood value is considered to be the pose identification pattern corner.
In some embodiments, the method 900 may further include determining an edge direction of the candidate pose identification pattern corner. For example, as shown in fig. 10, the candidate pose identification pattern corner is corner P in pose identification pattern 1000 1001 Corner point P 1001 The edge direction of (a) may refer to forming the corner point P 1001 As indicated by the dashed arrow in fig. 10.
In some embodiments, the edge directionBy first derivative values (I) in the X-direction and Y-direction of the planar coordinate system for each pixel of a range neighborhood (e.g., 10X 10 pixels) centered on the candidate pose identification pattern corner x And I y ) And (5) determining. For example, the edge direction may be calculated based on the following equation (19):
wherein the first derivative (I x And I y ) Can be obtained by carrying out convolution operation on each pixel point in a certain range neighborhood range. In some embodiments, the edge direction I of the pixel points in each range neighborhood is determined by angle And corresponding weight I weight Clustering calculation is carried out to obtain the edge direction of the pixel point, and the weight I is selected weight Class-corresponding I with maximum duty cycle angle As the edge direction. If there are a plurality of edge directions, the weight I is selected weight I corresponding to multiple classes with maximum duty ratio angle As the edge direction.
In some embodiments, the method used for the cluster computation may be any one of a K-means method, a BIRCH (Balanced Iterative Reducing and Clustering using Hierarchies, hierarchical structure based balanced iterative clustering method) method, a DBSCAN (Density-Based Spatial Clustering of Applications with Noise, density based clustering method with noise) method, a GMM (Gaussian Mixed Model, gaussian mixture model) method.
In some embodiments, method 900 may further include rotating the pose pattern matching template based on the edge direction. According to the edge direction rotation pose pattern matching template, the pose pattern matching template can be aligned with an image at the candidate pose identification pattern corner point. The edge direction of the candidate pose identification pattern corner may be used to determine the arrangement direction of the image at the candidate identification pattern corner in the positioning image. In some embodiments, the pose pattern matching template may be adjusted to be the same or nearly the same as the image direction at the candidate pose identification pattern corner points in order to facilitate image matching according to the edge direction rotation pose pattern matching template.
Referring to fig. 9, step 905 searches for a pose identifier starting from a first pose identifier. For example, fig. 11 illustrates a flow chart of a method 1100 for searching for pose identification according to some embodiments of the present disclosure. Some or all of the steps in method 1100 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 1100 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 1100 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 1100 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium.
Referring to fig. 11, in step 1101, a second pose identifier is searched for using the first pose identifier as a starting point. In some embodiments, the second pose identification pattern corner is searched in the set search direction with the first pose identification pattern corner as a starting point. In some embodiments, the set search direction may include: the first pose identifies at least one direction of a right front (corresponding to an angular direction of 0 °), a right rear (corresponding to an angular direction of 120 °), a right upper (angular direction of 90 °), a right lower (-angular direction of 90 °) and an oblique direction (e.g., an angular direction of ±45°).
In some embodiments, the set search direction is n, e.g., searching in 8 directions, each search direction v sn Can be determined based on the following equation (20):
v sn =[cos(n·π/4)sin(n·π/4)],(n=1,2,…,8) (20)
in some embodiments, the search direction set in the current step may be determined according to a deviation angle between adjacent pose identification pattern corner points among the pose identification pattern corner points determined in the previous frame. Illustratively, the predetermined search direction is determined based on the following equation (21):
wherein, (x) j ,y j ) Identifying two-dimensional coordinates of pattern corner points for a plurality of poses determined for a previous frame (or a previous image processing period); n is n last Identifying the number of pattern corner points for a plurality of poses determined by the previous frame; v s1 A search direction set for the first one; v s2 A search direction set for the second.
In some embodiments, as shown in FIG. 12, pattern corner P is identified in a first pose 1201 Is used as a searching starting point, and a second pose identification pattern corner point P is searched in a set searching direction 1202 The coordinate positions of (2) may specifically include: identifying pattern corner P with first pose 1201 Is used as a search start point in a set search direction V with a certain search step by a search box (for example, a broken line box in fig. 12) 1201 And (5) searching pose identification pattern corner points. If at least one candidate pose identification pattern corner exists in the search frame, the candidate pose identification pattern corner with the maximum likelihood value of the corner in the search frame is preferentially selected as the second pose identification pattern corner P 1202 . Under the condition that the search box is limited to a proper size, the pattern corner point P is marked by a first pose 1201 Is used as a searching starting point to carry out the second pose identification pattern corner point P 1202 When searching, the candidate pose identification pattern corner with the largest likelihood value among the candidate pose identification pattern corner appearing in the search frame is more likely to be the pose identification pattern corner. Therefore, the candidate pose identification pattern corner point with the maximum likelihood value in the search frame can be considered as the second pose identification pattern corner point P 1202 In order to increase the data processing speed. In other embodiments, in order to improve accuracy of identifying the pose identification pattern corner, in the case that at least one candidate pose identification pattern corner exists in the search frame, candidate pose identification pattern corner with the largest likelihood value of the corner in the candidate pose identification pattern corner appearing in the search frame is selected to identify the corner, so as toAnd determining whether the candidate pose identification pattern corner with the maximum likelihood value is the pose identification pattern corner. For example, matching the pose pattern matching template with an image within a certain range at the candidate pose identification pattern corner point with the maximum likelihood value of the corner point, wherein the candidate pose identification pattern corner point meeting the preset pose pattern matching degree standard can be regarded as the searched second pose identification pattern corner point P 1202
In some embodiments, with continued reference to fig. 12, the size of the search box may be increased in steps, such that the search range is increased in steps. The search step size may be varied in synchronization with the side length of the search box. In other embodiments, the size of the search box may be a fixed size.
In some embodiments, the pose identification pattern may be a black and white checkerboard pattern, and pattern matching may be performed based on the correlation coefficient in equation (18). If the correlation coefficient is larger than the threshold value, the candidate pose identification pattern corner with the maximum likelihood value of the corner is considered to be the pose identification pattern corner, and the candidate pose identification pattern corner is marked as the second pose identification pattern corner.
Fig. 13 illustrates a flowchart of a method 1300 for searching for a second pose identification according to some embodiments of the present disclosure. Some or all of the steps in method 1300 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 1300 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 1300 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 1300 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium. In some embodiments, these instructions may be stored on a computer readable medium. In some embodiments, step 1101 in method 1100 may be implemented similarly to method 1300.
Referring to fig. 13, in step 1301, a candidate pose identification pattern angle of the second pose identification is searched for using the first pose identification as a starting pointAnd (5) a dot. In some embodiments, searching for candidate pose identification pattern corner points for the second pose identification may be performed in combination with searching for second pose identification pattern corner points P shown in fig. 12 1202 Similarly implemented.
In step 1303, a first pose pattern matching template and a second pose pattern matching template are determined based on the distribution of the plurality of pose identifiers, the first pose pattern matching template and the second pose pattern matching template corresponding to pose identifiers adjacent to the first pose identifier. In some embodiments, step 1303 may be performed before or after step 1301, and step 1303 may also be performed in synchronization with step 1301. In some embodiments, the pose identification patterns included in the pose identifications adjacent to the first pose identification may be determined based on the pose identification pattern included in the first pose identification and the distribution of the plurality of pose identification patterns, thereby determining a first pose pattern matching template and a second pose pattern matching template.
In step 1305, the first pose pattern matching template and/or the second pose pattern matching template is matched with the pattern at the candidate pose identification pattern corner position of the second pose identification to identify the second pose identification. In some embodiments, the first pose pattern matching template and/or the second pose pattern matching template may be matched with the pattern at the candidate pose identification pattern corner location of the second pose identification based on the correlation coefficient in equation (18). If the correlation coefficient is greater than the threshold value, determining candidate pose identification pattern corner points of the second pose identification as pose identification pattern corner points of the second pose identification, and determining a pose pattern matching template (a first pose pattern matching template or a second pose pattern matching template) with the correlation coefficient greater than the threshold value as the pose identification pattern of the second pose identification.
Referring to fig. 11, in step 1103, a search direction is determined based on the first pose identification and the second pose identification. In some embodiments, the search direction includes: a first search direction and a second search direction. The first search direction may be a direction which takes a coordinate position of the first pose identification pattern corner as a starting point and is far away from the second pose identification pattern corner. The second search direction may beAnd taking the coordinate position of the second pose identification pattern corner as a starting point and keeping away from the direction of the first pose identification pattern corner. For example, the search direction V shown in fig. 12 1202
Referring to fig. 11, in step 1105, a pose identifier is searched in a search direction with the first pose identifier or the second pose identifier as a starting point. In some embodiments, if the first pose identification pattern corner is taken as a new starting point, the first search direction in the above embodiments may be taken as a search direction to perform the search of the pose identification pattern corner. If the second pose identification pattern corner is taken as a new searching starting point, the second searching direction in the above embodiment is taken as a searching direction to search the pose identification pattern corner. In some embodiments, a new pose identification pattern corner is searched (e.g., the third pose identification pattern corner P in fig. 12 1203 ) May be performed similarly to step 1101 or method 1400 in method 1100. In some embodiments, the search step may be the first pose identification pattern corner P 1201 And a second pose identification pattern corner point P 1202 Distance L between 1
Fig. 14 illustrates a flowchart of a method 1400 for searching for pose identification according to some embodiments of the present disclosure. Some or all of the steps in method 1400 may be performed by a control device (e.g., control device 120 shown in fig. 1). Some or all of the steps in method 1400 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 1400 may be used in a robotic system, such as the robotic system 100 shown in fig. 1 or the surgical robotic system 1700 shown in fig. 17. In some embodiments, method 1400 may be implemented as computer-readable instructions. These instructions may be read and executed by a general purpose processor or a special purpose processor. In some embodiments, these instructions may be stored on a computer readable medium. In some embodiments, step 1105 in method 1100 may be implemented similarly to method 1400.
Referring to fig. 14, in step 1401, candidate pose identification pattern corner points of the third pose identification are searched for using the first pose identification or the second pose identification as a starting point. In some embodiments In an example, searching for the candidate pose identification pattern corner point of the third pose identification may be compared with searching for the third pose identification pattern corner P shown in fig. 12 1203 Similarly implemented.
In step 1403, a third pose pattern matching template is determined based on the distribution of the plurality of pose identifiers, the third pose pattern matching template corresponding to a pose identifier adjacent to the first pose identifier or adjacent to the second pose identifier. In some embodiments, the pose identification pattern included in the pose identification adjacent to the first pose identification or the second pose identification can be determined based on the pose identification pattern included in the first pose identification or the second pose identification and the distribution of the plurality of pose identification patterns, and further, the pose identification pattern included in the pose identification adjacent to the first pose identification or the second pose identification can be determined, so that a third pose pattern matching template can be determined.
In step 1405, the third pose pattern matching template is matched with the pattern at the candidate pose identification pattern corner position of the third pose identification to identify the third pose identification. In some embodiments, step 1405 may be implemented similarly to step 1305.
In some embodiments, in response to the search distance being greater than the search distance threshold, determining a pixel of the set of pixels having a maximum likelihood value for a corner of the remaining pixels as a candidate pose identification pattern corner; and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification patterns respectively so as to identify the first pose identification. In some embodiments, after determining the pixel with the largest corner likelihood value for the remaining pixels in the set of pixels as the new candidate pose identification pattern corner, a new first pose identification may be identified based on a method similar to step 903. In some embodiments, the search distance being greater than the search distance threshold may be understood as the search distance in some or all of the search directions being greater than the search distance threshold. In some embodiments, the search distance threshold may include a set multiple of the distances of the N-1 th pose identification pattern corner and the N-2 nd pose identification pattern corner, where N+.3.
For example, the first two poses, whose distance threshold is doubled, identify the distance of the pattern corner points. In this way, the maximum searching distance for searching the corner point of the third pose identification pattern is twice the distance between the corner points of the first pose identification pattern and the corner points of the second pose identification pattern, if the searching distance is reached in the searching direction and the corner points of the pose identification pattern are not searched, the pixel with the maximum likelihood value of the corner points of the rest pixels in the pixel set is determined to be used as the corner point of the new candidate pose identification pattern, the new first pose identification is identified, and the current searching process is correspondingly stopped. In some embodiments, similar to method 900, the new first pose identification pattern corner may be redetermined, and similar to method 1100, the remaining pose identification pattern corner may be searched using the new pose identification pattern corner as a search starting point.
In some embodiments, in response to the number of identified pose identification pattern corner points being greater than or equal to the pose identification number threshold, a current relative pose of the operating arm with respect to the reference coordinate system may be determined based on the search of the searched pose identifications, with the search of the pose identification pattern corner points also stopping accordingly. For example, when four pose identification pattern corner points are identified, the search for the pose identification pattern corner points is stopped.
In some embodiments, in response to the identified number of pose identifications being less than the threshold number of pose identifications, determining a pixel in the set of pixels having a maximum likelihood value for a corner of the remaining pixels as a candidate pose identification pattern corner; and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification patterns respectively so as to identify the first pose identification. In some embodiments, if the total number of recognized pose identifiers (e.g., pose identifier pattern corner points) is less than the set pose identifier number threshold, the search based on the first pose identifier in the above step is considered to fail. In some embodiments, in the event of a search failure, the pixel with the largest likelihood value for the corner of the remaining pixels in the set of pixels is determined as the new candidate pose identification pattern corner, after which the new first pose identification may be identified based on a method similar to step 903. In some embodiments, similar to method 900, the new first pose identification pattern corner may be redetermined, and similar to method 1100, the remaining pose identification pattern corner may be searched using the new pose identification pattern corner as a search starting point.
In some embodiments, after the pose identification pattern corner is searched or identified, the determined pose identification pattern corner can be subjected to sub-pixel positioning so as to improve the position accuracy of the pose identification pattern corner.
In some embodiments, CL values of the pixel points may be fitted based on a model to determine coordinates of the sub-pixel located pose identification pattern corner points. For example, the fitting function of CL values for each pixel point in the ROI may be a quadric function, whose extreme points are sub-pixel points. The fitting function may be determined based on the following formulas (22) and (23):
S(x,y)=ax 2 +by 2 +cx+dy+exy+f (22)
wherein S (x, y) is a CL value fitting function of all pixel points in each ROI, a, b, c, d, e, f is a coefficient; x is x c The x-coordinate, y, identified for pose c The y-coordinate identified for the pose.
Fig. 15 illustrates a schematic diagram of a primary operator 1500 of some embodiments of the present disclosure. As shown in fig. 15, in some embodiments, the main manipulator 1500 includes a multiple degree of freedom mechanical arm 1510 and a handle 1520, the multiple degree of freedom mechanical arm 1510 including a plurality of joints (15101-15107). The joints of the multi-degree of freedom mechanical arm 1510 include a position joint and a posture joint, the posture joint serving as an orientation module of the main manipulator 1500, and the target posture is achieved by one or more posture joint control handles 1520. The positional joints act as positioning modules for the main manipulator 1500, with one or more of the positional joints controlling the handle 1520 to reach a target position.
In some embodiments, determining the current pose of the handle of the primary manipulator comprises: acquiring joint information of at least one gesture joint; and determining a current pose of the primary manipulator based on joint information of the at least one pose joint.
In some embodiments, the main manipulator sensor is disposed at a posture joint of the multi-degree-of-freedom mechanical arm, and is configured to acquire joint information (such as an angle) corresponding to the posture joint, and determine a current posture of the handle of the main manipulator relative to a base coordinate system of the main manipulator according to the acquired joint information. In some embodiments, joint information is acquired by a primary manipulator sensor of a pose joint, and a current pose of the primary manipulator is calculated based on a forward kinematic algorithm. In some embodiments, the master manipulator comprises at least one gesture joint for controlling the gesture of the handle of the master manipulator, and the control signal comprises a control signal for controlling one or more of the at least one gesture joint. The posture adjustment of the handle of the master manipulator is realized by adjusting one or more posture joints, so that the posture matching of the handle of the master manipulator and the driven tool is realized.
In some embodiments, the control signals include control signals for controlling one or more of the at least one gestural joint, wherein the one or more of the at least one gestural joint comprises a non-coupled gestural joint. The coupling joint may refer to a joint for adjusting the position and posture of the main manipulator. The uncoupled joint may refer to a joint that can only be used to adjust the position (referred to as a uncoupled position joint in this disclosure) or pose of the primary manipulator (referred to as a uncoupled pose joint in this disclosure). In some embodiments, the primary manipulator may comprise at least one coupling joint. For example, in the main manipulator 1500 shown in fig. 15, the first joint 15101, the second joint 15102, and the third joint 15103 are position joints, the first joint 15101, the second joint 15102, the fifth joint 15105, the sixth joint 15106, and the seventh joint 15107 are posture joints, the first joint 15101 and the second joint 15102 are coupling joints that can adjust the position of the main manipulator 1500 and the posture of the main manipulator 1500, and the fifth joint 15105 and the sixth joint 15106 and the seventh joint 15107 are uncoupled posture joints that can adjust only the posture of the main manipulator 1500. In some embodiments, the pose adjustment of the handle 1520 of the master manipulator 1500 may be achieved by calculating control signals of uncoupled pose joints (e.g., fifth joint 15105, sixth joint 15106, and seventh joint 15107), enabling the pose matching of the handle 1520 of the master manipulator 1500 with the pose of the slave tool, providing for subsequent teleoperation.
In some embodiments, the base coordinate system of the primary manipulator is b and the coordinate system of the handle is d. In some embodiments, the base coordinate system b is a coordinate system established with the base virtual as a point, the orientation of which can be determined based on its physical configuration. Similarly, the coordinate system d of the handgrip is a coordinate system established with the handgrip virtually as a point, the direction of which can be determined based on its physical configuration. In some embodiments, the origin of the coordinate system d of the handle may coincide with the origin of the coordinate systems of the fifth, sixth, seventh joints. It will be appreciated by those skilled in the art that the position and attitude of the coordinate system d of the handgrip with respect to the base coordinate system of the main manipulator can be determined by the joint information of the first to seventh joints.
In some embodiments, the primary manipulator sensor obtains joint information q of the primary manipulator j_mp (j is the number of the joint). In some embodiments, the jth joint information q j_mp May include the angle value theta of the corresponding joint j_mp . For example, acquiring joint information q of a first joint 1_mp Joint information q of the second joint 2_mp Joint information q of third joint 3_mp Joint information q of fourth joint 4_mp Joint information q of fifth joint 5_mp Joint information q of the sixth joint 6_mp Joint information q of seventh joint 7_mp . In some embodiments, the fourth joint is a slave joint of the third joint, and the joint angle of the fourth joint is in the same direction as the absolute value of the joint angle of the third joint. Thus, the angles of the six joints of the main manipulator are represented as a matrix q of 6*1 manipulator The joint angle of the fourth joint may not be in the matrix q manipulator Is embodied in (a). Each joint information q j_mp Can be expressed as theta j_mp The structure of the main manipulator is six degrees of freedom, as in equation (24):
q manipulator =(q 1_mp q 2_mp q 3_mp q 5_mp q 6_mp q 7_mp ) T (24)
the first joint, the second joint and the third joint are position joints, q 1_mp 、q 2_mp 、q 3_mp The position of the handle of the main operator is determined. The first joint, the second joint, the fifth joint, the sixth joint and the seventh joint are gesture joints, q 1_mp 、q 2_mp 、q 5_mp 、q 6_mp 、q 7_mp The posture of the handle is determined. In some embodiments, determining the pose of the handle of the primary manipulator may not concern the position controlled by the first, second, and third joints, but rather the pose (e.g., direction) determined by the first, second, fifth, sixth, and seventh joints. In some embodiments, while the motor is driving, the first, second and third joints are kept stationary, and q corresponding to the fifth, sixth and seventh joints are determined based on the target pose or the target pose 5_mp 、q 6_mp 、q 7_mp According to q 5_mp 、q 6_mp 、q 7_mp And calculating a control signal to realize posture adjustment of the handle.
Those skilled in the art will appreciate that there are many solutions for a multi-joint primary manipulator to achieve a certain target pose. In some embodiments, one or more of the at least one gesture joints may be adjusted to adjust the gesture of the primary manipulator handle. For example, in one embodiment, the first joint of the coupled pose, the second joint of the coupled pose, and the third joint of the uncoupled position may be maintained unchanged, and the pose of the main operator handle may be adjusted by adjusting the fifth joint, the sixth joint, and the seventh joint of the uncoupled pose.
In some embodiments, the method 600 further comprises: obtaining joint information of other attitude joints except one or more attitude joints to be adjusted in at least one attitude joint; and determining a transformation matrix of the other posture joints based on joint information of the other posture joints. For example, joint information of other posture joints is acquired based on the main manipulator sensor, and a transformation matrix of the other posture joints is determined based on the joint information of the other posture joints. Can obtainAnd obtaining joint information of the first joint and the second joint of the coupled gesture, and calculating a conversion matrix. In some embodiments, uncoupled ones of the one or more pose joints (e.g., fifth joint, sixth joint, and seventh joint) may be adjusted without adjusting other pose joints, such as coupled joints (e.g., first joint and second joint). In some embodiments, q corresponding to the first joint and the second joint may be based on 1_mp Q 2_mp Determining a transformation matrix for other pose joints (e.g., transformation matrix for other pose joints relative to joint start point 0) 0 R 4 ). In some embodiments, the method 600 further comprises generating the control signal of the primary manipulator based on a transition matrix of the target pose and the other pose joints of the handle of the primary manipulator. For example, a transformation matrix based on the target pose and other pose joints of the handle of the main manipulator 0 R 4 Control signals of the main operator are generated as in equations (25) to (27).
In some embodiments, the third joint, the fourth joint, and the uncoupled position joint are q-based 1_mp 、q 2_mp 、q 3_mp Transformation matrix of other determined gesture joints 0 R 4 And based on q 1_mp 、q 2_mp Transformation matrix of other determined gesture joints 0 R 4 And consistent.
4 R 70 R 4 T · b R 0 T · b R d · 7 R d T (25)
In equation (25), the matrix is transformed 0 R 4 From input q 1_mp 、q 2_mp Or q 1_mp 、q 2_mp 、q 3_mp Determining that b is the base coordinate system of the main operator, d is the coordinate system of the handle of the main operator, b R d in the attitude of the primary manipulator handle relative to the primary manipulator base coordinate system, b R 0 the existing angular relationship between the base and the joint starting point is a structural constant, 7 R d is the seventh jointThe existing angular relationship of the handle is a structural constant.
4 R 74 R 5 · 5 R 6 · 6 R 7 (26)
R(q 5_mp ,q 6_mp ,q 7_mp )= 0 R 4 T · b R 0 T ·R t · 7 R d T (27)
In formula (27), R t Is the current posture of the driven tool and is matched with b R d In the same manner, in the formula (26), 4 R 55 R 6 and 6 R 7 respectively correspond to the quantity q to be solved 5_mp 、q 6_mp 、q 7_mp . Based on the q obtained 5_mp 、q 6_mp 、q 7_mp And determining a control signal, and adjusting the gesture of the main operator based on the control signal to realize the matching of the master gesture and the slave gesture. As will be appreciated by those skilled in the art, R t It may be the current pose of the end instrument of the slave tool relative to the base coordinate system of the slave tool, or the current pose of the image of the end instrument of the slave tool in the display relative to the world coordinate system. R is R t Can be combined with b R d Consistent, e.g., identical or having a particular ratio or difference. In some embodiments, joint target values for one or more pose joints in the handle are determined based on the control signals, and the joint target values are converted into drive amounts and sent to the drive device. The driving device drives the motor of one or more gesture joints of the main manipulator to move so as to enable the one or more gesture joints of the main manipulator to move, and the gesture of the handle of the main manipulator is matched with the gesture of the tail end instrument of the driven tool.
In some embodiments, the mathematical structural model of the main manipulator may be constructed based on a D-H parametric method or an exponential product representation. For example, a D-H matrix corresponding to a joint of the primary manipulator is determined, and a mathematical structural model of the primary manipulator is determined based on the D-H matrix of the joint. The D-H matrix for each joint of the primary manipulator is represented as equation (28).
The correspondence between the D-H matrix and the joint information is shown in Table 1.
Table 1 correspondence between D-H matrix and joint information
In equation (28), rot (x, α) j_mp ) To rotate alpha about the x-axis j_mp Angle, rot (z, θ) j_mp ) To rotate theta around z-axis j_mp Angle, trans (x, a) j_mp ) Move a in x direction j_mp ,Trans(z,d j_mp ) Move d in z direction j_mp . The main manipulator 1500 shown in fig. 15, with the z axis being the axis of rotation of the joint and the x axis pointing to the next joint, can determine the y-axis direction according to the left/right hand law of the cartesian coordinate system. Rot (x, alpha) j_mp )、Trans(x,a j_mp ) The fourth order matrix represents rotation about a direction by a certain angle or translation along a direction by a certain distance.
In some embodiments, the mathematical structural model of the primary manipulator is described by D-H matrix multiplication of all joints, as in equation (29):
0 T 7_mp0 T 1_mp · 1_mp T 2_mp · 2_mp T 3_mp · 3_mp T 4_mp · 4_mp T 5_mp · 5_mp T 6_mp · 6_mp T 7_mp (29)
in some embodiments, the D-H matrix for the joint in equation (29) may be determined based on equation (28).
Those skilled in the art can understand that when teleoperation is started, if the gesture (such as the direction or the angle) of the handle is inconsistent with the gesture (such as the direction or the angle) of the corresponding controlled driven tool, the man-machine interaction experience of an operator (such as a surgeon) in the operation process is poor, and the operation precision of the driven tool is affected. Therefore, after the main operator is connected with the driven tool in a matching way and before the main operator teleoperations the driven tool (for example, when an operator holds the handle of the main operator to obtain the control right of the corresponding driven tool but does not start the master-slave teleoperation yet), the gesture of the handle and the gesture of the driven tool are adjusted in a matching way. When the gesture of the two is consistent, teleoperation of the master manipulator on the slave tool can be executed, and the precision and experience of follow-up teleoperation can be improved.
In some embodiments, the method 600 further comprises: in response to a predetermined condition being met, a degree of pose matching between the handle of the master operator and the slave tool may be determined. In some embodiments, the predetermined condition comprises a triggering of teleoperational control rights. In some embodiments, triggering of teleoperational control may be achieved by a triggering device. The triggering device can be a switch arranged on the main operator or the display, which is convenient for an operator to approach, touch, press or transfer. Triggering means include, but are not limited to, holding close, touching, transferring, clicking or long pressing, etc. The triggering mode of the triggering device can be that the triggering device is close to a sensor, a switch on a main operator is shifted, an induction position on the main operator is touched, a key on the main operator is pressed for a long time or a point is pressed, a pedal of a main control console is stepped on, a display screen of the main control console is operated, and the like. In some embodiments, matching refers to the pose of the handle and the pose of the slave tool meeting a preset relationship (e.g., agreement), and pose matching refers to the degree of matching between the current pose of the handle and the current pose of the slave tool. In some embodiments, a pose match between the master manipulator and the slave tool is determined based on a current pose of the handle of the master manipulator and a current pose of the slave tool. When the gesture matching degree is lower than a preset threshold, a control signal for adjusting the current gesture of the handle of the main operator is generated in response to the gesture matching degree being lower than the preset threshold so that the gesture matching degree is higher than or equal to the preset threshold. Thus, when the postures of the two are not matched, posture adjustment can be automatically carried out so as to achieve consistency of the postures of the two. When the current postures of the two are consistent or basically consistent (the posture matching degree is higher than or equal to a preset threshold value), a master-slave mapping between the master manipulator and the slave tool is established in response to the posture matching degree being higher than or equal to the preset threshold value, so that the next teleoperation flow can be executed.
In some embodiments, the means for adjusting the pose of the handle of the master manipulator to the pose of the slave tool comprises: the current posture of the slave tool is kept unchanged by adjusting the posture of the handle of the master operator so that the posture of the handle of the master operator coincides with the posture of the slave tool.
In some embodiments, the target gesture of the handle of the master manipulator is consistent with the current gesture of the slave tool, and a master-slave mapping is established between the master manipulator and the slave tool, so that teleoperation of the master manipulator on the slave tool can be performed, and the operating precision of teleoperation and the experience of teleoperation are improved. As will be appreciated by those skilled in the art, consistent attitude means that the attitude is substantially consistent, there may be some error between the target attitude of the handle of the master manipulator and the current attitude of the slave tool, but the range of error is within acceptable limits.
In some of the embodiments described above, the pose of the handle is matched to the pose of the slave tool prior to teleoperation, and when the operator begins to operate (e.g., presses the clamp button of the handle of the master manipulator), a master-slave map can be quickly established, and the master manipulator and slave tool enter teleoperation mode. In addition, only the current posture of the driven tool is kept, and the operator can still move the position of the handle of the main operator in a non-operation state, so that the operator can move to a proper position and then perform teleoperation matching, and the movement space of the handle of the main operator is greatly increased. The master-slave motion control method provided by the invention can be applied to the slave ends with different principles and forms, and has the advantages of strong pertinence and small calculation amount in the calculation process, and reduces the driving amount when the handle of the master manipulator is adjusted to the target gesture.
In some of the above embodiments, by establishing a connection between the master manipulator and the slave tool and effecting the transfer of control, the degree of posture matching between the handle of the master manipulator and the slave tool is determined in the state of the connection and the transfer of control. If the gesture matching degree meets the preset threshold condition, a master-slave mapping between the master manipulator and the slave tool is established, and a teleoperation step is executed. If the gesture matching degree does not meet the preset threshold condition, the gesture of the handle of the master manipulator needs to be adjusted to be consistent with the current gesture of the slave tool, then a master-slave mapping between the master manipulator and the slave tool is established, and teleoperation is performed through the handle of the master manipulator. The gesture of the handle of the main operator is adjusted to be consistent with the gesture of the driven tool in time before the teleoperation relation is established between the main operator and the driven tool, so that the accuracy of master-slave mapping between the handle of the main operator and the driven tool is realized, the operation experience of an operator during teleoperation is improved, the high-precision matching of the operation action and the actual action is realized, and meanwhile, the operation limitation caused by inconsistent motion control boundaries of the main operator and the driven tool are avoided.
In some of the above embodiments, when the control object (e.g., the slave tool) of the master manipulator is changed, the leading end orientation of the slave tool into the abdomen is likely to be different from the current orientation of the handle of the master manipulator. According to the method provided by the invention, before the master manipulator and the slave tool establish a master-slave mapping relation, the gesture of the handle of the master manipulator is adjusted to be consistent with the current gesture of the slave tool before the actual operation of an operator, so that good operation experience of the operator and high-precision matching of action expectations and the actual operation are realized, and meanwhile, operation limitation caused by inconsistent motion control boundaries of the master manipulator and the slave tool are avoided.
In some embodiments of the present disclosure, the present disclosure also provides a computer device including a memory and a processor. The memory may be used to store at least one instruction and the processor coupled to the memory for executing the at least one instruction to perform some or all of the steps in the methods of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14.
Fig. 16 illustrates a schematic block diagram of a computer device 1600 in accordance with some embodiments of the present disclosure. Referring to fig. 16, the computer device 1600 may include a Central Processing Unit (CPU) 1601, a system Memory 1604 including a random access Memory (Random Access Memory, RAM) 1602 and a Read-Only Memory (ROM) 1603, and a system bus 1605 connecting the components. Computer device 1600 may also include input/output devices 1606, and mass storage device 1607 for storing operating system 1613, application programs 1614, and other program modules 1615. The input/output device 1606 includes an input/output controller 1610 consisting essentially of a display 1608 and an input device 1609.
The mass storage device 1607 is connected to the central processing unit 1601 by a mass storage controller (not shown) connected to the system bus 1605. The mass storage device 1607 or computer-readable medium provides non-volatile storage for computer devices. The mass storage device 1607 may include a computer-readable medium (not shown) such as a hard disk or a compact disk-read Only Memory (CD-ROM) drive.
Computer readable media may include computer storage media and communication media without loss of generality. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes read-only memory, random-access memory, flash memory, or other solid state memory technology, optical read-only disks, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will recognize that computer storage media are not limited to the ones described above. The above-described system memory and mass storage devices may be collectively referred to as memory.
The computer device 1600 may connect to the network 1612 through a network interface unit 1611 connected to the system bus 1605.
The system memory 1604 or mass storage device 1607 is also used to store one or more instructions. The central processing unit 1601 implements all or part of the steps of the methods in some embodiments of the disclosure by executing the one or more instructions.
In some embodiments of the present disclosure, the present disclosure also provides a computer-readable storage medium having stored therein at least one instruction that is executable by a processor to cause a computer to perform some or all of the steps in the methods of some embodiments of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14. Examples of computer readable storage media include memory of computer programs (instructions), such as read-only memory, random-access memory, compact discs read-only, magnetic tapes, floppy discs, optical data storage devices, and the like.
Fig. 17 illustrates a schematic diagram of a surgical robotic system 1700 according to some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 17, surgical robotic system 1700 may include surgical tool 1701, master trolley 1702, and surgical trolley 1703. The operation trolley 1703 is provided with a driving module for driving the operation tool 1701, and the operation tool 1701 is mounted on the operation trolley 1703 and connected with the driving module. The master trolley 1702 is communicatively connected to the surgical trolley 1703 for controlling the surgical tool 1701 to perform a surgical operation. In some embodiments, the controller in master trolley 1702 or the controller in surgical trolley 1703 may be used to perform some or all of the steps in the methods of some embodiments of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14. In some embodiments, the master cart 1702 and the operation cart 1703 are connected by a wired transmission or a wireless transmission. For example, the master cart 1702 and the operation cart 1703 may be connected by a cable.
In some embodiments, the surgical tool 1701 includes an operating arm and an end instrument disposed at an end of the operating arm. In some embodiments, surgical robotic system 1700 may include a surgical trolley 1703. In some embodiments, surgical robotic system 1700 may include at least two surgical carts 1703, with one surgical tool 1701 mounted to each surgical cart 1703. In some embodiments, the surgical robotic system 1700 may also include an imaging tool 1704. The imaging tool 1704 may include an operating arm and an imaging module disposed at a distal end of the operating arm. The imaging tool 1704 may be disposed on the surgical trolley 1703 and driven by a corresponding drive module. The image of the manipulator arm of the surgical tool 1701 and its end instrument acquired by the imaging module may be transmitted to the master trolley 1702. In some embodiments, portions of the surgical tool 1701 or portions of the imaging tool 1704 may act as a slave tool. In some embodiments, the master trolley 1702 includes a master manipulator for teleoperating the surgical tool 1701 or the imaging tool 1704. In some embodiments, surgical tool 1701 is, for example, surgical tool 1800 shown in fig. 18. In some embodiments, master trolley 1702 is, for example, master trolley 1900 shown in fig. 19. In some embodiments, the surgical trolley 1703 is, for example, the surgical trolley 2000 shown in fig. 20.
Fig. 18 illustrates a schematic diagram of a surgical tool 1800 in accordance with some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 18, a surgical tool 1800 includes a drive transmission 1890, an operating arm 1840, and an end instrument 1860 disposed at the end of the operating arm. In some embodiments, the drive transmission 1890 may cooperate with the drive module to drive the movement of the operating arm 1840. The driving transmission device 1890 is used for being connected with the driving module, and driving force of the driving module is transmitted to the operation arm 1840 through the driving transmission device 1890, so that the operation arm 1840 is driven to realize multi-degree-of-freedom motion. The drive module may also control the end instrument 1860 to perform a surgical procedure. In some embodiments of the present disclosure, end instrument 1860 may include, but is not limited to, bipolar curved split-jaw actuators, bipolar elbow grasper actuators, monopolar curved scissors actuators, monopolar electric hook actuators, bipolar grasper actuators, needle holder actuators, and tissue grasper actuators. In some embodiments, surgical tool 1800 may be mounted, for example, on surgical trolley 1703 shown in fig. 17 or surgical trolley 2000 shown in fig. 20.
Fig. 19 illustrates a schematic diagram of a master trolley 1900 of some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 19, a master cart 1900 includes: controllers (which may be configured on a computer device, disposed within the master car 1900), a master manipulator 1901, master car displays (e.g., displays 1902-1904), and pedals (e.g., pedals 1905-1907). The controller is respectively in communication connection with the main operator 1901, the main control trolley display and the pedal, and is used for performing signal interaction with the main operator 1901, the main control trolley display and the pedal, and generating corresponding control instructions based on the collected control information. In some embodiments, the controller is also communicatively coupled to a surgical trolley, such as surgical trolley 1703 shown in fig. 17, for controlling surgical tool 1701 to perform a surgical operation or for controlling imaging tool 1704 to operate. In some embodiments, the controller of master cart 1900 may also be used to perform some or all of the steps in the methods of some embodiments of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14.
In some embodiments, the primary operator 1901 generally includes a left primary operator (e.g., for controlling a first operating arm) and a right primary operator (e.g., for controlling a second operating arm) corresponding to left-handed operation of a medical staff member, respectively. In a practical scenario, the master manipulator 1901 is used to collect operation inputs of a medical staff member, who by teleoperation of the master manipulator 1901, in turn controls the movement of a surgical tool or an imaging tool in an operation area to achieve a medical operation. In some embodiments, the master manipulator 1901 comprises a multiple degree of freedom robotic arm 19011, a master manipulator sensor is disposed at each joint on the multiple degree of freedom robotic arm 19011, and joint information (e.g., joint angle data) is generated by the master manipulator sensor of each joint. In some embodiments, multi-degree of freedom robotic arm 19011 has six degrees of freedom. In some embodiments, the pose of the primary manipulator 1901 may be represented by a set of joint information of the primary manipulator joints (e.g., a one-dimensional matrix composed of such joint information). In some embodiments, the master manipulator 1901 further includes a clamp 19012, and the clamp 19012 can be used to control the opening and closing angle of the end instrument. In some embodiments, the primary manipulator 1901 may be specifically the primary manipulator 1500 shown in fig. 15, in some embodiments, the master trolley display includes a stereoscopic display 1902, a master external display 1903, and a master touch display 1904. The stereoscopic display 1902 displays the operation part image and the system state prompt, the main control external display 1903 displays the operation part image and the system state prompt, and the touch display 1904 displays the software user interface of the main control trolley 1900. In some embodiments, the image displayed by the stereoscopic display 1902 or the master external display 1903 may be determined based on the image acquired by the imaging module, such as imaging module 2060b shown in fig. 20. In some embodiments, the master trolley pedal is used to collect input from both feet of a medical staff, including the structure of an electrotome pedal 1905, an electrocoagulation pedal 1906, a clutch pedal 1907, and the like.
Fig. 20 illustrates a schematic diagram of an operating trolley 2000 of some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 20, a surgical trolley 2000 includes: a controller (the controller may be disposed on a computer device and provided inside the surgical cart 2000), a surgical cart chassis 2002, a surgical cart chassis 2003, a system status display 2005, a primary column 2006, a primary cross member 2007, a positioning arm 2008, a drive module 2009, and the like. The surgical trolley chassis 2002 is used to perform the movement and fixation functions of the surgical trolley 2000. The surgical dolly case 2003 is used to integrate surgical dolly electrical components therein. The system status display 2005 is used to display a surgical trolley system user interface and receive user input. The primary upright 2006 is liftable and fixed at its top end to the primary cross member 2007. The end of the main beam 2007 is provided with a beam holder, and the lower end of the beam holder is fixed with a plurality of positioning arms 2008. The positioning arm 2008 carries a driving module 2009, and the driving module 2009 is used for loading the surgical tool 2001 or the imaging tool 2004 (the imaging tool 2004 may be, for example, a 3D electronic endoscope). In some embodiments, the surgical trolley 2000 integrates multiple positioning arms 2008, each positioning arm 2008 having multiple motion joints. In some embodiments, the surgical trolley 2000 is integrated with a plurality of surgical tools 2001 and imaging tools 2004, with portions of the operating arms 2040a and end instruments 2060a of the plurality of surgical tools 2001 and portions of the operating arms 2040b and imaging modules 2060b of the imaging tools 2004 entering the workspace through the sheath 2010. In some embodiments, the controller of the surgical trolley 2000 may also be used to perform some or all of the steps in the methods of some embodiments of the present disclosure, such as some or all of the steps in the methods disclosed in fig. 6, 7, 8, 9, 11, 13, and 14.
Note that the above is merely exemplary embodiments of the present disclosure and the technical principles applied. Those skilled in the art will appreciate that the present disclosure is not limited to the particular embodiments described herein, and that various obvious changes, rearrangements and substitutions can be made by those skilled in the art without departing from the scope of the disclosure. Therefore, while the present disclosure has been described in connection with the above embodiments, the present disclosure is not limited to the above embodiments, but may include many other equivalent embodiments without departing from the spirit of the present disclosure, the scope of which is determined by the scope of the appended claims.

Claims (20)

1. A method of master-slave motion control, comprising:
acquiring a positioning image;
identifying a plurality of pose identifiers on the slave tool in the positioning image, wherein the plurality of pose identifiers comprise different pose identifier patterns;
determining a current pose of the driven tool relative to a reference coordinate system based on the plurality of pose identifiers;
determining a target pose of a handle of a master manipulator based on a current pose of the slave tool; and
a control signal for the primary operator is generated based on the target pose of the handle of the primary operator.
2. The control method according to claim 1, further comprising:
based on the distribution of the plurality of pose identifiers, determining the axis-winding angles of the plurality of pose identifiers relative to the Z axis of the driven tool coordinate system; and
based on the pivot angles of the plurality of pose identifiers, three-dimensional coordinates of the plurality of pose identifiers relative to the driven tool coordinate system are determined.
3. The control method according to claim 2, further comprising:
determining two-dimensional coordinates of the plurality of pose identifiers in the positioning image; and
and determining the current gesture of the driven tool relative to the reference coordinate system based on the two-dimensional coordinates of the gesture identifications in the positioning image and the three-dimensional coordinates of the gesture identifications relative to the driven tool coordinate system.
4. The control method according to claim 1, further comprising:
determining a plurality of candidate pose identifiers from the positioning image;
identifying a first pose identifier from the plurality of candidate pose identifiers based on a plurality of different pose pattern matching templates; and
and searching the pose mark by taking the first pose mark as a starting point.
5. The control method of claim 4, the pose identification comprising pose identification pattern corner points in the pose identification pattern, the method further comprising:
Determining a region of interest in the localization image;
dividing the region of interest into a plurality of sub-regions;
determining the pixel with the maximum likelihood value of the corner in each sub-region to form a pixel set;
determining a pixel with the maximum likelihood value of the corner in the pixel set as a candidate pose identification pattern corner; and
and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification patterns respectively so as to identify the first pose identification.
6. The control method according to claim 4, further comprising:
determining an arrangement sequence of the plurality of pose identifiers based on at least two of the plurality of pose identifiers; and
and determining three-dimensional coordinates of the plurality of pose identifiers relative to a driven tool coordinate system based on the arrangement sequence of the plurality of pose identifiers.
7. The control method according to any one of claims 1 to 6, characterized by further comprising:
determining a current pose of a handle of the primary operator; and
a control signal for the primary operator is generated based on the target pose and the current pose of the handle of the primary operator.
8. The control method according to claim 7, wherein the main operator includes at least one attitude joint for controlling an attitude of the handle, the control method comprising:
Acquiring joint information of the at least one gesture joint; and
the current pose of the primary manipulator is determined based on joint information of the at least one pose joint.
9. The control method of any one of claims 1-6, wherein the slave tool includes an operating arm and a tip instrument disposed at a tip of the operating arm, and determining a current pose of the slave tool relative to a reference frame includes:
determining a current pose of the end instrument relative to a base coordinate system of the slave tool; or alternatively
A current pose of an image of the end instrument in a display relative to a world coordinate system is determined.
10. The control method according to any one of claims 1 to 6, wherein the main operator includes at least one attitude joint for controlling an attitude of a handle of the main operator, and the control signal includes a control signal for controlling one or more of the at least one attitude joint.
11. The control method of claim 10, wherein the one or more of the at least one gestural joint comprises a uncoupled gestural joint, the control method further comprising:
Obtaining joint information of other attitude joints of the at least one attitude joint except the one or more attitude joints; and
and determining a transformation matrix of the other gesture joints based on the joint information of the other gesture joints.
12. The control method according to claim 11, characterized by further comprising:
the control signals of the main manipulator are generated based on the target pose of the handle of the main manipulator and the transition matrix of the other pose joints.
13. The control method according to any one of claims 1 to 6, characterized by further comprising:
a degree of gesture matching between the handle of the master manipulator and the slave tool is determined in response to a predetermined condition being satisfied, the predetermined condition comprising triggering of teleoperational control.
14. The control method according to claim 13, characterized by further comprising:
and determining the matching degree of the gesture between the handle of the main operator and the driven tool based on the current gesture of the handle of the main operator and the current gesture of the driven tool.
15. The control method according to claim 13, characterized by further comprising:
And generating the control signal of the handle of the main operator in response to the gesture matching degree being lower than a preset threshold value so that the gesture matching degree is higher than or equal to the preset threshold value.
16. The control method according to claim 13, characterized by further comprising:
and establishing a master-slave mapping between the master manipulator and the slave tool in response to the gesture matching degree being higher than or equal to a preset threshold.
17. The control method according to any one of claims 1 to 6, characterized in that a target posture of a handle of the master manipulator coincides with a current posture of the slave tool.
18. A robotic system, comprising:
the device comprises a main manipulator, a plurality of mechanical arms, a handle, at least one motor, at least one main manipulator sensor and a plurality of control units, wherein the handle is arranged on the mechanical arms, and the at least one motor and the at least one main manipulator sensor are arranged at least one joint on the mechanical arms;
a driven tool including an operating arm and a tip instrument disposed at a tip of the operating arm;
the image collector is used for collecting positioning images; and
Control means in communication with the image collector and the main operator, the control means being configured for performing the control method of any one of claims 1-17.
19. A computer device, the computer device comprising:
a memory for storing at least one instruction; and
a processor coupled with the memory and configured to execute the at least one instruction to perform the control method of any of claims 1-17.
20. A computer readable storage medium storing at least one instruction that, when executed by a computer, cause a robotic system to implement the control method of any one of claims 1-17.
CN202210059153.8A 2022-01-19 2022-01-19 Master-slave motion control method based on pose identification and surgical robot system Pending CN116492064A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210059153.8A CN116492064A (en) 2022-01-19 2022-01-19 Master-slave motion control method based on pose identification and surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210059153.8A CN116492064A (en) 2022-01-19 2022-01-19 Master-slave motion control method based on pose identification and surgical robot system

Publications (1)

Publication Number Publication Date
CN116492064A true CN116492064A (en) 2023-07-28

Family

ID=87323633

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210059153.8A Pending CN116492064A (en) 2022-01-19 2022-01-19 Master-slave motion control method based on pose identification and surgical robot system

Country Status (1)

Country Link
CN (1) CN116492064A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117598787A (en) * 2024-01-08 2024-02-27 上海卓昕医疗科技有限公司 Medical instrument navigation method, device, equipment and medium based on medical image

Similar Documents

Publication Publication Date Title
US20230200923A1 (en) Systems, methods, and computer-readable program products for controlling a robotically delivered manipulator
US10730187B2 (en) Tool position and identification indicator displayed in a boundary area of a computer display screen
WO2018214840A1 (en) Surgical robot system, and method for displaying position of surgical instrument
CN112472297B (en) Pose monitoring system, pose monitoring method, surgical robot system and storage medium
US11806090B2 (en) System and method for image based registration and calibration
CN112618026B (en) Remote operation data fusion interactive display system and method
CN113876434A (en) Master-slave motion control method, robot system, device, and storage medium
CN112384339B (en) System and method for host/tool registration and control for intuitive motion
US20220401178A1 (en) Robotic surgical navigation using a proprioceptive digital surgical stereoscopic camera system
US20220415006A1 (en) Robotic surgical safety via video processing
CN114343847A (en) Hand-eye calibration method of surgical robot based on optical positioning system
CN114536399B (en) Error detection method based on multiple pose identifications and robot system
CN116492064A (en) Master-slave motion control method based on pose identification and surgical robot system
CN116492062A (en) Master-slave movement control method based on composite identification and surgical robot system
CN116492063A (en) Master-slave motion control method based on positioning image and surgical robot system
WO2022249163A1 (en) System and method of gesture detection and device positioning
CN113876433A (en) Robot system and control method
CN114523471A (en) Error detection method based on associated identification and robot system
CN116728394A (en) Control method of robot system based on positioning image and robot system
US20230248467A1 (en) Method of medical navigation
CN116725676A (en) Control method of robot system based on pose identification and robot system
CN116725675A (en) Control method of robot system based on composite identification and robot system
CN116459019A (en) Pose identification-based control method for preventing collision of operation arm and surgical robot system
CN116460837A (en) Operation arm anti-collision control method based on association identification and operation robot system
US20230115849A1 (en) Systems and methods for defining object geometry using robotic arms

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination