CN115256384A - Method and device for determining performance value of robot and controller - Google Patents

Method and device for determining performance value of robot and controller Download PDF

Info

Publication number
CN115256384A
CN115256384A CN202210878229.XA CN202210878229A CN115256384A CN 115256384 A CN115256384 A CN 115256384A CN 202210878229 A CN202210878229 A CN 202210878229A CN 115256384 A CN115256384 A CN 115256384A
Authority
CN
China
Prior art keywords
target instrument
target
instrument
amount
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210878229.XA
Other languages
Chinese (zh)
Inventor
袁帅
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Microport Medbot Group Co Ltd
Original Assignee
Shanghai Microport Medbot Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Microport Medbot Group Co Ltd filed Critical Shanghai Microport Medbot Group Co Ltd
Priority to CN202210878229.XA priority Critical patent/CN115256384A/en
Publication of CN115256384A publication Critical patent/CN115256384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)

Abstract

The specification provides a method, a device and a controller for determining a performance value of a robot, wherein the method comprises the following steps: acquiring an input action amount for driving the tail end of the target instrument to move; predicting a first action quantity of the input action quantity driving the tail end of the target instrument to move under the condition that the performance value of the target instrument reaches a preset threshold value from the angle of the motion control coordinate system; determining a second amount of motion that the tip of the target instrument moves in response to the input amount of motion from the perspective of the coordinate system of the image acquired by the image acquisition component; the instrument performance is determined from the difference between the first and second motion amounts. The scheme can objectively determine the performance value of the target instrument, does not depend on the experience of an operator, and has high performance detection efficiency; the performance value of the target instrument can be determined in real time before, during, and after the operation.

Description

Method and device for determining performance value of robot and controller
Technical Field
The present disclosure relates to the field of mechanical equipment technologies, and in particular, to a method, an apparatus, and a controller for determining a performance value of a robot.
Background
At present, many operations can be performed by machine equipment, and the performance of each component on the machine equipment determines the accuracy of operation results to a certain extent. In situations where precision is highly desirable, such as minimally invasive surgery, there is a particular need to pay close attention to the performance of the robotic device. For this reason, performance detection of the machine equipment is often required.
In the conventional performance detection method, before operation, an operator controls the mechanical equipment to perform "trial operation" (for example, the trial operation may be to control the mechanical equipment to clamp and shear when the mechanical equipment is separated from human tissue), and meanwhile, the operator observes an operation result in real time, and estimates the performance of the mechanical equipment according to the observed operation result. If the performance of the mechanical equipment reaches the standard, the mechanical equipment can be continuously adopted to carry out formal operation (the formal operation can be, for example, clamping and shearing operation in the process of carrying out operation on human tissues by adopting the mechanical equipment), otherwise, the mechanical equipment needs to be replaced and tested again until the mechanical equipment with the performance reaching the standard is obtained through testing, and the formal operation can be carried out by adopting the mechanical equipment with the performance reaching the standard. In the formal operation process, an operator is also required to determine whether the performance of the mechanical equipment is changed according to a real-time observation result, and if the performance is reduced to a level which does not reach the standard, the operator timely controls the operation to be stopped.
Therefore, the operation steps of an operator in the conventional performance detection method are complicated, and the requirement on the experience of the operator is high.
Disclosure of Invention
The embodiment of the application aims to provide a method, a device and a controller for determining a performance value of a target instrument, so as to solve the problems that the operation steps of an operator are more complicated and the requirement on the experience of the operator is higher in the existing performance detection method.
A first aspect of the present specification provides a method for determining a performance value of a robot, including: acquiring an input action amount for driving the tail end of the target instrument to move; the target instrument is arranged on a mechanical arm of the target robot; predicting a first action amount of the input action amount driving the end activity of the target instrument under the condition that the performance value of the target instrument reaches a preset threshold value; determining, from an image formed from the signals acquired by the image acquisition component, a second amount of motion that the tip of the target instrument moves in response to the input amount of motion; and determining the performance of the instrument according to the difference of the first action amount and the second action amount.
A second aspect of the present specification provides a performance value determination apparatus for a robot, including: a first acquisition unit for acquiring an input action amount for driving a distal end of a target instrument to move; the target instrument is arranged on a mechanical arm of the target robot; a first prediction unit configured to predict a first action amount by which the input action amount drives an end activity of the target instrument in a case where a performance value of the target instrument reaches a predetermined threshold; a first determination unit configured to determine a second motion amount by which the tip of the target instrument moves in response to the input motion amount from an image formed by the signal acquired by the image acquisition assembly; a second determination unit for determining the instrument performance based on a difference between the first and second motion amounts.
A third aspect of the present specification provides a robot system comprising: the target robot comprises a base, a first mechanical arm and a second mechanical arm, wherein the tail end of the first mechanical arm is used for mounting a target instrument, and the tail end of the second mechanical arm is used for mounting an image acquisition assembly; the controller is used for controlling the first mechanical arm and the second mechanical arm of the target robot to act so as to enable the target instrument to perform operation on the operated object, and meanwhile, the image acquisition assembly is used for acquiring a real-time image of the operation process; the controller is further configured to perform the method of determining a performance of a robot of any of the first aspect.
A fourth aspect of the present specification provides a controller comprising: a memory and a processor, the processor and the memory being communicatively connected to each other, the memory having stored therein computer instructions, the processor implementing the steps of the method of any one of the first aspect by executing the computer instructions.
A fifth aspect of the present description provides a computer storage medium storing computer program instructions which, when executed, implement the steps of the method of any one of the first aspects.
In the method, the apparatus, and the controller for determining a performance value of a robot provided in the present specification, a first motion amount by which a tip of a target instrument is driven to move when a performance of the target instrument reaches a predetermined threshold is predicted from an input motion amount by which the tip of the target instrument is driven to move, the first motion amount being determined from a motion control coordinate system; and determining a second motion quantity of the end of the target instrument moving in response to the input motion quantity from the image formed by the signals acquired by the image acquisition component, namely the second motion quantity is determined from the perspective of the image coordinate system; the performance value of the target instrument can be objectively determined by determining the performance value of the target instrument through the difference value of the first action amount and the second action amount which are mapped into the preset coordinate system, the experience of an operator is not depended on, and the performance value determining method has low requirement on the operator and high performance detection efficiency. In addition, the method can determine the performance value of the target instrument in real time before, during and after operation, thereby timely early warning the target instrument with performance not reaching the standard, preventing the operation result from being influenced by the instrument performance and improving the operation reliability.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 shows a schematic view of a surgical robotic system;
FIG. 2 shows a schematic diagram of a control end device in a surgical robotic system;
FIG. 3 shows a schematic diagram of an image-side device in a surgical robotic system;
FIG. 4 shows a schematic diagram of an execution end device in a surgical robotic system;
fig. 5 is a flowchart illustrating a method of determining a performance value of a robot provided in the present specification;
FIG. 6 shows a schematic of the general construction of the power cartridge;
FIG. 7 shows a schematic component diagram of a power assembly within a power cartridge;
FIG. 8 shows a schematic of the components of the transmission assembly in the power pack;
FIG. 9 is a schematic diagram showing the positional relationship of the target instrument, the image acquisition assembly and the operated object during the performance of the operation;
FIG. 10 shows a schematic of the structure of the interior of the target instrument tip;
FIG. 11 shows a schematic view of the degrees of freedom of movement of a robotic device;
FIG. 12 is a schematic diagram of various coordinate systems associated with a robot arm holding motion control process;
FIG. 13 is a schematic diagram showing various coordinate systems associated with the image acquisition assembly on the scope holding arm of the robot;
FIG. 14 is a flow chart illustrating another method of determining a performance value for a robot provided herein;
fig. 15 is a flowchart showing a performance value determination method of still another robot provided in the present specification;
FIG. 16 illustrates a display diagram of instrument performance information;
fig. 17 is a schematic block diagram of a performance value determination apparatus for a robot provided in the present specification;
FIG. 18 shows a functional block diagram of a controller provided herein.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art without any inventive work based on the embodiments in the present application shall fall within the scope of protection of the present application.
The present specification provides a system capable of automatically determining performance values of an operating device before, during, and after operation. In this specification, a "robot" refers to an operation device for performing a certain operation.
For example, the operating device, which may also be referred to as a robotic system, performs minimally invasive surgery, as shown in fig. 1. The system consists of a control end device 100, an execution end device 200 and an image end device 300. The control-end device 100, which is generally called a console, and a doctor console, is located outside the sterile field of the operating room, and is used for sending control instructions to the execution-end device 200. The execution-side device 200, i.e., a surgical robot device (in this specification, the surgical robot device is simply referred to as a surgical robot, and the robot device is simply referred to as a robot), is configured to control a surgical instrument mounted at the end of a mechanical arm thereof to perform a specific surgical operation on a patient according to a control instruction. The surgical robot apparatus may further include an endoscope head. The image-side device 300, generally referred to as an image trolley, is used for processing the information collected by the endoscope to form a three-dimensional stereoscopic high-definition image, and feeding the three-dimensional stereoscopic high-definition image back to the control-side device 100.
As shown in fig. 2, a main manipulator, an imaging device and a main controller are arranged on the control-end device 100, namely, a doctor console. The main manipulator detects the hand movement information of the doctor as a control signal of the whole surgical robot system. The imaging device provides a stereoscopic image of the patient body detected by the endoscope for the main surgeon, and provides reliable image information for the main surgeon to perform a surgical operation. When in operation, a doctor sits on the doctor console, and controls the operation robot and the endoscope through a main manipulator. The doctor of the main knife observes the transmitted intracavity three-dimensional image according to the imaging equipment, and the operation of the mechanical arm mechanism and the operation instrument on the operation robot are controlled by the action of two hands to complete various operations, thereby achieving the purpose of performing operations on patients. The main controller is a core control element of the surgical robot system and is used for controlling various operations and function realization of the surgical robot system.
As shown in fig. 3, the image-side apparatus 300 mainly includes an endoscope (not shown), an endoscope processor, and a display apparatus. The endoscope comprises a tube body inserted into the body of a patient, a lens for observation and a lens for illumination arranged at the front end of the tube body, an optical fiber and an eyepiece, and is used for illuminating the inside of the cavity and acquiring a stereo image of the inside of the cavity. The endoscope processor is used for processing the acquired stereoscopic images in the cavity, and the display device is used for displaying the processed images in real time.
As shown in fig. 4, the execution-side device 200, i.e., the surgical robot device, is located in the sterile area of the operating room, and mainly functions to control the surgical instruments mounted at the end of the robot arm to perform a specific surgical operation on a patient according to control instructions given by the master surgeon, and to carry on an endoscope. In the sterile field, an assistant doctor is also usually arranged to replace the surgical instruments installed on the surgical robot and assist the main doctor in completing the operation. To ensure patient safety, assistant surgeons typically have a higher priority on the control of the surgical robot.
The term "operation" as used herein includes not only a therapeutic operation such as excision and suture of a patient body with a medical instrument, but also an operation (i.e., biopsy) such as incision, forceps, or puncture for taking a lesion tissue out of a patient body and performing pathological examination, and means for treating a patient body in response to the need for diagnosis and treatment.
The following mainly describes the method for determining a performance value of a robot provided in the present specification, taking the surgical robot system as an example. As shown in fig. 5, the present specification provides a method for determining a performance value of a robot, the method including the steps of:
s10: an input action amount for driving the distal end of the target instrument to move is acquired.
The target instrument is arranged on a mechanical arm of the target robot, can be detachably arranged, and can also be integrally arranged with the robot.
The term "instrument" in this specification is to be understood in a broad sense as meaning a tool. For example, the instrument may be a tool that performs one of a shearing, clamping, etc. The "distal end" of the instrument refers to the end that faces or contacts the target manipulation object during manipulation, and the "head end" of the instrument is the end opposite the "distal end". For instruments that are removably mounted to the robot, the instrument is typically replaced as a unit so that the shape and length of the instrument is clearly recognizable and the "head end" of the instrument is actually the end of the instrument that is mounted to the robot.
On the one hand, since the operation result is the effect of the operation tool on the operation object, the interaction process of the tip of the operation tool and the operation object is mainly focused on during the operation, and on the other hand, the visual field of the image acquisition assembly is limited, and only the image acquisition assembly is generally adjusted to ensure that the tip of the operation tool is within the visual field, so that the input action amount for driving the movement of the instrument 'tip' is acquired in S10.
The input action amount for driving the tail end of the instrument to move can drive the tail end of the instrument to move by driving the mechanical arm to move; it is also possible to keep the mechanical arm stationary and only drive the "end" of the instrument to move, for example, to drive the angle between two slices of the shearing instrument to decrease, thereby achieving the shearing action.
For the instrument detachably provided on the robot, the step S10 "acquiring the input action amount for driving the movement of the distal end of the target instrument" may be to acquire the action amount of any one of the power transmission components on the target robot and other than the target instrument after the target instrument is mounted on the robot and during the movement of the target instrument. For example, the power transmission assembly may be a manipulator that transmits the operator's body movements to the target robot, or an interface on the robot for mounting the target instrument, which interface may transmit forces on the robot to the end of the target instrument or conduct electrical control signals on the robot to the target instrument after the target instrument is mounted to the interface, thereby controlling the end movement of the target instrument.
The manipulator in this specification may be a joystick having a plurality of degrees of freedom as shown in fig. 2, may be an existing multi-axis parallel master-hand structure, or may be a motion sensing device such as a manual controller, an eye controller, or an electroencephalogram controller.
In some embodiments, S10 may include: s11 and/or S12.
S11: acquiring pose variation of a manipulator in a robot system, and taking the pose variation of the manipulator as an input action quantity; the manipulator is used for manipulating the end movement of the target instrument.
A manipulator is a mechanism for converting the body motion of an operator into a control signal of a robot. As shown in fig. 2, a manipulator, a main manipulator, is provided on the console, wherein the main manipulator includes: the left manipulator and the right manipulator are arranged separately, and the hand of the operator can control the posture change of the main operator. The main manipulator may include a plurality of movable joints such that the main manipulator has a plurality of degrees of freedom of movement.
A master-slave consistency control rule is set in the robot system, and according to the rule, the posture of the robot can be correspondingly changed along with the posture change of a master manipulator. And (3) a master-slave consistency control rule, namely posture changes of a master end and a slave end are correspondingly consistent, wherein the master end refers to a master manipulator of an operator end, and the slave end refers to the robot equipment.
The pose change amount of the manipulator may be a change amount of the manipulator in each degree of freedom. In the process of controlling the posture change of the main operating hand, the first controller can record the real-time posture and the corresponding time of the main operating hand, so that the posture change amount of the manipulator can be acquired from the recorded data of the controller.
S12: acquiring the pose variation of a power transmission assembly which drives the tail end of a target instrument to move in the robot system, and taking the pose variation of the power transmission assembly as an input action quantity; the power transmission assembly is arranged on a target mechanical arm of the target robot, is mechanically connected with the target instrument and is used for driving the tail end of the target instrument to move.
The power transmission assembly is mechanically coupled to the target instrument, wherein the mechanical coupling may be in direct contact with the coupling end of the target instrument or may be in contact with the coupling end of the target instrument through other components.
Many power transmission components are usually included in the robot system, such as gear transmission components, belt transmission components, chain transmission components, etc., and the power transmission component in S12 may be any one or a combination of two or more of the above components.
For example, the pose variation of the gear transmission assembly may be a rotation angle, a rotation number, and the like of the gear, the pose variation of the belt transmission assembly may be a distance, a rotation number, and the like of any point on the belt relative to a fixed point outside the belt, and the pose variation of the chain transmission assembly may be a distance, a rotation number, and the like of any point on the chain relative to a fixed point outside the chain.
Because the transmission route of the input action amount of the manipulator to the target instrument is longer than that of the input action amount of the power transmission assembly to the target instrument, the overall performance of the target robot or the robot system can be determined more comprehensively according to the input action amount of the manipulator, and the performance of the target instrument can be determined more accurately according to the input action amount of the power transmission assembly.
In case it is desired to determine the performance of the target robot or robotic system, S10 may comprise S11.
Where it is desired to determine the performance of a target instrument mounted on a target robot, S10 can include S12. Alternatively, S10 may include S11 and S12, and S11 may be executed first to roughly determine the performance of the target implement based on the input action amount of the manipulator, and in the case where the performance determination result does not reach the predetermined performance value, S12 may no longer be executed to more accurately determine the performance of the target implement based on the input action amount of the power transmission assembly.
In some embodiments, S11 may be acquiring the amount of change in the pose of the manipulator during the operation of the manipulator by the operator to perform the target operation. The target operation may be a performance detection operation before the formal operation, or may be an operation during the formal operation. The formal operation may be, for example, a surgical operation.
For example, S11 may include the following steps S111 and S112.
S111: when an operator operates the manipulator to move the tip of the target instrument, it is recognized whether a pose change of the manipulator matches a preset action in a preset action set.
The operation manipulator can be operated manually or through other body part activities.
The preset action set can comprise a plurality of preset actions, and the preset actions can be predetermined actions capable of controlling the tail end of the target instrument to make a pose change. For example, the preset motion may be a motion that controls the cutting instrument to move in a certain direction, may be a motion that controls the scissors of the cutting instrument to open or close, or the like.
S111 may be a step in which the operator operates the manipulator to control the movement of the distal end of the target instrument, the pose of the manipulator is acquired in real time, and the judgment is performed once at a predetermined time interval, or the judgment is performed once after the pose change is completed, and it is judged whether or not any pose change belongs to a preset action in each pose change of the manipulator in the latest time interval.
S112: and under the condition of matching with a preset action, acquiring the pose variation of the manipulator corresponding to the preset action, and taking the pose variation as an input action amount.
Steps S111 and S112 acquire the pose of the manipulator in real time and identify whether the position change of the manipulator matches the preset action in the preset action set in real time during the operation of the manipulator by the operator to move the distal end of the target instrument, and in the case of matching, determine the performance of the robot or the target instrument mounted on the robot by using the pose change amount of the manipulator corresponding to the matched preset action as the input action amount, so as to determine the performance of the robot or the target instrument in real time during the operation of the operator controlling the robot to perform a formal operation, for example, during the operation of the surgeon controlling the surgical robot to perform a surgery, and prevent the occurrence of a situation that the performance of the robot or the target instrument suddenly deteriorates to affect the operation result during the formal operation.
In some embodiments, S11 may be to acquire the pose change amount of the manipulator during the performance detection operation performed automatically by the robot system before the formal operation is performed. The formal operation may be, for example, a surgical operation.
For example, S11 may include the following steps S113, S114, and S115.
S113: and acquiring a prestored control instruction sequence, wherein the control instruction sequence is used for controlling the tail end of the target instrument to generate pose change.
The robot control system may store a control instruction sequence set in advance, where the set may include one or more control instruction sequences, each control instruction sequence includes control instructions to be executed in sequence, and the process of executing the control instructions in sequence is a process of controlling a target instrument mounted on a target robot arm to generate a pose change.
Since the performance value determination method of a robot provided in the present specification requires a second motion amount that moves in response to the input motion amount in accordance with the tip of the target instrument acquired by the image acquisition component, the input motion amount should be capable of driving the tip of the target instrument to make a pose change.
S114: and sequentially sending the control instructions in the control instruction sequence to the target robot, and simultaneously feeding back the control instructions to a manipulator of the robot system so as to enable the pose of the manipulator and the target robot to synchronously act.
Because the robot system is usually internally provided with a master-slave consistency control rule, when the controller controls the robot to act according to a pre-stored control command so as to enable the tail end of the target instrument to generate a pose change, the control command is also fed back to a manipulator of the robot system so as to enable the pose of the manipulator and the target robot to act synchronously. This is due to the active consistency control rules of the robotic system. And (3) a master-slave consistency control rule, namely posture changes of a master end and a slave end are correspondingly consistent, wherein the master end refers to a master manipulator of an operator end, and the slave end refers to the robot equipment.
In the execution of step S114, the appearance of the robot system may be: under the condition that no one is present on the operating platform, the pose of the manipulator automatically changes, and the pose of the target robot also changes along with the change of the pose of the manipulator, so that the pose of the target instrument clamped on the mechanical arm of the target robot also changes. The series of changes are realized automatically by the robot posture. This is of course done automatically after receiving a control command from the operator (e.g. the operator presses the auto performance detection button).
S115: and taking the pose variation of the manipulator as an input action amount.
The control instruction sequences in the control instruction sequence set may be executed in sequence, each control instruction sequence being executed according to steps S113, S114, S115. That is, the robot system automatically performs a series of actions, and performs performance detection of the target robot or the target instrument based on the automatically performed series of actions.
In some embodiments, S12 may be to acquire the motion amount of the power box before, during, or after the execution of the formal operation.
For example, S12 may include steps S121 and S122 as follows.
S121: detecting the action amount of the power box in the process of moving the tail end of the target instrument; the power box is arranged at the tail end of a target mechanical arm of the target robot and is detachably connected with the head end of the target mechanical arm.
S122: the motion amount of the power box is used as an input motion amount.
Steps S121 and S122 further define, on the basis of step S12, that the power component is a power cartridge on the target robot.
Fig. 6 shows a schematic overall configuration of the power cartridge, wherein the power cartridge includes a power assembly 61 and transmission assemblies 62, 63 representing the target instruments.
The power assembly 61 may include a first connector 611, a plurality of motors 612 and a plurality of first coupling structures 613, with reference to fig. 6 and 8, the plurality of motors 612 being disposed on the first connector 611, and the shaft portion of each motor 612 being fixedly connected to one of the first coupling structures 613 through the first connector 611. Each first coupling structure 613 is disposed on the same side of the first connection member 611. The shape of the first connection member 611 may be any shape, and is only illustrated in fig. 6.
The transmission assembly 62 includes a second connecting member 621, a plurality of wire pulleys 622, and a plurality of second coupling structures 623, and in conjunction with fig. 6 and 7, a shaft portion of each wire pulley 622 passes through the second connecting member 621 and is fixedly connected with one of the second coupling structures 623. Each second coupling structure 623 is disposed on the same side of the second connection 621.
The surface of the second coupling structure 623 on the side far from the second connecting piece 621 is matched with the surface of the first coupling structure 613 on the side far from the first connecting piece 611, and when the two surfaces are spliced together, the first coupling structure 613 and the second coupling structure 623 can rotate synchronously without rotation angle difference.
The power assembly 61 may be fixedly disposed at the end of the robotic arm of the target robot and the transmission assembly 62 may be disposed at the head end of the instrument. When the target instrument is mounted to the end of the robotic arm of the target robot, each first coupling structure 613 on the power assembly 61 mates with each second coupling structure 623 on the transmission assembly 62 such that the first coupling structure 613 and the second coupling structure 623 rotate synchronously without a rotational angle difference, thereby transferring the control force on the robotic arm to the target instrument.
Each wire wheel 622 has an end of a wire secured thereto, and the other end of the wire is threaded through an elongated catheter of the target instrument (see fig. 9, where M is the target instrument, T is the image capture assembly, and X is the subject to be operated) to "couple" a driven wheel at the distal end of the target instrument (see fig. 10). When the mechanical arm of the target robot controls the motor 612 in the power assembly 61 to rotate through an electric signal, the mechanical arm can drive the wire wheel 622 in the transmission assembly 62 to rotate, so that the length of the wire wound on the wire wheel 622 can be adjusted, the tension of the wire section not wound on the wire wheel 622 can be adjusted, the driven wheel connected with the wire wheel 622 is further controlled to rotate, the target appliance can move on the degree of freedom controlled by the driven wheel, and the pose change is generated on the degree of freedom controlled by the degree of freedom. For example, the target instrument shown in fig. 10 has the following four degrees of freedom: rotation (Roll), pitch (Pitch), yaw (Yaw), and open-close (Grip).
The "connection of the wire with the driven wheel" may be that one end of the wire is connected with a first surface of one driven wheel, and the first surface is a side surface perpendicular to the axis of the driven wheel; the silk thread can also be wound in a groove body on the second surface of the driven wheel, the first end of the silk thread is fixed on the first silk wheel, the second end of the silk thread is fixed on the second silk wheel, the silk thread is controlled to move through the rotation of the first silk wheel and the second silk wheel, and the silk thread is provided with pretightening force, so that the silk thread can drive the driven wheel to rotate when moving.
As can be seen from the above description of the power cartridge, since the control error of each part of the robot system other than the target instrument is eliminated by using the operation amount of the power cartridge as the input operation amount of the target instrument (i.e., as the control amount of the input target instrument), and the control amount of the input target instrument of the robot system can be measured more accurately, the performance value obtained when the performance value of the target instrument is determined based on the operation amount of the power cartridge as the input operation amount is more accurate.
On the other hand, in the surgical robot system, the power box is the position where the target instrument is connected with the mechanical arm, and is located outside the body, and the volume of the power box can be set to be slightly larger, so that a sensor can be conveniently arranged on the power box to acquire the action amount of the power box.
The operation amount of the power pack may be, for example, an output torque of each motor in the power pack, a rotation angle of each motor, a tension on each wire, a torque on each wire wheel, or the like.
S20: predicting a first action amount of the input action amount driving the end activity of the target instrument in a case where the performance value of the target instrument reaches a predetermined threshold value.
In the robot system, the input motion amount has a mapping relation with the motion of the tip of the target instrument. Therefore, step S20 is to map the input motion amount to the pose change amount of the tip end of the target instrument when the performance value of the target instrument reaches the predetermined threshold value, and to take the pose change amount of the tip end of the target instrument after mapping as the first motion amount.
In some embodiments, S20 may include the steps of:
s21: and acquiring a first pose of the tail end of the target instrument under the motion coordinate system.
The motion coordinate system is a coordinate system used when the target robot performs motion control, and reference may be made to the description of step S41.
The pose of the tip of the target instrument in the motion coordinate system may be a pose in the direction of the respective degrees of freedom and a coordinate position. The attitude in each degree of freedom direction may be, for example, an angle in each degree of freedom such as rotation (Roll), pitch (Pitch), yaw (Yaw), and Grip (Grip) shown in fig. 10.
The first posture and the second posture in the present specification are only used for distinguishing different postures, and have no other limiting effect.
S22: and predicting a second pose of the tail end of the target instrument under the motion coordinate system after the input action amount drives the tail end of the target instrument to move under the condition that the performance value of the target instrument reaches a preset threshold value.
S23: and determining a first action quantity of the tail end of the target instrument under the motion coordinate system according to the first pose and the second pose.
S30: from an image formed from the signals acquired by the image acquisition component, a second amount of motion by which the tip of the target instrument moves in response to the input amount of motion is determined.
During some operations, the real-time operation results are not visible to the naked eye. For example, the operation result occurs in a micro area that cannot be identified by naked eyes, such as an operation result that can be identified by naked eyes after a certain degree of amplification is required; or the user's location is not within an area where the results of the operation can be seen, such as in a different room. For such operation, by providing an image capturing component at the end of the robot arm of the target robot, the operator can understand the real-time operation result through the image formed by the signal collected by the image capturing component.
Some image capturing components can capture images themselves, for example, an image capturing component with a built-in cmos image sensor; some image acquisition components cannot acquire images themselves, only intermediate data capable of being used for forming images can be acquired, and the intermediate data needs to be processed by a processor to obtain images, for example, an ultrasound probe can acquire only ultrasound echo signals and can acquire ultrasound images only by processing the ultrasound echo signals.
The image acquisition component may be a lens of an endoscope, an ultrasound probe of an ultrasound probe, or the like.
An endoscope is a common medical apparatus and instrument, which is composed of a cold light source, a light guide beam structure and a group of lenses. For example, laparoscopes commonly used clinically transmit images to an eyepiece using a series of optical cylindrical lens and image them by connecting separate cameras. Endoscopes can be classified into optical lenses (lenticular lenses), fiberscopes, electron mirrors, and the like according to the imaging principle.
A change in the pose of the image acquisition assembly may cause a change in the field of view of the corresponding image. The image corresponding to the pose is an image formed by signals acquired by the image acquisition assembly when the image acquisition assembly is in the pose. The robot system is usually provided with an automatic pose adjusting method of the image acquisition assembly, so that the pose of the image acquisition assembly is automatically adjusted along with the relative position relation between the operated object and the target instrument, and the operated object and the target instrument are kept in the visual field of the image acquisition assembly.
With reference to fig. 1 and 9, in the surgical robot system, the image capturing component may be an endoscope, which extends into the cavity of the patient through a small hole on the body of the patient, and the pose of the endoscope may be adjusted by a mechanical arm of the surgical robot, or by a small motor built in the head of the endoscope. The image-side device 300 forms a high-magnification stereoscopic image according to the signal collected by the endoscope head, and feeds the stereoscopic image back to the doctor console.
In some embodiments, S30 may include:
s31: an image formed by the signals acquired by the image acquisition assembly is acquired in real time, and a first image before the target instrument end responds to the input action amount and a second image after the target instrument end responds to the input action amount are determined.
The "first image of the target instrument tip before responding to the input action amount" in this step should be understood as: the image formed by the signals acquired by the image acquisition assembly at the most recent acquisition time before the target instrument tip responds to the input action amount, and the "second image after the target instrument tip responds to the input action amount" should be understood as: the image acquisition component acquires signals forming an image at a latest acquisition time after the target instrument tip responds to the input motion amount.
S32: and determining a third pose of the tail end of the target instrument in the image coordinate system according to the first image, and determining a fourth pose of the tail end of the target instrument in the image coordinate system according to the second image.
The image coordinate system refers to a coordinate system established according to the field of view of the image capturing component, and reference may be made to the related description of step S41.
S33: and determining a second action quantity of the tail end of the target instrument under the image coordinate system according to the third pose and the fourth pose.
The pose of the tip of the target instrument in the image coordinate system may be a pose in the direction of the respective degrees of freedom and a coordinate position. The attitude in each degree of freedom may be, for example, an angle in each degree of freedom such as a rotation (Roll), a Pitch (Pitch), a Yaw (Yaw), and a Grip (Grip) shown in fig. 10.
The first posture and the second posture in the present specification are only used for distinguishing different postures, and have no other limiting effect.
It follows that the second motion amount in the image coordinate system is in the same dimension as the first motion amount in the motion coordinate system, which facilitates comparison of the difference between the first motion amount and the second motion amount. However, it should be noted that the first motion amount and the second motion amount are in different coordinate systems, and the difference cannot be directly compared. For this reason, step S40 needs to be performed.
S40: the instrument performance is determined from the difference between the first and second motion amounts.
In some embodiments, the motion coordinate system and the image coordinate system are the same coordinate system, then S40 may directly calculate a difference between the first and second motion amounts and determine instrument performance based on the difference.
In some embodiments, the structure of the robot device is complex, the motion coordinate system and the image coordinate system are not usually one coordinate system, and it is possible to establish three, five or more coordinate systems to realize the control of the robot device. In this case, step S40 may include steps S41, S42, and S43 as follows.
S41: and mapping the first action amount to a preset coordinate system to obtain a third action amount, and mapping the second action amount to the preset coordinate system to obtain a fourth action amount.
Fig. 11 shows a schematic view of the degrees of freedom of movement of the robotic device of fig. 4. The robotic device may include a base 210 and a robotic arm mechanism 220. The robot arm mechanism 220 may include a telescopic arm sub-mechanism 221 and an operating arm sub-mechanism 222. A first end of the telescopic arm sub mechanism 221 is connected to the base 210, and the telescopic arm sub mechanism 221 can be extended or shortened in a radial direction of the base 210. A first end of the operating arm sub-mechanism 222 is connected to a second end of the telescopic arm sub-mechanism 221, and the operating arm sub-mechanism 221 can be bent to switch between the expanded state and the contracted state.
As shown in fig. 11, telescopic arm sub-mechanism 221 may include a first cantilever arm 2211 and a first torsion member 2212. First torsion member 2212 connects a first end of first boom 2211 with base 210. The first torsion member 2212 can drive the first arm 2211 to rotate on the horizontal plane around the first torsion member 2212, as shown by the double-arrow curve a in fig. 11. This arrangement enables the plurality of robot arm mechanisms 220 to be contracted together or expanded in the horizontal direction.
By "rotate in the horizontal plane" it is meant that the plane of the actual rotational movement is at a non-perpendicular angle to the horizontal plane, such that the actual rotational movement has a rotational component in the horizontal plane.
As shown in fig. 11, the operating arm mechanism 222 may include a second torsion member 2221, a second cantilever 2222, a third torsion member 2223, a third cantilever 2224, a fourth torsion member 2225, a fourth cantilever 2226, a fifth torsion member 2227, and a fifth cantilever 2228. The fifth arm 2228 has mounted thereon the machine M.
The second suspension arm 2222 is located below the first suspension arm 2211, and the second torsion member 2221 connects the second end of the first suspension arm 2211 with the first end of the second suspension arm 2222. The second torsion member 2221 can drive the second end of the second cantilever 2222 to move towards or away from the base 210 on the vertical plane, as shown by the double-arrow curve B in fig. 11.
The third cantilever 2224 is located at a side of the second end of the second cantilever 2222 away from the base 210, and intersects the second cantilever 2222 at a fixed included angle (e.g., the fixed included angle is an acute angle in fig. 11). The third torsion member 2223 connects the first end of the third cantilever 2224 with the second end of the second cantilever 2222, and the third torsion member 2223 can drive the third cantilever 2224 to rotate around its own axis, as shown by the double-arrow curve C in fig. 11.
The fourth torsion member 2225 connects the second end of the third cantilever 2224 with the first end of the fourth cantilever 2226. The fourth torsion element 2225 can drive the fourth suspension arm 2226 to move so as to change the angle between the third suspension arm 2224 and the fourth suspension arm 2226, as shown by the double-arrow curve D in fig. 11.
The fifth torsion member 2227 connects the second end of the fourth cantilever 2226 with the first end of the fifth cantilever 2228, and the second end of the fifth torsion member 2227 is provided with mechanical claws for clamping a target instrument to perform a surgical operation such as clamping, cutting, shearing, etc. The fifth torsion element 2227 drives the fifth suspension arm 2228 to move, so as to change the angle between the fourth suspension arm 2226 and the fifth suspension arm 2228.
Torsion motors can be respectively arranged in the second torsion member 2221, the third torsion member 2223, the fourth torsion member 2225 and the fifth torsion member 2227, and the torsion motors are electrically connected with the controller, so that the controller can control the torsion members to drive the cantilever to move by controlling the torsion motors, and further drive the pose of the instrument M to change. The torsion members are joints of the mechanical arm.
Fig. 11 shows a schematic diagram of the structure of only one type of robot apparatus, and in some embodiments, the number of degrees of freedom and the number of robot arms of the robot apparatus may be more or less than those of the robot apparatus shown in fig. 11.
Fig. 12 shows a schematic diagram of various coordinate systems associated with the motion control process of the arm holding (i.e., the robotic arm end-mounted target instrument) of the robot of fig. 4 or 11. The following describes coordinate systems related to the motion control process of the robot arm, taking this robot apparatus as an example.
As shown in FIG. 12, x0y0z0A first base coordinate system for the robot device as a whole, the first base coordinate system being a coordinate system established with points on the base as origin of coordinates; x is the number of1y1z1A second base coordinate system of the target robot arm for holding the target instrument, the second base coordinate system being a coordinate system established with a point on the telescopic arm sub mechanism 221, such as a coordinate system established with a point on the side of the telescopic arm 221 away from the base; x is the number ofmymzmAn end of arm coordinate system, which is a coordinate system established with the end of the arm (the end being the end of the arm distal from the base on the path of the control signal transmission), for example, a coordinate system established with a point on the end of the fifth boom in fig. 11; x is the number ofqyqzqIs the instrument tip coordinate system, which is a coordinate system established with points on the tip of the instrument (the tip refers to the end away from the robotic arm).
In the motion control process of the mechanical arm, a sensor arranged in each joint of the mechanical arm collects position information of the joint, and the pose and the coordinate position of the tail end of the target instrument in the instrument tail end coordinate system are determined according to the collection result. Step S41 may gradually map the pose and the coordinate pose to each coordinate system in the order of "instrument end coordinate system- > mechanical arm end coordinate system- > second base coordinate system- > first base coordinate system".
Figure 13 shows a schematic representation of the respective coordinate systems associated with the image acquisition assembly on the mirror-holding arm (i.e. the robotic arm end-mounted image acquisition assembly) of the robot of figure 4 or 11. The following describes each coordinate system associated with the image acquisition assembly by taking this robot apparatus as an example.
As shown in fig. 13, x0y0z0、x1y1z1、xmymzmThe coordinate system shown is the same coordinate system as the coordinate system of the same name in fig. 12, xTyTzTThe coordinate system of the image acquisition assembly is a coordinate system established by taking a point on the visual field axis of the image acquisition assembly and facing to one side direction of the mechanical arm as a coordinate origin; x is the number ofCyCzCAn image acquisition component front end coordinate system, which may be a coordinate system established with the front end of the image acquisition component (e.g., the front end of the lens of the endoscope, the ultrasound signal emitting point of the ultrasound signal transceiver) as the origin of coordinates; x is the number ofayazaThe field of view projection plane coordinate system may be a coordinate system established on a projection plane where the tip of the target instrument is located in each projection plane (the projection plane refers to a field of view section perpendicular to the field of view axis) of the field of view of the image acquisition assembly, and the origin of coordinates may be a point of the field of view axis on the projection screen.
In the motion control process of the endoscope holding arm, the sensors arranged in the joints of the endoscope holding arm acquire the position information of the joints, the coordinate position of the coordinate origin of the coordinate system of the image acquisition assembly is determined according to the acquisition result, and the step S41 can gradually map the pose and the coordinate pose to each coordinate system according to the sequence of the coordinate system of the image acquisition assembly, the coordinate system of the tail end of the mechanical arm, the second base system and the first base coordinate system.
The image acquisition component can acquire a real-time image of the target instrument processing operated object through a binocular vision technology and an infrared vision technology, and the image can contain depth information. Based on rich information acquired by the image, the pose and the coordinate position of the tail end of the target instrument in the image under the visual field projection plane coordinate system can be determined, then the pose and the coordinate position under the visual field projection plane coordinate system are mapped to the pose and the coordinate position under the image acquisition assembly front end coordinate system according to the image depth information, then the pose and the coordinate position under the image acquisition assembly front end coordinate system are mapped to the pose and the coordinate position under the image acquisition assembly coordinate system according to the lens structure parameters of the image acquisition assembly, and further the pose and the coordinate position can be mapped to other coordinate systems according to the mapping mode of the previous section.
As can be seen from the above description, the coordinate system in S41 may be any one of the first base coordinate system, the second base coordinate system, the robot arm end coordinate system, and the instrument end coordinate system.
It should be noted that, the method shown in fig. 5 is to determine the first motion amount and the second motion amount, then map the first motion amount and the second motion amount to a predetermined coordinate system to obtain a third motion amount and a fourth motion amount, respectively, and then calculate the difference between the mapped third motion amount and the mapped fourth motion amount. In some embodiments, the pose information and the position coordinate used for determining the first action amount may be mapped to a predetermined coordinate system, and then the third action amount may be obtained according to the mapping result. This embodiment is an equivalent embodiment of the method shown in fig. 5 and should be determined to fall within the scope of the present application.
S42: a difference between the third and fourth motion amounts is calculated.
As understood from the method of acquiring the operation amount in the present specification, the difference calculated in S42 may be an angle difference or a coordinate difference.
S43: and determining the performance value of the target instrument according to the difference.
Specifically, S43 may set respective numerical ranges for each dimension in the motion amount, each numerical range corresponding to one performance value. Thus, for one dimension, the performance value of the target instrument may be determined based on the range of values in which the difference is located.
In some cases, different performance values may be obtained according to the difference values of different dimensions, and then the result with the worst performance value may be used as the performance value of the target device, or the performance values corresponding to the dimensions are weighted and summed to obtain the performance value of the target device. Of course, other ways can be adopted, and the description is not given by way of example.
The method for determining a performance value of a robot according to the present specification predicts a first motion amount, which is determined from a motion control coordinate system, by which a tip of a target instrument is driven to move when the performance of the target instrument reaches a predetermined threshold value, based on an input motion amount by which the tip of the target instrument is driven to move; and determining a second motion quantity of the end of the target instrument moving in response to the input motion quantity from the image formed by the signal acquired by the image acquisition component, namely the second motion quantity is determined from the perspective of the image coordinate system; the performance value of the target instrument can be objectively determined by determining the performance value of the target instrument through the difference value of the first action amount and the second action amount which are mapped into the preset coordinate system, the experience of an operator is not depended on, and the performance value determining method has low requirement on the operator and high performance detection efficiency. In addition, the method can determine the performance value of the target instrument in real time before, during and after operation, thereby timely early warning the target instrument with performance not reaching the standard, preventing the operation result from being influenced by the instrument performance and improving the operation reliability.
In some embodiments, as shown in fig. 14, after S40, the following steps S50 to S70 may be further included.
S50: and predicting the continuation value of the service life of the target equipment from the current time according to the performance value of the target equipment.
"continuation of service life value" means a life value that is increased on the basis of the current rated life.
The life of the target instrument may refer to the remaining number of uses or the remaining time of the current target instrument.
S60: and acquiring the latest service life of the target instrument from a storage device built in the target instrument.
The storage device built in the target device may be a physical device arranged in the target device, or a tag (e.g., two-dimensional code identifier) pointing to a network memory arranged on the surface of the target device.
Step S60 may acquire the latest life span from the memory device built in the target instrument by any one of existing short-range wireless communication technologies.
S70: and adjusting the service life of the target instrument according to the continuation value of the service life, and writing the adjusted service life as the latest service life into a memory device built in the target instrument.
The target instrument and the robot device are usually arranged separately, the types of the target instruments are various, the operation which can be executed by each type of target instrument is different, the type of the target instrument can be selected according to the actual operation requirement, and the selected target instrument is installed on the robot device. After the target device is produced, the manufacturer determines the rated service life of the target device according to the test result before the manufacturer leaves the factory, and the rated service life can be stored in a memory device built in the target device.
Before the target instrument is used, the latest service life of the target instrument can be acquired from a storage device arranged in the target instrument, whether the current use exceeds the latest service life of the target instrument or not is judged, the target instrument is used only in the case of not exceeding the latest service life, and the target instrument is not used in the case of exceeding the latest service life. Or, if the latest service life obtained from the built-in storage device is exceeded, the latest service life of the target device is further adjusted by the method corresponding to the steps S50 to S70, and then whether the current use exceeds the adjusted latest service life is judged again, and the target device is continuously used if the current use does not exceed the adjusted latest service life.
By adjusting the service life of the target instrument, the instrument with still good performance when reaching the rated service life can be fully utilized, and the instrument which does not reach the rated service life but has poor performance can be marked, so that the instrument with poor performance is prevented from influencing the operation result.
In some embodiments, as shown in fig. 15, before S10, the following steps S100 to S130 may be further included.
S100: and acquiring the first type of the target instrument from a storage device built in the target instrument.
The storage device built in the target device may be a physical device arranged in the target device, or a tag (e.g., two-dimensional code identifier) pointing to a network memory arranged on the surface of the target device. That is, the first type to which the target instrument belongs, the latest lifetime of the target instrument may be stored in the same memory device. Of course, other information of the target instrument may be stored in the memory device.
Step S100 may acquire the first type to which the target device belongs from a memory device built in the target device by using any existing near field communication technology.
S110: from the image formed from the signals acquired by the image acquisition assembly, a second category to which the target instrument belongs is determined.
S120: it is determined whether the first category and the second category match.
S130: in the case of matching, acquiring an input action amount for driving the distal end of the target instrument is performed.
By determining whether or not "the type stored in the memory in which the target apparatus is built" matches "the" type specified from the image ", and executing the performance detection method shown in fig. 5 only when there is a match, it is possible to avoid the situation where the information of the target apparatus is mistaken.
After determining the performance value of the target instrument in step S40, the performance value may be presented to the operator by way of visual feedback. For example, instrument performance information, as shown in fig. 16, may be displayed on the display of the console, which may include the identity of the instrument (e.g., instrument 1, instrument 2), performance parameters, fault type, etc.; instrument performance information may also be displayed on the display of the image-side device 300. In some embodiments, a sound prompt device may be further disposed on the console and/or the image-side device 300 to present the fault information in a sound manner; or a light prompting device is arranged to present the fault information in a light mode.
The types of failures can be divided into: performance faults, structural faults, wherein a performance fault may refer to a performance value reaching a predetermined performance value threshold. Structural faults may be determined by: acquiring an image formed according to a signal acquired by the image acquisition assembly, and determining the type of a target instrument in the image; identifying features of the target instrument tip in the image according to the type of the target instrument, wherein the features can comprise colors, outlines, surface textures and the like; matching the features of the target instrument tip with the fault feature templates of the class of instruments; when the fault feature template is matched, determining the fault type of the target instrument as a structural fault. And the fault type corresponding to the fault characteristic template can also be used as the fault type of the target instrument.
The fault signature template for the instrument may be a target for one or more of the following faults: the method comprises the following steps of yarn stripping, yarn tail end fluffing, instrument material cracking, electrotome instrument material scorching and instrument insulation sleeve stripping.
The present specification provides a robot performance value specifying device that can be used to implement the robot performance value specifying method shown in fig. 5. As shown in fig. 17, the apparatus includes a first acquisition unit 10, a first prediction unit 20, a first determination unit 30, and a second determination unit 40.
The first acquiring unit 10 is used for acquiring an input action amount for driving the movement of the tail end of the target instrument; the target instrument is disposed on a robotic arm of the target robot. The first prediction unit 20 is configured to predict a first action amount by which the input action amount drives the distal end activity of the target instrument in a case where the performance value of the target instrument reaches a predetermined threshold value. The first determination unit 30 is configured to determine a second motion amount by which the tip of the target instrument moves in response to the input motion amount from an image formed by the signal acquired by the image acquisition unit. The second determination unit 40 is configured to determine the instrument performance based on a difference between the first motion amount and the second motion amount.
In some embodiments, the second determining unit 40 includes: the mapping subunit is used for mapping the first action amount to a preset coordinate system to obtain a third action amount, and mapping the second action amount to the preset coordinate system to obtain a fourth action amount; a calculation subunit configured to calculate a difference between the third motion amount and the fourth motion amount; the first determining subunit is used for determining the performance value of the target instrument according to the difference value.
In some embodiments, the first prediction unit comprises: the first acquisition subunit is used for acquiring a first pose of the tail end of the target instrument under the motion coordinate system; the motion coordinate system is a coordinate system adopted when the target robot carries out motion control; the predicting subunit is used for predicting a second pose of the tail end of the target instrument under a motion coordinate system after the tail end of the target instrument is driven to move by the input action amount under the condition that the performance value of the target instrument reaches a preset threshold value; and the second determining subunit is used for determining the first action quantity of the tail end of the target instrument under the motion coordinate system according to the first pose and the second pose.
In some embodiments, the first determination unit comprises: the second acquisition subunit is used for acquiring an image formed by the signal acquired by the image acquisition component in real time and determining a first image before the tail end of the target instrument responds to the input action amount and a second image after the tail end of the target instrument responds to the input action amount; the third determining subunit is used for determining a third pose of the tail end of the target instrument in the image coordinate system according to the first image and determining a fourth pose of the tail end of the target instrument in the image coordinate system according to the second image; and the fourth determining subunit is used for determining a second action quantity of the tail end of the target instrument in the image coordinate system according to the third pose and the fourth pose.
In some embodiments, the first obtaining unit includes: a third acquiring subunit, configured to acquire a pose variation amount of a manipulator in the robot system, and use the pose variation amount of the manipulator as an input action amount; the manipulator is used for manipulating the end movement of the target instrument by an operator; or, a fourth acquiring subunit, configured to acquire a pose change amount of a power transmission assembly that drives a distal end of the target instrument to move in the robot system, and use the pose change amount of the power transmission assembly as an input action amount; the power transmission assembly is arranged on a target mechanical arm of the target robot, is mechanically connected with the target instrument and is used for driving the tail end of the target instrument to move.
In some embodiments, the third acquisition subunit comprises: the recognition subunit is used for recognizing whether the pose change of the manipulator is matched with a preset action in a preset action set when the manipulator is operated by an operator to enable the tail end of the target instrument to move; and the fifth acquisition subunit is used for acquiring the pose variation of the manipulator corresponding to a preset action under the condition of matching with the preset action, and taking the pose variation as an input action amount.
In some embodiments, the first obtaining unit includes: the sixth acquisition subunit is used for acquiring a prestored control instruction sequence, and the control instruction sequence is used for controlling the tail end of the target instrument to generate pose change; the sending subunit is used for sending the control instructions in the control instruction sequence to the target robot in sequence and feeding back the control instructions to a manipulator of the robot system so as to enable the pose of the manipulator and the target robot to act synchronously; and a fifth determining subunit configured to take the pose change amount of the manipulator as the input action amount.
In some embodiments, the first obtaining unit includes: the detection subunit is used for detecting the action amount of the power box in the process of moving the tail end of the target instrument; the power box is arranged at the tail end of a target mechanical arm of the target robot and is detachably connected with the head end of the target instrument; and a sixth determining subunit, configured to take the motion amount of the power cartridge as the input motion amount.
In some embodiments, the apparatus further comprises: the second acquisition unit is used for acquiring the first type of the target instrument from a storage device arranged in the target instrument; a third determination unit configured to determine a second type to which the target instrument belongs from an image formed by the signals acquired by the image acquisition unit; a fourth determination unit for determining whether the first category and the second category match; in the case of matching, the first acquisition unit performs the acquisition of the input action amount for driving the distal end movement of the target instrument.
In some embodiments, the apparatus further comprises: the second prediction unit is used for predicting the continuation value of the service life of the target appliance from the current moment according to the performance value of the target appliance; the third acquisition unit is used for acquiring the latest service life of the target instrument from a storage device built in the target instrument; and the adjusting unit is used for adjusting the service life of the target instrument according to the continuation value of the service life and writing the adjusted service life into a storage device built in the target instrument as the latest service life.
In some embodiments, the apparatus further comprises: the fourth acquisition unit is used for acquiring an image formed according to the signal acquired by the image acquisition assembly; a fifth determining unit, configured to determine a type of the target instrument in the image; identifying the characteristics of the tail end of the target instrument in the image according to the type of the target instrument; the matching unit is used for matching the characteristics of the tail end of the target instrument with the fault characteristic template of the instrument of the type; and a sixth determining unit, configured to determine the fault type of the target instrument as a structural fault when the fault feature template is matched.
In some embodiments, the apparatus further comprises: the judging unit is used for judging whether the performance value of the target appliance reaches a preset performance value threshold value; a seventh determining unit, configured to determine that the target instrument has a performance fault if the target instrument is reached; a presentation unit for presenting to an operator a fault type of the target instrument, the fault type comprising a performance fault and/or a structural fault.
The details of the performance value determining apparatus for a robot described above can be understood with reference to the description and effects in the corresponding embodiment of fig. 5, and are not described herein again.
The present specification provides a robot system including a target robot and a controller. The controller may be provided in the target robot, in the console, or independently of the target robot and the console.
The target robot comprises a base, a first mechanical arm and a second mechanical arm, wherein the tail end of the first mechanical arm is used for mounting a target instrument, and the tail end of the second mechanical arm is used for mounting an image acquisition assembly; the controller is used for controlling the first mechanical arm and the second mechanical arm of the target robot to act so as to enable the target instrument to perform operation on the operated object, and meanwhile, the image acquisition assembly is used for acquiring a real-time image of the operation process; the controller is also used for executing the performance determination method of the robot.
The target robot may be the robotic device 200 of fig. 1. For the description of the robot apparatus 200, reference may be made to the description of other parts of the present specification, and the description thereof is omitted.
An embodiment of the present invention further provides a controller, as shown in fig. 18, the controller may include a processor 1801 and a memory 1802, where the processor 1801 and the memory 1802 may be connected by a bus or in another manner, and fig. 18 illustrates an example of connection by a bus.
The processor 1801 may be a Central Processing Unit (CPU). The Processor 1801 may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory 1802, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the method of determining a performance value of a robot in the embodiment of the present invention (for example, the first acquiring unit 10, the first predicting unit 20, the first determining unit 30, the mapping unit 40, the calculating unit 50, and the second determining unit 60 in fig. 17). The processor 1801 executes various functional applications and data classification of the processor by running non-transitory software programs, instructions and modules stored in the memory 1802, namely, implements the method for determining a performance value of a robot in the above method embodiments.
The memory 1802 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor 1801, and the like. Further, the memory 1802 can include high speed random access memory, and can also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 1802 may optionally include memory located remotely from the processor 1801, such remote memory may be coupled to the processor 1801 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more modules are stored in the memory 1802 and, when executed by the processor 1801, perform a method of determining a performance value of a robot as in the embodiment shown in fig. 5.
The details of the controller can be understood with reference to the description and effects of the embodiment shown in fig. 5, and are not described herein again.
The present specification also provides a computer storage medium having computer program instructions stored thereon that, when executed, implement the steps of the corresponding embodiment of fig. 5.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
All the embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from other embodiments. In particular, for the hardware + program class embodiment, since it is substantially similar to the method embodiment, the description is simple, and reference may be made to part of the description of the method embodiment for relevant points.
The foregoing description of specific embodiments has been presented for purposes of illustration and description. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may therefore be considered as a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The above description is only an example of the embodiments of the present disclosure, and is not intended to limit the embodiments of the present disclosure. Various modifications and variations to the embodiments described herein will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement or the like made within the spirit and principle of the embodiments of the present invention should be included in the scope of the claims of the embodiments of the present invention.

Claims (18)

1. A method for determining a performance value of a robot, comprising:
acquiring an input action amount for driving the tail end of the target instrument to move; the target instrument is arranged on a mechanical arm of the target robot;
predicting a first action amount of the input action amount driving the end activity of the target instrument under the condition that the performance value of the target instrument reaches a preset threshold value;
determining a second amount of motion that the tip of the target instrument moves in response to the input amount of motion from an image formed of the signals acquired by the image acquisition component;
and determining the performance of the instrument according to the difference of the first action amount and the second action amount.
2. The method of claim 1, wherein determining the instrument performance from the difference between the first and second motion quantities comprises:
mapping the first action amount to a preset coordinate system to obtain a third action amount, and mapping the second action amount to the preset coordinate system to obtain a fourth action amount;
calculating a difference between the third motion amount and the fourth motion amount;
and determining the performance value of the target instrument according to the difference value.
3. The method of claim 1, wherein predicting a first action amount by which the input action amount drives an end activity of the target instrument in a case where the performance value of the target instrument reaches a predetermined threshold value comprises:
acquiring a first pose of the tail end of the target instrument under a motion coordinate system; the motion coordinate system is a coordinate system adopted when the target robot carries out motion control;
predicting a second pose of the tail end of the target instrument under a motion coordinate system after the input action amount drives the tail end of the target instrument to move under the condition that the performance value of the target instrument reaches a preset threshold value;
and determining a first action amount of the tail end of the target instrument under the motion coordinate system according to the first pose and the second pose.
4. The method of claim 1, wherein determining, from the image formed of the signals acquired by the image acquisition assembly, a second amount of motion that the tip of the target instrument moves in response to the input amount of motion comprises:
acquiring an image formed by the signal acquired by the image acquisition component in real time, and determining a first image before the tail end of the target instrument responds to the input action amount and a second image after the tail end of the target instrument responds to the input action amount;
determining a third pose of the tail end of the target instrument in an image coordinate system according to the first image, and determining a fourth pose of the tail end of the target instrument in the image coordinate system according to the second image;
and determining a second action quantity of the tail end of the target instrument under the image coordinate system according to the third pose and the fourth pose.
5. The method of claim 1, wherein obtaining the input action amount for driving the end movement of the target instrument comprises:
acquiring pose variation of a manipulator in a robot system, and taking the pose variation of the manipulator as an input action quantity; the manipulator is used for manipulating the end movement of the target instrument by an operator; alternatively, the first and second electrodes may be,
acquiring the pose variation of a power transmission assembly which drives the tail end of a target instrument to move in the robot system, and taking the pose variation of the power transmission assembly as an input action quantity; the power transmission assembly is arranged on a target mechanical arm of the target robot, is mechanically connected with the target instrument and is used for driving the tail end of the target instrument to move.
6. The method according to claim 5, wherein acquiring a pose change amount of a manipulator in the robot system and taking the pose change amount of the manipulator as an input action amount comprises:
identifying whether a pose change of the manipulator matches a preset action in a preset action set when an operator operates the manipulator to move the tip of the target instrument;
and under the condition of matching with a preset action, acquiring the pose variation of the manipulator corresponding to the preset action, and taking the pose variation as an input action.
7. The method according to claim 5, wherein acquiring a pose change amount of a manipulator in the robot system and taking the pose change amount of the manipulator as an input action amount comprises:
acquiring a prestored control instruction sequence, wherein the control instruction sequence is used for controlling the tail end of a target instrument to generate pose change;
sequentially sending the control instructions in the control instruction sequence to a target robot, and simultaneously feeding back the control instructions to a manipulator of the robot system so as to enable the pose of the manipulator and the target robot to act synchronously;
and taking the pose variation of the manipulator as an input action amount.
8. The method of claim 5, wherein obtaining a pose change amount of a powered assembly in the robotic system that drives movement of the tip of the target instrument and using the pose change amount of the powered assembly as the input action amount comprises:
detecting the action amount of the power box in the process of moving the tail end of the target instrument; the power box is arranged at the tail end of a target mechanical arm of the target robot and is detachably connected with the head end of the target instrument;
the motion amount of the power box is used as an input motion amount.
9. The method of claim 1, prior to obtaining the input action amount for driving the end movement of the target instrument, comprising:
acquiring a first class to which a target instrument belongs from a storage device built in the target instrument;
determining a second type to which the target instrument belongs from an image formed from the signals acquired by the image acquisition assembly;
determining whether the first category and the second category match;
in case of a match, the obtaining of the input action amount for driving the distal end movement of the target instrument is performed.
10. The method of claim 2, further comprising, after determining a performance value of the target instrument based on the difference,:
predicting a service life continuation value of the target appliance from the current moment according to the performance value of the target appliance;
acquiring the latest service life of the target instrument from a storage device built in the target instrument;
and adjusting the service life of the target instrument according to the continuation value of the service life, and writing the adjusted service life as the latest service life into a memory device built in the target instrument.
11. The method of claim 1, further comprising:
acquiring an image formed according to a signal acquired by the image acquisition assembly;
determining the type of a target instrument in the image; identifying the characteristics of the tail end of the target instrument in the image according to the type of the target instrument;
matching the features of the target instrument tip with the fault feature templates of the class of instruments;
when the fault feature template is matched, determining the fault type of the target instrument as a structural fault.
12. The method of claim 11, further comprising:
judging whether the performance value of the target apparatus reaches a preset performance value threshold value or not;
in the case of achievement, determining that the target instrument has a performance fault;
the operator is presented with a fault type for the target instrument, including a performance fault and/or a structural fault.
13. A performance value determination apparatus for a robot, comprising:
a first acquisition unit for acquiring an input action amount for driving a distal end of a target instrument to move; the target instrument is arranged on a mechanical arm of the target robot;
a first prediction unit configured to predict a first action amount by which the input action amount drives an end activity of the target instrument in a case where a performance value of the target instrument reaches a predetermined threshold;
a first determination unit configured to determine a second motion amount by which the tip of the target instrument moves in response to the input motion amount from an image formed by the signal acquired by the image acquisition assembly;
a second determination unit for determining the instrument performance based on a difference between the first and second motion amounts.
14. A robotic system, comprising:
the target robot comprises a base, a first mechanical arm and a second mechanical arm, wherein the tail end of the first mechanical arm is used for mounting a target instrument, and the tail end of the second mechanical arm is used for mounting an image acquisition assembly;
the controller is used for controlling the first mechanical arm and the second mechanical arm of the target robot to act so as to enable the target instrument to perform operation on the operated object, and meanwhile, the image acquisition assembly is used for acquiring a real-time image of the operation process;
the controller is further configured to perform a method of determining a performance of a robot as claimed in any of claims 1 to 12.
15. The system of claim 14, further comprising:
and the manipulator is used for controlling the first mechanical arm and the second mechanical arm of the target robot to move so as to control the target instrument to perform operation on the operated object, and simultaneously acquiring a real-time image of the operation process through the image acquisition assembly.
16. The system of claim 14, further comprising a power cartridge, the power cartridge comprising:
the power assembly is fixedly arranged at the tail end of the first mechanical arm; the power assembly comprises a first connecting piece, a plurality of motors and a plurality of first connecting structures, the motors are arranged on the first connecting piece, and the shaft part of each motor penetrates through the first connecting piece to be fixedly connected with one first connecting structure; each first connecting structure is arranged on the same side of the first connecting piece;
the transmission assembly is fixedly arranged at the head section of the target instrument; the transmission assembly comprises a second connecting piece, a plurality of wire wheels and a plurality of second coupling structures, and the shaft part of each wire wheel passes through the second connecting piece and is fixedly connected with one second coupling structure; each second connecting structure is arranged on the same side of the second connecting piece;
when the target instrument is installed at the tail end of the first mechanical arm, one side surface, far away from the second connecting piece, of the second connecting structure on the power assembly is spliced with one side surface, far away from the first connecting piece, of the first connecting structure on the transmission assembly, so that the first connecting structure and the second connecting structure synchronously rotate, the wire wheel and the motor are driven to synchronously rotate, and the control force on the first mechanical arm is transmitted to the target instrument;
the end part of a silk thread is fixed on each silk wheel, and the other end of the silk thread penetrates through a pipe body on the target instrument to be connected with a driven wheel at the tail end of the target instrument; when a motor in the power assembly is controlled and selected, a wire wheel in the transmission assembly is driven to rotate so as to adjust the tension on the silk thread and/or control the silk thread to move, and then the driven wheel is driven to rotate through the silk thread, so that the target instrument can move on the degree of freedom controlled by the driven wheel.
17. A controller, comprising:
a memory and a processor, the processor and the memory being communicatively connected to each other, the memory having stored therein computer instructions, the processor implementing the steps of the method of any one of claims 1 to 12 by executing the computer instructions.
18. A computer storage medium, characterized in that it stores computer program instructions which, when executed, implement the steps of the method of any one of claims 1 to 12.
CN202210878229.XA 2022-07-25 2022-07-25 Method and device for determining performance value of robot and controller Pending CN115256384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210878229.XA CN115256384A (en) 2022-07-25 2022-07-25 Method and device for determining performance value of robot and controller

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210878229.XA CN115256384A (en) 2022-07-25 2022-07-25 Method and device for determining performance value of robot and controller

Publications (1)

Publication Number Publication Date
CN115256384A true CN115256384A (en) 2022-11-01

Family

ID=83768192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210878229.XA Pending CN115256384A (en) 2022-07-25 2022-07-25 Method and device for determining performance value of robot and controller

Country Status (1)

Country Link
CN (1) CN115256384A (en)

Similar Documents

Publication Publication Date Title
JP6891244B2 (en) Medical devices, systems, and methods that use eye tracking
KR102105142B1 (en) Switching control of an instrument to an input device upon the instrument entering a display area viewable by an operator of the input device
JP6180692B1 (en) Medical manipulator system
JP6091410B2 (en) Endoscope apparatus operating method and endoscope system
KR102171873B1 (en) Haptic glove and Surgical robot system
US11801103B2 (en) Surgical system and method of controlling surgical system
WO2015012143A1 (en) Medical system and medical treatment tool control method
EP2612616A1 (en) Surgical robot and method for controlling the same
JP6091370B2 (en) Medical system and medical instrument control method
JP7127128B2 (en) Surgical robot system and its surgical instrument
JP7334499B2 (en) Surgery support system, control device and control method
CN114760903A (en) Method, apparatus, and system for controlling an image capture device during a surgical procedure
CN110913792A (en) System and method for state-based speech recognition in a remote operating system
CN114311031A (en) Master-slave end delay testing method, system, storage medium and equipment for surgical robot
JP2020031770A (en) Driving part interface
US20220400938A1 (en) Medical observation system, control device, and control method
CN115256384A (en) Method and device for determining performance value of robot and controller
EP3940688B1 (en) Force sense display device and force sense display method of medical robot system
US20220401166A1 (en) Surgical system and controlling method
JP2023507063A (en) Methods, devices, and systems for controlling image capture devices during surgery
CN114845618A (en) Computer-assisted surgery system, surgery control apparatus, and surgery control method
JP2020032161A (en) Driving part interface, adapter, mounting detection method for surgical instrument to driving part interface
WO2022201933A1 (en) Intravital observation system, observation system, intravital observation method, and intravital observation device
CN117372667A (en) Pose adjusting method and device of image acquisition assembly and controller
US20200315740A1 (en) Identification and assignment of instruments in a surgical system using camera recognition

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination