CN115946105A - Control method of operation arm and surgical robot system - Google Patents

Control method of operation arm and surgical robot system Download PDF

Info

Publication number
CN115946105A
CN115946105A CN202111176599.0A CN202111176599A CN115946105A CN 115946105 A CN115946105 A CN 115946105A CN 202111176599 A CN202111176599 A CN 202111176599A CN 115946105 A CN115946105 A CN 115946105A
Authority
CN
China
Prior art keywords
pose
determining
identification
identifications
control method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111176599.0A
Other languages
Chinese (zh)
Inventor
徐凯
吴百波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shurui Shanghai Technology Co ltd
Original Assignee
Shurui Shanghai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shurui Shanghai Technology Co ltd filed Critical Shurui Shanghai Technology Co ltd
Priority to CN202111176599.0A priority Critical patent/CN115946105A/en
Publication of CN115946105A publication Critical patent/CN115946105A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The disclosure relates to the technical field of control, and discloses a control method of an operating arm, a computer device, a computer readable storage medium and a surgical robot system. A method of controlling an operating arm, comprising: acquiring a positioning image; identifying a plurality of pose identifiers located on the operating arm in the positioning image, wherein the plurality of pose identifiers comprise different pose identifier patterns; determining a current relative pose of the operating arm relative to a reference coordinate system based on the plurality of pose identifications; and determining a driving signal of the operation arm based on the current relative pose and the target pose of the operation arm.

Description

Control method of operation arm and surgical robot system
Technical Field
The disclosure belongs to the technical field of control, and particularly relates to a control method of an operating arm and a surgical robot system.
Background
As technology develops, it is becoming more common for related machine equipment, either human or computer controlled, to perform desired actions to assist or replace operators. For example, sorting of couriers is performed using a logistics robot, and a surgical robot is used to assist a doctor in performing a surgery, etc.
In the above application, the control of the operation arm is required to realize the control of the machine equipment.
Disclosure of Invention
In some embodiments, the present disclosure provides a control method of an operation arm, including: acquiring a positioning image; identifying a plurality of pose identifications positioned on the operating arm in the positioning image, wherein the plurality of pose identifications comprise different pose identification patterns; determining a current relative pose of the operating arm with respect to a reference coordinate system based on the plurality of pose identifications; and determining a driving signal of the operation arm based on the current relative pose and the target pose of the operation arm.
In some embodiments, the present disclosure provides a computer device comprising: a memory for storing at least one instruction; and a processor, coupled to the memory, for executing at least one instruction to perform the control method of any of some embodiments of the present disclosure.
In some embodiments, the present disclosure provides a computer-readable storage medium having at least one instruction stored therein, the at least one instruction being executable by a processor to cause a computer to perform a control method of any one of some embodiments of the present disclosure.
In some embodiments, the present disclosure provides a surgical robotic system comprising: the surgical tool comprises an operating arm, an actuator arranged at the far end of the operating arm and a plurality of pose marks arranged at the tail end of the operating arm, wherein the pose marks comprise different pose mark patterns; the image collector is used for collecting a positioning image of the operating arm; and a processor connected with the image collector and used for executing the control method of any one of the embodiments of the disclosure to determine the driving signal of the operation arm.
Drawings
Fig. 1 illustrates a schematic diagram of an operating arm control system according to some embodiments of the present disclosure;
FIG. 2 illustrates a segmented schematic view of an operating arm according to some embodiments of the present disclosure;
FIG. 3 illustrates a schematic structural view of an operating arm according to some embodiments of the present disclosure;
FIG. 4 illustrates a schematic diagram of a tag including multiple pose identifications, according to some embodiments of the present disclosure;
FIG. 5 shows a schematic view of a label disposed around the distal end of an arm and formed into a cylindrical shape, according to some embodiments of the present disclosure;
fig. 6 illustrates a flow chart of a control method of an operating arm control system according to some embodiments of the present disclosure;
FIG. 7 illustrates a flow diagram of a method for determining a drive signal according to some embodiments of the present disclosure;
FIG. 8 illustrates a flow diagram of a method of determining three-dimensional coordinates of a plurality of pose identifications relative to an manipulator coordinate system according to some embodiments of the present disclosure;
FIG. 9 illustrates determining a plurality of pose identification relative positions according to further embodiments of the present disclosure; a flow chart of a method of constructing three-dimensional coordinates of an arm coordinate system;
FIG. 10 illustrates a flow diagram of a method of identifying pose identifications, according to some embodiments of the present disclosure;
fig. 11 illustrates a schematic diagram of a pose identification pattern, according to some embodiments of the present disclosure;
FIG. 12 illustrates a flow diagram of a method for searching for pose identification, according to some embodiments of the present disclosure;
FIG. 13 shows a schematic diagram of search pose identification, according to some embodiments of the present disclosure;
FIG. 14 illustrates a flow diagram of a method for searching for a second gesture identification, in accordance with some embodiments of the present disclosure;
FIG. 15 illustrates a flow diagram of a method for searching for pose identification, according to some embodiments of the present disclosure;
FIG. 16 shows a schematic block diagram of a computer device, in accordance with some embodiments of the present disclosure;
fig. 17 shows a schematic view of a surgical robotic system, according to some embodiments of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the drawings, and those skilled in the art will understand that the scope of the present disclosure is not limited to these embodiments. The present disclosure may be modified and varied in many ways based on the embodiments described below. Such modifications and variations are intended to be included within the scope of the present disclosure. Like reference numerals refer to like parts throughout the various embodiments shown in the figures of the present disclosure.
In the present disclosure, the term "position" refers to the positioning of an object or a portion of an object in three-dimensional space (e.g., three translational degrees of freedom can be described using changes in cartesian X, Y, and Z coordinates, such as along cartesian X, Y, and Z axes, respectively). In this disclosure, the term "pose" refers to a rotational setting of an object or a portion of an object (e.g., three rotational degrees of freedom that can be described using roll, pitch, and yaw). In the present disclosure, the term "pose" refers to a combination of position and pose of an object or a portion of an object, such as may be described using six parameters in the six degrees of freedom mentioned above.
In the present disclosure, a reference coordinate system may be understood as a coordinate system capable of describing the pose of an object. According to the actual positioning requirement, the origin of the virtual reference object or the origin of the physical reference object can be selected as the origin of the coordinate system. In some embodiments, the reference coordinate system may be a world coordinate system or a camera coordinate system or the operator's own perceptual coordinate system, or the like. In the present disclosure, an object may be understood as an object or target to be positioned, for example a manipulator arm or a manipulator arm tip or an actuator arranged at a distal end of a manipulator arm tip.
In the present disclosure, the pose of the manipulator arm or a part thereof refers to the pose of the manipulator arm coordinate system defined by the manipulator arm or a part thereof relative to the reference coordinate system.
Fig. 1 illustrates a schematic diagram of a manipulator arm control system 100, according to some embodiments of the present disclosure. As shown in fig. 1, the manipulator arm control system 100 may include an image capture device 110, at least one manipulator arm 140, and a control apparatus 120. The image capturing device 110 and the at least one operating arm 140 are each communicatively connected to the control apparatus 120. In some embodiments, as shown in fig. 1, the control device 120 may be used to control the movement of the at least one manipulation arm 140, to adjust the pose of the at least one manipulation arm 140, to coordinate with each other, and so on. In some embodiments, at least one of the manipulation arms 140 may comprise a manipulation arm tip 130 at a distal or distal end. The control device 120 can control the movement of the at least one manipulator arm 140 to move the manipulator arm tip 130 to a desired position and attitude. It will be appreciated by those skilled in the art that the manipulator arm control system 100 may be applied to a surgical robotic system, such as an endoscopic surgical robotic system. For example, a surgical effector 160 may be disposed at the distal end of manipulator arm tip 130, as shown in fig. 1. It should be understood that manipulator arm control system 100 may also be applied to other fields (e.g., manufacturing, mechanical, etc.) of special or general purpose robotic systems.
In the present disclosure, the control device 120 may be communicatively connected with the driving unit 150 (e.g., a motor) of the at least one manipulation arm 140 and send a driving signal to the driving unit 150, so that the driving unit 150 controls the at least one manipulation arm 140 to move to the corresponding target pose based on the driving signal. For example, the driving unit 150 for controlling the movement of the operation arm 140 may be a servo motor, and may receive a command from the control device to control the movement of the operation arm 140. The control device 120 may also be communicatively connected to a sensor coupled to the driving unit 150, for example, through a communication interface, to receive the motion data of the operation arm 140, so as to monitor the motion state of the operation arm 140. In one example of the present disclosure, the communication interface may be a CAN (Controller Area Network) bus communication interface that enables the control device 120 to communicate with the drive unit 150 and the sensor connection through a CAN bus.
In some embodiments, the manipulator arm 140 may comprise a continuous body deformable arm, such as a manipulator arm having multiple degrees of freedom comprised of multiple joints, such as a manipulator arm that may achieve 6 degrees of freedom of motion. The image capture device 110 may include, but is not limited to, a dual lens image capture device or a single lens image capture device, such as a binocular or monocular camera.
In some embodiments, image capture device 110 may be used to capture positioning images. The positioning image may include an image of part or all of the manipulation arm 140. In some embodiments, the image capture device 110 may be configured to capture an image of the manipulator arm tip 130, and a plurality of different pose markers may be disposed on the manipulator arm tip 130, where the pose markers comprise different pose marker patterns. For example, the handle arm end 130 may have a locator tag 170 disposed thereon (the locator tag 170 may be, for example, the tag 400 shown in fig. 4). The localization tags 170 may include a plurality of pose identifications, including different pose identification patterns (described in detail below).
As shown in fig. 1, the distal end 130 of the operation arm is within the observation field of view of the image capturing device 110, and the captured positioning image may include an image of the distal end 130 of the operation arm. In some embodiments, the image capture device 110 may include, but is not limited to, a dual lens image capture device or a single lens image capture device, such as a binocular or monocular camera. The image capture module 110 may be an industrial camera, an underwater camera, a miniature electronic camera, an endoscopic camera, etc., depending on the application scenario. In some embodiments, the image acquisition module 110 may be fixed in position or variable in position, for example, an industrial camera fixed in a monitoring position or an endoscopic camera with adjustable position or pose. In some embodiments, the image acquisition module 110 may implement at least one of visible light band imaging, infrared band imaging, CT (Computed Tomography) imaging, acoustic wave imaging, and the like. Depending on the type of the captured image, one skilled in the art may select different image capturing devices as the image capturing module 110.
In some embodiments, the control device 120 may receive the positioning image from the image acquisition apparatus 110 and process the positioning image. For example, the control device 120 may identify a plurality of pose identifiers located on the manipulator arm 140 in the positioning image and determine a current relative pose of the manipulator arm 140 or the actuator 160 with respect to a reference coordinate system (e.g., a world coordinate system). The control device 120 may also determine the drive signal of the manipulation arm 140 based on the current relative pose of the manipulation arm 140 or the actuator 160 and the target pose. The driving signal may be transmitted to the driving unit 150 to perform motion control of the manipulation arm 140.
Fig. 2 illustrates a schematic view of a link 200 of an operating arm, according to some embodiments of the present disclosure. The manipulator arm (e.g., manipulator arm 140) may include at least one deformable link 200. As shown in fig. 2, deformable construct 200 includes a fixation plate 210 and a plurality of structural bones 220. The plurality of structural bones 220 are fixedly coupled at a first end to the fixed plate 210 and coupled at a second end to a driving unit (not shown). In some embodiments, the fixed disk 210 may be a ring structure, a disk structure, and the like, but is not limited thereto, and the cross-section may be a circular shape, a rectangular shape, a polygonal shape, and the like.
The drive unit deforms the segment 200 by driving the structural bone 220. For example, the drive unit brings the segment 200 into a bent state as shown in fig. 2 by driving the structural bone 220. In some embodiments, the second ends of the plurality of structural bones 220 are connected to the drive unit through the base plate 230. In some embodiments, similar to the fixed disk 210, the base disk 230 may be a ring structure, a disk structure, and the like, but is not limited thereto, and may have a cross-section of various shapes such as a circle, a rectangle, a polygon, and the like. The drive unit may comprise a linear motion mechanism, a drive link or a combination of both. A linear motion mechanism may be coupled to the structural bone 220 to push or pull the structural bone 220 to drive the flexure of the joint 200. The drive member may include a fixed disk and a plurality of structural bones, wherein one end of the plurality of structural bones is fixedly connected to the fixed disk. The other ends of the plurality of structural bones of the drive segment are connected to or integrally formed with the plurality of structural bones 220 to drive the bending of the segment 200 by driving the bending of the segment.
In some embodiments, a spacer disk 240 is further included between the fixed disk 210 and the base disk 230, and the plurality of structural bones 220 pass through the spacer disk 240. Similarly, the drive member may also include a spacer disk.
Fig. 3 illustrates a schematic structural view of an operating arm 300 according to some embodiments of the present disclosure. As shown in fig. 3, the manipulation arm 300 is a deformable manipulation arm, and the manipulation arm 300 may include a manipulation arm tip 310 and a manipulation arm body 320. The lever arm body 320 may include one or more links, such as a first link 3201 and a second link 3202. In some embodiments, the structure of the first and second joints 3201, 3202 may be similar to the joint 200 shown in fig. 2. In some implementations, as shown in fig. 3, the lever arm body 320 also includes a first straight rod segment 3203 located between the first and second links 3201, 3202. A first end of the first straight rod section 3203 is connected to the base plate of the second joint 3202 and a second end is connected to the fixed plate of the first joint 3201. In some implementations, as shown in fig. 3, the lever arm body 320 further includes a second straight rod segment 3204, a first end of the second straight rod segment 3204 being connected with the base plate of the first link 3201.
In some embodiments, a plurality of pose markers are distributed on the operating arm (e.g., the operating arm 140 or the operating arm tip 130 shown in fig. 1, and the operating arm body 320 or the operating arm tip 310 shown in fig. 3). In some embodiments, a plurality of pose identifications are provided on an outer surface of the column portion of the operation arm. For example, a plurality of pose markers are circumferentially distributed on the manipulator arm tip 310. For example, a plurality of posture markers are provided on the outer surface of the columnar portion of the operation arm end 310. In some embodiments, a positioning tag (e.g., tag 400 shown in fig. 4, tag 500 shown in fig. 5) including a plurality of pose markers including a plurality of different pose marker patterns distributed on the positioning tag along the circumference of the column portion and pose marker pattern corner points in the pose marker patterns is disposed on the outer surface of the column portion of the manipulator arm.
In some embodiments, the pose identification may include pose identification patterns and pose identification pattern corner points in the pose identification patterns. In some embodiments, the pose identification pattern may be provided on a label on the manipulator arm tip, or may be printed on the manipulator arm tip, or may be a pattern formed by the physical configuration of the manipulator arm tip itself, e.g., may include depressions or protrusions, and combinations thereof. In some embodiments, the pose identification pattern may include patterns formed in brightness, grayscale, color, and the like. In some embodiments, the pose identification pattern may include a pattern that actively (e.g., self-illuminating) or passively (e.g., reflected light) provides information detected by the image capture device. Those skilled in the art will appreciate that in some embodiments, the pose of the pose marker or the pose of the pose marker pattern may be represented by the pose of the pose marker pattern corner point coordinate system. In some embodiments, the pose identification pattern is provided on the distal end of the operation arm in an area suitable for the image acquisition by the image acquisition device, for example, an area that can be covered by the field of view of the image acquisition device during operation or an area that is not easily disturbed or blocked during operation.
Fig. 4 illustrates a schematic diagram of a tag 400 including multiple pose identifications, according to some embodiments. Fig. 5 shows a schematic view of a label 500 which is provided on the distal end peripheral side of the operation arm and formed in a cylindrical shape. It will be appreciated that for simplicity, tag 400 may include the same pose identification pattern as tag 500.
Referring to fig. 4, the plurality of pose identifications may include a plurality of different pose identification patterns 410. The plurality of pose identifications may also include a plurality of pose identification pattern corner points P in a plurality of different pose identification patterns 410 4 In the present disclosure, the pose identification pattern corner point is indicated by an "o" symbol. In some embodiments, the pose identification pattern 410 or pose identification pattern corner points P therein may be identified by identifying the pose identification pattern or the pose identification pattern corner points P 4 And determining a pose identification.
Referring to fig. 5, in the circumferentially disposed state, the label 400 becomes a label 500 spatially configured in a cylindrical shape. In some embodiments, the axial angle or roll angle of the pose marker may be represented by the axial angle of the pose marker pattern or pose marker pattern corner points. The angle around the axis of each pose identification pattern or pose identification pattern corner point identification is known or predetermined. In some embodiments, the on-axis angle identified by each pose identifier may be determined based on a distribution of a plurality of pose identifiers (e.g., pose identifier patterns or pose identifier pattern corner points). In some embodiments, the plurality of pose identifications may be evenly distributed (e.g., pose identification pattern corners in tag 400 are equally spaced, pose identification pattern corners in tag 500 are equally distributed, and so on). In other embodiments, the plurality of pose identifications may be non-uniformly distributed. In some embodiments, each pose identification pattern may be used to identify a particular axial angle based on a distribution of a plurality of pose identifications, each pose identification pattern having a one-to-one correspondence with an identified axial angle. In this disclosure, the angle around the axis or roll angle refers to the angle around the Z-axis (e.g., the Z-axis of the manipulator arm coordinate system { wm }). In some embodiments, the lever arm is a deformable lever arm and the Z-axis is tangential to the lever arm.
As shown in FIG. 5, the plurality of different pose identification patterns 510 in the tag 500 are uniformly distributed along the circumference of the cylindrical structure, and the plurality of pose identification pattern corner points are uniformly distributed on the cross-sectional circle 520 of the XY plane of the manipulator coordinate system { wm }, so that the distribution angle (for example, the angle α) of any adjacent pose identification pattern corner points 0 ) Are equal. Setting the position and pose mark pattern corner point P pointed by the X axis 5 ,P 5 As a reference corner point for marking a 0-degree angle around the shaft (pose mark pattern corner point P) 5 The position and pose identification pattern is used as a reference pattern), the angular points of the position and pose identification pattern can be determined according to the angular points P of any position and pose identification pattern 5 The position relation of the position and the pose identification pattern determines the angle of the pose identification pattern angle point identification around the shaft. In some embodiments, the around-axis angle of the pose identification pattern corner point identification may be determined based on the following equation (1):
α m =α 0 (m-1) (1)
wherein alpha is m For marking pattern corner points P by pose 5 As a first pose identification pattern corner point, the mth pose identification pattern corner point has an angle around the shaft in the clockwise direction of the cross-section circle 320.
Some embodiments of the present disclosure provide a control method for an operating arm. Fig. 6 illustrates a flow chart of a control method 600 of operating an arm control system according to some embodiments of the present disclosure. Some or all of the steps in method 600 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. The method 600 may be implemented by software, firmware, and/or hardware. In some embodiments, method 600 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to FIG. 6, in step 601, a scout image is acquired. In some embodiments, the positioning image includes a plurality of pose identifications on the manipulator arm. In some embodiments, the positioning image may be received from an image acquisition device 110 as shown in FIG. 1. For example, the control device 120 may receive positioning images actively transmitted by the image acquisition apparatus 110. Alternatively, the control device 120 may send an image request instruction to the image capturing apparatus 110, and the image capturing apparatus 110 sends the positioning image to the control device 120 in response to the image request instruction.
With continued reference to fig. 6, at step 603, a plurality of pose identifiers located on the manipulator arm are identified in the positioning image, the plurality of pose identifiers comprising different pose identifier patterns. For example, exemplary methods of identifying a plurality of pose identifications located on an operating arm may include methods as shown in fig. 10, 12, 14, and 15. In some embodiments, the control device 120 may identify the pose identification of some or all of the positioning images through image processing algorithms. In some embodiments, the image processing algorithms may include feature recognition algorithms that may extract or recognize features of the pose identification. For example, the image processing algorithm may include a corner detection algorithm for detecting pose marker pattern corners. The corner detection algorithm may be one of, but not limited to, a gray-scale image-based corner detection, a binary image-based corner detection, and a contour curve-based corner detection. For example, the image processing algorithm may be a color feature extraction algorithm for detecting color features in the pose identification pattern. For another example, the image processing algorithm may be a contour detection algorithm for detecting contour features of the pose identification pattern. In some embodiments, the control device may identify the pose identification of some or all of the positioning images by the recognition model.
With continued reference to fig. 6, at step 605, a current relative pose of the manipulator arm with respect to the reference coordinate system is determined based on the plurality of pose identifications. In some embodiments, method 600 further comprises: determining two-dimensional coordinates of a plurality of pose identifications in the positioning image; and determining the current relative pose of the operating arm relative to the reference coordinate system based on the two-dimensional coordinates of the pose identifications in the positioning image and the three-dimensional coordinates of the pose identifications relative to the operating arm coordinate system. In some embodiments, the coordinates of the pose marker may be represented by coordinates of pose marker pattern corner points. For example, the two-dimensional coordinates of the pose markers in the positioning image and the three-dimensional coordinates in the manipulator coordinate system may be represented by coordinates of corner points of the pose marker pattern. In some embodiments, the pose of the manipulator arm coordinate system relative to the reference coordinate system may be determined as the current relative pose of the manipulator arm relative to the reference coordinate system based on the two-dimensional coordinates of the plurality of pose identification pattern corner points in the positioning image and the three-dimensional coordinates in the manipulator arm coordinate system.
With continued reference to fig. 6, in step 607, the drive signals for the manipulator arms are determined based on the current relative pose and the target pose of the manipulator arms. In some embodiments, the method 600 may further include determining a drive signal to operate the arm at a predetermined period to achieve real-time control over a plurality of motion control cycles.
In some embodiments, method 600 may further include: determining a pose difference based on the current relative pose of the operating arm and the target pose of the operating arm; and determining a drive signal of the operation arm based on the pose difference and the inverse kinematics model of the operation arm. For example, based on the difference between the target pose and the current pose of the tip of the manipulator in the world coordinate system, the drive values of the plurality of joints included in the manipulator (or the drive values of the corresponding plurality of motors controlling the motion of the manipulator) in the current motion control cycle may be determined by an inverse kinematics numerical iteration algorithm of the manipulator kinematics model. It should be understood that the kinematic model may represent a mathematical model of the kinematic relationship of the joint space and the task space of the manipulator arm. For example, the kinematic model can be established by a DH (Denavit-Hartenberg) parametric method, an exponential product representation method, or the like.
In some embodiments, the target pose of the manipulator is a target pose of the manipulator in a world coordinate system. The method 600 may further include: determining the current pose of the operating arm in a world coordinate system based on the current relative pose; and determining a pose difference based on the target pose of the manipulator and the current pose of the manipulator in the world coordinate system. In some embodiments, the pose differences include position differences and pose differences.
In the kth motion control cycle, the pose difference may be determined based on the following equation (2):
Figure BDA0003295374520000061
wherein the content of the first and second substances,
Figure BDA0003295374520000062
for the difference in position of the operating arm at the kth movement control cycle, <' >>
Figure BDA0003295374520000063
For the angular difference of the operating arm at the kth motion control cycle, P t k For the target position of the operating arm in the kth motion control cycle, R t k For the target position of the operating arm at the kth motion control cycle, <' >>
Figure BDA0003295374520000064
For the current position of the operating arm at the kth motion control cycle, R t k Operating the arm for the kth movement control cycle->
Figure BDA0003295374520000065
Present position,. Or>
Figure BDA0003295374520000066
Represents->
Figure BDA0003295374520000067
And/or>
Figure BDA0003295374520000068
The angle of rotation therebetween.
In some embodiments, the target pose of the manipulator arm is updated at a predetermined period before or during each motion control cycle. In some embodiments, multiple motion control loops are performed iteratively, and at each motion control loop, a method according to some embodiments of the present disclosure, such as steps 601-607, may be performed to control the manipulator arm to move to the target pose. By iteratively executing a plurality of motion control cycles, real-time closed-loop control of the pose of the tail end of the operating arm can be realized, and the pose control precision of the operating arm can be improved. It will be appreciated that the control of the pose of the manipulator arm via the method of the present disclosure can improve the tracking error of the manipulator arm (e.g., a continuum deformable arm).
In some embodiments, method 600 may further include receiving a control command; and determining the target pose of the operating arm based on the control command. In some embodiments, the target pose of the manipulator arm tip in the world coordinate system may be input by a user via an input device. By comparison calculation, the difference between the target pose and the current pose of the end of the manipulator arm can be determined. In some embodiments, the control commands may be received in a master-slave motion based control scheme. For example, by acquiring the pose of the main operator or joint information of the main operator in each motion control cycle, the target pose of the operation arm is determined. Through a plurality of motion control cycles, real-time master-slave motion control can be performed.
Fig. 7 illustrates a flow diagram of a method 700 for determining a drive signal according to some embodiments of the present disclosure. As shown in fig. 7, some or all of the steps of the method 700 may be performed by a control device (e.g., control device 120) of the manipulator arm control system 100. The control 120 may be configured on a computing device. Method 700 may be implemented by software, firmware, and/or hardware. In some embodiments, method 700 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 7, in step 701, based on the pose difference, a cartesian spatial velocity is determined. In some embodiments, the cartesian spatial velocities include cartesian spatial linear velocities and cartesian spatial angular velocities. The method 700 may further include: based on the position difference, a cartesian spatial linear velocity is determined, and based on the attitude difference, a cartesian spatial angular velocity is determined. In some embodiments, the cartesian spatial angular velocity may be determined by a proportional-integral-derivative controller or a proportional-derivative controller based on the attitude difference. In some casesIn the embodiment, the Cartesian space velocity of the k-th motion control cycle
Figure BDA0003295374520000071
Can be determined based on the following equation (3):
Figure BDA0003295374520000072
wherein v is k Cartesian space linear velocity, ω, for the kth motion control cycle k Cartesian spatial angular velocity, P, for the kth motion control cycle v Is the linear velocity proportionality coefficient, D v Is a linear velocity differential coefficient, P ω Is the proportional coefficient of angular velocity, D ω Is a differential coefficient of the angular velocity,
Figure BDA0003295374520000073
for the difference in position of the operating arm at the k-1 th motion control cycle>
Figure BDA0003295374520000074
The angle difference of the operation arm at the k-1 motion control cycle.
Referring to fig. 7, in step 703, joint parameter spatial velocities are determined based on the cartesian spatial velocities. Joint parameter space velocity for kth motion control cycle
Figure BDA0003295374520000075
May be determined based on the following equation (4):
Figure BDA0003295374520000076
J + is a Moore-Penrose pseudo-inverse of the velocity-jacobian matrix J of the kinematic model of the manipulator arm, which can be determined based on the structure of the manipulator arm.
Referring to FIG. 7, in step 705, a target joint is determined based on the joint parameter spatial velocity and the current joint parameterAnd (4) saving parameters. Target joint parameters for the kth motion control cycle
Figure BDA0003295374520000077
Can be determined based on the following equation (5):
Figure BDA0003295374520000078
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003295374520000081
is the current joint parameter for the kth motion control cycle, and Δ t is the period of the motion control cycle.
It should be understood that when the manipulator arm has a plurality of links (e.g., the manipulator arm 300 shown in fig. 3), the target joint parameter of the manipulator arm may be the target joint parameter of all of the links, or may be the target joint parameter of one or some of the plurality of links.
Referring to fig. 7, in step 707, a drive signal is determined based on the target joint parameters. For example, based on the mapping relationship between the target joint parameter and the driving amount, the driving amount of the plurality of joints included in the operation arm in the current motion control cycle may be determined, and then the driving signal of the driving unit (e.g., motor) may be determined based on the driving amount. In some embodiments, the mapping of the joint parameter of the individual joint to the driving amount may be a mapping based on equation (14).
In some embodiments, the manipulation arm is exemplified as a deformable motion arm (e.g., a continuum deformable arm). The continuous body deformable arm may be an operating arm 300 as shown in fig. 3. As shown in fig. 3, each segment (first segment 3201 and second segment 3202) may include a base tray, a mounting tray, and a plurality of structural bones extending through the base tray and mounting tray, which may be fixedly connected to the mounting tray and slidably connected to the base tray. The continuum deformable arm and the segments it contains can be described by a kinematic model. In some embodiments, the structure of each segment may be embodied as segment 200 shown in FIG. 2. As shown in fig. 2, the base plate coordinate system
Figure BDA0003295374520000082
Attached to the base plate of the t-th (t =1,2,3 \ 8230;) link continuum segment with the origin at the center of the base plate, the XY plane coinciding with the plane of the base plate, and->
Figure BDA0003295374520000083
From the center of the base plate towards the first structural bone (the first structural bone is understood to be a structural bone to which any given one of the plurality of structural bones is referred). Curved plane coordinate system
Figure BDA0003295374520000084
The origin of the base plate coordinate system coincides with the origin of the base plate coordinate system, the XY plane coincides with the bending plane, and the base plate coordinate system is based on the XY plane>
Figure BDA0003295374520000085
And/or>
Figure BDA0003295374520000086
And (4) overlapping. Fixing tray coordinate system->
Figure BDA0003295374520000087
Attached to the fixed disk of the t-th section continuum segment, with its origin at the center of the fixed disk, and XY plane coincident with the fixed disk plane and/or in the vicinity of the T-th section continuum segment>
Figure BDA0003295374520000088
Pointing from the center of the fixation disc towards the first structural bone. Curved plane coordinate system
Figure BDA0003295374520000089
Its origin is located at the center of the fixed disk, and the XY plane and the bending plane are coincident>
Figure BDA00032953745200000810
And/or>
Figure BDA00032953745200000811
And (4) overlapping.
As shown in FIG. 2 as a singleThe structure 200 may be represented by a kinematic model. The position of the t-th node end (fixed disk coordinate system { te }) relative to the base disk coordinate system { tb } tb P te Posture, posture tb R te Can be determined based on the following equations (6), (7):
Figure BDA00032953745200000812
tb R tetb R t1 t1 R t2 t2 R te (7)
wherein L is t Is the length of the virtual structural bone of the t-th node (e.g., virtual structural bone 221 shown in FIG. 2), θ t In order to be in the t-th section,
Figure BDA00032953745200000813
about>
Figure BDA00032953745200000814
Or->
Figure BDA00032953745200000815
Rotate to->
Figure BDA00032953745200000816
The required angle of rotation is such that, tb R t1 a curved plane coordinate system 1, which is the t-th node, has an attitude of t1 with respect to a base plate coordinate system { tb }, t1 R t2 a curved plane coordinate system 2, which is the t-th segment, has a pose 2, which is a mapped relationship with respect to the curved plane coordinate system 1, the mapped relationship being the curved plane t1, t2 R te the posture of the fixed disk coordinate system { te } which is the t-th section with respect to the curved plane coordinate system 2, t2 }.
tb R t1t1 R t2 And t2 R te can be shown based on the following equations (8), (9) and (10):
Figure BDA0003295374520000091
Figure BDA0003295374520000092
Figure BDA0003295374520000093
wherein, delta t In the t-th section, a plane is bent and
Figure BDA0003295374520000094
the included angle of (a).
The joint parameters Ψ of a single joint 200 as shown in FIG. 2 t Can be determined based on the following equation (11):
ψ t =[θ tt ] T (11)
joint parameter Ψ t Spatial velocity of joint parameters
Figure BDA0003295374520000095
May be determined based on the following equation (12):
Figure BDA0003295374520000096
wherein the content of the first and second substances,
Figure BDA0003295374520000097
is theta t Is first derivative of->
Figure BDA0003295374520000098
Is delta t The first derivative of (a).
The terminal Cartesian space velocity of a single segment 200 as shown in FIG. 2
Figure BDA0003295374520000099
May be determined based on the following equation (13):
Figure BDA00032953745200000910
wherein, J t Velocity Jacobian matrix, J, of a kinematic model that is a single segment tv Linear velocity Jacobian matrix, J, of a kinematic model of a single segment Angular velocity Jacobian matrix, v, of a kinematic model being a single segment t Is the terminal linear velocity, ω, of a single link t Is the end angular velocity of a single segment.
In some embodiments, the drive volumes of the plurality of structural bones have a known mapping relationship to the joint parameters. Based on the target joint parameters of the joint and the mapping relationship, the driving amounts of the plurality of structural bones can be determined. The driving amount of the plurality of structural bones may be understood as a single segment from an initial state (e.g., θ) t = 0) length of the structural bone that is pushed or pulled when bent to the target bending angle. In some embodiments, the mapping of the driving amounts of the plurality of structural bones to the joint parameters may be determined based on the following equation (14):
q i ≡-r ti θ t cos(δ tti ) (14)
wherein r is ti Is the distance from the ith structural bone to the virtual structural bone in the t-th structural section, beta ti Is the angle between the ith structural bone and the first structural bone in the t-th structural joint, q i For the driving amount of the ith structural bone, the driving signal of the driving unit may be determined based on the driving amount of the ith structural bone.
In some embodiments, the terminal cartesian spatial velocity of the individual segment may be determined based on the formula (2) and the formula (3), the joint parameter spatial velocity of the individual segment may be determined based on the formula (4), the target joint parameter of the individual segment may be determined based on the formula (5), the driving amount of each structural bone may be determined based on the formula (14), and then the driving signal of the driving unit (e.g., motor) may be determined based on the driving amount.
In some embodiments, the entire deformable arm may be described by a kinematic model. As shown in fig. 3, a transformation may be performed between a plurality of coordinate systems at a plurality of positions of the deformable arm. For example, an end effector of a continuum-deformable arm may be determined in the world coordinate system { w } based on equation (15) below:
W T tipW T 1b 1b T 1e 1e T 2b 2b T 2e 2e T tip (15)
wherein, the first and the second end of the pipe are connected with each other, W T tip a homogeneous transformation matrix representing the end effector of the continuum deformable arm relative to a world coordinate system; W T 1b a homogeneous transformation matrix representing the base plate of the first continuous segment relative to a world coordinate system; 1b T 1e a homogeneous transformation matrix representing the fixed disks of the first continuous section relative to the base disks of the first continuous section; 1e T 2b a homogeneous transformation matrix representing the base pans of the second continuous section relative to the fixed pans of the first continuous section; 2b T 2e a homogeneous transformation matrix representing the fixed disks of the second continuum segment relative to the base disks of the second continuum segment; 2e T tip a homogeneous transformation matrix representing the end effector of the continuum segment relative to the fixed disk of the second continuum segment. In some embodiments, the end effector is fixedly disposed on the fixed disk, and thus 2e T tip Known or predetermined.
It will be appreciated that the deformable arm has different joint parameters in different operating conditions. For example, the manipulator arm 300 shown in fig. 3 includes at least four operational states. The four operating states of the operating arm 300 are as follows:
the first working state: only the second link 3202 participates in the pose control of the actuator (e.g., only the second link 3202 enters the workspace), at which time the joint parameters of the manipulator arm 300 may be determined based on equation (16) below:
Figure BDA0003295374520000101
wherein psi c1 Is firstThe joint parameters of the operating arm 300 in the working state,
Figure BDA0003295374520000102
to operate the pivoting angle, L, of the arm 300 2 、θ 2 、δ 2 And L in the structure 200 as shown in FIG. 2 t 、θ t And delta t Have the same physical meaning.
The second working state: the second joint 3202 and the first linear segment 3203 participate in pose control of the actuator (e.g., the second joint 3202 enters the workspace entirely, and the first linear segment 3203 enters the workspace partially), at which time the joint parameters of the manipulator arm 300 may be determined based on equation (17) below:
Figure BDA0003295374520000111
wherein psi c2 Is a joint parameter, L, of the operating arm 300 in the second operating state r Is the feed amount of the first straight line segment 3203.
The third working state: the second link 3202, the first linear segment 3203, and the first link 3201 participate in pose control of the actuator (e.g., the second link 3202 enters the workspace entirely, the first linear segment 3203 enters the workspace entirely, and the first link 3201 partially enters the workspace), at which time the joint parameters of the manipulation arm 300 may be determined based on the following equation (18):
Figure BDA0003295374520000112
wherein psi c3 Is a joint parameter of the operating arm 300 in the third operating state, theta 1 And delta 1 And theta in the segment 200 as shown in FIG. 2 t And delta t The physical meanings of (A) are the same.
The fourth working state: the second link 3202, the first straight line segment 3203, the first link 3201, and the second straight line segment 3204 participate in the pose control of the actuator (e.g., the second link 3202 enters the workspace entirely, the first straight line segment 3203 enters the workspace entirely, the first link 3201 enters the workspace entirely, and the second straight line segment 3204 partially enters the workspace), at which time the joint parameters of the manipulator arm 300 may be determined based on the following equation (19):
Figure BDA0003295374520000113
wherein psi c4 Is a joint parameter, L, of the operating arm 300 in the fourth operating state s The feed amount of the second straight line segment 3204.
In some embodiments, similar to a single segment, a cartesian spatial velocity of the distal end of the deformable arm may be determined based on equations (2) and (3), a joint parameter spatial velocity of the deformable arm may be determined based on equation (4), a target joint parameter of the deformable arm may be determined based on equation (5), wherein the joint parameter spatial velocity in equation (4) and a specific parameter included in the target joint parameter in equation (5) may be determined based on equations (16), (17), (18) or (19), a driving amount of each structural bone of each segment may be determined based on equation (14), and a driving signal of the driving unit (e.g., motor) may be determined based on the driving amount.
In some embodiments, method 600 may further include: and determining the pose of the operating arm coordinate system relative to the reference coordinate system based on the two-dimensional coordinates of the pose identification pattern corner points in the positioning image, the three-dimensional coordinates of the pose identification pattern corner points in the operating arm coordinate system and the transformation relation of the camera coordinate system relative to the reference coordinate system. In some embodiments, the transformation relationship of the camera coordinate system relative to the reference coordinate system may be known. For example, the reference coordinate system is a world coordinate system, and the transformation relationship of the camera coordinate system relative to the world coordinate system can be determined according to the pose of the camera. In other embodiments, the reference coordinate system may be the camera coordinate system itself according to actual requirements. In some embodiments, based on the camera imaging principle and the projection model, the pose of the manipulator coordinate system relative to the camera coordinate system is determined based on the two-dimensional coordinates of the pose identification pattern corner points in the positioning image and the three-dimensional coordinates of the pose identification pattern corner points in the manipulator coordinate system. Based on the transformation relationship between the pose of the operating arm coordinate system relative to the camera coordinate system and the camera coordinate system relative to the reference coordinate system, the pose of the operating arm coordinate system relative to the reference coordinate system can be obtained. In some embodiments, the internal parameters of the camera may also be considered. For example, the internal reference of the camera may be the internal reference of the image capture device 110 as shown in fig. 1. The internal parameters of the camera may be known or calibrated. In some embodiments, the camera coordinate system may be understood as a coordinate system established with the origin of the camera. For example, a coordinate system established with the optical center of the camera as the origin or a coordinate system established with the lens center of the camera as the origin. When the camera is a binocular camera, the origin of the camera coordinate system may be the center of the left lens of the camera, or the center of the right lens, or any point on the line connecting the centers of the left and right lenses (e.g., the midpoint of the line).
In some embodiments, the pose of the manipulator arm coordinate system { wm } relative to a reference coordinate system (e.g., world coordinate system) may be determined based on equation (20) below:
Figure BDA0003295374520000121
wherein the content of the first and second substances, w R wm to determine the pose of the manipulator arm coordinate system relative to the world coordinate system, w P wm to manipulate the position of the arm coordinate system relative to the world coordinate system, w R lens is the pose of the camera coordinate system relative to the world coordinate system, w P lens is the position of the camera coordinate system relative to the world coordinate system, lens R wm for the pose of the manipulator coordinate system relative to the camera coordinate system, lens P wm is the position of the manipulator coordinate system relative to the camera coordinate system.
Some embodiments of the present disclosure provide methods of determining three-dimensional coordinates of a plurality of pose identifications relative to an manipulator coordinate system. In some embodiments, three-dimensional coordinates of the plurality of pose identifications relative to the manipulator coordinate system are determined based on a distribution of the plurality of pose identifications. For example, three-dimensional coordinates of the plurality of pose identification pattern corner points in the manipulator coordinate system are determined based on the distribution of the plurality of pose identification pattern corner points.
Fig. 8 illustrates a flow diagram of a method 800 of determining three-dimensional coordinates of a plurality of pose identifications relative to an manipulator coordinate system according to some embodiments of the present disclosure. Some or all of the steps of method 800 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. Method 800 may be implemented by software, firmware, and/or hardware. In some embodiments, method 800 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to FIG. 8, at step 801, an on-axis angle of a plurality of pose markers with respect to a Z-axis of an arm coordinate system is determined based on a distribution of the plurality of pose markers. In some embodiments, an on-axis angle of the plurality of pose identifications relative to the Z-axis of the manipulator coordinate system may be determined based on the plurality of pose identification patterns. For example, each pose identification pattern may identify a particular axial angle, with different pose identification patterns corresponding one-to-one with the identified axial angles. Based on the identified pose identification pattern and the corresponding relationship between the pose identification pattern and the axial angle, the axial angle identified by the identified pose identification pattern can be determined. It should be understood that the distribution of each pose identification pattern is known or predetermined. In some embodiments, the distribution of the plurality of pose identification patterns or the plurality of pose identification pattern corners may be a distribution as shown in fig. 3. In some embodiments, the angle around the axis of each pose identification pattern corner point identification may also be determined based on equation (1).
Referring to FIG. 8, at step 803, three-dimensional coordinates of a plurality of pose markers relative to an manipulator coordinate system are determined based on the around-axis angles of the plurality of pose markers. In some embodiments, as shown in fig. 5, each pose identification pattern corner point is located on the circumference of a cross-sectional circle 520, and the center and radius r of the cross-sectional circle 520 are known. Marking pattern corner point P with pose 5 As a reference corner point, a pose mark pattern corner point P 5 At the operating armThe three-dimensional coordinates of the coordinate system { wm } are (r, 0). In some embodiments, the three-dimensional coordinates of each pose identification pattern corner point in the manipulator arm coordinate system { wm } may be determined based on the following equation (21):
C m =[r·cosα m r·sinα m 0] T (21)
wherein, C m For marking the corner point P of the pattern with the pose 5 As the first pose identification pattern corner point, the specific angle around the axis identified by the mth pose identification pattern corner point may be based on the three-dimensional coordinates of the plurality of poses identified in the manipulator coordinate system { wm } in the clockwise direction of the cross-sectional circle 520.
In some embodiments, the determination of the axial angle α identified by the mth position identification pattern angle point is based on equation (1) m Then the angle alpha around the shaft determined based on the formula (1) m And equation (21) determining the three-dimensional coordinate C m
Fig. 9 illustrates a flow diagram of a method 900 of determining three-dimensional coordinates of a plurality of pose identifications with respect to an manipulator coordinate system according to further embodiments of the present disclosure. Method 900 may be an alternative embodiment of method 800. Some or all of the steps of method 900 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. The method 900 may be implemented by software, firmware, and/or hardware. In some embodiments, method 900 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 9, in step 901, an arrangement order of the plurality of pose identifications is determined based on at least two of the plurality of pose identifications. In some embodiments, the arrangement order of the plurality of pose identifications may be represented by an arrangement order of the plurality of pose identification patterns. In some embodiments, the order of arrangement of the plurality of pose identifications is determined by recognizing any two pose identification patterns. It should be understood that the plurality of pose markers includes different pose marker patterns, and that in the case where any two pose marker patterns are known, the order of arrangement, e.g., clockwise or counterclockwise, of the plurality of pose markers in the positioning image may be determined based on the known distribution of the plurality of pose marker patterns (e.g., the distribution of the different pose marker patterns in the tag 400 shown in fig. 4, or the distribution of the different pose marker patterns in the tag 500 shown in fig. 5).
Referring to fig. 9, at step 903, three-dimensional coordinates of a plurality of pose markers are determined based on the arrangement order of the plurality of pose markers. In some embodiments, three-dimensional coordinates of each pose identification in the manipulator coordinate system may be determined based on a known distribution of the plurality of pose identifications, the three-dimensional coordinates of each pose identification may be represented by three-dimensional coordinates of corner points of a pose identification pattern in the manipulator coordinate system, each pose identification pattern corresponding to a coordinate point in the manipulator coordinate system. After the arrangement order of the plurality of pose identification patterns is determined, the remaining pose identification patterns can be determined based on the recognized pose identification patterns, and the three-dimensional coordinates of each pose identification pattern in the manipulator coordinate system can be determined. In some embodiments, a plurality of pose identification corner points in the positioning image are identified, and pose identification patterns corresponding to any two of the pose identification corner points are determined. And determining the arrangement sequence of a plurality of pose identification pattern corner points based on the two identified pose identification patterns, and further determining the three-dimensional coordinates of each pose identification pattern corner point in the operating arm coordinate system. In addition, based on the arrangement sequence, the distribution of all the pose identification patterns can be determined, so that the pose identification patterns at the corresponding positions on the positioning image are matched by using the specific pose pattern matching template, and the data processing speed is increased. In some embodiments, the matching of the pose pattern matching template to the pattern at the pose identification pattern corner points may be implemented similarly to step 1003 in method 1000.
In some embodiments, method 600 further comprises: determining a current relative pose of the tip instrument of the manipulator arm relative to the reference coordinate system based on the current relative pose of the manipulator arm relative to the reference coordinate system; and determining a drive signal for the manipulator arm based on the current relative pose of the end instrument of the manipulator arm and the target pose. In some embodiments, the target pose of the tip instrument of the manipulator arm may be determined based on the target pose of the manipulator arm, and the drive signal of the manipulator arm may be determined based on a current relative pose of the tip instrument with respect to a reference coordinate system and the target pose of the tip instrument. Alternatively, user-input control commands may be received to determine the target pose of the tip instrument. For example, in each motion control cycle, the pose of the main operator or joint information of the main operator is acquired, and the target pose of the tip instrument is determined. In some embodiments, an end instrument (e.g., effector 160 shown in fig. 1) is disposed at the end of the manipulator arm, so the position of the end instrument is known or determinable. The pose transformation relationship of the end instrument with respect to the manipulator arm coordinate system is also known or predetermined. In some embodiments, taking the example of a reference coordinate system as the world coordinate system, the pose of the tip instrument of the manipulator arm relative to the reference coordinate system may be determined based on the following equation (22):
Figure BDA0003295374520000141
wherein, the first and the second end of the pipe are connected with each other, w R tip the pose of the tip instrument relative to the world coordinate system, w P tip the position of the tip instrument relative to the world coordinate system, wm R tip for the pose of the distal instrument relative to the manipulator arm coordinate system, wm P tip the position of the end instrument relative to the coordinate system of the manipulator arm.
In some embodiments, the pose of the manipulator arm coordinate system relative to the world coordinate system is determined based on equation (20) w R wm And position w P wm Then the attitude determined based on the formula (20) w R wm And position w P wm And equation (22) determining the pose of the tip instrument relative to the world coordinate system w R tip And position w P tip
Some embodiments of the present disclosure provide methods of identifying pose identifications. Fig. 10 illustrates a flow diagram of a method 1000 of identifying pose identifications, according to some embodiments of the present disclosure. Some or all of the steps of method 1000 may be performed by a control device (e.g., control device 120) of control system 100. The control 120 may be configured on a computing device. Method 1000 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1000 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 10, at step 1001, a plurality of candidate pose identifications are determined from a positioning image. In some embodiments, the pose identification may include pose identification pattern corner points in the pose identification pattern. The coordinates or the origin of the coordinate system of the candidate pose identification can be represented by the corner points of the candidate pose identification pattern. In some embodiments, the candidate pose identification pattern corner points may refer to possible pose identification pattern corner points obtained through preliminary processing or preliminary identification on the positioning image.
In some embodiments, method 1000 may further include determining a Region of Interest (ROI) in the scout image. For example, an ROI may be first cut out from the positioning image, and a plurality of candidate pose identifications may be determined from the ROI. The ROI may be a full image of the positioning image or a partial region. For example, the ROI of the current frame may be truncated based on a region within a certain range of the pose identification pattern corners determined from the previous frame image (e.g., the positioning image of the previous image processing cycle). For a positioning image of a non-first frame, the ROI may be a region within a certain distance range centered on an imaginary point formed by coordinates of a plurality of pose identification pattern corner points of a previous image processing cycle. The certain distance range may be a fixed multiple, for example, twice, of the average spacing distance of the pose identification pattern corner points. It should be understood that the predetermined multiple may also be a variable multiple of the average spacing distance of the corner points of the plurality of candidate pose identification patterns in the last image processing cycle.
In some embodiments, the method 1000 may further include determining Corner Likelihood values (CL) of pixels in the positioning image. In some embodiments, the corner likelihood value of a pixel point may be a numerical value characterizing the likelihood of the pixel point as a feature point (e.g., a corner). In some embodiments, the positioning image may be preprocessed before calculating the corner likelihood value of each pixel point, and then the corner likelihood value of each pixel point in the preprocessed image is determined. The pre-processing of the image may include, for example: at least one of image graying, image denoising and image enhancement. For example, image pre-processing may include: and intercepting the ROI from the positioning image, and converting the ROI into a corresponding gray image.
In some embodiments, the manner of determining the corner likelihood value of each pixel point in the ROI may include, for example, performing a convolution operation on each pixel point within the ROI to obtain a first and/or second derivative of each pixel point. And solving the corner likelihood value of each pixel point by using the first-order and/or second-order derivative of each pixel point in the ROI. Illustratively, the corner likelihood value of each pixel may be determined based on the following equation (23):
Figure BDA0003295374520000151
wherein τ is a set constant, for example set to 2; I.C. A x 、I 45 、I y 、I n45 Respectively the first derivatives of the pixel points in four directions of 0, pi/4, pi/2 and-pi/4; i is xy And I 45_45 The second derivatives of the pixel points in the directions of 0, pi/2 and pi/4, -pi/4, respectively.
In some embodiments, the method 1000 may further include dividing the ROI into a plurality of sub-regions. For example, a non-maximum suppression method may be used to equally divide multiple sub-images in a ROI region. In some embodiments, the ROI may be equally divided into a plurality of sub-images of 5 x 5 pixels. The above embodiments are exemplary and not limiting, it being understood that the positioning image or ROI may also be segmented into sub-images of other sizes, for example, into sub-images of 9 x 9 pixels.
In some embodiments, the method 1000 may further include determining the pixel in each sub-region with the largest corner likelihood value to form a set of pixels. In some embodiments, the set of pixels serves as a plurality of candidate identifications determined from the scout image. For example, the pixel point with the largest CL value in each sub-image may be determined, the pixel point with the largest CL value in each sub-image may be compared with the first threshold, and the set of pixels with CL values larger than the first threshold may be determined. In some embodiments, the first threshold may be set to 0.06. It should be understood that the first threshold may also be set to other values.
Referring to fig. 10, step 1003 identifies a first pose identification from the candidate pose identifications based on a plurality of different pose pattern matching templates. In some embodiments, a plurality of different pose pattern matching templates are respectively matched with patterns at corner points of candidate pose identification patterns to identify a first pose identification. For example, a candidate pose identification pattern corner point which reaches a preset pose pattern matching degree standard is determined as a first pose identification pattern corner point. In some embodiments, the pose pattern matching template has the same or similar features as the pattern of the area near the pose identification pattern corner. If the matching degree of the pose pattern matching template and the patterns of the regions near the candidate pose identification pattern corner points reaches a preset pose pattern matching degree standard (for example, the matching degree is higher than a threshold), the patterns of the regions near the candidate pose identification pattern corner points and the pose pattern matching template can be considered to have the same or similar characteristics, and then the current candidate pose identification pattern corner points can be considered as pose identification pattern corner points.
In some embodiments, the pixel point with the maximum CL value in the pixel set is determined as the candidate pose identification pattern corner point. For example, all pixels in the pixel set may be sorted in order of CL values from large to small, and the pixel with the largest CL value may be used as a candidate pose identification pattern corner point. In some embodiments, after the candidate pose identification pattern corner points are determined, matching is performed between the pose pattern matching template and the patterns at the candidate pose identification pattern corner points, and if a preset pose pattern matching degree standard is reached, the candidate pose identification pattern corner points are determined to be the first identified pose identification pattern corner points.
In some embodiments, the method 1000 may further include determining, in response to a failure in matching, a pixel with the largest corner likelihood value of the remaining pixels in the set of pixels as a candidate pose identification pattern corner. For example, if the candidate pose identification pattern corner point does not reach the preset matching degree standard, selecting a pixel point with a secondary CL value (a pixel point with a second largest CL value) as the candidate pose identification pattern corner point, matching the pose pattern matching template with the pattern at the candidate pose identification pattern corner point, and so on until the first pose identification pattern corner point is identified.
In some embodiments, the pose identification pattern may be a black and white alternating pattern (e.g., a checkerboard pattern) so that the pose pattern matching template may be the same pattern, using the gray scale distribution G of the pose pattern matching template M Pixel neighborhood gray scale distribution G of pixel points corresponding to candidate pose identification pattern corner points image The Correlation Coefficient (CC) between the two signals. Pixel neighborhood gray distribution G of pixel points image The gray scale distribution of pixels in a certain range (for example, 10 × 10 pixels) centered on the pixel point. The correlation coefficient may be determined based on the following equation (24):
Figure BDA0003295374520000161
where Var () is a variance function and Cov () is a covariance function. In some embodiments, when the correlation coefficient is less than 0.8, and the correlation between the gray scale distribution in the pixel domain and the pose pattern matching template is low, it is determined that the candidate pose identification pattern corner with the largest corner likelihood value is not the pose identification pattern corner, otherwise, it is determined that the candidate pose identification pattern corner with the largest corner likelihood value is the pose identification pattern corner.
In some embodiments, the method 1000 may further include determining edge directions of candidate pose identification pattern corner points. For example, as shown in fig. 11, the candidate pose identification pattern corner point is the corner point P in the pose identification pattern 1100 11 Angular point P 11 The edge direction of (a) may refer to the formation of a corner point P 11 In the direction of the edge of (2), as indicated by the dashed arrow in FIG. 11The intended direction.
In some embodiments, the edge direction may be determined by calculating the first derivative values (I) in the X-direction and the Y-direction of the planar coordinate system for each pixel of a range of neighborhoods (e.g., 10 × 10 pixels) centered on the corner point of the candidate pose identification pattern x And I y ) And (4) determining. For example, the edge direction may be calculated based on the following equation (25):
Figure BDA0003295374520000162
wherein the first derivative (I) x And I y ) The method can be obtained by performing convolution operation on each pixel point in a certain range of neighborhood. In some embodiments, the edge direction I is determined by calculating the edge direction of the pixels in each range neighborhood angle And corresponding weight I weight Clustering to obtain edge direction of the pixel point, and selecting weight I weight I corresponding to the largest class angle As the edge direction. It should be noted that if there are multiple edge directions, the weight I is selected weight I corresponding to a plurality of classes with the largest proportion angle As the edge direction.
In some embodiments, the method used for the Clustering calculation may be any one of a K-means method, a BIRCH (Balanced Iterative Clustering method Based on hierarchical structure) method, a DBSCAN (Density-Based Clustering method with Noise) method, a GMM (Gaussian Mixed Model) method.
In some embodiments, the method 1000 may further include rotating the pose pattern matching template based on the edge orientation. And rotating the pose pattern matching template according to the edge direction to align the pose pattern matching template with the images at the corner points of the candidate pose identification pattern. The edge direction of the candidate pose identification pattern corner point can be used for determining the setting direction of the image at the candidate identification pattern corner point in the positioning image. In some embodiments, rotating the pose pattern matching template according to the edge orientation may adjust the pose pattern matching template to be the same or nearly the same as the image orientation at the candidate pose identification pattern corner point for image matching.
Referring to fig. 10, step 1005, searching for pose identifications with the first pose identification as a starting point. For example, fig. 12 illustrates a flow diagram of a method 1200 for searching for pose identifications, according to some embodiments of the present disclosure. As shown in fig. 12, some or all of the steps in the method 1200 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1720 shown in fig. 17). Some or all of the steps in method 1200 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1200 may be performed by a robotic system (e.g., surgical robotic system 1700 shown in fig. 17). In some embodiments, the method 1200 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium.
Referring to fig. 12, in step 1201, a second gesture identifier is searched for, starting from a first gesture identifier. In some embodiments, the second pose identification pattern corner point is searched in the set search direction with the first pose identification pattern corner point as a starting point. In some embodiments, the set search direction may include: the first pose identification pattern corner point is at least one of directly in front (corresponding to an angle direction of 0 degree), directly behind (corresponding to an angle direction of 120 degrees), directly above (an angle direction of 90 degrees), directly below (-an angle direction of 90 degrees) and obliquely (for example, an angle direction of +/-45 degrees).
In some embodiments, n search directions are set, for example, the search is performed in 8 directions, each search direction v sn May be determined based on the following equation (26):
v sn =[cos(n·π/4) sin(n·π/4)],(n=1,2,…,8) (26)
in some embodiments, the search direction set in the current step may be determined according to a deviation angle between adjacent pose identification pattern corner points in the plurality of pose identification pattern corner points determined in the previous frame. Illustratively, the predetermined search direction is determined based on the following formula (27):
Figure BDA0003295374520000171
wherein (x) j ,y j ) Identifying two-dimensional coordinates of the pattern corner points for a plurality of poses determined for a previous frame (or a previous image processing cycle); n is last The number of the angular points of the plurality of pose identification patterns determined for the previous frame; v. of s1 A search direction set for the first; v. of s2 For the second set search direction.
In some embodiments, as shown in FIG. 13, the pattern corner points P are identified in a first pose 1301 Using the coordinate position of the second gesture identification pattern as a search starting point, and searching a second gesture identification pattern corner point P in the set search direction 1302 The coordinate position of (a) may specifically include: marking the corner point P of the pattern with a first orientation 1301 Is used as a search starting point, and is searched in a set search direction V by a search frame (e.g., a dotted line frame in fig. 13) at a certain search step length 1301 And searching pose identification pattern corner points. If at least one candidate pose identification pattern angular point exists in the search box, preferentially selecting the candidate pose identification pattern angular point with the maximum angular point likelihood value in the search box as a second pose identification pattern angular point P 1302 . Identifying the pattern corner point P in a first pose with the search box constrained to a suitable size 1301 The coordinate position of the second gesture identification pattern corner point P is used as a search starting point 1302 During searching, the candidate pose identification pattern corner with the maximum corner likelihood value in the candidate pose identification pattern corners appearing in the search box has higher possibility of being the pose identification pattern corner. Therefore, the candidate pose identification pattern corner point with the largest corner likelihood value in the search box can be regarded as the second pose identification pattern corner point P 1302 In order to increase the data processing speed. In other embodiments, in order to improve the accuracy of the identification of the angular points of the pose identification patterns, under the condition that at least one candidate angular point of the pose identification patterns exists in the search boxAnd selecting the candidate pose identification pattern corner with the maximum corner likelihood value from the candidate pose identification pattern corners appearing in the search box to identify the corner so as to determine whether the candidate pose identification pattern corner with the maximum corner likelihood value is the pose identification pattern corner. For example, matching the pose pattern matching template with the image in a certain range at the corner of the candidate pose identification pattern with the maximum likelihood value of the corner, and considering the candidate pose identification pattern corner meeting the preset pose pattern matching degree standard as the searched second pose identification pattern corner P 1302
In some embodiments, with continued reference to FIG. 13, the size of the search box may be increased in steps, thereby increasing the search range in steps. The search step size may vary synchronously with the side length of the search box. In other embodiments, the size of the search box may be fixed.
In some embodiments, the pose identification pattern may be a checkerboard pattern between black and white, and pattern matching may be performed based on the correlation coefficient in equation (24). And if the correlation coefficient is larger than the threshold value, the candidate pose identification pattern corner with the largest corner likelihood value is regarded as the pose identification pattern corner and is marked as a second pose identification pattern corner.
Fig. 14 illustrates a flow diagram of a method 1400 for searching for a second position location identification, in accordance with some embodiments of the present disclosure. As shown in fig. 14, some or all of the steps of the method 1400 may be performed by a data processing device (e.g., the control device 120 shown in fig. 1, the processor 1720 shown in fig. 17). Some or all of the steps in method 1400 may be implemented by software, firmware, and/or hardware. In some embodiments, the method 1400 may be performed by a robotic system (e.g., the surgical robotic system 1700 shown in fig. 17). In some embodiments, the method 1400 may be implemented as computer-readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium. In some embodiments, step 1201 of method 1200 may be implemented similarly to method 1400.
Referring to fig. 14, in step 1401, candidate pose identification pattern corner points of a second pose identification are searched for with a first pose identification as a starting point. In some embodiments, searching for candidate pose marker pattern corner points for the second pose marker may be compared to searching for second pose marker pattern corner points P as shown in fig. 13 1302 And is implemented similarly.
At step 1403, based on the distribution of the plurality of pose identifications, a first pose pattern matching template and a second pose pattern matching template are determined, the first pose pattern matching template and the second pose pattern matching template corresponding to pose identifications adjacent to the first pose identification. In some embodiments, step 1403 may be performed before or after step 1401, and step 1403 may also be performed in synchronization with step 1401. In some embodiments, pose identification patterns included by pose identifications adjacent to the first pose identification may be determined based on a pose identification pattern included by the first pose identification and a distribution of a plurality of pose identification patterns, thereby determining a first pose pattern matching template and a second pose pattern matching template.
In step 1405, the first pose pattern matching template and/or the second pose pattern matching template are matched with the patterns at the corner positions of the candidate pose identification pattern of the second pose identification to identify the second pose identification. In some embodiments, the first pose pattern matching template and/or the second pose pattern matching template may be matched with the patterns at the corner positions of the candidate pose identification pattern of the second pose identification based on the correlation coefficients in equation (24). And if the correlation coefficient is larger than the threshold, determining candidate pose identification pattern corner points of the second pose identification as pose identification pattern corner points of the second pose identification, and determining a pattern corresponding to a pose pattern matching template (a first pose pattern matching template or a second pose pattern matching template) with the correlation coefficient larger than the threshold as a pose identification pattern of the second pose identification.
Referring to fig. 12, in step 1203, a search direction is determined based on the first position identifier and the second position identifier. In some embodiments, the search direction comprises: a first search direction and a second search direction. The first search direction may be first bitThe coordinate position of the gesture identification pattern corner point is taken as a starting point and is far away from the direction of the second gesture identification pattern corner point. The second search direction may be a direction starting from the coordinate position of the second pose identification pattern corner point and away from the first pose identification pattern corner point. For example, the search direction V shown in FIG. 13 1302
Referring to fig. 12, in step 1205, the pose identification is searched for in the search direction with the first pose identification or the second pose identification as a starting point. In some embodiments, if the first pose identification pattern corner point is taken as a new starting point, the first search direction in the above embodiments may be taken as a search direction to search for the pose identification pattern corner point. If the second pose identification pattern corner point is used as a new search starting point, the second search direction in the above embodiment may be used as a search direction to search for the pose identification pattern corner point. In some embodiments, a new pose identification pattern corner point (e.g., third pose identification pattern corner point P in fig. 13) is searched for 1303 ) May be performed similarly to step 1201 in method 1200 or method 1500. In some embodiments, the search step size may be the first pose identification pattern corner point P 1301 And a second position sign pattern corner point P 1302 Distance L between 1
Fig. 15 illustrates a flow diagram of a method 1500 for searching for pose identification, according to some embodiments of the present disclosure. As shown in fig. 15, some or all of the steps of the method 1500 may be performed by a data processing apparatus (e.g., the control apparatus 120 shown in fig. 1, the processor 1720 shown in fig. 17). Some or all of the steps of method 1500 may be implemented by software, firmware, and/or hardware. In some embodiments, method 1500 may be performed by a robotic system (e.g., surgical robotic system 1700 shown in fig. 17). In some embodiments, the method 1500 may be implemented as computer readable instructions. Which may be read and executed by a general-purpose or special-purpose processor, such as processor 1720 shown in fig. 17. In some embodiments, these instructions may be stored on a computer-readable medium. In some embodiments, step 1205 in method 1200 may be implemented similarly to method 1500.
Referring to fig. 15, in step 1501, candidate pose identification pattern corner points of a third pose identification are searched for with the first pose identification or the second pose identification as starting points. In some embodiments, searching for candidate pose marker pattern corner points for the third pose marker may be compared to searching for third pose marker pattern corner points P as shown in fig. 13 1303 And similarly implemented.
At step 1503, a third pose pattern matching template is determined based on the distribution of the plurality of pose identifications, the third pose pattern matching template corresponding to a pose identification adjacent to the first pose identification or adjacent to the second pose identification. In some embodiments, a pose identification pattern included by a pose identification adjacent to the first pose identification or the second pose identification may be determined based on a pose identification pattern included by the first pose identification or the second pose identification and a distribution of the plurality of pose identification patterns, thereby determining a third pose pattern matching template.
At step 1505, the third pose pattern matching template is matched with the patterns at the corner positions of the candidate pose identification pattern of the third pose identification to identify the third pose identification. In some embodiments, step 1505 may be implemented similarly to step 1405.
In some embodiments, in response to the search distance being greater than the search distance threshold, determining a pixel with the largest corner likelihood value of the remaining pixels in the pixel set as a candidate pose identification pattern corner; and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification patterns respectively to identify the first pose identification. In some embodiments, after determining the pixel with the largest corner likelihood value of the remaining pixels in the pixel set as the new candidate pose identification pattern corner, a new first pose identification may be identified based on a method similar to step 1003. In some embodiments, a search distance greater than a search distance threshold may be understood as a search distance greater than a search distance threshold in some or all of the search directions. In some embodiments, the search distance threshold may comprise a set multiple of the distance between the (N-1) th and (N-2) th gesture-identifying pattern corner points, where N ≧ 3.
For example, the distance threshold is searched for twice the distance between the first two pose identification pattern corner points. Thus, the maximum search distance for searching the third pose identification pattern corner is twice the distance between the first pose identification pattern corner and the second pose identification pattern corner, if the pose identification pattern corner is not searched yet when the search distance is reached in the search direction, the pixel with the maximum corner likelihood value of the rest pixels in the pixel set is determined as a new candidate pose identification pattern corner, a new first pose identification is identified, and the current search process is correspondingly stopped. In some embodiments, similar to method 1000, a new first pose marker pattern corner point may be re-determined, and similar to method 1200, the remaining pose marker pattern corner points may be searched for with the new pose marker pattern corner point as a search starting point.
In some embodiments, in response to the number of identified pose identification pattern corner points being greater than or equal to the pose identification number threshold, a current relative pose of the manipulator with respect to the reference coordinate system may be determined based on the search for the searched pose identifications, and the search for pose identification pattern corner points may be correspondingly stopped. For example, when four pose identification pattern corner points are identified, the search for the pose identification pattern corner points is stopped.
In some embodiments, in response to the number of the identified pose identifications being less than the number of pose identifications threshold, determining a pixel with the largest corner likelihood value of the remaining pixels in the pixel set as a candidate pose identification pattern corner; and matching the plurality of different pose pattern matching templates with patterns at the corner positions of the candidate pose identification pattern respectively to identify the first pose identification. In some embodiments, if the total number of the identified pose identifications (e.g., pose identification pattern corner points) is less than the set pose identification number threshold, the search based on the first pose identification in the above step is considered to fail. In some embodiments, in case of a search failure, the pixel with the largest corner likelihood value of the remaining pixels in the pixel set is determined as a new candidate pose identification pattern corner, after which a new first pose identification may be identified based on a similar method as step 1003. In some embodiments, similar to method 1000, a new first pose marker pattern corner may be re-determined, and similar to method 1200, the remaining pose marker pattern corners may be searched for with the new pose marker pattern corner as a search starting point.
In some embodiments, after the pose identification pattern corner point is searched or identified, the determined pose identification pattern corner point may be sub-pixel positioned to improve the position accuracy of the pose identification pattern corner point.
In some embodiments, model-based fitting may be performed on the CL values of the pixel points to determine the coordinates of the sub-pixel located pose identification pattern corner points. For example, the fitting function of the CL value of each pixel point in the ROI may be a quadratic function, and the extreme point of the function is a sub-pixel point. The fitting function may be determined based on the following equations (28) and (29):
S(x,y)=ax 2 +by 2 +cx+dy+exy+f (28)
Figure BDA0003295374520000201
wherein S (x, y) is a fitting function of CL values of all pixel points in each ROI, and a, b, c, d, e and f are coefficients; x is the number of c X-coordinate, y, for pose identification c The y coordinate of the pose mark.
In some embodiments of the present disclosure, the present disclosure also provides a computer device comprising a memory and a processor. The memory may be configured to store at least one instruction, and the processor is coupled to the memory and configured to execute the at least one instruction to perform some or all of the steps of the methods disclosed herein, such as some or all of the steps of the methods disclosed in fig. 6, 7, 8, 9, 10, 12, 14, and 15.
Fig. 16 shows a schematic block diagram of a computer device 1600 according to some embodiments of the present disclosure. Referring to FIG. 16, the computer device 1600 may include a Central Processing Unit (CPU) 1601, a system memory 1604 including a Random Access Memory (RAM) 1602 and a Read Only Memory (ROM) 1603, and a system bus 1605 that couples the various components. Computer device 1600 may also include input/output device(s) 1606 and a mass storage device 1607 for storing an operating system 1613, application programs 1614, and other program modules 1615. The input/output device 1606 includes an input/output controller 1610 which is primarily comprised of a display 1608 and an input device 1609.
The mass storage device 1607 is connected to the central processing unit 1601 by a mass storage controller (not shown) connected to the system bus 1605. The mass storage device 1607 or computer-readable medium provides non-volatile storage for the computer device. The mass storage device 1607 may include a computer-readable medium (not shown) such as a hard disk or Compact disk Read-Only Memory (CD-ROM) drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, flash memory or other solid state storage technology, CD-ROM, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory and mass storage devices described above may be collectively referred to as memory.
The computer device 1600 may connect to the network 1612 through a network interface unit 1611 connected to the system bus 1605.
The system memory 1604 or mass storage device 1607 is also used to store one or more instructions. The central processing unit 1601 implements all or part of the steps of the method in some embodiments of the present disclosure by executing the one or more instructions.
In some embodiments of the present disclosure, the present disclosure also provides a computer-readable storage medium having stored therein at least one instruction, which is executable by a processor to cause a computer to perform some or all of the steps of the methods of some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 6, 7, 8, 9, 10, 12, 14 and 15. Examples of computer readable storage media include memories of computer programs (instructions), such as Read-Only memories (ROMs), random Access Memories (RAMs), compact Disc Read-Only memories (CD-ROMs), magnetic tapes, floppy disks, and optical data storage devices, among others.
Fig. 17 shows a schematic diagram of a surgical robotic system 1700, according to some embodiments of the present disclosure. In some embodiments of the present disclosure, referring to fig. 17, a surgical robotic system 1700 may include: surgical instrument 1750, image collector 1710, and processor 1720. The surgical instrument 1750 may include a manipulator 1740, an effector 1730 disposed at a distal end of the manipulator 1740, and a plurality of pose identifications disposed on the end of the manipulator 1740, the plurality of pose identifications including different pose identification patterns. The image collector 1710 can be used to collect positioning images of the manipulator arm 1740. Processor 1720 is coupled to image collector 1710 for performing some or all of the steps of the methods of some embodiments of the present disclosure, such as some or all of the steps of the methods disclosed in fig. 6, 7, 8, 9, 10, 12, 14, and 15.
While particular embodiments of the present disclosure have been illustrated and described, it would be obvious to those skilled in the art that various other changes and modifications can be made without departing from the spirit and scope of the disclosure. Accordingly, it is intended to cover in the appended claims all such changes and modifications that are within the scope of this disclosure.

Claims (25)

1. A control method of an operation arm, comprising:
acquiring a positioning image;
identifying, in the positioning image, a plurality of pose identifications located on the manipulator arm, the plurality of pose identifications including different pose identification patterns;
determining a current relative pose of the manipulator arm with respect to a reference coordinate system based on the plurality of pose identifications; and
determining a drive signal of the manipulator arm based on the current relative pose and the target pose of the manipulator arm.
2. The control method according to claim 1, further comprising:
determining a pose difference based on the current relative pose and a target pose of the manipulator arm; and
determining a drive signal of the manipulator arm based on the pose difference and an inverse kinematics model of the manipulator arm.
3. The control method according to claim 2, further comprising:
determining a current pose of the manipulator in a world coordinate system based on the current relative pose; and
determining the pose difference based on a target pose of the operating arm and a current pose of the operating arm in a world coordinate system, wherein the target pose of the operating arm is the target pose of the operating arm in the world coordinate system.
4. The control method according to claim 3, further comprising:
determining a Cartesian space velocity based on the pose difference;
determining joint parameter spatial velocities based on the cartesian spatial velocities;
determining a target joint parameter based on the joint parameter space velocity and the current joint parameter; and
determining the drive signal based on the target joint parameter.
5. The control method according to claim 4, the posture difference including a position difference and a posture difference, the Cartesian space velocity including a Cartesian space linear velocity and a Cartesian space angular velocity, the control method further comprising:
determining the linear cartesian space velocity based on the position difference; and
determining the Cartesian spatial angular velocity based on the attitude difference.
6. The control method according to claim 4, the operation arm comprising:
the structure comprises at least one structure section and a driving unit, wherein the structure section comprises a fixed disc and a plurality of structural bones, the first ends of the structural bones are fixedly connected with the fixed disc, and the second ends of the structural bones are connected with the driving unit;
the control method further comprises the following steps:
determining a driving amount of the plurality of structural bones based on the target joint parameter; and
determining a drive signal for the drive unit based on the drive amounts of the plurality of structural bones.
7. The control method according to any one of claims 1 to 6, further comprising:
receiving a control command; and
and determining the target pose of the operating arm based on the control command.
8. The control method according to any one of claims 1 to 6, further comprising:
determining the driving signal of the operating arm at a predetermined period to achieve real-time control through a plurality of motion control cycles.
9. The control method according to claim 1, further comprising:
determining an angle of the plurality of pose identifications around an axis relative to a Z axis of an operating arm coordinate system based on the distribution of the plurality of pose identifications; and
and determining three-dimensional coordinates of the plurality of pose identifications relative to the operating arm coordinate system based on the angle of the plurality of pose identifications around the shaft.
10. The control method according to claim 9, further comprising:
determining two-dimensional coordinates of the plurality of pose identifications in the positioning image; and
determining a current relative pose of the manipulator with respect to the reference coordinate system based on the two-dimensional coordinates of the plurality of pose identifications in the positioning image and the three-dimensional coordinates of the plurality of pose identifications with respect to the manipulator coordinate system.
11. The control method according to claim 1, further comprising:
determining a plurality of candidate pose identifications from the positioning image;
identifying a first pose identification from the plurality of candidate pose identifications based on a plurality of different pose pattern matching templates; and
and searching the pose identification by taking the first pose identification as a starting point.
12. The control method according to claim 11, the pose identification including pose identification pattern corner points in the pose identification pattern, the method further comprising:
determining a region of interest in the positioning image;
dividing the region of interest into a plurality of sub-regions;
determining the pixel with the maximum corner likelihood value in each sub-region to form a pixel set;
determining a pixel with the maximum corner likelihood value in the pixel set as a candidate pose identification pattern corner; and
and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification pattern respectively to identify the first pose identification.
13. The control method according to claim 12, further comprising:
and responding to the matching failure, and determining the pixel with the maximum corner likelihood value of the rest pixels in the pixel set as the candidate pose identification pattern corner.
14. The control method according to any one of claims 11 to 13, further comprising:
searching for a second position identifier by taking the first position identifier as a starting point;
determining a search direction based on the first position and orientation identifier and the second position and orientation identifier; and
and searching the pose identification in the searching direction by taking the first pose identification or the second pose identification as a starting point.
15. The control method of claim 14, searching for a second gesture identifier with the first gesture identifier as a starting point comprises:
searching candidate pose identification pattern corner points of the second pose identification by taking the first pose identification as a starting point;
determining a first pose pattern matching template and a second pose pattern matching template based on the distribution of the plurality of pose identifications, the first and second pose pattern matching templates corresponding to pose identifications adjacent to the first pose identification; and
and matching the first pose pattern matching template and/or the second pose pattern matching template with patterns at the corner positions of candidate pose identification patterns of the second pose identification to identify the second pose identification.
16. The control method according to claim 14, searching for a pose identification in the search direction with the first pose identification or the second pose identification as a starting point comprises:
searching candidate pose identification pattern corner points of a third pose identification by taking the first pose identification or the second pose identification as a starting point;
determining a third pose pattern matching template based on the distribution of the plurality of pose identifications, the third pose pattern matching template corresponding to a pose identification adjacent to the first pose identification or adjacent to the second pose identification; and
and matching the third pose pattern matching template with patterns at the corner positions of candidate pose identification pattern of the third pose identification to identify the third pose identification.
17. The control method according to claim 14, further comprising:
responding to the fact that the search distance is larger than a search distance threshold value, and determining a pixel with the largest corner likelihood value of the rest pixels in the pixel set as a candidate pose identification pattern corner; and
and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification pattern respectively to identify the first pose identification.
18. The control method according to claim 14, further comprising:
in response to the fact that the number of the recognized pose identifications is smaller than a pose identification number threshold value, determining a pixel with the largest corner likelihood value of the rest pixels in the pixel set as a candidate pose identification pattern corner; and
and matching the plurality of different pose pattern matching templates with the patterns at the corner positions of the candidate pose identification patterns respectively to identify the first pose identification.
19. The control method according to any one of claims 11 to 13, further comprising:
determining an arrangement order of the plurality of pose identifications based on at least two of the plurality of pose identifications; and
and determining three-dimensional coordinates of the plurality of pose identifications relative to the coordinate system of the operating arm based on the arrangement sequence of the plurality of pose identifications.
20. The control method according to any one of claims 1-6, 9-13, comprising:
determining a current relative pose of the tip instrument of the manipulator arm with respect to the reference coordinate system based on the current relative pose of the manipulator arm with respect to the reference coordinate system; and
determining drive signals for the manipulator arm based on the current relative pose of the end instrument of the manipulator arm and the target pose.
21. The control method according to any one of claims 1 to 6, 9 to 13, the plurality of posture markers being provided on an outer surface of a columnar portion of the operation arm.
22. The control method according to any one of claims 1 to 6 and 9 to 13, wherein a positioning tag including the plurality of pose identifications is provided on an outer surface of the column portion of the operation arm, and the plurality of pose identifications include a plurality of different pose identification patterns distributed on the positioning tag along a circumferential direction of the column portion and pose identification pattern corner points in the pose identification patterns.
23. A computer device, the computer device comprising:
a memory for storing at least one instruction; and
a processor, coupled with the memory, to execute the at least one instruction to perform the control method of any of claims 1-22.
24. A computer-readable storage medium having stored therein at least one instruction, the at least one instruction being executable by a processor to cause a computer to perform the control method of any one of claims 1-22.
25. A surgical robotic system comprising:
a surgical tool comprising an manipulator arm, an effector disposed at a distal end of the manipulator arm, and a plurality of pose identifiers disposed on a distal end of the manipulator arm, the plurality of pose identifiers comprising different pose identifier patterns;
the image collector is used for collecting a positioning image of the operating arm; and
a processor connected with the image collector and used for executing the control method of any one of claims 1-22 to determine the driving signal of the operating arm.
CN202111176599.0A 2021-10-09 2021-10-09 Control method of operation arm and surgical robot system Pending CN115946105A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111176599.0A CN115946105A (en) 2021-10-09 2021-10-09 Control method of operation arm and surgical robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111176599.0A CN115946105A (en) 2021-10-09 2021-10-09 Control method of operation arm and surgical robot system

Publications (1)

Publication Number Publication Date
CN115946105A true CN115946105A (en) 2023-04-11

Family

ID=87295463

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111176599.0A Pending CN115946105A (en) 2021-10-09 2021-10-09 Control method of operation arm and surgical robot system

Country Status (1)

Country Link
CN (1) CN115946105A (en)

Similar Documents

Publication Publication Date Title
CN111775146B (en) Visual alignment method under industrial mechanical arm multi-station operation
CN114536292A (en) Error detection method based on composite identification and robot system
CN114536399B (en) Error detection method based on multiple pose identifications and robot system
CN114523471B (en) Error detection method based on association identification and robot system
CN114347037B (en) Robot system fault detection processing method based on composite identification and robot system
CN116766194A (en) Binocular vision-based disc workpiece positioning and grabbing system and method
WO2022012337A1 (en) Moving arm system and control method
Gratal et al. Scene representation and object grasping using active vision
CN114536402B (en) Robot system fault detection processing method based on association identification and robot system
CN114536331B (en) Method for determining external stress of deformable mechanical arm based on association identification and robot system
CN115946105A (en) Control method of operation arm and surgical robot system
CN115957005A (en) Method for controlling an operating arm and surgical robotic system
CN115708128A (en) Control method of operation arm and surgical robot system
CN115731289A (en) Method for determining pose of object and surgical robot system
CN116468646A (en) Execution arm detection method based on composite identification and robot system
CN115731290A (en) Method for determining pose of object and surgical robot system
CN116468647A (en) Execution arm detection method based on multiple pose identifiers and robot system
CN116468648A (en) Execution arm detection method based on association identification and robot system
CN114536330A (en) Method for determining external stress of deformable mechanical arm based on multiple pose identifications and robot system
CN115700768A (en) Positioning method and surgical robot system
CN114536329A (en) Method for determining external stress of deformable mechanical arm based on composite identification and robot system
CN114536401A (en) Robot system fault detection processing method based on multiple pose identifications and robot system
CN116459019A (en) Pose identification-based control method for preventing collision of operation arm and surgical robot system
CN116459018A (en) Operation arm anti-collision control method based on composite identification and operation robot system
CN116460837A (en) Operation arm anti-collision control method based on association identification and operation robot system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination