CN113211440B - Continuous robot shape sensing method based on multi-attitude calculation - Google Patents

Continuous robot shape sensing method based on multi-attitude calculation Download PDF

Info

Publication number
CN113211440B
CN113211440B CN202110518040.5A CN202110518040A CN113211440B CN 113211440 B CN113211440 B CN 113211440B CN 202110518040 A CN202110518040 A CN 202110518040A CN 113211440 B CN113211440 B CN 113211440B
Authority
CN
China
Prior art keywords
segment
robot
time
ith
curvature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110518040.5A
Other languages
Chinese (zh)
Other versions
CN113211440A (en
Inventor
梁斌
刘厚德
程淏
王学谦
兰斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen International Graduate School of Tsinghua University
Original Assignee
Shenzhen International Graduate School of Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen International Graduate School of Tsinghua University filed Critical Shenzhen International Graduate School of Tsinghua University
Priority to CN202110518040.5A priority Critical patent/CN113211440B/en
Publication of CN113211440A publication Critical patent/CN113211440A/en
Application granted granted Critical
Publication of CN113211440B publication Critical patent/CN113211440B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1628Programme controls characterised by the control loop
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a continuous robot shape sensing method based on multi-attitude calculation, which comprises the following steps of: dividing the continuum robot into a plurality of segments based on a piecewise polynomial curvature kinematics model, wherein each segment passes through miFitting the curvature of the order polynomial, and selecting m in each segmenti+1 position with attitude sensor, miRepresents the order of the ith segment; solving pose transformation information of corresponding intervals according to the data acquired by each pose sensor to obtain poses of the continuous robot at multiple positions; solving curvature modal parameters according to postures of the continuous robot at a plurality of positions; and obtaining the pose of the continuous robot at any point along the arm length direction at any moment according to the poses and curvature modal parameters of the continuous robot at a plurality of positions. The continuous robot shape sensing method based on multi-attitude calculation provided by the invention can be suitable for unknown complex environments, and is simple, practical, real-time and accurate, and long in service life.

Description

Continuous robot shape sensing method based on multi-attitude calculation
Technical Field
The invention relates to the technical field of continuous robots, in particular to a shape sensing method of a continuous robot based on multi-attitude calculation.
Background
In disaster rescue, nuclear and radiation equipment overhaul, toxic waste sampling, pipeline monitoring, minimally invasive surgery and other aspects, due to the narrow space and the high risk, the robot is not suitable for people or large equipment to enter to carry out work, so that the continuous Robots (continuous Robots) with fine body and flexible movement become an important choice. The continuous robot has good bending characteristic and obstacle avoidance capability, can change self shape and posture and other adaptive environments, overcomes the limitation of various obstacles, and is widely applied to special occasions of autonomous operation in unstructured environments, such as the fields of medical treatment, military affairs, disaster relief, ocean exploration and the like. However, due to the flexibility of continuum robots, real-time shape perception remains challenging, severely limiting effective control and thus their ability to work flexibly in unstructured environments such as small spaces.
The shape perception target of the continuum robot is to complete the real-time estimation of the curve configuration of the continuum robot through a sensor, and is the basis for realizing the effective planning control of the continuum robot under the complex unknown environment so as to avoid obstacles and complete tasks.
The existing continuous robot perception technology mainly uses the schemes of cameras, optical fibers, electromagnetic tracking and the like. The existing camera-based shape perception technology for the continuous robot mainly adopts an external camera to acquire a global image of the continuous robot, and then uses a computer vision technology to extract the shape of the continuous robot; however, the technology needs to be provided with an external camera in advance, and for an unknown complex environment, a global image cannot be obtained, so that the technology cannot be used, namely the existing continuous robot shape sensing technology based on the camera has limitations. The existing continuous robot shape sensing technology based on optical fiber mainly configures the optical fiber with Bragg grating (FBG) on the body of the continuous robot, and the bending curvature can be measured at the position with the FBG, so that the shape of the continuous robot is estimated through a certain configuration strategy. However, the grating inscription cost in the existing FBG fiber is high, and the length is limited (usually less than 2 cm), so that the FBG fiber cannot be applied to shape perception of a long continuous robot; in addition, in the prior art, the requirement on the optical fiber configuration precision is high, and the application is not easy; furthermore, this solution has limited durability of the device due to the fragility of the optical fibers.
The above background disclosure is only for the purpose of assisting understanding of the concept and technical solution of the present invention and does not necessarily belong to the prior art of the present patent application, and should not be used for evaluating the novelty and inventive step of the present application in the case that there is no clear evidence that the above content is disclosed at the filing date of the present patent application.
Disclosure of Invention
In order to solve the technical problems, the invention provides a continuous robot shape sensing method based on multi-attitude calculation, which can be suitable for unknown complex environments, is simple, practical, real-time and accurate, and has long service life.
In order to achieve the purpose, the invention adopts the following technical scheme:
the invention discloses a continuous robot shape sensing method based on multi-attitude calculation, which comprises the following steps of:
s1: dividing the continuum robot into a plurality of segments based on a piecewise polynomial curvature kinematics model, wherein each segment passes through miFitting the curvature of the order polynomial, and selecting m in each segmenti+1 position with attitude sensor, miRepresents the order of the ith segment;
s2: solving pose transformation information of corresponding intervals according to the data acquired by each pose sensor to obtain poses of the continuous robot at multiple positions;
s3: solving curvature modal parameters according to postures of the continuum robot at a plurality of positions;
s4: and obtaining the pose of the continuous robot at any point along the arm length direction at any moment according to the poses and curvature modal parameters of the continuous robot at a plurality of positions.
Preferably, the attitude sensor installed in step S1 is an inertial sensor and/or a fiber grating sensor.
Preferably, step S2 specifically includes: obtaining the pose transformation of each inertial sensor relative to the base coordinate system of the continuous robot according to the data acquired by the inertial sensors, and converting the pose transformation of each position relative to the base coordinate system into a root coordinate system { S) of the ith segment relative to each positioni-1Pose transformation and multi-pose calculation are carried out.
Preferably, the obtaining of the pose transformation of each inertial sensor with respect to the coordinate system of the base of the continuum robot according to the data collected by the inertial sensors specifically comprises: and (3) according to the data acquired by the inertial sensors, obtaining the pose transformation of each inertial sensor relative to the continuous robot base coordinate system in real time through an extended Kalman filtering algorithm or a complementary filtering algorithm.
Preferably, the performing multi-pose solution specifically includes: the multi-attitude solution is performed using the following formula:
Figure GDA0003657977450000031
wherein alpha isi(s1,t)、
Figure GDA0003657977450000032
Indicating the ith segment s at time t1
Figure GDA0003657977450000033
Deflection angle at the location of the deflection plane, sgn (-) is a sign-taking operation, yi(s1,t)、
Figure GDA0003657977450000034
Indicating the ith segment s at time t1
Figure GDA0003657977450000035
Quaternion y coordinate at location, wi(s1,t)、
Figure GDA0003657977450000036
Indicating the ith segment s at time t1
Figure GDA0003657977450000037
Quaternion constant term at position w, phii(t) represents the deflection direction of the ith segment at time t,
Figure GDA0003657977450000038
indicating the ith segment s at time t1
Figure GDA0003657977450000039
Quaternion x coordinate at the location.
Preferably, step S2 is embodiedThe method comprises the following steps: obtaining the ith section distributed on the optical fiber grating sensor according to the data collected by the optical fiber grating sensor
Figure GDA00036579774500000310
And
Figure GDA00036579774500000311
between the gate regions at a deflection angle of
Figure GDA00036579774500000312
Wherein
Figure GDA00036579774500000313
Respectively showing two ends of the grid region of the fiber grating sensor arranged at the ith segment and the jth stage.
Preferably, step S3 specifically includes: solving curvature modal parameters according to the postures of the continuous robot at the plurality of positions by adopting one of the following three linear equation sets:
Figure GDA00036579774500000314
Figure GDA00036579774500000315
Figure GDA00036579774500000316
wherein alpha isi(sjAnd t) is any i section s at the t moment obtained by the inertial sensorjThe attitude at the location of the position of the user,
Figure GDA0003657977450000041
time t obtained for fiber grating sensor
Figure GDA0003657977450000042
And
Figure GDA0003657977450000043
change of posture of thetaj,iAnd the curvature modal parameters of the attitude sensor set in the jth step of the ith segment are represented.
Preferably, step S4 specifically includes: according to the postures and curvature modal parameters of the continuous robot at the multiple positions, a root coordinate system { S) of the continuous robot at any ith section and any S position of the continuous robot in the corresponding section is obtainedi-1Position and attitude below.
Preferably, a root coordinate system { S ] of the continuous robot at the position of any ith segment and any S of the corresponding segment is obtainedi-1The position below is solved by the following formula:
Figure GDA0003657977450000044
wherein the content of the first and second substances,
Figure GDA0003657977450000045
ri(s,t)、xi(s,t)、yi(s,t)、zi(s, t) respectively represent the projection of the ith segment s position at the time t onto the x-o-z plane, the x-axis, the y-axis and the z-axis in the planeiDenotes the deflection direction of the i-th segment, LiIs the length of the i-th segment, αi(v, t) represents the deflection angle in the deflection plane at the ith segment v position at time t.
Preferably, a root coordinate system { S at the position of any ith segment and any S of the continuous robot in the corresponding segment is obtainedi-1The attitude under the equation is solved by the following formula:
Figure GDA0003657977450000046
wherein alpha isi(s, t) is the attitude at the ith segment s position at time t, qi(v, t) represents the curvature at the v position of the ith segment at time t,
Figure GDA0003657977450000047
is the k-th orderMode of curvature, phii(t) represents the deflection direction of the i-th segment at time t, sgn (-) is a sign operation, wi(s, t) represents the quaternion constant term w, x at the location of the ith segment s at time ti(s,t)、yi(s, t) denotes the quaternion x, y coordinates at the location of the ith segment s at time t, ni-1Representing a root coordinate system S perpendicular to the ith segmenti-1In zi-1The axis of rotation of the shaft.
Compared with the prior art, the invention has the beneficial effects that: the continuous robot shape sensing method based on multi-attitude calculation is suitable for unknown complex environments, simple, practical, real-time and accurate, and long in service life; the method can be further applied to shape perception of various types of continuum robots and provides perception information for further planning control, so that the application potential of the continuum robots in complex environments is exerted.
Drawings
FIG. 1 is a flow chart of a continuous robot shape sensing method based on multi-pose solution according to a preferred embodiment of the invention;
FIG. 2a is a schematic diagram of the overall structure of a continuum robot based on a piecewise polynomial curvature kinematics model;
FIG. 2b is a schematic structural view of a segment of FIG. 2 a;
FIG. 3 is a flowchart of a continuous robot shape sensing method based on multi-pose solution according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below. It should be emphasized that the following description is merely exemplary in nature and is not intended to limit the scope of the invention or its application.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. In addition, the connection may be for either a fixed or circuit/signal communication role.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be in any way limiting of the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
As shown in fig. 1, the preferred embodiment of the present invention discloses a continuous robot shape sensing method based on multi-pose solution, comprising the following steps:
s1: dividing the continuum robot into a plurality of segments based on a piecewise polynomial curvature kinematics model, wherein each segment passes through miFitting the curvature of the order polynomial, and selecting m in each segmenti+1 position with attitude sensor, miRepresents the order of the ith segment;
s2: solving pose transformation information of corresponding intervals according to the data acquired by each pose sensor to obtain poses of the continuous robot at multiple positions;
s3: solving curvature modal parameters according to postures of the continuous robot at a plurality of positions;
s4: and obtaining the pose of the continuum robot at any point along the arm length direction at any moment according to the poses of the continuum robot at multiple positions and the curvature modal parameters.
According to the invention, a plurality of attitude sensors (such as inertial sensors IMU) are configured on the body of the continuum robot, multi-point attitude calculation is carried out, and then the overall shape of the continuum robot is further obtained. The technology can flexibly configure the number of the sensors according to actual requirements, is suitable for various continuous robots, and is not limited by external environmental factors.
The technology is based on a continuous robot piecewise polynomial curvature kinematics model, and the multi-point posture can be mapped to an integral shape through a functional relation according to the model, so that the invention provides the method for realizing the real-time shape perception of the continuous robot by using multi-posture calculation.
Piecewise polynomial kinematics consider that a continuum robot consists of a fixed number of piecewise sequences, as shown in FIG. 2 a; each segment is continuously deformed in space, the curvature distribution can be expressed as a polynomial of the normalized distance on the arc, and the polynomial coefficient, namely the modal parameter, can change along with time, and the graph is { S }0}、{S1}、{S2}、{S3Denotes the coordinate system at both ends of each segment, xi、yi、ziRespectively represent { Si-three coordinate axes of a coordinate system,
Figure GDA0003657977450000061
respectively expressed by a coordinate system { S }0To { S }1The pose of the matrix, T, is transformed in a homogeneous way1 2Expressed by a coordinate system S1To { S }2The pose of the matrix is transformed in a homogeneous way,
Figure GDA0003657977450000062
expressed by a coordinate system S2To { S }3And (6) uniformly transforming the pose of the matrix. FIG. 2b is an example of a segment of a piecewise polynomial curvature continuum robot with normalized distance s ∈ [0,1]]Indicating that a segment is from root to tip. In the figure, { Si-1Denotes the end coordinate system of the i-1 th segment (which is also the root coordinate system of the i-th segment), { Si(s,t)Denotes the coordinate system at the ith segment S position at time t, { SiDenotes the segment end coordinate system, xi-1、yi-1、 zi-1Respectively represent { Si-1Three coordinate axes of a coordinate system, xi、yi、ziRespectively represent { SiThree coordinate axes of a coordinate system, LiDenotes the length of the i-th segment, sLiDenotes the length from the root of the i-th segment to the s-position, αi(s, t) denotes the deflection angle of the i-th segment s at the deflection plane at time t, αiAnd (1, t) represents the deflection angle of the ith segment end on the deflection plane at the moment t.
Wherein the motion model is based on the following assumptions: (1) each deflection arc line is on a certain plane at a certain moment, namely each segment does not deflect; (2) each section of the continuous robot is not telescopic, namely the arc length is constant when the robot bends.
For a certain segment of the continuous robot, the normalized distance s epsilon [0,1], s-0 corresponds to the root of the segment, s-1 corresponds to the tail end of the segment, and then the curve on the deflection plane can be described by polynomial curvature.
Curvature function q of arbitrarily segmented ith segmenti(s, t) can be expressed by a curvature polynomial infinite series of s, and in practical applications, in the present embodiment, by miOrder modal truncation approximation operator
Figure GDA0003657977450000071
Truncating the high-order mode small quantity to obtain finite-dimension approximation of polynomial curvature:
Figure GDA0003657977450000072
wherein m isiIs the truncation order of the ith segment, when miWhen 0, the curvature function degenerates to a constant curvature.
Figure GDA0003657977450000073
Figure GDA0003657977450000074
Is the ith curvature mode of the k order, qi(s, t) represents the curvature at the location of the ith segment s at time t. The curvature polynomial describes the shape of the continuum robot, and θ ═ θ is introduced0,...,θj,...θm}∈K is a modal parameter, where the series space mapped for the K curvature function is defined by the pair qiThe (s, t) curve is integrated and a continuum robot kinematics model can be built in Θ.
And (3) integrating the curvature polynomial about an s curve to obtain the deflection angle of the continuous robot at any position on the deflection plane as follows:
Figure GDA0003657977450000075
wherein alpha isi(s, t) denotes the deflection angle at the location of the ith segment s at time t in the deflection plane.
Further, projecting the deflection angle to a Cartesian coordinate system for curve integration to obtain the Cartesian coordinate of any position in the plane:
Figure GDA0003657977450000081
wherein L isiIs the length of the i-th segment, ri(s,t)、zi(s, t) represent the projection of the ith segment s position at time t onto the x-o-z plane and z-axis, respectively, ai(v, t) represents the deflection angle in the deflection plane at the ith segment v position at time t.
Further according to the fact that all the deflection arc lines are on a certain plane at a certain moment, namely all the segments are not deflected, and the position S of the arbitrary position of the ith segment is located at { Si-1The coordinates in the coordinate system are:
Figure GDA0003657977450000082
as can be seen from the equations (2) and (4), the continuous robot at any time t has an attitude (phi) at any position s of any segment iii)|s,tAnd position (x)i,yi,zi)|s,tThere is a coupling relationship that is characterized by a curvature mode as follows.
Figure GDA0003657977450000083
As shown in fig. 3, the method for sensing the shape of the continuous robot based on multi-pose solution disclosed by the present invention includes two parts: (1) a sensor configuration; (2) a real-time shape perception algorithm.
(1) Sensor arrangement
The part configures necessary sensors for a subsequent shape perception algorithm; the sensor configuration work is required to be completed firstly when the shape sensing is carried out for the first time, and the shape sensing can be carried out by directly using a real-time shape sensing algorithm subsequently under the condition that the structure of the continuous robot is not changed.
i. Attitude sensor selection
When arranging the sensors, it needs to be considered in connection with the mechanical structure of the actual continuum robot. In terms of sensor selection, IMU sensors provide a validated solution and there are numerous mature products to choose from.
The part determines the selection of the sensor according to the actual continuous robot structure; for example, the inertial sensor IMU may be selected directly for larger continuum robots, and the tiny fiber optic sensor may be selected for tiny continuum robots (e.g., minimally invasive continuum surgical robots).
Segmentation and approximation order selection
The segmentation is divided according to segments which can generate different deflection directions of the actual continuous robot. The approximation order needs to be selected from a low order to a high order according to the complexity of each segment of the actual curve (the higher the approximation order is, the higher the precision is, but the more sensors are needed, the installation is not facilitated, and more noise is possibly introduced to influence the precision); higher order corresponds to more attitude sensors needed, for example, 0 order only needs 1 sensor, 1 order needs 2 sensors, …, and m order needs m +1 sensors.
Installing a corresponding number of sensors and different positions of the continuum robot according to the approximate order:
s=[s0,...,sm]Twherein 0 < s0<s1<...<sm≤1。
In order to achieve a better shape perception, the mounting positions need to be distributed as evenly as possible over the segment.
After the sensors are selected, the number and the positions of the segments are selected according to the actual structure and the motion condition of the continuous robot; and the approximate order of each segment is selected. The method has universality and is suitable for various continuous robots, and the approximate order can be selected according to the actual curve shape, so that the shape perception requirement is met;
(2) real-time shape-aware algorithm
For the continuous robot with the configured sensor, the real-time shape perception can be realized through a shape perception algorithm.
i. Multi-pose solution
The multi-pose resolving aims to solve pose transformation information of the corresponding interval in real time according to data collected by the sensor. According to the difference of the sensors, the corresponding attitude calculation methods are different, and the obtained pose transformation sections are also different. The invention introduces quaternion expression of continuous robot gestures, and then develops discussion aiming at multi-gesture resolving problems of common gesture sensors.
According to the piecewise polynomial curvature kinematics model, the attitude of the ith segment at the t moment at any s position can be determined by the deflection direction phii(t) and a deflection angle alphai(s, t); in fact, the gesture may consist of a vertical to zi-1Axis n of rotation of the shafti-1And a rotation angle alphai(s, t) and can be succinctly expressed by quaternions. the attitude at the ith segment of any s position at the time t can be expressed by an axis-angle as follows:
Figure GDA0003657977450000101
wherein n isz,i-10 means no deflection.
And defined according to quaternion:
Figure GDA0003657977450000102
wherein wi 2+xi 2+yi 2+zi 2=1。
Combining formula (6) and formula (7) to obtain:
Figure GDA0003657977450000103
wherein sgn (·) is a sign-taking operation.
As shown in equation (8), for the i-th segment, the deflection direction φ can be calculated from the attitude at any s-positioni(ii) a Usually the deflection angle near the end is larger, where the deflection direction calculation structure is less affected by noise, so the i-th segment deflection direction can be selected
Figure GDA0003657977450000104
Is calculated from the attitude, wherein
Figure GDA0003657977450000105
Indicates the sensor position at the end of the i-th segment, miIndicating the order of the ith segment.
Expressed by quaternion, the multi-attitude calculation task is to solve the deflection direction phi of each section according to the data collected by the sensoriAnd each segment siUpper sigma miDeflection angle at one position
Figure GDA0003657977450000106
i e {1, ·, n } is based on { q ·i(sjT) | i ∈ {1, ·, n }, j ∈ {1, ·, m } } calculates:
Figure GDA0003657977450000107
wherein i is formed by {1, ·, n }, alphai(s1,t)、
Figure GDA0003657977450000108
Indicating the ith segment s at time t1
Figure GDA0003657977450000109
At position (i.e. s th)1
Figure GDA00036579774500001010
Where the attitude sensor is located), sgn (·) is a sign-taking operation, y) is the angle of deflection in the deflection planei(s1,t)、
Figure GDA0003657977450000111
Indicating the ith segment s at time t1
Figure GDA0003657977450000112
Quaternion y coordinate at location, wi(s1,t)、
Figure GDA0003657977450000113
Denotes the ith segment s at time t1
Figure GDA0003657977450000114
Quaternion constant term w, phi at locationi(t) represents the deflection direction of the ith segment at time t,
Figure GDA0003657977450000115
indicating the ith segment s at time t1
Figure GDA0003657977450000116
Quaternion x coordinate at location, miIndicating the order of the ith segment.
For the inertial sensors, the pose transformation of each inertial sensor relative to the base coordinate system of the continuous robot can be obtained in real time through an extended Kalman filtering algorithm, a complementary filtering algorithm and the like, and according to the formula (6), the pose transformation of each S position relative to the base coordinate system needs to be converted into a root coordinate system { S ] relative to the ith segmenti-1Pose transformation ofAnd (5) carrying out multi-attitude calculation by one step of substitution formula (9).
For the fiber grating sensor, the principle is that when temperature, stress or bending and the like act on the grating area part of the fiber grating sensor, the grating area can generate strain, and further the central wavelength of reflected light which is led into laser and meets the Bragg condition is shifted; when the interference of other factors such as the control temperature and the like occurs, the offset generated by the grid region part of the fiber grating sensor, namely the deflection angle between two sections of the grid region, can be obtained by calculating the offset. For section i distributed over
Figure GDA0003657977450000117
And
Figure GDA0003657977450000118
between the gate regions, with a deflection angle of
Figure GDA0003657977450000119
However, since the fiber grating sensor can only measure the deflection angle and cannot measure the deflection direction, it is necessary to combine other sensors for spatial motion or to adopt a mode of simultaneously measuring a plurality of fiber grating sensors.
In practical application, the sensor can be selected according to the characteristics of the prototype of the continuous robot, and one or more sensors can be mixed for use.
Modal parameter solution
According to the postures of the continuous robot at the multiple positions obtained in the previous step, wherein the posture conversion from the root of the section to the position of the sensor is obtained by the inertial sensor, namely, any i section sjIn an attitude of alphai(sjT); the fibre-optic grating sensor obtains the attitude change at both ends of the grating detection section, i.e.
Figure GDA00036579774500001110
And
Figure GDA00036579774500001111
change of posture of room
Figure GDA00036579774500001112
When the sensors all adopt inertial sensors, obtaining any i section sjIn an attitude of alphai(sjAnd t) substituting the following equation to solve the modal parameters:
Figure GDA0003657977450000121
when the sensors all adopt fiber grating sensors, the obtained result is
Figure GDA0003657977450000122
And
Figure GDA0003657977450000123
change of posture of room
Figure GDA0003657977450000124
Substituting the following formula to solve the modal parameters:
Figure GDA0003657977450000125
when the sensor part adopts a conventional sensor and the part adopts a fiber grating sensor, the arbitrary i section s obtained in the previous stepjIn an attitude of alphai(sjT), or
Figure GDA0003657977450000126
And
Figure GDA0003657977450000127
change of posture of room
Figure GDA0003657977450000128
For any i segment, according to a piecewise polynomial curvature kinematic model miContaining m in the curvature of the approximate polynomiali+1 polynomial curvatures, the following system of linear equations can be obtained:
Figure GDA0003657977450000129
there is a unique solution to the above linear equation set (10), (11) or (12), and by solving the equation (10), (11) or (12), m can be obtained from the attitude of the sensoriOrder-approximated curvature mode
Figure GDA00036579774500001210
Shape solution (Overall pose solution)
Obtaining a segmental deflection direction phi according to a segmental polynomial curvature kinematics modeliAnd modal parameters
Figure GDA00036579774500001211
Thereafter, { S ] at an arbitrary S-position in an arbitrary ith segment can be obtained by equations (3) and (4)i-1The position in the coordinate system is obtained by the formulas (2), (6) and (8) at any ith segment of any S positioni-1And (5) posture under a coordinate system. In conclusion, the pose of the continuum robot at any point in the arm length direction at time t, that is, the overall shape can be obtained.
The formula (3) relates to an integral containing a trigonometric function of a polynomial variable, and the integral has no analytic solution but can be calculated by a numerical method at any precision.
And solving each section of the whole continuous robot respectively to obtain a whole shape sensing result in real time. In particular, a continuum robot may be composed of a plurality of polynomial curvature segments, which are segmented in a "sensor arrangement"; steps i to iii in the real-time shape perception algorithm are the solving process of a certain section, and the shape under the terminal coordinate system relative to the previous section is obtained; the overall shape needs to be subjected to a cyclic single-section solving process to obtain the shape of each section; and then the whole shape can be obtained by connecting according to the sequence. Wherein the pose transformation of the two ends of the segment can be known after the shape of the single segment is solved. Shape feedback information can be provided for further accurate feedback motion control of the continuum robot through the obtained overall shape.
The invention provides a universal, simple and practical real-time shape sensing method for a continuous robot; the sensor is the existing mature attitude sensor, is easy to obtain and has long service life; the method is not influenced by external environment, and can be suitable for unknown complex environment; the method provides necessary perception information for further planning and accurate motion control of the continuum robot, so that the problem that the continuum robot is difficult to accurately control can be further solved.
The background section of the present invention may contain background information related to the problem or environment of the present invention rather than the description of the prior art by others. Accordingly, the inclusion in the background section is not an admission of prior art by the applicant.
The foregoing is a more detailed description of the invention in connection with specific/preferred embodiments and is not intended to limit the practice of the invention to those descriptions. It will be apparent to those skilled in the art that various substitutions and modifications can be made to the described embodiments without departing from the spirit of the invention, and these substitutions and modifications should be considered to fall within the scope of the invention. In the description herein, references to the description of the term "one embodiment," "some embodiments," "preferred embodiments," "an example," "a specific example," or "some examples" or the like are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction. Although embodiments of the present invention and their advantages have been described in detail, it should be understood that various changes, substitutions and alterations can be made herein without departing from the scope of the invention as defined by the appended claims.

Claims (7)

1. A continuous robot shape sensing method based on multi-attitude calculation is characterized by comprising the following steps:
s1: dividing the continuum robot into a plurality of segments based on a piecewise polynomial curvature kinematics model, wherein each segment passes through miFitting the curvature of the order polynomial, and selecting m in each segmenti+1 position with attitude sensor, miRepresents the order of the ith segment;
s2: solving pose transformation information of corresponding intervals according to the data acquired by each pose sensor to obtain poses of the continuous robot at multiple positions;
s3: solving curvature modal parameters according to postures of the continuous robot at a plurality of positions;
s4: according to the postures and curvature modal parameters of the continuous robot at the multiple positions, a root coordinate system { S) of the continuous robot at any ith section and any S position of the continuous robot in the corresponding section is obtainedi-1Position and attitude below;
wherein, a root coordinate system { S at the position of any ith section and any S of the continuous robot in the corresponding section is obtainedi-1The position below is solved by the following formula:
Figure FDA0003657977440000011
in the formula (I), the compound is shown in the specification,
Figure FDA0003657977440000012
ri(s,t)、xi(s,t)、yi(s,t)、zi(s, t) respectively represent the projection of the ith segment s position at the time t onto the x-o-z plane, the x-axis, the y-axis and the z-axis in the planeiDenotes the deflection direction of the i-th segment, LiIs the length of the i-th segment, αi(v, t) represents the deflection angle at the deflection plane at the ith segment v position at time t;
whereinObtaining a root coordinate system { S of the continuous robot at any ith section and any S position of the continuous robot in the corresponding sectioni-1The attitude under the equation is solved by the following formula:
Figure FDA0003657977440000021
in the formula, alphai(s, t) is the attitude at the ith segment s position at time t, qi(v, t) represents the curvature at the v position of the ith segment at time t,
Figure FDA0003657977440000022
is a k-th order curvature mode, phii(t) represents the deflection direction of the i-th segment at time t, sgn (-) is a sign operation, wi(s, t) denotes the quaternion constant term w, x at the location of the ith segment s at time ti(s,t)、yi(s, t) denotes the quaternion x, y coordinates at the location of the ith segment s at time t, ni-1Representing a root coordinate system S perpendicular to the ith segmenti-1In zi-1The axis of rotation of the shaft.
2. The continuum robot shape perception method according to claim 1, wherein the attitude sensor installed in step S1 is an inertial sensor and/or a fiber grating sensor.
3. The shape perception method of the continuum robot as recited in claim 1, wherein the step S2 specifically comprises: obtaining the pose transformation of each inertial sensor relative to the base coordinate system of the continuous robot according to the data acquired by the inertial sensors, and converting the pose transformation of each position relative to the base coordinate system into a root coordinate system { S) of the ith segment relative to each positioni-1Pose transformation and multi-pose calculation are carried out.
4. The shape sensing method of the continuum robot according to claim 3, wherein the obtaining the pose transformation of each inertial sensor with respect to the coordinate system of the base of the continuum robot based on the data collected by the inertial sensors comprises: and according to the data acquired by the inertial sensors, obtaining the pose transformation of each inertial sensor relative to the continuous robot base coordinate system in real time through an extended Kalman filtering algorithm or a complementary filtering algorithm.
5. The continuum robot shape perception method of claim 3, wherein performing multi-pose solution specifically comprises: the multi-attitude solution is performed using the following formula:
Figure FDA0003657977440000031
wherein alpha isi(s1,t)、
Figure FDA0003657977440000032
Indicating the ith segment s at time t1
Figure FDA0003657977440000033
Angle of deflection at the plane of deflection, sgn (·) is signed operation, yi(s1,t)、
Figure FDA00036579774400000315
Indicating the ith segment s at time t1
Figure FDA0003657977440000034
Quaternion y coordinate at location, wi(s1,t)、
Figure FDA00036579774400000313
Indicating the ith segment s at time t1
Figure FDA00036579774400000314
Quaternion constant term at position w, phii(t) represents the deflection direction of the ith segment at time t,
Figure FDA0003657977440000035
indicating the ith segment s at time t1
Figure FDA0003657977440000036
Quaternion x coordinate at the location.
6. The shape perception method of the continuum robot as recited in claim 1, wherein the step S2 specifically comprises: obtaining the ith section distributed on the optical fiber grating sensor according to the data collected by the optical fiber grating sensor
Figure FDA0003657977440000037
And
Figure FDA0003657977440000038
between the gate regions at a deflection angle of
Figure FDA0003657977440000039
Wherein
Figure FDA00036579774400000310
Respectively showing two ends of the grid region of the fiber grating sensor arranged at the ith segment and the jth stage.
7. The shape perception method of the continuum robot as recited in claim 1, wherein the step S3 specifically comprises: solving curvature modal parameters according to the postures of the continuous robot at the plurality of positions by adopting one of the following three linear equation sets:
Figure FDA00036579774400000311
Figure FDA00036579774400000312
Figure FDA0003657977440000041
wherein alpha isi(sjAnd t) is any i section s at the t moment obtained by the inertial sensorjThe attitude at the location of the position of the user,
Figure FDA0003657977440000042
time t obtained for fiber grating sensor
Figure FDA0003657977440000043
And
Figure FDA0003657977440000044
change of posture of the cells, thetaj,iAnd the curvature modal parameters of the attitude sensor set by the jth step of the ith segment are represented.
CN202110518040.5A 2021-05-12 2021-05-12 Continuous robot shape sensing method based on multi-attitude calculation Active CN113211440B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110518040.5A CN113211440B (en) 2021-05-12 2021-05-12 Continuous robot shape sensing method based on multi-attitude calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110518040.5A CN113211440B (en) 2021-05-12 2021-05-12 Continuous robot shape sensing method based on multi-attitude calculation

Publications (2)

Publication Number Publication Date
CN113211440A CN113211440A (en) 2021-08-06
CN113211440B true CN113211440B (en) 2022-07-01

Family

ID=77095082

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110518040.5A Active CN113211440B (en) 2021-05-12 2021-05-12 Continuous robot shape sensing method based on multi-attitude calculation

Country Status (1)

Country Link
CN (1) CN113211440B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114099227B (en) * 2021-10-25 2023-10-10 清华大学深圳国际研究生院 Spinal rehabilitation robot and system, shape sensing and motion control method thereof

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09103945A (en) * 1995-10-09 1997-04-22 Seiko Seiki Co Ltd Timepiece outer case polishing device and universal grinding device
CN106695803A (en) * 2017-03-24 2017-05-24 中国民航大学 Continuous robot posture control system
WO2018037931A1 (en) * 2016-08-22 2018-03-01 Canon Kabushiki Kaisha Continuum robot, modification method of kinematic model of continuum robot, and control method of continuum robot
CN109676603A (en) * 2018-10-12 2019-04-26 沈阳新松机器人自动化股份有限公司 A kind of flexibility hyper-redundant robot control method
CN110744547A (en) * 2019-11-08 2020-02-04 山东大学 Continuous body mechanical arm inverse kinematics modeling method based on segmented constant curvature
CN111597657A (en) * 2020-05-22 2020-08-28 南京航空航天大学 Method for calculating modal parameters and vibration response of rotary joint type industrial robot
CN111618824A (en) * 2020-05-25 2020-09-04 清华大学深圳国际研究生院 Arm type self-estimation method for continuous robot
CN111695213A (en) * 2020-05-25 2020-09-22 清华大学深圳国际研究生院 Continuous robot kinematics equivalent method and application
CN110834330B (en) * 2019-10-25 2020-11-13 清华大学深圳国际研究生院 Flexible mechanical arm teleoperation man-machine interaction terminal and method

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09103945A (en) * 1995-10-09 1997-04-22 Seiko Seiki Co Ltd Timepiece outer case polishing device and universal grinding device
WO2018037931A1 (en) * 2016-08-22 2018-03-01 Canon Kabushiki Kaisha Continuum robot, modification method of kinematic model of continuum robot, and control method of continuum robot
CN106695803A (en) * 2017-03-24 2017-05-24 中国民航大学 Continuous robot posture control system
CN109676603A (en) * 2018-10-12 2019-04-26 沈阳新松机器人自动化股份有限公司 A kind of flexibility hyper-redundant robot control method
CN110834330B (en) * 2019-10-25 2020-11-13 清华大学深圳国际研究生院 Flexible mechanical arm teleoperation man-machine interaction terminal and method
CN110744547A (en) * 2019-11-08 2020-02-04 山东大学 Continuous body mechanical arm inverse kinematics modeling method based on segmented constant curvature
CN111597657A (en) * 2020-05-22 2020-08-28 南京航空航天大学 Method for calculating modal parameters and vibration response of rotary joint type industrial robot
CN111618824A (en) * 2020-05-25 2020-09-04 清华大学深圳国际研究生院 Arm type self-estimation method for continuous robot
CN111695213A (en) * 2020-05-25 2020-09-22 清华大学深圳国际研究生院 Continuous robot kinematics equivalent method and application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
用于微创外科的线驱动连续型手术机器人设计;赵亮;《中国优秀博硕士学位论文全文数据库(硕士)》;20200815(第8期);全文 *

Also Published As

Publication number Publication date
CN113211440A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
US8467904B2 (en) Reconstruction, retargetting, tracking, and estimation of pose of articulated systems
Clements et al. Three-dimensional contact imaging with an actuated whisker
CN106910223A (en) A kind of Robotic Hand-Eye Calibration method based on convex lax global optimization approach
CN110880189A (en) Combined calibration method and combined calibration device thereof and electronic equipment
CN112914731A (en) Interventional robot contactless teleoperation system based on augmented reality and calibration method
CN113211440B (en) Continuous robot shape sensing method based on multi-attitude calculation
Jeong et al. A novel low-cost, large curvature bend sensor based on a Bowden-cable
Shintemirov et al. An open-source 7-DOF wireless human arm motion-tracking system for use in robotics research
CN108279773B (en) Data glove based on MARG sensor and magnetic field positioning technology
Fei et al. Development of a wearable glove system with multiple sensors for hand kinematics assessment
Lapusan et al. Shape Sensing of Hyper-Redundant Robots Using an AHRS IMU Sensor Network
Lu et al. Fbg-based variable-length estimation for shape sensing of extensible soft robotic manipulators
Barker Vector-algebra approach to extract Denavit-Hartenberg parameters of assembled robot arms
Chen et al. Model-based estimation of the gravity-loaded shape and scene depth for a slim 3-actuator continuum robot with monocular visual feedback
Yang et al. Sensor fusion-based teleoperation control of anthropomorphic robotic arm
Matyunin Research on characteristics of fiber optic sensors for anthropomorphous robots
Kuhnlenz et al. A multi-focal high-performance vision system
Kim et al. A physics-informed, vision-based method to reconstruct all deformation modes in slender bodies
Sincak et al. Sensing of Continuum Robots: A Review
WO2023192681A1 (en) Inertia-based improvements to robots and robotic systems
Larimi et al. Control of artificial human finger using wearable device and adaptive network-based fuzzy inference system
CN112057083B (en) Wearable human upper limb pose acquisition equipment and acquisition method
Visentin et al. A new exploration strategy for soft robots based on proprioception
Rettig et al. Kinematic calibration of a collaborative robot by a marker based optical measurement procedure
Luo et al. Robotics Perception and Control: Key Technologies and Applications

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant