CN107322601A - The attitudes vibration detection means and method of a kind of object gripped by manipulator - Google Patents

The attitudes vibration detection means and method of a kind of object gripped by manipulator Download PDF

Info

Publication number
CN107322601A
CN107322601A CN201710691102.6A CN201710691102A CN107322601A CN 107322601 A CN107322601 A CN 107322601A CN 201710691102 A CN201710691102 A CN 201710691102A CN 107322601 A CN107322601 A CN 107322601A
Authority
CN
China
Prior art keywords
mrow
msub
msubsup
state
mtd
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710691102.6A
Other languages
Chinese (zh)
Other versions
CN107322601B (en
Inventor
李学勇
赵凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201710691102.6A priority Critical patent/CN107322601B/en
Publication of CN107322601A publication Critical patent/CN107322601A/en
Application granted granted Critical
Publication of CN107322601B publication Critical patent/CN107322601B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion

Abstract

The invention discloses a kind of attitudes vibration detection means of object gripped by manipulator and method;Detection means includes:Manipulator grips system, and the manipulator gripping system includes manipulator and two mechanical fingers;The manipulator is cross bar, and two mechanical fingers are the montant vertical with cross bar, and two mechanical fingers are separately mounted to the two ends of cross bar;Described two mechanical fingers are used to clamp object;Embedded on one of mechanical finger and the 3rd sliding feeling sensor sensor3 is embedded on the first sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2, another mechanical finger;Each sliding feeling sensor is used to detection slide displacement amount and glide direction;Each sliding feeling sensor is connected by corresponding A/D converter with memory, and the memory is connected with DSP, and the DSP is connected by USB interface with control computer.So as to realize the quantitative detection of gestures of object change.Detection method can be further used for being gripped the monitoring of object continuous path and track.

Description

The attitudes vibration detection means and method of a kind of object gripped by manipulator
Technical field
The present invention relates to a kind of attitudes vibration detection means of object gripped by manipulator and method, it is related to robot behaviour Control field.
Background technology
Robot realizes stabilization to object, flexibly gripping in non-structure environment, be in robotics development most One of research field of challenge.In actually gripping operation, inclination, sliding, rotation etc. are unavoidably occurred by gripping object Situations such as attitudes vibration, the translation such as along x, y, z axle and the corner change around three axles.If can be to by the posture of gripping object Change carries out Real_time quantitative detection and feeds back to manipulator, and Manipulator Controller can make corresponding behaviour according to these real time datas Make, be then greatly improved the stability of gripping and the flexibility of operation.
At present, to mainly having two classes by the attitudes vibration detection method of gripping object:One is to use visible sensation method, that is, is passed through Monitored in manipulator disposed outside camera by the attitudes vibration of gripping object, this method has the disadvantage camera light Louis by workpiece Or manipulator is blocked, the scope of application is influenceed, in addition, Detection results are also more sensitive to ambient lighting;Two be to use tactile sensing Device method, i.e., by mechanical finger and grabbed the tactile sensing information of object and judge to be gripped the attitudes vibration of object, this Method is although can overcome the shortcoming of visible sensation method, and it has the disadvantage qualitatively to detect, can only detect by gripping gestures of object Change is translation, rotation or rolled, it is impossible to carry out quantitative measurment to attitudes vibration.
The content of the invention
The invention aims to overcome the deficiencies in the prior art, a kind of posture of the object gripped by manipulator is proposed Change detecting device and method, this method realize that its principle is by three sliding feeling sensors being embedded into mechanical finger It is then empty by geometry first using body surface slide displacement amount size and direction before and after the change of sliding feeling sensor test pose Anaplasia is got the translation distance by gripping object along x, y, z direction in return and changed around the corner of x, y, z axle, so as to realize object appearance The quantitative detection of state change.Detection method can be further used for being gripped the monitoring of object continuous path and track.
To achieve these goals, the present invention is adopted the following technical scheme that:
A kind of attitudes vibration detection means of the object gripped by manipulator, including:Manipulator grips system, the machinery Hand gripping system includes manipulator and two mechanical fingers;The manipulator is cross bar, and two mechanical fingers are to be hung down with cross bar Straight montant, two mechanical fingers are separately mounted to the two ends of cross bar;Described two mechanical fingers are used to clamp object;
The first sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2 is embedded on one of mechanical finger, separately The 3rd sliding feeling sensor sensor3 is embedded on an outer mechanical finger;Three sliding feeling sensors are that isotropic directivity slides feel sensing Device, each sliding feeling sensor is used to detection slide displacement amount and glide direction;
Each sliding feeling sensor is connected by corresponding A/D converter with memory, and the memory is connected with DSP, The DSP is connected by USB interface with control computer.
A kind of attitudes vibration detection method of the object gripped by manipulator, comprises the following steps:
Step (1):Set up robot coordinate system O-XYZ;Set up by gripping object coordinates system o-xyz;Three sliding feels are passed Sensor is respectively installed in mechanical finger;
Step (2):Mechanical finger gripping object on manipulator, each sliding feeling sensor collection is by the coordinate of gripping object Value;
Under current state state (n-1), the coordinate value of i-th of sliding feeling sensor output is: Wherein, i=1,2..., 3, corresponding to sensor1, sensor2, tri- sliding feeling sensors of sensor3;
In the case where the state state (n) after sliding is produced on mechanical finger by gripping object, i-th of sliding feeling sensor The coordinate value of output is:
Step (3):By sensor1, sensor2, the coordinate value of tri- sliding feeling sensor detection outputs of sensor3 is expressed as Point set P={ p1,p2,p3, then
Corresponding to state (n-1) state, sensor1, sensor2, tri- sliding feeling sensor detections of sensor3 are exported Point set Pstate(n-1)It is expressed as:
Wherein,First sliding feeling sensor sensor1 detections output under state (n-1) state of expression Coordinate value;
The seat of second sliding feeling sensor sensor2 detections output under state (n-1) state of expression Scale value;
The seat of 3rd sliding feeling sensor sensor3 detections output under state (n-1) state of expression Scale value;
Corresponding to state (n) states, sensor1, sensor2, the point set of tri- sensor detection outputs of sensor3 Pstate(n)It is expressed as:
Wherein,The seat of first sliding feeling sensor sensor1 detections output under state (n) state of expression Scale value;
The coordinate value of second sliding feeling sensor sensor2 detections output under state (n) state of expression;
The coordinate of 3rd sliding feeling sensor sensor3 detections output under state (n) state of expression Value;
Then, two point set Pstate(n-1)With Pstate(n)Between relational expression be expressed as:
Pstate(n)=RPstate(n-1)+ T, (1);
Wherein,It is the spin matrix between two point sets;T=[t11 t12 t13]TBe two point sets it Between translation matrix;Spin matrix includes testee by state state (n-1) to rotation information during state (n);It is flat Move matrix and include testee by state state (n-1) to translation information during state (n);
Step (4):Spin matrix R and translation matrix T is solved based on cross covariance matrix singular value decomposition method, point The straight-line displacement Δ x, Δ y, the Δ z that obtain being gripped object in x, y, z direction of principal axis are not calculated by spin matrix R and translation matrix T And around rotational angle theta x, θ y, θ z of three reference axis;So as to realize the detection of the attitudes vibration of object;
Δ x=t11, (2);
Δ y=t12, (3);
Δ z=t13, (4);
θx=atan2 (r32,r33), (5);
θz=atan2 (r21,r11), (7);
Wherein, atan2 () is arctan function.
The step (4) is based on cross covariance matrix singular value decomposition method and solves spin matrix R and translation matrix T The step of be:
Step (401):
Define rotating vector:Wherein, q0> 0, and,
Define translation vector:
ByWithBuild complete state vector:
Step (402):Build target equationBy iteration, makeIt is intended to 0,
Step (403):Calculate point set Pstate(n-1)BarycenterCalculate point set Pstate(n)Barycenter
Then, point set Pstate(n-1)And Pstate(n)Cross covariance matrix ∑ p be expressed as:
Optimal spin matrix is calculated with cross covariance matrix ∑ p, and calculates optimal translation vector, step is:
Build antisymmetric matrixAnd constitute column vector with the cyclical component of antisymmetric matrix: Δ=[A23 A31 A12]T, with column vector Δ=[A23 A31 A12]TFurther construct 4*4 symmetrical matrixes Q (∑ p):
Wherein, I3It is 3*3 unit matrixs, corresponding to matrix Q, (characteristic vector of ∑ p) eigenvalue of maximum is rotating vectorSpin matrix R is obtained using the component of rotating vector,
Calculate translation vector
Calculate translation matrix T:
Step (5):Detection is by the multi-pose consecutive variations situation of gripping object, using formula (1), makes n=1, and 2 ... ..., obtain To by the continuous path change of gripping object, realize to by the monitoring and tracking of gripping object.
It is preferred that, in the step (1), the step of setting up robot coordinate system O-XYZ is:
Using the direction with the vertical direction centerline parallel of montant as Y-axis, using with cross bar and Y-axis vertical direction as X Axle, using with X-axis and Y-axis vertical direction as Z axis;
It is preferred that, in the step (1), the step of setting up by gripping object coordinates system o-xyz is:
Using central axis on the vertical section of held object as x-axis, with parallel with Z axis on the cross section of held object Line z-axis, using on by the vertical section of gripping object with x-axis and z-axis vertical line as y-axis;
It is preferred that, in the step (1), three sliding feeling sensors are respectively installed to the position in mechanical finger:
In the Y-axis direction, the triangle of three sliding feeling sensor compositions should be parallel with O-XZ faces;
3rd sliding feeling sensor sensor3 is located at the center of mechanical finger in the X-axis direction;
First sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2 are vertical on mechanical finger in the X direction Center line is symmetrical.
Beneficial effects of the present invention:
1. the present invention can be detected quantitatively by the attitudes vibration amount of gripping object, including translation distance, the anglec of rotation etc., Using these detection limits, can easily it realize to being monitored and being tracked by the continuous path of gripping object.
2. the present invention is realized by being embedded in the sliding feeling sensor inside mechanical finger, simple in construction, not by external rings Border influences, strong antijamming capability.
Brief description of the drawings
Fig. 1 is the composition of detecting system of the present invention;
Fig. 2 (a) is sliding feeling sensor sensor1 Cleaning Principle schematic diagram;
Fig. 2 (b) is sliding feeling sensor sensor2 Cleaning Principle schematic diagram;
Fig. 2 (c) is sliding feeling sensor sensor3 Cleaning Principle schematic diagram;
Fig. 3 is the electric connecting relation figure of three sliding feeling sensors and Manipulator Controller;
Wherein, 1, manipulator, 2, mechanical finger, 3, sliding feeling sensor, 4, by gripping object.
Embodiment
The invention will be further described with embodiment below in conjunction with the accompanying drawings.
As shown in figure 1, the detecting system that designs of the present invention by manipulator 1, mechanical finger 2, three sliding feeling sensors 3 and Constituted by gripping object 4.Wherein, manipulator 1 and the composition manipulator gripping system of mechanical finger 2;Sliding feeling sensor 3 is three complete Directionality sliding feeling sensor, the sensor can detect slide displacement amount size, and glide direction can be detected again.Such as Fig. 2 (a), Fig. 2 (b) it is and shown in Fig. 2 (c) sliding feeling sensor sensor1, sensor2, sensor3 Cleaning Principle schematic diagram, wherein state (n-1) coordinate that lower three sensors of current state are detected respectively is represented;State (n) is represented when by gripping object and machinery After finger surface is slided, the changes in coordinates situation that each sensor is detected.In the present invention, three sliding feeling sensors are (such as Following principle arrangement Fig. 1) is installed:
1. in the Y-axis direction, three sensor groups into triangle should be parallel with O-XZ faces;
2. the 3rd sliding feeling sensor sensor3 is located at the center of mechanical finger in the X-axis direction;
3. the first sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2 are in the X direction in mechanical finger Heart line is symmetrical.
As shown in figure 1, in the present invention with by gripping object x, y, z direction of principal axis straight-line displacement Δ x, Δ y, Δ z, and Described around rotational angle theta x, θ y, θ z of three reference axis by grip object by state state (n-1) (in Fig. 1 by gripping object 4) state state (n) (in Fig. 1 shown in dotted line) attitudes vibration is arrived, main task of the invention is to realize these parameters Detection.
In order to realize the detection of above-mentioned each parameter, realization principle of the invention is as follows:
1. as shown in Fig. 2 (a)-Fig. 2 (c), it is assumed that in working as each sliding feeling sensor detection outputs of state state (n-1) Preceding coordinate value is respectively:(correspond to state after slip is produced by gripping object and mechanical finger State (n)), the present coordinate values of each sliding feeling sensor detection output are expressed as:Wherein, i=1, 2..., m, herein, m=3, corresponding to sensor1, sensor2, tri- sliding feeling sensors of sensor3.
2. the present coordinate values for three sensors being detected into output are expressed as point set P={ p0,p1,p2, then correspond to Two states of state (n-1) and state (n), the point set of three sensor detection outputs is expressed as:
And
Then, the relational expression between two point sets is expressed as:
Pstate(n)=RPstate(n-1)+ T, (1);
Wherein, matrixWith T=[t11 t12 t13]TBe respectively spin matrix between two point sets and Translation matrix.The two matrixes include testee by state state (n-1) to rotation information during state (n) and translation Information.
3. respectively by matrix T and R calculate obtain describe gestures of object change parameters Δ x, Δ y, Δ z and θ x, θ y, θz.Specially:
Δ x=t11, (2);
Δ y=t12, (3);
Δ z=t13, (4);
θx=atan2 (r32,r33), (5);
θz=atan2 (r21,r11), (7);
Wherein, atan2 () is arctan function.
Therefore, in order to solve the parameters of description attitudes vibration, it is necessary to according to point set Pstate(n-1)And Pstate(n) Solve in spin matrix R and translation matrix T, the present invention, based on cross covariance matrix singular value decomposition (SVD) method, quilt Calculating for the two matrixes.Concretely comprise the following steps:
Its solution procedure is as follows:
1. rotating vector is defined:Wherein, q0> 0, and,
Define translation vector:ByWithBuild complete state vector:
2. target equation is builtBy iteration, it is set to be intended to 0,
3. point set P is calculatedstate(n-1)Barycenter,
Calculate point set Pstate(n)Barycenter,
In formula, m=3.
Then, point set Pstate(n-1)And Pstate(n)Cross covariance matrix be expressed as:
(4) optimal spin matrix is calculated with cross covariance matrix ∑ p, and calculates optimal translation vector, step is:
Build antisymmetric matrixAnd constitute column vector with its cyclical component:Δ=[A23 A31 A12]T, 4*4 symmetrical matrixes are further constructed with this column vector:
Wherein, I3It is 3*3 unit matrixs, corresponding to matrix Q, (characteristic vector of ∑ p) eigenvalue of maximum is rotating vectorBring its component into following formula and can be obtained by spin matrix R,
Translation vector is further calculated,
Then, translation matrix is calculated by following formula;
Calculated respectively according to above step and obtain translation matrix T and spin matrix R, calculated respectively using formula (2)-(7) To by gripping object x, y, z direction of principal axis straight-line displacement Δ x, Δ y, Δ z, and around rotational angle theta x, θ y, θ of three reference axis Z, so as to realize the detection of the attitudes vibration of object.
Above-mentioned steps are given by two adjacent states of gripping object, from state (n-1) to state (n) attitudes vibrations Detection method, if desired for detection by the multi-pose consecutive variations situation of gripping object, using formula (1), makes n=1, 2... it ..., i.e. can obtain being gripped the continuous path change of object, realize to by the monitoring and tracking of gripping object.
The electric connecting relation of three sliding feeling sensors sensor1, sensor2, sensor3 and Manipulator Controller is as schemed Shown in 3.The electric signal of three sliding feeling sensor outputs is changed into being stored in memory after data signal respectively through A/D converter In, DPS processors read the input signal of three sliding feeling sensors and generate present coordinate values respectively from memory.It is current to sit The generation method of scale value is as follows:
The corresponding coordinate value of the input signal of three sensors is initial coordinate values when 1. gripping object for the first time using manipulator, It is designated as respectivelyWherein, C is sliding feeling sensor Distance between sensor1 and sensor2 installation centers in the x-direction;D is the distance of two mechanical fingers (montant) in the z-direction. And the initial coordinate values are stored in memory;
2. the slide displacement amount size exported based on initial coordinate values according to isotropic directivity sliding feeling sensor and slip Direction, the present coordinate values of three sensors obtained under state state (n) are calculated by coordinate.
Present coordinate values are sent to control computer through USB communication module, then by control computer by calculating completion quilt Grip the attitudes vibration detection of object.
Although above-mentioned the embodiment of the present invention is described with reference to accompanying drawing, not to present invention protection model The limitation enclosed, one of ordinary skill in the art should be understood that on the basis of technical scheme those skilled in the art are not Need to pay various modifications or deform still within protection scope of the present invention that creative work can make.

Claims (9)

1. a kind of attitudes vibration detection means of the object gripped by manipulator, it is characterized in that, including:Manipulator grips system, The manipulator gripping system includes manipulator and two mechanical fingers;The manipulator is cross bar, and two mechanical fingers are The montant vertical with cross bar, two mechanical fingers are separately mounted to the two ends of cross bar;Described two mechanical fingers are used to clamp thing Body;
The first sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2, in addition one are embedded on one of mechanical finger The 3rd sliding feeling sensor sensor3 is embedded on individual mechanical finger;Three sliding feeling sensors are isotropic directivity sliding feeling sensor, often Individual sliding feeling sensor is used to detection slide displacement amount and glide direction;
Each sliding feeling sensor is connected by corresponding A/D converter with memory, and the memory is connected with DSP, described DSP is connected by USB interface with control computer.
2. a kind of attitudes vibration detection method of the object gripped by manipulator, it is characterized in that, comprise the following steps:
Step (1):Set up robot coordinate system O-XYZ;Set up by gripping object coordinates system o-xyz;By three sliding feeling sensors It is respectively installed in mechanical finger;
Step (2):Mechanical finger gripping object on manipulator, each sliding feeling sensor collection is by the coordinate value of gripping object;
Under current state state (n-1), the coordinate value of i-th of sliding feeling sensor output is:Wherein, I=1,2..., 3, corresponding to sensor1, sensor2, tri- sliding feeling sensors of sensor3;
In the case where the state state (n) after sliding is produced on mechanical finger by gripping object, i-th of sliding feeling sensor output Coordinate value be:
Step (3):By under state (n-1) states and state (n) states, sensor1, sensor2, sensor3 tri- slides feel The coordinate value of sensor detection output is expressed as point set P={ p1,p2,p3, then, and two point set Pstate(n-1)With Pstate(n)Between Relational expression be expressed as:
Pstate(n)=RPstate(n-1)+ T, (1);
Wherein,It is the spin matrix between two point sets;T=[t11 t12 t13]TIt is between two point sets Translation matrix;Spin matrix includes testee by state state (n-1) to rotation information during state (n);Translate square Battle array includes testee by state state (n-1) to translation information during state (n);
Step (4):Based on cross covariance matrix singular value decomposition method solve spin matrix R and translation matrix T, respectively by Spin matrix R and translation matrix T calculate obtain being gripped object the straight-line displacement Δ x of x, y, z direction of principal axis, Δ y, Δ z and around Rotational angle theta x, θ y, θ z of three reference axis;So as to realize the detection of the attitudes vibration of object.
3. method as claimed in claim 2, it is characterized in that, in the step (4),
Δ x=t11, (2);
Δ y=t12, (3);
Δ z=t13, (4);
θx=atan2 (r32,r33), (5);
<mrow> <msub> <mi>&amp;theta;</mi> <mi>y</mi> </msub> <mo>=</mo> <mi>a</mi> <mi>t</mi> <mi>a</mi> <mi>n</mi> <mn>2</mn> <mrow> <mo>(</mo> <mo>-</mo> <msub> <mi>r</mi> <mn>31</mn> </msub> <mo>,</mo> <msqrt> <mrow> <msubsup> <mi>r</mi> <mn>32</mn> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>r</mi> <mn>33</mn> <mn>2</mn> </msubsup> </mrow> </msqrt> <mo>)</mo> </mrow> <mo>,</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>6</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
θz=atan2 (r21,r11), (7);
Wherein, atan2 () is arctan function.
4. method as claimed in claim 2, it is characterized in that, in the step (3),
Corresponding to state (n-1) state, sensor1, sensor2, the point set of tri- sliding feeling sensor detection outputs of sensor3 Pstate(n-1)It is expressed as:
<mrow> <msup> <mi>P</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msup> <mo>=</mo> <mo>{</mo> <msubsup> <mi>p</mi> <mn>1</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>p</mi> <mn>2</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>p</mi> <mn>3</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>3</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>3</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mo>}</mo> <mo>;</mo> </mrow>
Wherein,The seat of first sliding feeling sensor sensor1 detections output under state (n-1) state of expression Scale value;
The coordinate value of second sliding feeling sensor sensor2 detections output under state (n-1) state of expression;
The coordinate value of 3rd sliding feeling sensor sensor3 detections output under state (n-1) state of expression;
Corresponding to state (n) states, sensor1, sensor2, the point set P of tri- sensor detection outputs of sensor3state(n) It is expressed as:
<mrow> <msup> <mi>P</mi> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> </msup> <mo>=</mo> <mo>{</mo> <msubsup> <mi>p</mi> <mn>1</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>1</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>p</mi> <mn>2</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>2</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> <mo>,</mo> <msubsup> <mi>p</mi> <mn>3</mn> <mrow> <mi>s</mi> <mi>t</mi> <mi>a</mi> <mi>t</mi> <mi>e</mi> <mrow> <mo>(</mo> <mi>n</mi> <mo>)</mo> </mrow> </mrow> </msubsup> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mn>3</mn> </msub> <mo>,</mo> <msub> <mi>y</mi> <mn>3</mn> </msub> <mo>,</mo> <msub> <mi>z</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> <mo>}</mo> <mo>;</mo> </mrow>
Wherein,The coordinate value of first sliding feeling sensor sensor1 detections output under state (n) state of expression;
The coordinate value of second sliding feeling sensor sensor2 detections output under state (n) state of expression;
The coordinate value of 3rd sliding feeling sensor sensor3 detections output under state (n) state of expression.
5. method as claimed in claim 2, it is characterized in that, the singular value of the step (4) based on cross covariance matrix point Solution method solve spin matrix R and translation matrix T the step of be:
Step (401):
Define rotating vector:Wherein, q0> 0, and,
Define translation vector:
ByWithBuild complete state vector:
Step (402):Build target equationBy iteration, makeIt is intended to 0,
Step (403):Calculate point set Pstate(n-1)BarycenterCalculate point set Pstate(n)Barycenter
Then, point set Pstate(n-1)And Pstate(n)Cross covariance matrix ∑ p be expressed as:
Optimal spin matrix is calculated with cross covariance matrix ∑ p, and calculates optimal translation vector, step is:
Build antisymmetric matrixAnd constitute column vector with the cyclical component of antisymmetric matrix:Δ= [A23 A31 A12]T, with column vector Δ=[A23 A31 A12]TFurther construct 4*4 symmetrical matrixes Q (∑ p):
<mrow> <mi>Q</mi> <mrow> <mo>(</mo> <mi>&amp;Sigma;</mi> <mi>p</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <mi>t</mi> <mi>r</mi> <mrow> <mo>(</mo> <mi>&amp;Sigma;</mi> <mi>p</mi> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <msup> <mi>&amp;Delta;</mi> <mi>T</mi> </msup> </mtd> </mtr> <mtr> <mtd> <mi>&amp;Delta;</mi> </mtd> <mtd> <mrow> <mi>&amp;Sigma;</mi> <mi>p</mi> <mo>+</mo> <msubsup> <mi>&amp;Sigma;</mi> <mi>p</mi> <mi>T</mi> </msubsup> <mo>-</mo> <mi>t</mi> <mi>r</mi> <mrow> <mo>(</mo> <mi>&amp;Sigma;</mi> <mi>p</mi> <mo>)</mo> </mrow> <msub> <mi>I</mi> <mn>3</mn> </msub> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>12</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Wherein, I3It is 3*3 unit matrixs, corresponding to matrix Q, (characteristic vector of ∑ p) eigenvalue of maximum is rotating vectorSpin matrix R is obtained using the component of rotating vector,
<mrow> <mi>R</mi> <mo>=</mo> <mfenced open = "[" close = "]"> <mtable> <mtr> <mtd> <mrow> <msubsup> <mi>q</mi> <mn>0</mn> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>q</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>3</mn> <mn>2</mn> </msubsup> </mrow> </mtd> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <msub> <mi>q</mi> <mn>2</mn> </msub> <mo>-</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>+</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <msub> <mi>q</mi> <mn>2</mn> </msub> <mo>+</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msubsup> <mi>q</mi> <mn>0</mn> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>q</mi> <mn>2</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>3</mn> <mn>2</mn> </msubsup> </mrow> </mtd> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>2</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>-</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>1</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>-</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>2</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <mn>2</mn> <mrow> <mo>(</mo> <msub> <mi>q</mi> <mn>2</mn> </msub> <msub> <mi>q</mi> <mn>3</mn> </msub> <mo>+</mo> <msub> <mi>q</mi> <mn>0</mn> </msub> <msub> <mi>q</mi> <mn>1</mn> </msub> <mo>)</mo> </mrow> </mrow> </mtd> <mtd> <mrow> <msubsup> <mi>q</mi> <mn>0</mn> <mn>2</mn> </msubsup> <mo>+</mo> <msubsup> <mi>q</mi> <mn>3</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>1</mn> <mn>2</mn> </msubsup> <mo>-</mo> <msubsup> <mi>q</mi> <mn>2</mn> <mn>2</mn> </msubsup> </mrow> </mtd> </mtr> </mtable> </mfenced> <mo>,</mo> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>13</mn> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Calculate translation vector
Calculate translation matrix T:
6. method as claimed in claim 2, it is characterized in that, in addition to:
Step (5):Detection is by the multi-pose consecutive variations situation of gripping object, using formula (1), makes n=1,2 ... ..., obtain by The continuous path change of object is gripped, is realized to by the monitoring and tracking of gripping object.
7. method as claimed in claim 2, it is characterized in that, in the step (1), set up robot coordinate system O-XYZ step Suddenly it is:
Using the direction with the vertical direction centerline parallel of montant as Y-axis, using with cross bar and Y-axis vertical direction as X-axis, with It is Z axis with X-axis and Y-axis vertical direction.
8. method as claimed in claim 2, it is characterized in that, in the step (1), set up by gripping object coordinates system o-xyz The step of be:
Using central axis on the vertical section of held object as x-axis, with line z parallel with Z axis on the cross section of held object Axle, using on by the vertical section of gripping object with x-axis and z-axis vertical line as y-axis.
9. method as claimed in claim 2, it is characterized in that, in the step (1), three sliding feeling sensors are respectively installed to Position in mechanical finger:
In the Y-axis direction, the triangle of three sliding feeling sensor compositions should be parallel with O-XZ faces;
3rd sliding feeling sensor sensor3 is located at the center of mechanical finger in the X-axis direction;
First sliding feeling sensor sensor1 and the second sliding feeling sensor sensor2 are in the X direction on the vertical center of mechanical finger Line is symmetrical.
CN201710691102.6A 2017-08-14 2017-08-14 A kind of the attitudes vibration detection device and method of the object clamped by manipulator Expired - Fee Related CN107322601B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710691102.6A CN107322601B (en) 2017-08-14 2017-08-14 A kind of the attitudes vibration detection device and method of the object clamped by manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710691102.6A CN107322601B (en) 2017-08-14 2017-08-14 A kind of the attitudes vibration detection device and method of the object clamped by manipulator

Publications (2)

Publication Number Publication Date
CN107322601A true CN107322601A (en) 2017-11-07
CN107322601B CN107322601B (en) 2019-06-28

Family

ID=60225790

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710691102.6A Expired - Fee Related CN107322601B (en) 2017-08-14 2017-08-14 A kind of the attitudes vibration detection device and method of the object clamped by manipulator

Country Status (1)

Country Link
CN (1) CN107322601B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239388A (en) * 2018-09-10 2019-01-18 清华大学深圳研究生院 A kind of tactile dynamic sensing method of electronic skin
CN112672861A (en) * 2019-03-22 2021-04-16 欧姆龙株式会社 Robot, robot control method, and program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61284391A (en) * 1985-06-11 1986-12-15 三菱電機株式会社 Tactile sensor
CN102120326A (en) * 2011-01-14 2011-07-13 常州大学 Manipulator arm grabbing and sliding detection method and device based on image processing technology
CN102501257A (en) * 2011-10-12 2012-06-20 上海应用技术学院 Multi-directional slip sensor
CN202716271U (en) * 2012-08-04 2013-02-06 合肥泰禾光电科技股份有限公司 Joint-type medium-duty intelligent control palletizing robot
CN206029943U (en) * 2016-09-23 2017-03-22 安徽工程大学 Machinery indicates dexterous hand more

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61284391A (en) * 1985-06-11 1986-12-15 三菱電機株式会社 Tactile sensor
CN102120326A (en) * 2011-01-14 2011-07-13 常州大学 Manipulator arm grabbing and sliding detection method and device based on image processing technology
CN102501257A (en) * 2011-10-12 2012-06-20 上海应用技术学院 Multi-directional slip sensor
CN202716271U (en) * 2012-08-04 2013-02-06 合肥泰禾光电科技股份有限公司 Joint-type medium-duty intelligent control palletizing robot
CN206029943U (en) * 2016-09-23 2017-03-22 安徽工程大学 Machinery indicates dexterous hand more

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109239388A (en) * 2018-09-10 2019-01-18 清华大学深圳研究生院 A kind of tactile dynamic sensing method of electronic skin
CN109239388B (en) * 2018-09-10 2020-09-25 清华大学深圳研究生院 Electronic skin touch dynamic sensing method
CN112672861A (en) * 2019-03-22 2021-04-16 欧姆龙株式会社 Robot, robot control method, and program

Also Published As

Publication number Publication date
CN107322601B (en) 2019-06-28

Similar Documents

Publication Publication Date Title
CN103226398B (en) Based on the data glove of micro-inertia sensor network technology
CN100491083C (en) Flexible double-wheel self-balancing robot posture detecting method
CN102426477A (en) Gesture detecting method and detecting device
CN103487011B (en) A kind of attitude angle detection method of data glove
CN1947960A (en) Environment-identification and proceeding work type real-man like robot
CN105487689A (en) Ring mouse and method for operating mobile terminal through same
CN104081322A (en) Electronic apparatus
JP2004288188A (en) Pen type input system using magnetic sensor, and its trajectory restoration method
CN107322601A (en) The attitudes vibration detection means and method of a kind of object gripped by manipulator
CN107894854A (en) Stylus is modeled as to the touch-control electronic system, touch-control processing unit and method of rocking bar
CN103294226A (en) Virtual input device and virtual input method
CN105278382A (en) Method of intelligent wearable equipment for controlling automobile central control system in wireless remote manner
CN211479075U (en) AR/VR Interactive handle
Cannan et al. A Multi-sensor armband based on muscle and motion measurements
CN106482709B (en) A kind of method, apparatus and system of distance survey
CN203966058U (en) With the hand motion acquisition equipment that utilizes physical construction to realize of force feedback
Frigola et al. Human-robot interaction based on a sensitive bumper skin
CN205808422U (en) A kind of attitude sensing device
CN105302307B (en) The method that directional information progress behavior matching is obtained by acceleration transducer
Lam et al. Motion sensing for robot hands using MIDS
CN105243720B (en) The method that pulse data carries out gate inhibition&#39;s safe identification is obtained by Intelligent bracelet
CN105425660A (en) Analysis matching work method realizing human body behavior determination through inertia sensor
CN110877335A (en) Self-adaptive unmarked mechanical arm track tracking method based on hybrid filter
Wang et al. Intuitive Maneuver of Autonomous Vehicles without Physical Control Interfaces using Wearable Sensing Devices
CN217238120U (en) Attitude sensor module circuit

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190628

Termination date: 20200814