CN103112007A - Human-machine interaction method based on mixing sensor - Google Patents

Human-machine interaction method based on mixing sensor Download PDF

Info

Publication number
CN103112007A
CN103112007A CN201310047285XA CN201310047285A CN103112007A CN 103112007 A CN103112007 A CN 103112007A CN 201310047285X A CN201310047285X A CN 201310047285XA CN 201310047285 A CN201310047285 A CN 201310047285A CN 103112007 A CN103112007 A CN 103112007A
Authority
CN
China
Prior art keywords
operator
sin
cos
centerdot
hand
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310047285XA
Other languages
Chinese (zh)
Other versions
CN103112007B (en
Inventor
张平
杜广龙
卢晓敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201310047285.XA priority Critical patent/CN103112007B/en
Publication of CN103112007A publication Critical patent/CN103112007A/en
Application granted granted Critical
Publication of CN103112007B publication Critical patent/CN103112007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention provides a human-machine interaction method based on a mixing sensor. The human-machine interaction method based on the mixing sensor comprises steps as below: obtaining and processing red green blue (RGB) images and depth images during an operation task of a robot through hand movement of an operator in real time, and obtaining the position of the hand of the operator; using an inertia measurement device to measure gesture of the hand of the operator and evaluating the gesture data together with a quaternion algorithm (FQA) and Kalman filtering (KFs) to obtain stable gesture of the hand of the operator; driving a robot according to the gesture of the hand of the operator by the mentioned steps. The human-machine interaction method based on the mixing sensor allows an operator to control the robot through three-dimensional hand movements, allows the operator to use the action of hands and arms to control the robot without limits of a physical device, and enables the operator to simply concentrate on the task at hand without dividing the task into limited commands.

Description

Man-machine interaction method based on hybrid sensor
Technical field
The invention belongs to the robot motion field, particularly a kind of man-machine interaction method based on hybrid sensor.
Background technology
When the mankind are not suitable for appearing at robot site, it is very necessary utilizing the operator to handle tele-robotic in the destructuring dynamic environment.For reaching the object run task, the machinery that distant operation is used and other contact interfaces need factitious mankind's action, and they tend to hinder human action.Existing based on vision man-machine exchanged form otherwise only use the minority free degree of staff, or need the operator to do Large Amplitude Motion, very effort operates.This piece invention has proposed a kind of discontiguous teleoperation of robot method based on multisensor, and it allows the operator to come the complete operation task by the staff three-dimensional motion control six degree of freedom virtual robot of nature.Man-machine interface based on multisensor helps the operator end user manually to make control and understands robot motion and surrounding environment.
The device that the normally used distant formula man-machine interface of operating machine has comprised that multiple operator wears, such as the skeleton-type mechanical device, tool glove, electromagnetism or inertia motion tracking sensor, perhaps electromyographic activity sensor.Yet the device that these are worn, sensor also have their line may hinder the operator to move flexibly.The interface of other contacts is such as imitative robot mechanical arm, rotating disk, and rocking bar, computer mouse and computer graphical interface etc. all need the factitious action of operator or need the operator to learn how to operate.Used the control mode of gesture identification and speech recognition combination based on the interaction technique of vision, it allows the operator more naturally to provide order intuitively and there is no restriction on physical unit.Yet under this control, the action of controlled robot all is comprised of some simple orders, such as position and the orientation of determining end effector, crawl also has a series of less tasks as above to move to move down, rotation suspends, and continues the pattern that also changes and operate.When the complexity of the robotic manipulator that requires action was higher, this technology all can be difficult to operation.The hand motion of single camera is followed the tracks of and has been used in the simulation of teleoperation of robot and distant operation.Yet action command only is confined in the subset of the real end effector free degree.In addition, the gesture that those and specific operation task have nothing to do is used to change the operator scheme of robot, so these have all caused the interchange of operation task natural.Ideally, a kind ofly natural need not be limited instruction with the Task-decomposing of complexity and do not have the method for contact physical unit restriction required by people.
Summary of the invention
This invention has proposed a kind of man-machine interaction method based on hybrid sensor, allows the operator to come control by the hand motion of three-dimensional.This method has been used contactless based on the man-machine interface based on hybrid sensor, and it can feed back in addition by operator's action control robot the mutual situation of robot motion and robot and surrounding environment object, and concrete technical scheme is as follows.
Based on the man-machine interaction method of hybrid sensor, it comprises the steps:
S1, Real-time Obtaining are also processed the operator and are carried out RGB image and depth image in robot manipulation's task process by hand exercise, obtain operator's hand position;
S2, use inertia measurement instrument are measured operator's hand attitude, and in conjunction with Quaternion Algorithm (FQA) and Kalman filtering (KFs), attitude data are estimated, thereby obtain stable operator's hand attitude;
S3, the operator's hand pose driven machine people who obtains according to above-mentioned steps.
Further, described step S1 comprises the following steps:
When the operator carries out robot manipulation's task by hand exercise, catch RGB image and depth image by the Kinect that is fixed on the operator front, RGB image and the depth image processed continuously in this process are realized hand tracking and location; Kinect has three kinds of automatic focusing camera heads: optimize the infrared camera of depth detection and the vision spectrum camera that is used for visual identity of a standard for two;
Extract the human skeleton artis from depth image; 15 artis of this that extracts as shown in Figure 1, from top to bottom and from left to right sequence is: (1) head; (2) shoulder center; (3) right shoulder; (4) right hand elbow; (5) right finesse; (6) left shoulder; (7) left hand elbow; (8) left finesse; (9) hip joint center; (10) right hip; (11) right knee; (12) right crus of diaphragm; (13) left hip; (14) left knee; (15) left foot; The coordinate of these 15 artis is called as the Kinect coordinate; As shown in Figure 2, the position of operator's right hand may be defined as right finesse P 5The position, but so directly utilize P 5The positional information impact that comes control may cause the position of robot to be subjected to operator's health to move or tremble; In order to address this problem, with shoulder joint P 3As a reference point and build reference point
Figure BDA00002824072600021
Right hand framework;
With respect to reference point
Figure BDA00002824072600022
P 5Position P ' 5For:
P′ 5=P 5-P 3 (1)
The definition right hand with respect to the moving range of right shoulder is W RH 1 < x < W RH 2 , H RH 1 < y < H RH 2 , L RH 1 < z < L RH 2 , And the moving range of robot is W RR 1 < x < W RR 2 , H RR 1 < y < H RR 2 , L RR 1 < z < L RR 2 , The mapping point of robot moving range is:
R x R = ( P 8 _ x &prime; - W RH 1 ) ( W RH 2 - W RH 1 ) &CenterDot; ( W RR 2 - W RR 1 ) + W RR 1 R y R = ( P 8 _ y &prime; - H RH 1 ) ( H RH 2 - H RH 1 ) &CenterDot; H RR 2 - H RR 1 + H RR 1 R z R = ( P 8 _ z &prime; - L RH 1 ) ( L RH 2 - L RH 1 ) &CenterDot; ( L RR 2 - L RR 1 ) + L RR 1 - - - ( 2 )
Described step S2 comprises the following steps:
When the operator carried out robot manipulation's task by hand exercise, the right hand was held an inertia measurement instrument, and this inertia measurement instrument can be determined rolling, pitching and the deflection attitude of operator's right hand; The inertia measurement instrument is by a triaxial accelerometer, and two two axis gyroscope instrument and three axle magnetometers form;
Define three coordinate systems: operator's body coordinate system x by bz b, inertia measurement instrument coordinate system x sy sz sWith terrestrial coordinate system x ey ez eInertia measurement instrument coordinate system x sy sz sThe accelerometer that corresponding three quadratures are installed and the axle of magnetometer; Because the inertia measurement instrument is arranged on the robot end rigidly, suppose operator's body coordinate system x by bz bWith inertia measurement instrument coordinate system x sy sz sIn conjunction with; Earth fixed coordinate system x ey ez eFollow north-Dong-under (NED) pact, wherein x eEnergized north, y ePoint to east and z eUnder sensing; The inertia measurement instrument can be measured rolling, pitching and the deflection attitude of himself; Definition is around x eRotation φ represent to roll, around y eThe rotation θ of axle represents pitching and around z eThe rotation ψ of axle represents deflection; According to Euler's rotation theorem, from Eulerian angles φ, θ and ψ being converted to hypercomplex number q:
q = q 0 q 1 q 2 q 3 = cos ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) sin ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) - cos ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) cso ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) cos ( &theta; / 2 ) sin ( &psi; / 2 ) - sin ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) - - - ( 3 )
And being constrained to of four Euler Parameter:
q 0 2 + q 1 2 + q 2 2 + q 3 2 = 1 - - - ( 4 )
Here q 0Scalar part and (q 1, q 2, q 3) be the vector part; So be tied to the attitude cosine matrix of earth fixed coordinate system from inertia measurement instrument coordinate Can be represented as:
M s e = q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 0 q 2 + q 1 q 3 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 - q 1 2 + q 2 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 0 q 1 + q 2 q 3 ) q 0 2 - q 1 2 - q 2 2 - q 3 2 - - - ( 5 ) ;
Because gyroscope and magnetometer have white noise and random walk, in order to obtain stable data, utilize Kalman filtering to estimate the state of inertia measurement instrument from one group of noisy and incomplete measurement; Inertia measurement instrument state x from moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly kTransfer equation be:
x k=A k·x k-1+B·u k-1+w k-1 (6)
z k=H·x k+v k
Here x kBe the state vector of n * 1, represent the inertia measurement instrument state of k constantly; A is the state transition matrix of n * n; B is system's input matrix of n * p; u k-1Be the k-1 moment certainty input vector of p * 1, p is the number of certainty input; w k-1It is the k-1 n in the moment * 1 process noise vector; z kThat vector is measured in the k m in the moment * 1; H is the observation matrix of m * n and v kIt is the measurement noise vector of m * 1; N=7 herein, m=6; With the quaternion differential equation of time correlation be:
&PartialD; q 0 / &PartialD; t &PartialD; q 1 / &PartialD; t &PartialD; q 2 / &PartialD; t &PartialD; q 3 / &PartialD; t = q 0 - q 1 - q 2 - q 3 q 1 q 0 - q 3 q 2 q 2 q 3 q 0 - q 1 q 3 - q 2 q 1 q 0 &CenterDot; 0 v x / 2 v y / 2 v z / 2 - - - ( 7 )
V wherein x, v y, v zThat the inertia measurement instrument is at axle x s, y s, z sAngular speed; Due to state x kComprise hypercomplex number state and angular speed, so x kFollowing form is arranged:
x k=[q 0,kq 1,kq 2,kq 3,kv x,kv y,kv z,k] (8)
Here q 0, k, q 1, k, q 2, k, q 3, k, v x, k, v y, k, v z, kBe k hypercomplex number state and angular speed constantly, can get from equation (7), the state transfer equation is:
A k = 1 0 0 0 - q 1 , k &CenterDot; &Delta;t / 2 - q 2 , k &CenterDot; &Delta;t / 2 - q 3 , k &CenterDot; &Delta;t / 2 0 1 0 0 q 0 , k &CenterDot; &Delta;t / 2 q 3 , k &CenterDot; &Delta;t / 2 q 2 , k &CenterDot; &Delta;t / 2 0 0 1 0 q 3 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 - q 1 , k &CenterDot; &Delta;t / 2 0 0 0 1 - q 2 , k &CenterDot; &Delta;t / 2 q 1 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 - - - ( 9 )
Here Δ t is sample time, owing to there is no control inputs, allows B=0 N * p, Bu k-1=0 N * 1Estimate the hypercomplex number state with angular speed, obtain the process noise vector and be
w k=[0000 w x w y w z] T (10)
Here w x, w y, w zIt is the process noise of angular speed; Owing to coming measured angular speed with gyroscope, observation matrix H is
H=[0 3×4I 3×3] (11)
In order to satisfy (4), definition
Figure BDA00002824072600051
K certainty hypercomplex number q constantly kCan be standardized as:
q k=[q 0,k/D q 1,k/D q 2,k/D q 3,k/D] (12)
Arrive formula (12) in conjunction with formula (6), can be from the inertia measurement instrument state x of moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly k
Described step S3 comprises the following steps:
Utilize anti-solution of end pose of robot, then each joint is by anti-angular movement of separating, when the hand of the complete virtual robot of whole joint motions is grabbed the position that end just arrives appointment.There is no the situation of barrier around this situation is suitable for, the virtual robot arm motion can not bump with the surrounding environment barrier.
And in the Denavit-Hartenberg representation, A iThe homogeneous coordinate transformation matrix of expression from coordinate system i-1 to coordinate system i usually has:
A i = cos &theta; i - sin i cos &alpha; i sin &theta; i sin &alpha; i l i cos &theta; i sin &theta; i cos &theta; i cos &alpha; i - cos &theta; i sin &alpha; i l i sin &theta; i 0 sin &alpha; i cos &alpha; i r i 0 0 0 1 - - - ( 13 )
For a robot with six joints, definition support coordinate is coordinate system 1, and last coordinate is coordinate system 6, the homogeneous transformation matrix T from the support frame of axes to last frame of axes 6Be defined as:
T 6 = A 1 A 2 . . . A 6 = n 6 0 s 6 0 a 6 0 p 6 0 0 0 0 1 - - - ( 14 )
In formula
Figure BDA00002824072600054
Be the normal vector of paw,
Figure BDA00002824072600055
Be sliding vector,
Figure BDA00002824072600056
For near vector,
Figure BDA00002824072600057
Be position vector.
Utilize (14), (5) have:
T 6=M (15)
Obtain six joint motions angle values by finding the solution (15): (θ 1, θ 2... θ 6).
The present invention has following advantage and effect with respect to prior art:
The present invention proposes a kind of man-machine interaction method based on hybrid sensor, allow the operator to come control by the hand motion of three-dimensional.This interface based on hybrid sensor allows the operator to use the action control robot of hand and arm and there is no the restriction of physical unit, only need to also need not all be decomposed into those limited instructions to the task that energy concentrates at hand to task.
Description of drawings
Fig. 1 is the skeleton joint point diagram;
Fig. 2 is right hand joint point diagram;
Fig. 3 is based on the man-machine interaction flow chart of hybrid sensor.
The specific embodiment
The present invention is described in further detail below in conjunction with embodiment, but embodiments of the present invention are not limited to this.
Embodiment:
The present invention is based on hybrid sensor carries out man-machine interaction method and comprises the steps:
S1, Real-time Obtaining are also processed the operator and are carried out RGB image and depth image in robot manipulation's task process by hand exercise, obtain operator's hand position;
S2, use inertia measurement instrument are measured operator's hand attitude, and in conjunction with Quaternion Algorithm (FQA) and Kalman filtering (KFs), attitude data are estimated, thereby obtain stable operator's hand attitude;
S3, the operator's hand pose driven machine people who obtains according to above-mentioned steps.
Described step S1 comprises the following steps:
When the operator carries out robot manipulation's task by hand exercise, catch RGB image and depth image by the Kinect that is fixed on the operator front, RGB image and the depth image processed continuously in this process are realized hand tracking and location; Kinect has three kinds of automatic focusing camera heads: optimize the infrared camera of depth detection and the vision spectrum camera that is used for visual identity of a standard for two;
Extract the human skeleton artis from depth image; 15 artis of this that extracts as shown in Figure 1, from top to bottom and from left to right sequence is: (1) head; (2) shoulder center; (3) right shoulder; (4) right hand elbow; (5) right finesse; (6) left shoulder; (7) left hand elbow; (8) left finesse; (9) hip joint center; (10) right hip; (11) right knee; (12) right crus of diaphragm; (13) left hip; (14) left knee; (15) left foot; The coordinate of these 15 artis is called as the Kinect coordinate; As shown in Figure 2, the position of operator's right hand may be defined as right finesse P 5The position, but so directly utilize P 5The positional information impact that comes control may cause the position of robot to be subjected to operator's health to move or tremble; In order to address this problem, with shoulder joint P 3As a reference point and build reference point
Figure BDA00002824072600071
Right hand framework;
With respect to reference point
Figure BDA00002824072600072
P 5Position P ' 5For:
P 5=P 5-P 3 (1)
The definition right hand with respect to the moving range of right shoulder is w Rh 1 < x < w RH 2 , H RH 1 < y < H RH 2 , L RH 1 < z < L RH 2 , And the moving range of robot is W RR 1 < x < W RR 2 , H RR 1 < y < H RR 2 , L RR 1 < z < L RR 2 , The mapping point of robot moving range is:
R x R = ( P 8 _ x &prime; - W RH 1 ) ( W RH 2 - W RH 1 ) &CenterDot; ( W RR 2 - W RR 1 ) + W RR 1 R y R = ( P 8 _ y &prime; - H RH 1 ) ( H RH 2 - H RH 1 ) &CenterDot; H RR 2 - H RR 1 + H RR 1 R z R = ( P 8 _ z &prime; - L RH 1 ) ( L RH 2 - L RH 1 ) &CenterDot; ( L RR 2 - L RR 1 ) + L RR 1 - - - ( 2 )
Described step S2 comprises the following steps:
When the operator carried out robot manipulation's task by hand exercise, the right hand was held an inertia measurement instrument, and this inertia measurement instrument can be determined rolling, pitching and the deflection attitude of operator's right hand; The inertia measurement instrument is by a triaxial accelerometer, and two two axis gyroscope instrument and three axle magnetometers form;
Define three coordinate systems: operator's body coordinate system x by bz b, inertia measurement instrument coordinate system x sy sz sWith terrestrial coordinate system x ey ez eInertia measurement instrument coordinate system x sy sz sThe accelerometer that corresponding three quadratures are installed and the axle of magnetometer; Because the inertia measurement instrument is arranged on the robot end rigidly, suppose operator's body coordinate system x by bz bWith inertia measurement instrument coordinate system x sy sz sIn conjunction with; Earth fixed coordinate system x ey ez eFollow north-Dong-under (NED) pact, wherein x eEnergized north, y ePoint to east and z eUnder sensing; The inertia measurement instrument can be measured rolling, pitching and the deflection attitude of himself; Definition is around x eRotation φ represent to roll, around y eThe rotation θ of axle represents pitching and around z eThe rotation ψ of axle represents deflection; According to Euler's rotation theorem, from Eulerian angles φ, θ and ψ being converted to hypercomplex number q:
q = q 0 q 1 q 2 q 3 = cos ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) sin ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) - cos ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) cso ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) cos ( &theta; / 2 ) sin ( &psi; / 2 ) - sin ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) - - - ( 3 )
And being constrained to of four Euler Parameter:
q 0 2 + q 1 2 + q 2 2 + q 3 2 = 1 - - - ( 4 )
Here q 0Scalar part and (q 1, q 2, q 3) be the vector part; So be tied to the attitude cosine matrix of earth fixed coordinate system from inertia measurement instrument coordinate
Figure BDA00002824072600083
Can be represented as:
M s e = q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 0 q 2 + q 1 q 3 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 - q 1 2 + q 2 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 0 q 1 + q 2 q 3 ) q 0 2 - q 1 2 - q 2 2 - q 3 2 - - - ( 5 ) ;
Because gyroscope and magnetometer have white noise and random walk, in order to obtain stable data, utilize Kalman filtering to estimate the state of inertia measurement instrument from one group of noisy and incomplete measurement; Inertia measurement instrument state x from moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly kTransfer equation be:
x k=A k·x k-1+B·u k-1+w k-1 (6)
z k=H·x k+v k
Here x kBe the state vector of n * 1, represent the inertia measurement instrument state of k constantly; A is the state transition matrix of n * n; B is system's input matrix of n * p; u k-1Be the k-1 moment certainty input vector of p * 1, p is the number of certainty input; w k-1It is the k-1 n in the moment * 1 process noise vector; z kThat vector is measured in the k m in the moment * 1; H is the observation matrix of m * n and v kIt is the measurement noise vector of m * 1; N=7 herein, m=6; With the quaternion differential equation of time correlation be:
&PartialD; q 0 / &PartialD; t &PartialD; q 1 / &PartialD; t &PartialD; q 2 / &PartialD; t &PartialD; q 3 / &PartialD; t = q 0 - q 1 - q 2 - q 3 q 1 q 0 - q 3 q 2 q 2 q 3 q 0 - q 1 q 3 - q 2 q 1 q 0 &CenterDot; 0 v x / 2 v y / 2 v z / 2 - - - ( 7 )
V wherein x, v y, v zThat the inertia measurement instrument is at axle x s, y s, z sAngular speed; Due to state x kComprise hypercomplex number state and angular speed, so x kFollowing form is arranged:
x k=[q 0,kq 1,kq 2,kq 3,kv x,kv y,kv z,k] (8)
Here q 0, k, q 1, k, q 2, k, q 3, k, v X, k, v Y, k, v z,kBe k hypercomplex number state and angular speed constantly, can get from equation (7), the state transfer equation is:
A k = 1 0 0 0 - q 1 , k &CenterDot; &Delta;t / 2 - q 2 , k &CenterDot; &Delta;t / 2 - q 3 , k &CenterDot; &Delta;t / 2 0 1 0 0 q 0 , k &CenterDot; &Delta;t / 2 q 3 , k &CenterDot; &Delta;t / 2 q 2 , k &CenterDot; &Delta;t / 2 0 0 1 0 q 3 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 - q 1 , k &CenterDot; &Delta;t / 2 0 0 0 1 - q 2 , k &CenterDot; &Delta;t / 2 q 1 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 - - - ( 9 )
Here Δ t is sample time, owing to there is no control inputs, allows B=0 N * p, Bu k-1=0 N * 1Estimate the hypercomplex number state with angular speed, obtain the process noise vector and be
w k=[0000w x w y w z] T (10)
Here w x, w y, w zIt is the process noise of angular speed; Owing to coming measured angular speed with gyroscope, observation matrix H is
H=[0 3×4 I 3×3] (11)
In order to satisfy (4), definition
Figure BDA00002824072600092
K certainty hypercomplex number q constantly kCan be standardized as:
q k=[q 0,k/D q 1,k/D q 2,k/D q 3,k/D] (12)
Arrive formula (12) in conjunction with formula (6), can be from the inertia measurement instrument state x of moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly k
Described step S3 comprises the following steps:
Utilize anti-solution of end pose of robot, then each joint is by anti-angular movement of separating, when the hand of the complete virtual robot of whole joint motions is grabbed the position that end just arrives appointment.There is no the situation of barrier around this situation is suitable for, the virtual robot arm motion can not bump with the surrounding environment barrier.
And in the Denavit-Hartenberg representation, A iThe homogeneous coordinate transformation matrix of expression from coordinate system i-1 to coordinate system i usually has:
A i = cos &theta; i - sin i cos &alpha; i sin &theta; i sin &alpha; i l i cos &theta; i sin &theta; i cos &theta; i cos &alpha; i - cos &theta; i sin &alpha; i l i sin &theta; i 0 sin &alpha; i cos &alpha; i r i 0 0 0 1 - - - ( 13 )
For a robot with six joints, definition support coordinate is coordinate system 1, and last coordinate is coordinate system 6, the homogeneous transformation matrix T from the support frame of axes to last frame of axes 6Be defined as:
T 6 = A 1 A 2 . . . A 6 = n 6 0 s 6 0 a 6 0 p 6 0 0 0 0 1 - - - ( 14 )
In formula Be the normal vector of paw, Be sliding vector,
Figure BDA00002824072600105
For near vector,
Figure BDA00002824072600106
Be position vector.
Utilize (14), (5) have:
T 6=M (15)
Obtain six joint motions angle values by finding the solution (15): (θ 1, θ 2..., θ 6).
Above-described embodiment is the better embodiment of the present invention; but embodiments of the present invention are not restricted to the described embodiments; other any do not deviate from change, the modification done under Spirit Essence of the present invention and principle, substitutes, combination, simplify; all should be the substitute mode of equivalence, within being included in protection scope of the present invention.

Claims (4)

1. based on the man-machine interaction method of hybrid sensor, it is characterized in that, comprise the following steps:
S1, Real-time Obtaining are also processed the operator and are carried out RGB image and depth image in robot manipulation's task process by hand exercise, obtain operator's hand position;
S2, use inertia measurement instrument are measured operator's hand attitude, and in conjunction with Quaternion Algorithm and Kalman filtering, attitude data are estimated, thereby obtain stable operator's hand attitude;
S3, the operator's hand pose driven machine people who obtains according to above-mentioned steps.
2. the man-machine interaction method based on hybrid sensor according to claim 1, is characterized in that, described step S1 comprises:
When the operator carries out robot manipulation's task by hand exercise, catch RGB image and depth image by the Kinect that is fixed on the operator front, RGB image and the depth image processed continuously in this process are realized hand tracking and location; Kinect has three kinds of automatic focusing camera heads: optimize the infrared camera of depth detection and the vision spectrum camera that is used for visual identity of a standard for two;
Extract the human skeleton artis from depth image; 15 artis of this that extracts sort from top to bottom and from left to right: (1) head; (2) shoulder center; (3) right shoulder; (4) right hand elbow; (5) right finesse; (6) left shoulder; (7) left hand elbow; (8) left finesse; (9) hip joint center; (10) right hip; (11) right knee; (12) right crus of diaphragm; (13) left hip; (14) left knee; (15) left foot; The coordinate of these 15 artis is called as the Kinect coordinate; The position of operator's right hand is defined as right finesse P 5The position, with shoulder joint P 3As a reference point and build reference point
Figure FDA00002824072500011
Right hand framework;
With respect to reference point
Figure FDA00002824072500012
P 5Position P ' 5For:
P′ 5=P 5-P 3 (1)
The setting right hand with respect to the moving range of right shoulder is W RH 1 < x < W RH 2 , H RH 1 < y < H RH 2 , L RH 1 < z < L RH 2 , And the moving range of robot is W RR 1 < x < W RR 2 , H RR 1 < y < H RR 2 , L RR 1 < z < L RR 2 , The mapping point of robot moving range is:
R x R = ( P 8 _ x &prime; - W RH 1 ) ( W RH 2 - W RH 1 ) &CenterDot; ( W RR 2 - W RR 1 ) + W RR 1 R y R = ( P 8 _ y &prime; - H RH 1 ) ( H RH 2 - H RH 1 ) &CenterDot; H RR 2 - H RR 1 + H RR 1 R z R = ( P 8 _ z &prime; - L RH 1 ) ( L RH 2 - L RH 1 ) &CenterDot; ( L RR 2 - L RR 1 ) + L RR 1 . - - - ( 2 )
3. the man-machine interaction method based on hybrid sensor according to claim 1, is characterized in that, described step S2 comprises the following steps:
When the operator carried out robot manipulation's task by hand exercise, the right hand was held an inertia measurement instrument, and this inertia measurement instrument can be determined rolling, pitching and the deflection attitude of operator's right hand; The inertia measurement instrument is by a triaxial accelerometer, and two two axis gyroscope instrument and three axle magnetometers form;
Define three coordinate systems: operator's body coordinate system x by bz b, inertia measurement instrument coordinate system x sy sz sWith terrestrial coordinate system x ey ez eInertia measurement instrument coordinate system x sy sz sThe accelerometer that corresponding three quadratures are installed and the axle of magnetometer; Because the inertia measurement instrument is arranged on the robot end rigidly, suppose operator's body coordinate system x by bz bWith inertia measurement instrument coordinate system x sy sz sIn conjunction with; Earth fixed coordinate system x ey ez eFollow north-Dong-under (NED) pact, wherein x eEnergized north, y ePoint to east and z eUnder sensing; The inertia measurement instrument can be measured rolling, pitching and the deflection attitude of himself; Definition is around x eRotation φ represent to roll, around y eThe rotation θ of axle represents pitching and around z eThe rotation ψ of axle represents deflection; According to Euler's rotation theorem, from Eulerian angles φ, θ and ψ being converted to hypercomplex number q:
q = q 0 q 1 q 2 q 3 = cos ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) sin ( &phi; / 2 ) cos ( &theta; / 2 ) cos ( &psi; / 2 ) - cos ( &phi; / 2 ) sin ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) + sin ( &phi; / 2 ) cso ( &theta; / 2 ) sin ( &psi; / 2 ) cos ( &phi; / 2 ) cos ( &theta; / 2 ) sin ( &psi; / 2 ) - sin ( &phi; / 2 ) sin ( &theta; / 2 ) cos ( &psi; / 2 ) - - - ( 3 )
And being constrained to of four Euler Parameter:
q 0 2 + q 1 2 + q 2 2 + q 3 2 = 1 - - - ( 4 )
Here q 0Scalar part and (q 1, q 2, q 3) be the vector part; So be tied to the attitude cosine matrix of earth fixed coordinate system from inertia measurement instrument coordinate
Figure FDA00002824072500023
Can be represented as:
M s e = q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 0 q 2 + q 1 q 3 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 - q 1 2 + q 2 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 0 q 1 + q 2 q 3 ) q 0 2 - q 1 2 - q 2 2 - q 3 2 - - - ( 5 ) ;
Because gyroscope and magnetometer have white noise and random walk, in order to obtain stable data, utilize Kalman filtering to estimate the state of inertia measurement instrument from one group of noisy and incomplete measurement; Inertia measurement instrument state x from moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly kTransfer equation be:
x k=A k·x k-1+B·u k-1+w k-1 (6)
z k=H·x k+v k
Here x kBe the state vector of n * 1, represent the inertia measurement instrument state of k constantly; A is the state transition matrix of n * n; B is system's input matrix of n * p; u k-1Be the k-1 moment certainty input vector of p * 1, p is the number of certainty input; w k-1It is the k-1 n in the moment * 1 process noise vector; z kThat vector is measured in the k m in the moment * 1; H is the observation matrix of m * n and v kIt is the measurement noise vector of m * 1; N=7 herein, m=6; With the quaternion differential equation of time correlation be:
&PartialD; q 0 / &PartialD; t &PartialD; q 1 / &PartialD; t &PartialD; q 2 / &PartialD; t &PartialD; q 3 / &PartialD; t = q 0 - q 1 - q 2 - q 3 q 1 q 0 - q 3 q 2 q 2 q 3 q 0 - q 1 q 3 - q 2 q 1 q 0 &CenterDot; 0 v x / 2 v y / 2 v z / 2 - - - ( 7 )
V wherein x, v y, v zThat the inertia measurement instrument is at axle x s, y s, z sAngular speed; Due to state x kComprise hypercomplex number state and angular speed, so x kFollowing form is arranged:
x k=[q 0,kq 1,kq 2,kq 3,kv x,kv y,kv z,k] (8)
Q wherein 0, k, q 1, k, q 2, k, q 3, k, v X, k, v Y, k, v z,kBe k hypercomplex number state and angular speed constantly, can get from equation (7), the state transfer equation is:
A k = 1 0 0 0 - q 1 , k &CenterDot; &Delta;t / 2 - q 2 , k &CenterDot; &Delta;t / 2 - q 3 , k &CenterDot; &Delta;t / 2 0 1 0 0 q 0 , k &CenterDot; &Delta;t / 2 q 3 , k &CenterDot; &Delta;t / 2 q 2 , k &CenterDot; &Delta;t / 2 0 0 1 0 q 3 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 - q 1 , k &CenterDot; &Delta;t / 2 0 0 0 1 - q 2 , k &CenterDot; &Delta;t / 2 q 1 , k &CenterDot; &Delta;t / 2 q 0 , k &CenterDot; &Delta;t / 2 0 0 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 1 - - - ( 9 )
Wherein Δ t is sample time, owing to there is no control inputs, allows B=0 N * p, Bu k-1=0 N * 1Estimate the hypercomplex number state with angular speed, obtain the process noise vector and be
w k=[0000w xw yw z] T (10)
W wherein x, w y, w zIt is the process noise of angular speed; Owing to coming measured angular speed with gyroscope, observation matrix H is
H=[0 3×4I 3×3] (11)
In order to satisfy (4), definition
Figure FDA00002824072500033
K certainty hypercomplex number q constantly kCan be standardized as:
q k=[q 0,k/D q 1,k/D q 2,k/D q 3,k/D] (12)
Arrive formula (12) in conjunction with formula (6), can be from the inertia measurement instrument state x of moment k-1 k-1Estimate the inertia measurement instrument state x of k constantly k
4. the man-machine interaction method based on hybrid sensor according to claim 1, is characterized in that, described step S3 comprises the following steps:
Utilize anti-solution of end pose of robot, then each joint is by anti-angular movement of separating, when the hand of the complete virtual robot of whole joint motions is grabbed the position that end just arrives appointment; There is no the situation of barrier around this situation is suitable for, the virtual robot arm motion can not bump with the surrounding environment barrier;
In the Denavit-Hartenberg representation, A iThe homogeneous coordinate transformation matrix of expression from coordinate system i-1 to coordinate system i has:
A i = cos &theta; i - sin i cos &alpha; i sin &theta; i sin &alpha; i l i cos &theta; i sin &theta; i cos &theta; i cos &alpha; i - cos &theta; i sin &alpha; i l i sin &theta; i 0 sin &alpha; i cos &alpha; i r i 0 0 0 1 - - - ( 13 )
For a robot with six joints, definition support coordinate is coordinate system 1, and last coordinate is coordinate system 6, the homogeneous transformation matrix T from the support frame of axes to last frame of axes 6Be defined as:
T 6 = A 1 A 2 . . . A 6 = n 6 0 s 6 0 a 6 0 p 6 0 0 0 0 1 - - - ( 14 )
In formula Be the normal vector of paw, Be sliding vector,
Figure FDA00002824072500045
For near vector,
Figure FDA00002824072500046
Be position vector.
Utilize (14), (5) have:
T 6=M (15)
Obtain six joint motions angle values by finding the solution (15): (θ 1, θ 2..., θ 6).
CN201310047285.XA 2013-02-06 2013-02-06 Based on the man-machine interaction method of hybrid sensor Active CN103112007B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310047285.XA CN103112007B (en) 2013-02-06 2013-02-06 Based on the man-machine interaction method of hybrid sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310047285.XA CN103112007B (en) 2013-02-06 2013-02-06 Based on the man-machine interaction method of hybrid sensor

Publications (2)

Publication Number Publication Date
CN103112007A true CN103112007A (en) 2013-05-22
CN103112007B CN103112007B (en) 2015-10-28

Family

ID=48410425

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310047285.XA Active CN103112007B (en) 2013-02-06 2013-02-06 Based on the man-machine interaction method of hybrid sensor

Country Status (1)

Country Link
CN (1) CN103112007B (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN103412649A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Control system based on non-contact type hand motion capture
CN103544713A (en) * 2013-10-17 2014-01-29 芜湖金诺数字多媒体有限公司 Human-body projection interaction method on basis of rigid-body physical simulation system
CN104252228A (en) * 2013-06-28 2014-12-31 三星电子株式会社 Display apparatus and method for controlling display apparatus thereof
CN104484574A (en) * 2014-12-25 2015-04-01 东华大学 Real-time human body gesture supervised training correction system based on quaternion
CN104772756A (en) * 2015-01-26 2015-07-15 杭州师范大学 Mechanical arm based on inertial measurement units and control method thereof
CN104959984A (en) * 2015-07-15 2015-10-07 深圳市优必选科技有限公司 Control system of intelligent robot
CN105252538A (en) * 2015-11-06 2016-01-20 邹海英 Novel industrial robot demonstrator
CN105487660A (en) * 2015-11-25 2016-04-13 北京理工大学 Immersion type stage performance interaction method and system based on virtual reality technology
CN106127788A (en) * 2016-07-04 2016-11-16 触景无限科技(北京)有限公司 A kind of vision barrier-avoiding method and device
CN106182003A (en) * 2016-08-01 2016-12-07 清华大学 A kind of mechanical arm teaching method, Apparatus and system
CN106293103A (en) * 2016-10-21 2017-01-04 北京工商大学 Four-axle aircraft gesture control device based on inertial sensor and control method
TWI574800B (en) * 2016-07-19 2017-03-21 Robot control system
CN106647248A (en) * 2017-01-13 2017-05-10 中国科学院深圳先进技术研究院 Method and apparatus for determining the inverse solution result of robots in series
CN108284444A (en) * 2018-01-25 2018-07-17 南京工业大学 Multi-mode human action prediction technique based on Tc-ProMps algorithms under man-machine collaboration
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
CN108748144A (en) * 2018-05-28 2018-11-06 上海优尼斯工业服务有限公司 A kind of collision recognition method of man-machine collaboration mechanical arm
CN108777081A (en) * 2018-05-31 2018-11-09 华中师范大学 A kind of virtual Dancing Teaching method and system
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN111923043A (en) * 2020-07-30 2020-11-13 苏州富鑫林光电科技有限公司 Multi-manipulator positioning method based on multi-sensor fusion
CN113829357A (en) * 2021-10-25 2021-12-24 香港中文大学(深圳) Teleoperation method, device, system and medium for robot arm
CN115641647A (en) * 2022-12-23 2023-01-24 海马云(天津)信息技术有限公司 Digital human wrist driving method and device, storage medium and electronic equipment
CN116330305A (en) * 2023-05-30 2023-06-27 常州旭泰克系统科技有限公司 Multi-mode man-machine interaction assembly method, system, equipment and medium thereof
WO2023168849A1 (en) * 2022-03-08 2023-09-14 江南大学 Mechanical arm motion capture method, medium, electronic device, and system
US11766784B1 (en) 2022-03-08 2023-09-26 Jiangnan University Motion capture method and system of robotic arm, medium, and electronic device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102818524A (en) * 2012-07-31 2012-12-12 华南理工大学 On-line robot parameter calibration method based on visual measurement

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102818524A (en) * 2012-07-31 2012-12-12 华南理工大学 On-line robot parameter calibration method based on visual measurement

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FLETCHER DUUN,LAN PARBERRY: "《3D数学基础 图形与游戏开发》", 31 December 2005, 清华大学出版社, article "从欧拉角转换到四元数", pages: 170-171 *
SENG-HOON PETER WON等: "A fasting tool tracking system using an IMU and a position sensor with kalman filters and a fuzzy expert system", 《TRANSACTIONS ON INDUSTRIAL ELECTRONICS》, vol. 56, no. 5, 31 May 2009 (2009-05-31), XP 011249133 *
曹雏清,李瑞峰,赵立军: "基于深度图像技术的手势识别方法", 《计算机工程》, vol. 38, no. 8, 30 April 2012 (2012-04-30) *
王明东: "基于kinect骨骼跟踪功能实现PC的手势控制", 《漳州职业技术学院学报》, vol. 14, no. 2, 30 June 2012 (2012-06-30), pages 11 - 16 *

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104252228A (en) * 2013-06-28 2014-12-31 三星电子株式会社 Display apparatus and method for controlling display apparatus thereof
CN104252228B (en) * 2013-06-28 2019-08-23 三星电子株式会社 Display device and the method for controlling display device
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN103399637B (en) * 2013-07-31 2015-12-23 西北师范大学 Based on the intelligent robot man-machine interaction method of kinect skeleton tracing control
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103398702B (en) * 2013-08-05 2015-08-19 青岛海通机器人系统有限公司 A kind of mobile robot's remote operation controller and manipulation technology thereof
CN103412649A (en) * 2013-08-20 2013-11-27 苏州跨界软件科技有限公司 Control system based on non-contact type hand motion capture
CN103544713A (en) * 2013-10-17 2014-01-29 芜湖金诺数字多媒体有限公司 Human-body projection interaction method on basis of rigid-body physical simulation system
CN104484574A (en) * 2014-12-25 2015-04-01 东华大学 Real-time human body gesture supervised training correction system based on quaternion
CN104772756A (en) * 2015-01-26 2015-07-15 杭州师范大学 Mechanical arm based on inertial measurement units and control method thereof
CN104959984A (en) * 2015-07-15 2015-10-07 深圳市优必选科技有限公司 Control system of intelligent robot
CN105252538A (en) * 2015-11-06 2016-01-20 邹海英 Novel industrial robot demonstrator
CN105487660A (en) * 2015-11-25 2016-04-13 北京理工大学 Immersion type stage performance interaction method and system based on virtual reality technology
CN106127788A (en) * 2016-07-04 2016-11-16 触景无限科技(北京)有限公司 A kind of vision barrier-avoiding method and device
CN106127788B (en) * 2016-07-04 2019-10-25 触景无限科技(北京)有限公司 A kind of vision barrier-avoiding method and device
TWI574800B (en) * 2016-07-19 2017-03-21 Robot control system
CN106182003A (en) * 2016-08-01 2016-12-07 清华大学 A kind of mechanical arm teaching method, Apparatus and system
CN106293103A (en) * 2016-10-21 2017-01-04 北京工商大学 Four-axle aircraft gesture control device based on inertial sensor and control method
CN106293103B (en) * 2016-10-21 2023-09-26 北京工商大学 Gesture control device and gesture control method for four-axis aircraft based on inertial sensor
CN106647248A (en) * 2017-01-13 2017-05-10 中国科学院深圳先进技术研究院 Method and apparatus for determining the inverse solution result of robots in series
CN106647248B (en) * 2017-01-13 2020-02-07 中国科学院深圳先进技术研究院 Method and device for determining inverse solution result of serial robot
CN108284444B (en) * 2018-01-25 2021-05-11 南京工业大学 Multi-mode human body action prediction method based on Tc-ProMps algorithm under man-machine cooperation
CN108284444A (en) * 2018-01-25 2018-07-17 南京工业大学 Multi-mode human action prediction technique based on Tc-ProMps algorithms under man-machine collaboration
CN108762495A (en) * 2018-05-18 2018-11-06 深圳大学 The virtual reality driving method and virtual reality system captured based on arm action
CN108762495B (en) * 2018-05-18 2021-06-29 深圳大学 Virtual reality driving method based on arm motion capture and virtual reality system
CN108748144B (en) * 2018-05-28 2020-06-30 上海优尼斯工业服务有限公司 Collision recognition method of man-machine cooperation mechanical arm
CN108748144A (en) * 2018-05-28 2018-11-06 上海优尼斯工业服务有限公司 A kind of collision recognition method of man-machine collaboration mechanical arm
CN108777081A (en) * 2018-05-31 2018-11-09 华中师范大学 A kind of virtual Dancing Teaching method and system
CN110480634A (en) * 2019-08-08 2019-11-22 北京科技大学 A kind of arm guided-moving control method for manipulator motion control
CN111923043A (en) * 2020-07-30 2020-11-13 苏州富鑫林光电科技有限公司 Multi-manipulator positioning method based on multi-sensor fusion
CN113829357A (en) * 2021-10-25 2021-12-24 香港中文大学(深圳) Teleoperation method, device, system and medium for robot arm
CN113829357B (en) * 2021-10-25 2023-10-03 香港中文大学(深圳) Remote operation method, device, system and medium for robot arm
WO2023168849A1 (en) * 2022-03-08 2023-09-14 江南大学 Mechanical arm motion capture method, medium, electronic device, and system
US11766784B1 (en) 2022-03-08 2023-09-26 Jiangnan University Motion capture method and system of robotic arm, medium, and electronic device
CN115641647A (en) * 2022-12-23 2023-01-24 海马云(天津)信息技术有限公司 Digital human wrist driving method and device, storage medium and electronic equipment
CN116330305A (en) * 2023-05-30 2023-06-27 常州旭泰克系统科技有限公司 Multi-mode man-machine interaction assembly method, system, equipment and medium thereof
CN116330305B (en) * 2023-05-30 2023-10-31 常州旭泰克系统科技有限公司 Multi-mode man-machine interaction assembly method, system, equipment and medium thereof

Also Published As

Publication number Publication date
CN103112007B (en) 2015-10-28

Similar Documents

Publication Publication Date Title
CN103112007B (en) Based on the man-machine interaction method of hybrid sensor
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
Wolf et al. Gesture-based robot control with variable autonomy from the JPL BioSleeve
Frank et al. Realizing mixed-reality environments with tablets for intuitive human-robot collaboration for object manipulation tasks
CN102814814B (en) Kinect-based man-machine interaction method for two-arm robot
Fang et al. A robotic hand-arm teleoperation system using human arm/hand with a novel data glove
WO2011065035A1 (en) Method of creating teaching data for robot, and teaching system for robot
CN104570731A (en) Uncalibrated human-computer interaction control system and method based on Kinect
Fang et al. A novel data glove using inertial and magnetic sensors for motion capture and robotic arm-hand teleoperation
Bhuyan et al. Gyro-accelerometer based control of a robotic arm using AVR microcontroller
CN109243575B (en) Virtual acupuncture method and system based on mobile interaction and augmented reality
US20130202212A1 (en) Information processing apparatus, information processing method, and computer program
Pierce et al. A data-driven method for determining natural human-robot motion mappings in teleoperation
Dwivedi et al. Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system
US11422625B2 (en) Proxy controller suit with optional dual range kinematics
Odesanmi et al. Skill learning framework for human–robot interaction and manipulation tasks
Kumar et al. Computer vision based object grasping 6DoF robotic arm using picamera
Zhang et al. A handheld master controller for robot-assisted microsurgery
Kryuchkov et al. Simulation of the «cosmonaut-robot» system interaction on the lunar surface based on methods of machine vision and computer graphics
Li et al. A dexterous hand-arm teleoperation system based on hand pose estimation and active vision
Tanzini et al. New interaction metaphors to control a hydraulic working machine's arm
Tsitos et al. Handling vision noise through robot motion control in a real-time teleoperation system
Bai et al. Kinect-based hand tracking for first-person-perspective robotic arm teleoperation
Mohammad et al. Tele-operation of robot using gestures
Shchekoldin et al. Adaptive head movements tracking algorithms for AR interface controlled telepresence robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant