CN103386683B  Kinectbased motion sensingcontrol method for manipulator  Google Patents
Kinectbased motion sensingcontrol method for manipulator Download PDFInfo
 Publication number
 CN103386683B CN103386683B CN201310328791.6A CN201310328791A CN103386683B CN 103386683 B CN103386683 B CN 103386683B CN 201310328791 A CN201310328791 A CN 201310328791A CN 103386683 B CN103386683 B CN 103386683B
 Authority
 CN
 China
 Prior art keywords
 coordinate
 joint
 kinect
 elbow joint
 angle
 Prior art date
 Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
 Active
Links
 210000002310 Elbow Joint Anatomy 0.000 claims abstract description 70
 238000009499 grossing Methods 0.000 claims abstract description 30
 210000001364 Upper Extremity Anatomy 0.000 claims abstract description 25
 210000000323 Shoulder Joint Anatomy 0.000 claims abstract description 20
 210000003857 Wrist Joint Anatomy 0.000 claims abstract description 20
 238000001914 filtration Methods 0.000 claims abstract description 15
 210000001503 Joints Anatomy 0.000 claims abstract description 5
 238000004364 calculation method Methods 0.000 claims description 12
 238000000034 method Methods 0.000 claims description 9
 241000272168 Laridae Species 0.000 claims description 7
 230000000875 corresponding Effects 0.000 claims description 7
 210000003141 Lower Extremity Anatomy 0.000 claims description 4
 210000002478 Hand Joints Anatomy 0.000 claims description 3
 238000004891 communication Methods 0.000 claims description 3
 230000000977 initiatory Effects 0.000 claims description 2
 230000003993 interaction Effects 0.000 abstract 1
 238000010586 diagram Methods 0.000 description 4
 238000005516 engineering process Methods 0.000 description 3
 230000001276 controlling effect Effects 0.000 description 2
 230000002452 interceptive Effects 0.000 description 2
 238000004805 robotic Methods 0.000 description 2
 241000452638 Parasaissetia nigra Species 0.000 description 1
 101700025439 RUNX1 Proteins 0.000 description 1
 210000002356 Skeleton Anatomy 0.000 description 1
 230000000694 effects Effects 0.000 description 1
 238000009114 investigational therapy Methods 0.000 description 1
 238000004519 manufacturing process Methods 0.000 description 1
 239000000203 mixture Substances 0.000 description 1
 230000001360 synchronised Effects 0.000 description 1
Abstract
The invention aims to provide a Kinectbased motion sensingcontrol method for manipulator comprising the following steps of obtaining the threediemnsional coordinates of five joints of the right upper limb of a human body by a Kinect sensor; smoothing the obtained five joint coordinate data of the right upper limb by a doubleexponential filtering algorithm; constructing vectors in the threedimensional space coordinate system of the Kinect by utilizing the smoothed 5 joint coordinates of right upper limb, and obtaining the angles of the right upper limb by calculating the vector included angles, wherein the angles of the right upper limb comprise a right shoulder joint angle, a right elbow joint angle and a right wrist joint angle; fusing the angle information to form a data package, adding with a data package head and a checksum, and sending to a robot by a wireless serial port to perform manipulator control. The Kinectbased motion sensingcontrol method for the manipulator can finish the flexible and accurate control on the manipulator and the moving robot by accurate identification on the human body actions, thus enabling the interaction between people and the robot to be more friendly and improving the intelligence of the robot.
Description
Technical field
What the present invention relates to is a kind of robot control method, specifically body sense robot control method.
Background technology
Kinect is a kind of 3D body sense video camera, and it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interactive simultaneously.Microsoft is proposed Kinect forWindows SDK Beta in June, 2011.It is a kind of novel manmachine interactive system, is also a kind of new figure's detecting sensor, and its application is very wide, as virtual mirror, 3D modeling, virtual musical instrument, virtual entertainment and Mechanical course etc.Robot control method research at present based on Kinect is also little, and the flexible control that research and utilization Kinect realizes robot has very wide application prospect.
Current Japan is by the Robotics of its advanced person, and scientist applies Kinect sensor and controls in real time to test to robot, obtains certain effect, and the control method demonstrated with Kinect sensor realizes robot is feasible.Kinect has been applied on the patrol robot of military battlefield by the U.S., and Kinect detecting realtime robot front threedimensional environment information, with the operation of decisionmaking robot, can reconstruct the threedimensional map in robot running simultaneously.More domestic companies or university are applied to the control research of robot also seldom to Kinect now, are not more applied in the middle of actual production.The present invention by the further investigation to Kinect, and carries out Control release on small scale robot platform, is expected to this technology is applied to the controls such as patrol safety protection robot, medical auxiliary robot, mechanical arm simultaneously.
Existing robot control method is all by programmecontrol, cannot work asynchronously with people.
Summary of the invention
The object of the present invention is to provide a kind of body sense controller mechanical arm method based on Kinect worked asynchronously with people of intelligence.
The object of the present invention is achieved like this:
A kind of body sense controller mechanical arm method based on Kinect of the present invention, is characterized in that:
(1) obtained the threedimensional coordinate in right side upper limbs 5 joints of human body by Kinect sensor, the threedimensional coordinate in right side upper limbs 5 joints comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) smoothing to the two exponent filtering algorithm of right side upper limbs 5 the joint coordinates data separate obtained;
(3) utilize right side upper limbs 5 joint coordinates smoothly in the threedimensional coordinate system of Kinect, build vector, obtained the angle at upper limbs place, right side by compute vector angle, the angle at described upper limbs place, right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, carry out mechanical arm control.
The present invention can also comprise:
1, described two exponent filtering algorithms are:
T represents the time, { x
_{t}represent original data sequence, { s
_{t}represent the two exponential smoothing result of t, { b
_{t}represent the optimal estimation of t data sequence trend, F
_{t+m}represent the optimal estimation of x in the t+m moment, m is predictive factor, and m>0, two exponential smoothing filtering algorithm formula is as follows:
s
_{1}=x
_{0}
b
_{1}=x
_{1}x
_{0}
s
_{t}=αx
_{t}+(1α)(s
_{t1}+b
_{t1}),t>1
b
_{t}=β(s
_{t}s
_{t1})+(1β)b
_{t1},t>1
F
_{t+m}=s
_{t}+mb
_{t}
α represents the data smoothing factor, and 0< α <1, β represents trend smoothing factor, and 0< β <1, definition F
_{1}=s
_{0}+ b
_{0}, then x can estimate according to abovementioned pair of exponential smoothing filtering algorithm formula in the value in all moment.
2, the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly exports s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate
_{n}, right elbow joint coordinate trend optimal estimation b
_{n}, right elbow joint coordinate final result optimal estimation F
_{n+1}, current from Kinect obtain right elbow joint coordinate v
_{n}, counting variable n gives initial value respectively, n is integer variable, and often obtain once the coordinate of right elbow joint, n adds 1;
(2) right elbow joint coordinate v is obtained from Kinect
_{0}, enter first time iteration n=0, trend prediction b
_{0}assignment is 0, smoothly exports s
_{0}assignment is for working as front right elbow joint coordinate v
_{0}, right elbow joint coordinate finally exports F
_{1}=s
_{0}+ b
_{0};
(3) right elbow joint coordinate v is obtained from Kinect
_{1}, enter second time iteration n=1, smoothly export s
_{1}assignment is for working as front right elbow joint coordinate v
_{1}elbow joint v right with last time
_{0}mean value, namely
trend prediction b
_{1}=(s
_{1}s
_{0}) β, right elbow joint coordinate finally exports F
_{2}=s
_{1}+ mb
_{1};
(4) right elbow joint coordinate v is obtained from Kinect
_{2}, enter third time iteration n=2, smoothly export s
_{2}according to formula s
_{n}=α v
_{n}+ (1α) (s
_{n1}+ b
_{n1}) calculate, trend prediction b
_{2}according to formula b
_{n}=β (s
_{n}s
_{n1})+(1β) b
_{n1}calculate, right elbow joint coordinate finally exports according to formula F
_{n+1}=s
_{n}+ mb
_{n}calculate;
(5) according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate and export, reset, get back to step (2) and restart iteration until n overflows, then along with the motion of people's right arm, what realize right elbow joint coordinate is constantly level and smooth;
The smoothing process of right side all the other 4 joint coordinates of upper limbs is identical with the smoothing process of right elbow joint coordinate.
3, upper limb joint angle computation method in right side is:
(1) set up Kinect coordinate system XYZ, under Kinect coordinate system, obtain the three dimensional space coordinate H (x representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T respectively
_{h}, y
_{h}, z
_{h}), S (x
_{s}, y
_{s}, z
_{s}), E (x
_{e}, y
_{e}, z
_{e}), W (x
_{w}, y
_{w}, z
_{w}), T (x
_{t}, y
_{t}, z
_{t});
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0; (2) vector is calculated respectively
(3) calculate ∠ A, ∠ B, ∠ C respectively, result of calculation correspondence is kept in global variable A, B, C.
4, main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged; Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, thus controls manipulator motion.
5, the control method of the mechanical arm that the control method of the mechanical arm that the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding is corresponding with the right side upper limbs of human body is identical.
Advantage of the present invention is:
1, by accurately identifying the action of human body, can complete to mechanical arm, mobile robot flexible, accurately manipulate.Make the mutual more friendly of people and robot, improve the intelligent of robot.
2, adopt modular programming, facilitate the transplanting of program on different machines people platform.
3, this technology can be used for telerobotic manipulation, such as Longdistance Control explosiveremoval robot, thus reduce unnecessary casualties.
4, whole robot controlling interface adopts XAML programming, and interface and behavior are separated, expansion and integrated after convenient, and developer can play respective speciality, synchronous development, finally completes more complicated robot controlling.
Accompanying drawing explanation
Fig. 1 is two exponent filtering algorithm flow charts of embodiment 1;
Fig. 2 is the Kinect coordinate system schematic diagram of embodiment 1;
Fig. 3 is the right arm geometric representation of embodiment 1;
Fig. 4 is the geometric representation of the mechanical arm of embodiment 1;
The mechanical arm control principle block diagram b of Fig. 5 a to be the mechanical arm control principle block diagram a of embodiment 1, Fig. 5 b be embodiment 1.
Detailed description of the invention
Below in conjunction with accompanying drawing citing, the present invention is described in more detail:
Embodiment 1:
Composition graphs 1 ~ 5, the general steps of present embodiment is:
(1) obtained 20 skeleton point threedimensional coordinates of human body by Kinect sensor, test the joint coordinates mainly used and comprise right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates.
(2) smoothing to the two exponent filtering algorithm of the right arm joint coordinates data separate obtained, reduce the shake in right arm joint motions process.
(3) utilize filtered right arm joint coordinates in the threedimensional coordinate system of Kinect, build vector, just can be known all angles of right arm joint by compute vector angle, comprise right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle.Result of calculation is preserved;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, control for mechanical arm.
Idiographic flow is as follows: the two exponent filtering algorithm principle of (1) joint coordinates and flow process
The present invention adopts two exponential smoothing filtering algorithm.T represents the time, { x
_{t}represent original data sequence, { s
_{t}represent the two exponential smoothing result of t, { b
_{t}represent the optimal estimation of t data sequence trend, F
_{t+m}represent the optimal estimation of x in the t+m moment, m is predictive factor, and m>0, the concrete formula of two exponential smoothing filtering algorithms is as follows:
s
_{1}=x
_{0}(1)
b
_{1}=x
_{1}x
_{0}(2)
s
_{t}=αx
_{t}+(1α)(s
_{t1}+b
_{t1}),t>1 (3)
b
_{t}=β(s
_{t}s
_{t1})+(1β)b
_{t1},t>1 (4)
F
_{t+m}=s
_{t}+ mb
_{t}(5) wherein α represents the data smoothing factor, and 0< α <1, β represents trend smoothing factor, and 0< β <1.Definition F
_{1}=s
_{0}+ b
_{0}, such x can estimate according to formula in the value in all moment.With reference to Fig. 1, for right arm elbow joint coordinate smoothly, other joint coordinates are smoothly similar, and concrete level and smooth implementation step is as follows:
Step 1: initialize multiple parameter.
Data smoothing factoralpha is initialized as 0.5, and trend smoothing factor β is initialized as 0.25, and predictive factor m is initialized as 0.5, s
_{n}represent that right elbow joint coordinate smoothly exports, begin to turn to (0,0,0), b
_{n}represent the optimal estimation of right elbow joint coordinate trend, be initialized as (0,0,0), F
_{n+1}represent the optimal estimation of right elbow joint coordinate final result, be initialized as (0,0,0), v
_{n}represent the current right elbow joint coordinate obtained from Kinect, be initialized as (0,0,0), n represents counting variable, and often obtain once the coordinate of right elbow joint, n adds 1, is defined as integer variable, initializes n=0.
Step 2:
Right elbow joint coordinate v is obtained from Kinect
_{0}, enter first time iteration n=0, trend prediction b
_{0}assignment is 0, smoothly exports s
_{0}assignment is for working as front right elbow joint coordinate v
_{0}, right elbow joint coordinate finally exports F
_{1}=s
_{0}+ b
_{0}, n adds 1.
Step 3:
Right elbow joint coordinate v is obtained from Kinect
_{1}, enter second time iteration n=1, smoothly export s
_{1}assignment is for working as front right elbow joint coordinate v
_{1}elbow joint v right with last time
_{0}mean value, namely
trend prediction b
_{1}=(s
_{1}s
_{0}) β, right elbow joint coordinate finally exports F
_{2}=s
_{1}+ mb
_{1}, n adds 1.
Step 4:
Right elbow joint coordinate v is obtained from Kinect
_{2}, enter third time iteration n=2, smoothly export s
_{2}according to formula s
_{n}=α v
_{n}+ (1α) (s
_{n1}+ b
_{n1}) calculate, trend prediction b2 is according to formula b
_{n}=β (s
_{n}s
_{n1})+(1β) b
_{n1}calculate, right elbow joint coordinate finally exports according to formula F
_{n+1}=s
_{n}+ mb
_{n}calculate, n adds 1.
Step 5: obtain next right elbow joint coordinate v from Kinect
_{n}repeat step 4, iteration count variable n constantly adds 1, because n is integer variable, finally make n overflow clearing so constantly add 1, get back to again step 2 iteration again, therefore, along with people's right arm in Kinect front moves, what just can realize right elbow joint coordinate is constantly level and smooth.
(2) arm joint angle computation method
Kinect coordinate system is with reference to Fig. 2, and right arm geometric representation is with reference to Fig. 3.The three dimensional space coordinate (x, y, z) representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand T is respectively obtained under Kinect coordinate system, first two exponential smoothing process is passed through, then vector is built, according to threedimensional vector angle calculation formula, if
then vector
with
included angle cosine be
its angle
$\mathrm{\θ}=\mathrm{arccos}\frac{({x}_{1}{x}_{2}+{y}_{1}{y}_{2}+{z}_{1}{z}_{2})}{\sqrt{{x}_{1}^{2}+{y}_{1}^{2}+{z}_{1}^{2}}\·\sqrt{{x}_{2}^{2}+{y}_{2}^{2}+{z}_{2}^{2}}},$ Finally calculate each joint angles of arm.Angle A in the angle at a, the right shoulder joint place of angle calculation of right shoulder joint place and Fig. 2, centered by right shoulder joint S, vector
right shoulder joint place angle
$\∠A=\mathrm{arccos}\frac{[({x}_{h}{x}_{s})\·({x}_{e}{x}_{s})+({y}_{h}{y}_{s})\·({y}_{e}{y}_{s})+({z}_{h}{z}_{s})\·({z}_{e}{z}_{s})]}{\sqrt{{({x}_{h}{x}_{s})}^{2}+{({y}_{h}{y}_{s})}^{2}+{({z}_{h}{z}_{s})}^{2}}\·\sqrt{{({x}_{e}{x}_{s})}^{2}+{({y}_{e}{y}_{s})}^{2}+{({z}_{e}{z}_{s})}^{2}}}$ 。
B, right elbow joint place angle calculation
Angle B in the angle at right elbow joint place and Fig. 2, centered by right elbow joint E, vector
right elbow joint place angle
$\∠B=\mathrm{arccos}\frac{[({x}_{s}{x}_{e})\·({x}_{w}{x}_{e})+({y}_{s}{y}_{e})\·({y}_{w}{y}_{e})+({z}_{s}{z}_{e})\·({z}_{w}{z}_{e})]}{\sqrt{{({x}_{s}{x}_{e})}^{2}+{({y}_{s}{y}_{e})}^{2}+{({z}_{s}{z}_{e})}^{2}}\·\sqrt{{({x}_{w}{x}_{e})}^{2}+{({y}_{w}{y}_{e})}^{2}+{({z}_{w}{z}_{e})}^{2}}}$ 。
C, right wrist joint place angle calculation
Angle C in the angle at right wrist joint place and Fig. 2, centered by right wrist joint W, vector
right wrist joint place angle
$\∠C=\mathrm{arccos}\frac{[({x}_{t}{x}_{w})\·({x}_{e}{x}_{w})+({y}_{t}{y}_{w})\·({y}_{e}{y}_{w})+({z}_{t}{z}_{w})\·({z}_{e}{z}_{w})]}{\sqrt{{({x}_{t}{x}_{w})}^{2}+{({y}_{t}{y}_{w})}^{2}+{({z}_{t}{z}_{w})}^{2}}\·\sqrt{{({x}_{e}{x}_{w})}^{2}+{({y}_{e}{y}_{w})}^{2}+{({z}_{e}{z}_{w})}^{2}}}$ 。
Concrete steps:
Step 1:
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0.
Step 2:
Obtain the right stern joint coordinates H (x smoothly
_{h}, y
_{h}, z
_{h}), right shoulder joint coordinate S (x
_{s}, y
_{s}, z
_{s}), right elbow joint coordinate E (x
_{e}, y
_{e}, z
_{e}), right wrist joint coordinate W (x
_{w}, y
_{w}, z
_{w}) and righthanded scale (R.H.scale) T (x
_{t}, y
_{t}, z
_{t}).Vector is calculated respectively according to abovementioned vector calculation formula
Step 3:
Computing formula according to abovementioned right shoulder joint place angle A calculates ∠ A, in like manner calculates right elbow joint place ∠ B, right wrist joint place ∠ C, result of calculation correspondence is kept in global variable A, B, C.Angle information is formed a packet, add data packet head and School Affairs, send to robot by wireless serial, return step 2.
(3) manipulator motion controls
The geometric representation of mechanical arm is with reference to figure 4, and its main controller adopts MC9S12XS128 microcontroller.The motion in each joint of mechanical arm adopts Servocontroller to control, and No. 1 steering wheel represents right shoulder joint, No. 2 steering wheels represent right elbow joint, No. 3 steering wheels represent right wrist joint, and when arm againsts body, in mechanical arm and figure, 0 ° of reference line overlaps.It is as follows that manipulator motion controls concrete steps:
Step 1:
MC9S12XS128 basic module initializes, and comprises clock, PWM, serial communication and interruption.
Step 2:
Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged.
Step 3:
Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, controls manipulator motion, return step 2.
Constantly repeat according to abovementioned steps, just can realize the tracking of mechanical arm to arm action, the theory diagram that whole mechanical arm controls is with reference to figure 5.In experiment, mechanical arm can tracing arm motion reposefully, does not have jitter phenomenon, demonstrate abovementioned filtering algorithm and arm angles computational methods correct and effectively.
Claims (5)
1., based on a body sense controller mechanical arm method of Kinect, it is characterized in that:
(1) obtained the threedimensional coordinate in right side upper limbs 5 joints of human body by Kinect sensor, the threedimensional coordinate in right side upper limbs 5 joints comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) smoothing to the two exponent filtering algorithm of right side upper limbs 5 the joint coordinates data separate obtained;
(3) utilize right side upper limbs 5 joint coordinates smoothly in the threedimensional coordinate system of Kinect, build vector, obtained the angle at upper limbs place, right side by compute vector angle, the angle at described upper limbs place, right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, carry out mechanical arm control;
Described two exponent filtering algorithms are:
T represents the time, { x
_{t}represent original data sequence, { s
_{t}represent the two exponential smoothing result of t, { b
_{t}represent the optimal estimation of t data sequence trend, F
_{t+m}represent the optimal estimation of x in the t+m moment, m is predictive factor, and m > 0, two exponential smoothing filtering algorithm formula is as follows:
s
_{1}＝x
_{0}
b
_{1}＝x
_{1}x
_{0}
s
_{t}＝αx
_{t}+(1α)(s
_{t1}+b
_{t1}),t＞1
b
_{t}＝β(s
_{t}s
_{t1})+(1β)b
_{t1},t＞1
F
_{t+m}＝s
_{t}+mb
_{t}
α represents the data smoothing factor, and 0 < α < 1, β represents trend smoothing factor, and 0 < β < 1, definition F
_{1}=s
_{0}+ b
_{0}, then x can estimate according to abovementioned pair of exponential smoothing filtering algorithm formula in the value in all moment.
2. a kind of body sense controller mechanical arm method based on Kinect according to claim 1, is characterized in that: the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly exports s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate
_{n}, right elbow joint coordinate trend optimal estimation b
_{n}, right elbow joint coordinate final result optimal estimation F
_{n+1}, current from Kinect obtain right elbow joint coordinate v
_{n}, counting variable n gives initial value respectively, n is integer variable, and often obtain once the coordinate of right elbow joint, n adds 1;
(2) right elbow joint coordinate v is obtained from Kinect
_{0}, enter first time iteration n=0, trend prediction b
_{0}assignment is 0, smoothly exports s
_{0}assignment is for working as front right elbow joint coordinate v
_{0}, right elbow joint coordinate finally exports F
_{1}=s
_{0}+ b
_{0};
(3) right elbow joint coordinate v is obtained from Kinect
_{1}, enter second time iteration n=1, smoothly export s
_{1}assignment is for working as front right elbow joint coordinate v
_{1}elbow joint v right with last time
_{0}mean value, namely
trend prediction b
_{1}=(s
_{1}s
_{0}) β, right elbow joint coordinate finally exports F
_{2}=s
_{1}+ mb
_{1};
(4) right elbow joint coordinate v is obtained from Kinect
_{2}, enter third time iteration n=2, smoothly export s
_{2}according to formula s
_{n}=α v
_{n}+ (1α) (s
_{n1}+ b
_{n1}) calculate, trend prediction b
_{2}according to formula b
_{n}=β (s
_{n}s
_{n1})+(1β) b
_{n1}calculate, right elbow joint coordinate finally exports according to formula F
_{n+1}=s
_{n}+ mb
_{n}calculate;
(5) according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate and export, reset until n overflows, get back to step (2) and restart iteration, then along with the motion of people's right arm, what realize right elbow joint coordinate is constantly level and smooth;
The smoothing process of right side all the other 4 joint coordinates of upper limbs is identical with the smoothing process of right elbow joint coordinate.
3. a kind of body sense controller mechanical arm method based on Kinect according to claim 2, is characterized in that: right side upper limb joint angle computation method is:
(1) set up Kinect coordinate system XYZ, under Kinect coordinate system, obtain the three dimensional space coordinate H (x representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T respectively
_{h}, y
_{h}, z
_{h}), S (x
_{s}, y
_{s}, z
_{s}), E (x
_{e}, y
_{e}, z
_{e}), W (x
_{w}, y
_{w}, z
_{w}), T (x
_{t}, y
_{t}, z
_{t});
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0;
(2) vector is calculated respectively
(3) calculate ∠ A, ∠ B, ∠ C respectively, result of calculation correspondence is kept in global variable A, B, C.
4. a kind of body sense controller mechanical arm method based on Kinect according to claim 3, is characterized in that: main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged; Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, thus controls manipulator motion.
5. according to the arbitrary a kind of described body sense controller mechanical arm method based on Kinect of claim 14, it is characterized in that: the control method of the mechanical arm that the control method of the mechanical arm that the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding is corresponding with the right side upper limbs of human body is identical.
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201310328791.6A CN103386683B (en)  20130731  20130731  Kinectbased motion sensingcontrol method for manipulator 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201310328791.6A CN103386683B (en)  20130731  20130731  Kinectbased motion sensingcontrol method for manipulator 
Publications (2)
Publication Number  Publication Date 

CN103386683A CN103386683A (en)  20131113 
CN103386683B true CN103386683B (en)  20150408 
Family
ID=49531201
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201310328791.6A Active CN103386683B (en)  20130731  20130731  Kinectbased motion sensingcontrol method for manipulator 
Country Status (1)
Country  Link 

CN (1)  CN103386683B (en) 
Families Citing this family (23)
Publication number  Priority date  Publication date  Assignee  Title 

CN104460972A (en) *  20131125  20150325  安徽寰智信息科技股份有限公司  Humancomputer interaction system based on Kinect 
CN103760976B (en) *  20140109  20161005  华南理工大学  Gesture identification intelligent home furnishing control method based on Kinect and system 
JP5986125B2 (en) *  20140228  20160906  ファナック株式会社  Mechanical system with wireless sensor 
CN103921266A (en) *  20140415  20140716  哈尔滨工程大学  Method for somatosensory control over snow and robot on basis of Kinect 
CN103995478B (en) *  20140530  20160518  山东建筑大学  Modular Press Machine tool arm experiment porch and method based on virtual reality interaction 
CN104227724B (en) *  20140828  20170118  北京易拓智谱科技有限公司  Visual identitybased manipulation method for end position of universal robot 
CN104808788B (en) *  20150318  20170901  北京工业大学  A kind of method that noncontact gesture manipulates user interface 
CN106607910B (en) *  20151022  20190322  中国科学院深圳先进技术研究院  A kind of robot imitates method in real time 
CN105904457B (en) *  20160516  20180306  西北工业大学  A kind of heterogeneous redundant mechanical arm control method based on position tracker and data glove 
CN106095083A (en) *  20160602  20161109  深圳奥比中光科技有限公司  The determination method of bodysensing instruction and body feeling interaction device 
CN106095087A (en) *  20160602  20161109  深圳奥比中光科技有限公司  Body feeling interaction system and method 
CN106095082A (en) *  20160602  20161109  深圳奥比中光科技有限公司  Body feeling interaction method, system and device 
CN106313072A (en) *  20161012  20170111  南昌大学  Humanoid robot based on leap motion of Kinect 
CN106384115B (en) *  20161026  20191022  武汉工程大学  A kind of joint of mechanical arm angle detecting method 
CN107309872B (en) *  20170508  20210615  南京航空航天大学  Flying robot with mechanical arm and control method thereof 
CN107272593A (en) *  20170523  20171020  陕西科技大学  A kind of robot bodysensing programmed method based on Kinect 
CN108127667B (en) *  20180118  20210105  西北工业大学  Mechanical arm somatosensory interaction control method based on joint angle increment 
CN109108970A (en) *  20180822  20190101  南通大学  A kind of reciprocating mechanical arm control method based on bone nodal information 
CN109176512A (en) *  20180831  20190111  南昌与德通讯技术有限公司  A kind of method, robot and the control device of motion sensing control robot 
WO2020133628A1 (en) *  20181229  20200702  深圳市工匠社科技有限公司  Humanoid robotic arm somatosensory control system and related product 
CN110238853A (en) *  20190618  20190917  广州市威控机器人有限公司  A kind of joint series Mobile Robot Control System, remote control system and method 
CN110216676A (en) *  20190621  20190910  深圳盈天下视觉科技有限公司  A kind of Mechanical arm control method, manipulator control device and terminal device 
CN110978064A (en) *  20191211  20200410  山东大学  Human body safety assessment method and system in humancomputer cooperation 
Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN102727362A (en) *  20120720  20121017  上海海事大学  NUI (Natural User Interface)based peripheral arm motion tracking rehabilitation training system and training method 
CN102814814A (en) *  20120731  20121212  华南理工大学  Kinectbased manmachine interaction method for twoarm robot 
CN102830798A (en) *  20120731  20121219  华南理工大学  Markfree hand tracking method of singlearm robot based on Kinect 
CN103170973A (en) *  20130328  20130626  上海理工大学  Manmachine cooperation device and method based on Kinect video camera 
Family Cites Families (1)
Publication number  Priority date  Publication date  Assignee  Title 

JP2013013969A (en) *  20110704  20130124  Hirotaka Niitsuma  Robot control by microsoft kinect (r), and application thereof 

2013
 20130731 CN CN201310328791.6A patent/CN103386683B/en active Active
Patent Citations (4)
Publication number  Priority date  Publication date  Assignee  Title 

CN102727362A (en) *  20120720  20121017  上海海事大学  NUI (Natural User Interface)based peripheral arm motion tracking rehabilitation training system and training method 
CN102814814A (en) *  20120731  20121212  华南理工大学  Kinectbased manmachine interaction method for twoarm robot 
CN102830798A (en) *  20120731  20121219  华南理工大学  Markfree hand tracking method of singlearm robot based on Kinect 
CN103170973A (en) *  20130328  20130626  上海理工大学  Manmachine cooperation device and method based on Kinect video camera 
NonPatent Citations (1)
Title 

基于Kinect深度信息的实时三维重复和滤波算法研究;陈晓明;《计算机应用研究》;20130430;第30卷(第4期);第1216页 * 
Also Published As
Publication number  Publication date 

CN103386683A (en)  20131113 
Similar Documents
Publication  Publication Date  Title 

CN103386683B (en)  Kinectbased motion sensingcontrol method for manipulator  
Qian et al.  Developing a gesture based remote humanrobot interaction system using kinect  
CN107239728A (en)  Unmanned plane interactive device and method based on deep learning Attitude estimation  
CN105241461A (en)  Map creating and positioning method of robot and robot system  
CN105807926A (en)  Unmanned aerial vehicle manmachine interaction method based on threedimensional continuous gesture recognition  
CN103921266A (en)  Method for somatosensory control over snow and robot on basis of Kinect  
CN102236414A (en)  Picture operation method and system in threedimensional display space  
CN103440037A (en)  Realtime interaction virtual human body motion control method based on limited input information  
CN110834329B (en)  Exoskeleton control method and device  
CN103942829A (en)  Singleimage human body threedimensional posture reconstruction method  
Guo et al.  Twostream convolutional neural network for accurate rgbd fingertip detection using depth and edge information  
CN110415322A (en)  The generation method and device of the action command of virtual objects model  
Li  Human–robot interaction based on gesture and movement recognition  
CN105068536A (en)  Moving substrate track planner achieved based on nonlinear optimization method  
CN105005381A (en)  Shake elimination method for virtual robot arm interaction  
CN107336238A (en)  The control system of all directionally movable robot  
CN103729879A (en)  Virtual hand stable grabbing method based on force sense calculation  
CN106406875A (en)  Virtual digital sculpture method based on natural gesture  
CN108664126B (en)  Deformable hand grabbing interaction method in virtual reality environment  
Yano et al.  A facial expression parameterization by elastic surface model  
CN205247208U (en)  Robotic system  
CN109543703A (en)  The method and device of sensing data processing  
CN103699214A (en)  Threedimensional tracking and interacting method based on threedimensional natural gestures  
Berti et al.  Kalman filter for tracking robotic arms using low cost 3d vision systems  
CN107168556A (en)  Air mouse multidata fusion method, air mouse and air mouse control system 
Legal Events
Date  Code  Title  Description 

C06  Publication  
PB01  Publication  
C10  Entry into substantive examination  
SE01  Entry into force of request for substantive examination  
C14  Grant of patent or utility model  
GR01  Patent grant  
TR01  Transfer of patent right  
TR01  Transfer of patent right 
Effective date of registration: 20201231 Address after: 572024 area A129, 4th floor, building 4, Baitai Industrial Park, yazhouwan science and Technology City, Yazhou District, Sanya City, Hainan Province Patentee after: Nanhai innovation and development base of Sanya Harbin Engineering University Address before: 150001 Intellectual Property Office, Harbin Engineering University science and technology office, 145 Nantong Avenue, Nangang District, Harbin, Heilongjiang Patentee before: HARBIN ENGINEERING University 