CN103386683A - Kinect-based motion sensing-control method for manipulator - Google Patents

Kinect-based motion sensing-control method for manipulator Download PDF

Info

Publication number
CN103386683A
CN103386683A CN2013103287916A CN201310328791A CN103386683A CN 103386683 A CN103386683 A CN 103386683A CN 2013103287916 A CN2013103287916 A CN 2013103287916A CN 201310328791 A CN201310328791 A CN 201310328791A CN 103386683 A CN103386683 A CN 103386683A
Authority
CN
China
Prior art keywords
kinect
coordinate
elbow joint
joint
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103287916A
Other languages
Chinese (zh)
Other versions
CN103386683B (en
Inventor
莫宏伟
孟龙龙
徐立芳
董会云
蒋兴洲
雍升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanhai Innovation And Development Base Of Sanya Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201310328791.6A priority Critical patent/CN103386683B/en
Publication of CN103386683A publication Critical patent/CN103386683A/en
Application granted granted Critical
Publication of CN103386683B publication Critical patent/CN103386683B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention aims to provide a Kinect-based motion sensing-control method for manipulator comprising the following steps of obtaining the three-diemnsional coordinates of five joints of the right upper limb of a human body by a Kinect sensor; smoothing the obtained five joint coordinate data of the right upper limb by a double-exponential filtering algorithm; constructing vectors in the three-dimensional space coordinate system of the Kinect by utilizing the smoothed 5 joint coordinates of right upper limb, and obtaining the angles of the right upper limb by calculating the vector included angles, wherein the angles of the right upper limb comprise a right shoulder joint angle, a right elbow joint angle and a right wrist joint angle; fusing the angle information to form a data package, adding with a data package head and a checksum, and sending to a robot by a wireless serial port to perform manipulator control. The Kinect-based motion sensing-control method for the manipulator can finish the flexible and accurate control on the manipulator and the moving robot by accurate identification on the human body actions, thus enabling the interaction between people and the robot to be more friendly and improving the intelligence of the robot.

Description

The mechanical arm method is controlled in a kind of sense of body based on Kinect
Technical field
What the present invention relates to is a kind of robot control method, specifically body sense robot control method.
Background technology
Kinect is a kind of 3D body sense video camera, and it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interaction simultaneously.Microsoft has released Kinect for Windows SDK Beta in June, 2011.It is a kind of novel man-machine interactive system, is also a kind of new figure's detecting sensor, and its application is very wide, controls as virtual mirror, 3D modeling, virtual musical instrument, virtual entertainment and machinery etc.Study also seldom based on the robot control method of Kinect at present, research and utilization Kinect realizes that the flexible control of robot has very wide application prospect.
Japan is by its advanced Robotics at present, and scientist applies the Kinect sensor robot is controlled and tests in real time, obtains certain effect, has proved with the Kinect sensor and has realized that the control method of robot is feasible.The U.S. has been applied to Kinect on the patrol robot of military battlefield, and Kinect detecting real-time robot the place ahead three-dimensional environment information is with the operation of decision-making robot, the three-dimensional map in simultaneously can reconstruct robot running.The control research that more domestic companies or university are applied to robot to Kinect now also seldom, more is not applied in the middle of actual production.The present invention passes through the further investigation to Kinect, and controls experiment on the small scale robot platform, is expected to simultaneously this technology is applied to go on patrol the controls such as safety protection robot, medical auxiliary robot, mechanical arm.
Existing robot control method is all by programme-control, can't work asynchronously with the people.
Summary of the invention
The object of the present invention is to provide a kind of sense of body based on Kinect that works asynchronously with the people of intelligence to control the mechanical arm method.
The object of the present invention is achieved like this:
The mechanical arm method is controlled in a kind of sense of body based on Kinect of the present invention, it is characterized in that:
(1) obtain the three-dimensional coordinate in 5 joints of right side upper limbs of human body by the Kinect sensor, the three-dimensional coordinate in 5 joints of right side upper limbs comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) the two exponent filtering algorithms of 5 joint coordinates data of the right side upper limbs utilization that obtains are carried out smoothly;
(3) 5 joint coordinates of right side upper limbs after utilizing smoothly build vector in the three-dimensional coordinate system of Kinect, obtain the angle at upper limbs place, right side by the compute vector angle, the angle at upper limbs place, described right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, forms a packet, add data packet head and verification and, send to robot by wireless serial, carry out mechanical arm and control.
The present invention can also comprise:
1, described pair of exponent filtering algorithm is:
T represents the time, { x tThe expression original data sequence, { s tThe constantly two exponential smoothing results of expression t, { b tThe expression t optimal estimation of data sequence trend constantly, F t+mExpression x is in t+m optimal estimation constantly, and m is predictive factor, and m〉0, two exponential smoothing filtering algorithm formula are as follows:
s 1=x 0
b 1=x 1-x 0
s t=αx t+(1-α)(s t-1+b t-1),t>1
b t=β(s t-s t-1)+(1-β)b t-1,t>1
F t+m=s t+mb t
α represents the data smoothing factor, and 0<α<1, and β represents the trend smoothing factor, and 0<β<1, definition F 1=s 0+ b 0, x can estimate according to above-mentioned pair of exponential smoothing filtering algorithm formula in all values constantly.
2, the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly export s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate n, right elbow joint coordinate trend optimal estimation b n, right elbow joint coordinate final result optimal estimation F n+1, the current right elbow joint coordinate v that obtains from Kinect n, counting variable n gives respectively initial value, n is integer variable, often obtains once the coordinate of right elbow joint, n adds 1;
(2) obtain right elbow joint coordinate v from Kinect 0, enter iteration n=0 for the first time, trend prediction b 0Assignment is 0, smoothly exports s 0Assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate is finally exported F 1=s 0+ b 0
(3) obtain right elbow joint coordinate v from Kinect 1, enter iteration n=1 for the second time, smoothly export s 1Assignment is for working as front right elbow joint coordinate v 1With right elbow joint v last time 0Mean value, namely
Figure BDA00003598694000031
Trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate is finally exported F 2=s 1+ mb 1
(4) obtain right elbow joint coordinate v from Kinect 2, enter iteration n=2 for the third time, smoothly export s 2According to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate trend prediction b 2According to formula b n=β (s n-s n-1)+(1-β) b n-1Calculate, right elbow joint coordinate is finally exported according to formula F n+1=s n+ mb nCalculate;
(5), according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate output,, until n overflows zero clearing, get back to step (2) and restart iteration, along with the motion of people's right arm, realization is constantly level and smooth to right elbow joint coordinate;
The smoothing process of all the other 4 joint coordinates of right side upper limbs is identical with the smoothing process of right elbow joint coordinate.
3, upper limb joint angle computation method in right side is:
(1) set up Kinect coordinate system XYZ, obtain to represent respectively the three dimensional space coordinate H (x of right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T under the Kinect coordinate system h, y h, z h), S (x s, y s, z s), E (x e, y e, z e), W (x w, y w, z w), T (x t, y t, z t);
Definition global variable A, B, C represent respectively right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle, all are initialized as 0; (2) calculate respectively vector
Figure BDA00003598694000032
(3) calculate respectively ∠ A, ∠ B, ∠ C, the result of calculation correspondence is kept in global variable A, B, C.
4, the main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, after arriving, packet notifies main controller in the mode of interrupting, main controller read after whole packet calculation check and, if and the verification of host computer transmission and identical, this packet is effective, and packet is resolved, and reads the angle value in each joint of mechanical arm, otherwise this packet is invalid, and the angle value in each joint of former mechanical arm remains unchanged; Each joint angles value of mechanical arm that reads is converted into corresponding pwm signal, thereby controls manipulator motion.
The control method of the mechanical arm that the control method of the mechanical arm that 5, the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding and the right side upper limbs of human body are corresponding is identical.
Advantage of the present invention is:
1, accurately identify by the action to human body, can complete to mechanical arm, mobile robot flexibly, accurately control.Make the more friendly alternately of people and robot, improve the intelligent of robot.
2, adopt modular programming, facilitate the transplanting of program on different machines people platform.
3, this technology can be used for tele-robotic and control, Long-distance Control explosive-removal robot for example, thus reduce unnecessary casualties.
4, whole robot controls interface and adopts the XAML programming, makes interface and behavior be separated, and conveniently expansion and integrated later, and developer can be brought into play speciality separately, and more complicated robot control is finally completed in synchronous exploitation.
Description of drawings
Fig. 1 is two exponent filtering algorithm flow charts of embodiment 1;
Fig. 2 is the Kinect coordinate system schematic diagram of embodiment 1;
Fig. 3 is the right arm geometric representation of embodiment 1;
Fig. 4 is the geometric representation of the mechanical arm of embodiment 1;
Fig. 5 a is the mechanical arm control principle block diagram a of embodiment 1, and Fig. 5 b is the mechanical arm control principle block diagram b of embodiment 1.
The specific embodiment
For example the present invention is described in more detail below in conjunction with accompanying drawing:
Embodiment 1:
In conjunction with Fig. 1~5, the general steps of present embodiment is:
(1) obtain 20 skeleton point three-dimensional coordinates of human body by the Kinect sensor, the joint coordinates that experiment is mainly used comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates.
(2) the two exponent filtering algorithms of the right arm joint coordinates data utilization that obtains are carried out smoothly, reduce the shake in right arm joint motions process.
(3) utilize filtered right arm joint coordinates to build vector in the three-dimensional coordinate system of Kinect, just know all angles of right arm joint by the compute vector angle, comprise right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle.Result of calculation is preserved;
(4) angle information is merged, forms a packet, add data packet head and verification and, send to robot by wireless serial, be used for mechanical arm and control.
Idiographic flow is as follows: the two exponent filtering algorithm principle of (1) joint coordinates and flow process
The present invention adopts two exponential smoothing filtering algorithms.T represents the time, { x tThe expression original data sequence, { s tThe constantly two exponential smoothing results of expression t, { b tThe expression t optimal estimation of data sequence trend constantly, F t+mExpression x is in t+m optimal estimation constantly, and m is predictive factor, and m〉0, the concrete formula of two exponential smoothing filtering algorithms is as follows:
s 1=x 0 (1)
b 1=x 1-x 0 (2)
s t=αx t+(1-α)(s t-1+b t-1),t>1 (3)
b t=β(s t-s t-1)+(1-β)b t-1,t>1 (4)
F t+m=s t+ mb t(5) wherein α represents the data smoothing factor, and 0<α<1, and β represents the trend smoothing factor, and 0<β<1.Definition F 1=s 0+ b 0, x can estimate according to formula in all values constantly like this.With reference to Fig. 1, smoothly as example, other joint coordinates are smoothly similar take right arm elbow joint coordinate, and concrete level and smooth performing step is as follows:
Step 1: initialize a plurality of parameters.
The data smoothing factor-alpha is initialized as 0.5, and trend smoothing factor β is initialized as 0.25, and predictive factor m is initialized as 0.5, s nRepresent that right elbow joint coordinate smoothly exports, (0,0,0), b begin to turn to nRepresent the optimal estimation of right elbow joint coordinate trend, be initialized as (0,0,0), F n+1Represent the optimal estimation of right elbow joint coordinate final result, be initialized as (0,0,0), v nRepresent the current right elbow joint coordinate that obtains from Kinect, be initialized as (0,0,0), n represents counting variable, often obtains once the coordinate of right elbow joint, and n adds 1, is defined as integer variable, initializes n=0.
Step 2:
Obtain right elbow joint coordinate v from Kinect 0, enter iteration n=0 for the first time, trend prediction b 0Assignment is 0, smoothly exports s 0Assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate is finally exported F 1=s 0+ b 0, n adds 1.
Step 3:
Obtain right elbow joint coordinate v from Kinect 1, enter iteration n=1 for the second time, smoothly export s 1Assignment is for working as front right elbow joint coordinate v 1With right elbow joint v last time 0Mean value, namely
Figure BDA00003598694000061
Trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate is finally exported F 2=s 1+ mb 1, n adds 1.
Step 4:
Obtain right elbow joint coordinate v from Kinect 2, enter iteration n=2 for the third time, smoothly export s 2According to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate, trend prediction b2 is according to formula b n=β (s n-s n-1)+(1-β) b n-1Calculate, right elbow joint coordinate is finally exported according to formula F n+1=s n+ mb nCalculate, n adds 1.
Step 5: from Kinect, obtain next right elbow joint coordinate v nRepeating step 4, iteration count variable n constantly adds 1, because n is integer variable, finally make n overflow zero clearing so constantly add 1, get back to again step 2 iteration again, therefore, along with the motion of people's right arm in Kinect the place ahead, just can realize to right elbow joint coordinate constantly not smoothly.
(2) arm joint angle computation method
The Kinect coordinate system is with reference to Fig. 2, and the right arm geometric representation is with reference to Fig. 3.Obtain to represent respectively the three dimensional space coordinate (x, y, z) of right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand T under the Kinect coordinate system, at first process through two exponential smoothings, then build vector,, according to three-dimensional vector angle computing formula, establish
Figure BDA00003598694000062
Figure BDA00003598694000063
Vectorial
Figure BDA00003598694000064
With
Figure BDA00003598694000065
Included angle cosine be
Figure BDA00003598694000071
Its angle θ = arccos ( x 1 x 2 + y 1 y 2 + z 1 z 2 ) x 1 2 + y 1 2 + z 1 2 · x 2 2 + y 2 2 + z 2 2 , Calculate finally each joint angles of arm.The angle at a, the right shoulder joint of right shoulder joint place angle calculation place is the angle A in Fig. 2, centered by right shoulder joint S, and vector
Figure BDA00003598694000073
Figure BDA00003598694000074
Right shoulder joint place angle ∠ A = arccos [ ( x h - x s ) · ( x e - x s ) + ( y h - y s ) · ( y e - y s ) + ( z h - z s ) · ( z e - z s ) ] ( x h - x s ) 2 + ( y h - y s ) 2 + ( z h - z s ) 2 · ( x e - x s ) 2 + ( y e - y s ) 2 + ( z e - z s ) 2
B, right elbow joint place angle calculation
The angle at right elbow joint place is the angle B in Fig. 2, centered by right elbow joint E, and vector
Figure BDA00003598694000076
Figure BDA00003598694000077
Right elbow joint place angle ∠ B = arccos [ ( x s - x e ) · ( x w - x e ) + ( y s - y e ) · ( y w - y e ) + ( z s - z e ) · ( z w - z e ) ] ( x s - x e ) 2 + ( y s - y e ) 2 + ( z s - z e ) 2 · ( x w - x e ) 2 + ( y w - y e ) 2 + ( z w - z e ) 2
C, right wrist joint place angle calculation
The angle at right wrist joint place is the angle C in Fig. 2, centered by right wrist joint W, and vector
Figure BDA00003598694000079
Figure BDA000035986940000710
Right wrist joint place angle ∠ C = arccos [ ( x t - x w ) · ( x e - x w ) + ( y t - y w ) · ( y e - y w ) + ( z t - z w ) · ( z e - z w ) ] ( x t - x w ) 2 + ( y t - y w ) 2 + ( z t - z w ) 2 · ( x e - x w ) 2 + ( y e - y w ) 2 + ( z e - z w ) 2
Concrete steps:
Step 1:
Definition global variable A, B, C represent respectively right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle, all are initialized as 0.
Step 2:
Right stern joint coordinates H (x after obtaining smoothly h, y h, z h), right shoulder joint coordinate S (x s, y s, z s), right elbow joint coordinate E (x e, y e, z e), right wrist joint coordinate W (x w, y w, z w) and right-handed scale (R.H.scale) T (x t, y t, z t).Calculate respectively vector according to above-mentioned vector calculation formula
Figure BDA00003598694000081
Step 3:
Computing formula according to above-mentioned right shoulder joint place angle A calculates ∠ A, in like manner calculates the ∠ B of right elbow joint place, the ∠ C of right wrist joint place, and the result of calculation correspondence is kept in global variable A, B, C.Angle information is formed a packet, add data packet head and verification and, send to robot by wireless serial, return to step 2.
(3) manipulator motion is controlled
The geometric representation of mechanical arm is with reference to figure 4, and its main controller adopts the MC9S12XS128 microcontroller.The motion in each joint of mechanical arm adopts Servo-controller to control, and No. 1 steering wheel represents that right shoulder joint, No. 2 steering wheels represent that right elbow joint, No. 3 steering wheels represent right wrist joint, and when arm was pasting body, in mechanical arm and figure, 0 ° of reference line overlapped.It is as follows that manipulator motion is controlled concrete steps:
Step 1:
The MC9S12XS128 basic module initializes, and comprises clock, PWM, serial communication and interruption.
Step 2:
Wait for that the arm angle information packet that host computer sends arrives, after arriving, packet notifies main controller in the mode of interrupting, main controller read after whole packet calculation check and, if and the verification of host computer transmission and identical, this packet is effective, and packet is resolved, and reads the angle value in each joint of mechanical arm, otherwise this packet is invalid, and the angle value in each joint of former mechanical arm remains unchanged.
Step 3:
Each joint angles value of mechanical arm that reads is converted into corresponding pwm signal, controls manipulator motion, return to step 2.
Constantly repeat according to above-mentioned steps, just can realize the tracking of mechanical arm to arm action, the theory diagram that whole mechanical arm is controlled is with reference to figure 5.In experiment, mechanical arm is tracing arm motion reposefully, there is no jitter phenomenon, has proved that above-mentioned filtering algorithm and arm angle computation method are correct and effective.

Claims (6)

1. the mechanical arm method is controlled in the sense of the body based on Kinect, it is characterized in that:
(1) obtain the three-dimensional coordinate in 5 joints of right side upper limbs of human body by the Kinect sensor, the three-dimensional coordinate in 5 joints of right side upper limbs comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) the two exponent filtering algorithms of 5 joint coordinates data of the right side upper limbs utilization that obtains are carried out smoothly;
(3) 5 joint coordinates of right side upper limbs after utilizing smoothly build vector in the three-dimensional coordinate system of Kinect, obtain the angle at upper limbs place, right side by the compute vector angle, the angle at upper limbs place, described right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, forms a packet, add data packet head and verification and, send to robot by wireless serial, carry out mechanical arm and control.
2. the mechanical arm method is controlled in a kind of sense of body based on Kinect according to claim 1, and it is characterized in that: described pair of exponent filtering algorithm is:
T represents the time, { x tThe expression original data sequence, { s tThe constantly two exponential smoothing results of expression t, { b tThe expression t optimal estimation of data sequence trend constantly, F t+mExpression x is in t+m optimal estimation constantly, and m is predictive factor, and m〉0, two exponential smoothing filtering algorithm formula are as follows:
s 1=x 0
b 1=x 1-x 0
s t=αx t+(1-α)(s t-1+b t-1),t>1
b t=β(s t-s t-1)+(1-β)b t-1,t>1
F t+m=s t+mb t
α represents the data smoothing factor, and 0<α<1, and β represents the trend smoothing factor, and 0<β<1, definition F 1=s 0+ b 0, x can estimate according to above-mentioned pair of exponential smoothing filtering algorithm formula in all values constantly.
3. the mechanical arm method is controlled in a kind of sense of body based on Kinect according to claim 2, and it is characterized in that: the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly export s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate n, right elbow joint coordinate trend optimal estimation b n, right elbow joint coordinate final result optimal estimation F n+1, the current right elbow joint coordinate v that obtains from Kinect n, counting variable n gives respectively initial value, n is integer variable, often obtains once the coordinate of right elbow joint, n adds 1;
(2) obtain right elbow joint coordinate v from Kinect 0, enter iteration n=0 for the first time, trend prediction b 0Assignment is 0, smoothly exports s 0Assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate is finally exported F 1=s 0+ b 0
(3) obtain right elbow joint coordinate v from Kinect 1, enter iteration n=1 for the second time, smoothly export s 1Assignment is for working as front right elbow joint coordinate v 1With right elbow joint v last time 0Mean value, namely
Figure FDA00003598693900021
Trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate is finally exported F 2=s 1+ mb 1
(4) obtain right elbow joint coordinate v from Kinect 2, enter iteration n=2 for the third time, smoothly export s 2According to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate trend prediction b 2According to formula b n=β (s n-s n-1)+(1-β) b n-1Calculate, right elbow joint coordinate is finally exported according to formula F n+1=s n+ mb nCalculate;
(5), according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate output,, until n overflows zero clearing, get back to step (2) and restart iteration, along with the motion of people's right arm, realization is constantly level and smooth to right elbow joint coordinate;
The smoothing process of all the other 4 joint coordinates of right side upper limbs is identical with the smoothing process of right elbow joint coordinate.
4. the mechanical arm method is controlled in a kind of sense of body based on Kinect according to claim 3, and it is characterized in that: right side upper limb joint angle computation method is:
(1) set up Kinect coordinate system XYZ, obtain to represent respectively the three dimensional space coordinate H (x of right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T under the Kinect coordinate system h, y h, z h), S (x s, y s, z s), E (x e, y e, z e), W (x w, y w, z w), T (x t, y t, z t);
Definition global variable A, B, C represent respectively right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle, all are initialized as 0;
(2) calculate respectively vector
Figure FDA00003598693900031
(3) calculate respectively ∠ A, ∠ B, ∠ C, the result of calculation correspondence is kept in global variable A, B, C.
5. the mechanical arm method is controlled in a kind of sense of body based on Kinect according to claim 4, and it is characterized in that: the main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, after arriving, packet notifies main controller in the mode of interrupting, main controller read after whole packet calculation check and, if and the verification of host computer transmission and identical, this packet is effective, and packet is resolved, and reads the angle value in each joint of mechanical arm, otherwise this packet is invalid, and the angle value in each joint of former mechanical arm remains unchanged; Each joint angles value of mechanical arm that reads is converted into corresponding pwm signal, thereby controls manipulator motion.
6. the mechanical arm methods are controlled according to claim 1-5 arbitrary described a kind of senses of body based on Kinect, it is characterized in that: the control method of the mechanical arm that the control method of the mechanical arm that the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding and the right side upper limbs of human body are corresponding is identical.
CN201310328791.6A 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator Expired - Fee Related CN103386683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310328791.6A CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310328791.6A CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Publications (2)

Publication Number Publication Date
CN103386683A true CN103386683A (en) 2013-11-13
CN103386683B CN103386683B (en) 2015-04-08

Family

ID=49531201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310328791.6A Expired - Fee Related CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Country Status (1)

Country Link
CN (1) CN103386683B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system
CN103921266A (en) * 2014-04-15 2014-07-16 哈尔滨工程大学 Method for somatosensory control over snow and robot on basis of Kinect
CN103995478A (en) * 2014-05-30 2014-08-20 山东建筑大学 Modularized hydraulic mechanical arm experimental platform and method based on interaction of virtual and reality
CN104227724A (en) * 2014-08-28 2014-12-24 北京易拓智谱科技有限公司 Visual identity-based manipulation method for end position of universal robot
CN104460972A (en) * 2013-11-25 2015-03-25 安徽寰智信息科技股份有限公司 Human-computer interaction system based on Kinect
CN104808788A (en) * 2015-03-18 2015-07-29 北京工业大学 Method for controlling user interfaces through non-contact gestures
CN104875209A (en) * 2014-02-28 2015-09-02 发那科株式会社 Machine system including wireless sensor
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN106095083A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 The determination method of body-sensing instruction and body feeling interaction device
CN106095087A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction system and method
CN106095082A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction method, system and device
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106384115A (en) * 2016-10-26 2017-02-08 武汉工程大学 Mechanical arm joint angle detection method
CN106607910A (en) * 2015-10-22 2017-05-03 中国科学院深圳先进技术研究院 Robot real-time simulation method
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm
CN108127667A (en) * 2018-01-18 2018-06-08 西北工业大学 A kind of mechanical arm body feeling interaction control method based on joint angle increment
CN108814894A (en) * 2018-04-12 2018-11-16 山东大学 The upper limb rehabilitation robot system and application method of view-based access control model human body pose detection
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109176512A (en) * 2018-08-31 2019-01-11 南昌与德通讯技术有限公司 A kind of method, robot and the control device of motion sensing control robot
CN109223441A (en) * 2018-09-13 2019-01-18 华南理工大学 A kind of human upper limb rehabilitation training and movement auxiliary system based on Kinect sensor
CN110216676A (en) * 2019-06-21 2019-09-10 深圳盈天下视觉科技有限公司 A kind of Mechanical arm control method, manipulator control device and terminal device
CN110238853A (en) * 2019-06-18 2019-09-17 广州市威控机器人有限公司 A kind of joint series Mobile Robot Control System, remote control system and method
CN111228792A (en) * 2020-01-14 2020-06-05 深圳十米网络科技有限公司 Motion sensing game action recognition method and device, computer equipment and storage medium
WO2020133628A1 (en) * 2018-12-29 2020-07-02 深圳市工匠社科技有限公司 Humanoid robotic arm somatosensory control system and related product
WO2021114666A1 (en) * 2019-12-11 2021-06-17 山东大学 Human body safety evaluation method and system in human-machine collaboration
CN117340914A (en) * 2023-10-24 2024-01-05 哈尔滨工程大学 Humanoid robot human body feeling control method and control system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102727362A (en) * 2012-07-20 2012-10-17 上海海事大学 NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102830798A (en) * 2012-07-31 2012-12-19 华南理工大学 Mark-free hand tracking method of single-arm robot based on Kinect
JP2013013969A (en) * 2011-07-04 2013-01-24 Hirotaka Niitsuma Robot control by microsoft kinect (r), and application thereof
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013013969A (en) * 2011-07-04 2013-01-24 Hirotaka Niitsuma Robot control by microsoft kinect (r), and application thereof
CN102727362A (en) * 2012-07-20 2012-10-17 上海海事大学 NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102830798A (en) * 2012-07-31 2012-12-19 华南理工大学 Mark-free hand tracking method of single-arm robot based on Kinect
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
陈晓明: "基于Kinect深度信息的实时三维重复和滤波算法研究", 《计算机应用研究》 *

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460972A (en) * 2013-11-25 2015-03-25 安徽寰智信息科技股份有限公司 Human-computer interaction system based on Kinect
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system
CN103760976B (en) * 2014-01-09 2016-10-05 华南理工大学 Gesture identification intelligent home furnishing control method based on Kinect and system
CN104875209A (en) * 2014-02-28 2015-09-02 发那科株式会社 Machine system including wireless sensor
CN104875209B (en) * 2014-02-28 2017-11-03 发那科株式会社 Mechanical system with wireless senser
CN103921266A (en) * 2014-04-15 2014-07-16 哈尔滨工程大学 Method for somatosensory control over snow and robot on basis of Kinect
CN103995478B (en) * 2014-05-30 2016-05-18 山东建筑大学 Modular Press Machine tool arm experiment porch and method based on virtual reality interaction
CN103995478A (en) * 2014-05-30 2014-08-20 山东建筑大学 Modularized hydraulic mechanical arm experimental platform and method based on interaction of virtual and reality
CN104227724A (en) * 2014-08-28 2014-12-24 北京易拓智谱科技有限公司 Visual identity-based manipulation method for end position of universal robot
CN104227724B (en) * 2014-08-28 2017-01-18 北京易拓智谱科技有限公司 Visual identity-based manipulation method for end position of universal robot
CN104808788A (en) * 2015-03-18 2015-07-29 北京工业大学 Method for controlling user interfaces through non-contact gestures
CN104808788B (en) * 2015-03-18 2017-09-01 北京工业大学 A kind of method that non-contact gesture manipulates user interface
CN106607910A (en) * 2015-10-22 2017-05-03 中国科学院深圳先进技术研究院 Robot real-time simulation method
CN106607910B (en) * 2015-10-22 2019-03-22 中国科学院深圳先进技术研究院 A kind of robot imitates method in real time
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN106095083A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 The determination method of body-sensing instruction and body feeling interaction device
CN106095082A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction method, system and device
CN106095087A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction system and method
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106384115A (en) * 2016-10-26 2017-02-08 武汉工程大学 Mechanical arm joint angle detection method
CN106384115B (en) * 2016-10-26 2019-10-22 武汉工程大学 A kind of joint of mechanical arm angle detecting method
CN107309872A (en) * 2017-05-08 2017-11-03 南京航空航天大学 A kind of flying robot and its control method with mechanical arm
CN107309872B (en) * 2017-05-08 2021-06-15 南京航空航天大学 Flying robot with mechanical arm and control method thereof
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
CN108127667A (en) * 2018-01-18 2018-06-08 西北工业大学 A kind of mechanical arm body feeling interaction control method based on joint angle increment
CN108127667B (en) * 2018-01-18 2021-01-05 西北工业大学 Mechanical arm somatosensory interaction control method based on joint angle increment
CN108814894A (en) * 2018-04-12 2018-11-16 山东大学 The upper limb rehabilitation robot system and application method of view-based access control model human body pose detection
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109108970B (en) * 2018-08-22 2021-11-09 南通大学 Interactive mechanical arm control method based on skeleton node information
CN109176512A (en) * 2018-08-31 2019-01-11 南昌与德通讯技术有限公司 A kind of method, robot and the control device of motion sensing control robot
CN109223441A (en) * 2018-09-13 2019-01-18 华南理工大学 A kind of human upper limb rehabilitation training and movement auxiliary system based on Kinect sensor
WO2020133628A1 (en) * 2018-12-29 2020-07-02 深圳市工匠社科技有限公司 Humanoid robotic arm somatosensory control system and related product
CN110238853A (en) * 2019-06-18 2019-09-17 广州市威控机器人有限公司 A kind of joint series Mobile Robot Control System, remote control system and method
CN110216676A (en) * 2019-06-21 2019-09-10 深圳盈天下视觉科技有限公司 A kind of Mechanical arm control method, manipulator control device and terminal device
CN110216676B (en) * 2019-06-21 2022-04-26 深圳盈天下视觉科技有限公司 Mechanical arm control method, mechanical arm control device and terminal equipment
WO2021114666A1 (en) * 2019-12-11 2021-06-17 山东大学 Human body safety evaluation method and system in human-machine collaboration
CN111228792A (en) * 2020-01-14 2020-06-05 深圳十米网络科技有限公司 Motion sensing game action recognition method and device, computer equipment and storage medium
CN117340914A (en) * 2023-10-24 2024-01-05 哈尔滨工程大学 Humanoid robot human body feeling control method and control system

Also Published As

Publication number Publication date
CN103386683B (en) 2015-04-08

Similar Documents

Publication Publication Date Title
CN103386683B (en) Kinect-based motion sensing-control method for manipulator
CN107984472B (en) Design method of variable parameter neural solver for redundant manipulator motion planning
CN111402290B (en) Action restoration method and device based on skeleton key points
CN105241461A (en) Map creating and positioning method of robot and robot system
CN109543703A (en) The method and device of sensing data processing
CN104406598A (en) Non-cooperative spacecraft attitude estimation method based on virtual sliding mode control
CN105807926A (en) Unmanned aerial vehicle man-machine interaction method based on three-dimensional continuous gesture recognition
CN104133375B (en) A kind of many AUV isochronous controller structure and method for designing
CN103065037B (en) Nonlinear system is based on the method for tracking target of distributing volume information filtering
CN102999696B (en) Noise correlation system is based on the bearingsonly tracking method of volume information filtering
CN103921266A (en) Method for somatosensory control over snow and robot on basis of Kinect
CN113221726A (en) Hand posture estimation method and system based on visual and inertial information fusion
CN103729564A (en) Pressure field calculating method and device based on particle image velocimetry technology
CN105068536A (en) Moving substrate track planner achieved based on nonlinear optimization method
CN105069826A (en) Modeling method of deformation movement of elastic object
Chen et al. Rnin-vio: Robust neural inertial navigation aided visual-inertial odometry in challenging scenes
CN114237041B (en) Space-ground cooperative fixed time fault tolerance control method based on preset performance
CN104240217B (en) Binocular camera image depth information acquisition methods and device
Rezende et al. Constructive time-varying vector fields for robot navigation
CN108898669A (en) Data processing method, device, medium and calculating equipment
CN105404744A (en) Space manipulator full-state dynamic semi-physical simulation system
CN205247208U (en) Robotic system
CN108509024B (en) Data processing method and device based on virtual reality equipment
CN103729879A (en) Virtual hand stable grabbing method based on force sense calculation
CN115686193A (en) Virtual model three-dimensional gesture control method and system in augmented reality environment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201231

Address after: 572024 area A129, 4th floor, building 4, Baitai Industrial Park, yazhouwan science and Technology City, Yazhou District, Sanya City, Hainan Province

Patentee after: Nanhai innovation and development base of Sanya Harbin Engineering University

Address before: 150001 Intellectual Property Office, Harbin Engineering University science and technology office, 145 Nantong Avenue, Nangang District, Harbin, Heilongjiang

Patentee before: HARBIN ENGINEERING University

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150408

Termination date: 20210731

CF01 Termination of patent right due to non-payment of annual fee