CN103386683B - Kinect-based motion sensing-control method for manipulator - Google Patents

Kinect-based motion sensing-control method for manipulator Download PDF

Info

Publication number
CN103386683B
CN103386683B CN201310328791.6A CN201310328791A CN103386683B CN 103386683 B CN103386683 B CN 103386683B CN 201310328791 A CN201310328791 A CN 201310328791A CN 103386683 B CN103386683 B CN 103386683B
Authority
CN
China
Prior art keywords
coordinate
joint
kinect
elbow joint
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310328791.6A
Other languages
Chinese (zh)
Other versions
CN103386683A (en
Inventor
莫宏伟
孟龙龙
徐立芳
董会云
蒋兴洲
雍升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanhai innovation and development base of Sanya Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201310328791.6A priority Critical patent/CN103386683B/en
Publication of CN103386683A publication Critical patent/CN103386683A/en
Application granted granted Critical
Publication of CN103386683B publication Critical patent/CN103386683B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The invention aims to provide a Kinect-based motion sensing-control method for manipulator comprising the following steps of obtaining the three-diemnsional coordinates of five joints of the right upper limb of a human body by a Kinect sensor; smoothing the obtained five joint coordinate data of the right upper limb by a double-exponential filtering algorithm; constructing vectors in the three-dimensional space coordinate system of the Kinect by utilizing the smoothed 5 joint coordinates of right upper limb, and obtaining the angles of the right upper limb by calculating the vector included angles, wherein the angles of the right upper limb comprise a right shoulder joint angle, a right elbow joint angle and a right wrist joint angle; fusing the angle information to form a data package, adding with a data package head and a checksum, and sending to a robot by a wireless serial port to perform manipulator control. The Kinect-based motion sensing-control method for the manipulator can finish the flexible and accurate control on the manipulator and the moving robot by accurate identification on the human body actions, thus enabling the interaction between people and the robot to be more friendly and improving the intelligence of the robot.

Description

A kind of body sense controller mechanical arm method based on Kinect
Technical field
What the present invention relates to is a kind of robot control method, specifically body sense robot control method.
Background technology
Kinect is a kind of 3D body sense video camera, and it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interactive simultaneously.Microsoft is proposed Kinect forWindows SDK Beta in June, 2011.It is a kind of novel man-machine interactive system, is also a kind of new figure's detecting sensor, and its application is very wide, as virtual mirror, 3D modeling, virtual musical instrument, virtual entertainment and Mechanical course etc.Robot control method research at present based on Kinect is also little, and the flexible control that research and utilization Kinect realizes robot has very wide application prospect.
Current Japan is by the Robotics of its advanced person, and scientist applies Kinect sensor and controls in real time to test to robot, obtains certain effect, and the control method demonstrated with Kinect sensor realizes robot is feasible.Kinect has been applied on the patrol robot of military battlefield by the U.S., and Kinect detecting real-time robot front three-dimensional environment information, with the operation of decision-making robot, can reconstruct the three-dimensional map in robot running simultaneously.More domestic companies or university are applied to the control research of robot also seldom to Kinect now, are not more applied in the middle of actual production.The present invention by the further investigation to Kinect, and carries out Control release on small scale robot platform, is expected to this technology is applied to the controls such as patrol safety protection robot, medical auxiliary robot, mechanical arm simultaneously.
Existing robot control method is all by programme-control, cannot work asynchronously with people.
Summary of the invention
The object of the present invention is to provide a kind of body sense controller mechanical arm method based on Kinect worked asynchronously with people of intelligence.
The object of the present invention is achieved like this:
A kind of body sense controller mechanical arm method based on Kinect of the present invention, is characterized in that:
(1) obtained the three-dimensional coordinate in right side upper limbs 5 joints of human body by Kinect sensor, the three-dimensional coordinate in right side upper limbs 5 joints comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) smoothing to the two exponent filtering algorithm of right side upper limbs 5 the joint coordinates data separate obtained;
(3) utilize right side upper limbs 5 joint coordinates smoothly in the three-dimensional coordinate system of Kinect, build vector, obtained the angle at upper limbs place, right side by compute vector angle, the angle at described upper limbs place, right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, carry out mechanical arm control.
The present invention can also comprise:
1, described two exponent filtering algorithms are:
T represents the time, { x trepresent original data sequence, { s trepresent the two exponential smoothing result of t, { b trepresent the optimal estimation of t data sequence trend, F t+mrepresent the optimal estimation of x in the t+m moment, m is predictive factor, and m>0, two exponential smoothing filtering algorithm formula is as follows:
s 1=x 0
b 1=x 1-x 0
s t=αx t+(1-α)(s t-1+b t-1),t>1
b t=β(s t-s t-1)+(1-β)b t-1,t>1
F t+m=s t+mb t
α represents the data smoothing factor, and 0< α <1, β represents trend smoothing factor, and 0< β <1, definition F 1=s 0+ b 0, then x can estimate according to above-mentioned pair of exponential smoothing filtering algorithm formula in the value in all moment.
2, the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly exports s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate n, right elbow joint coordinate trend optimal estimation b n, right elbow joint coordinate final result optimal estimation F n+1, current from Kinect obtain right elbow joint coordinate v n, counting variable n gives initial value respectively, n is integer variable, and often obtain once the coordinate of right elbow joint, n adds 1;
(2) right elbow joint coordinate v is obtained from Kinect 0, enter first time iteration n=0, trend prediction b 0assignment is 0, smoothly exports s 0assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate finally exports F 1=s 0+ b 0;
(3) right elbow joint coordinate v is obtained from Kinect 1, enter second time iteration n=1, smoothly export s 1assignment is for working as front right elbow joint coordinate v 1elbow joint v right with last time 0mean value, namely trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate finally exports F 2=s 1+ mb 1;
(4) right elbow joint coordinate v is obtained from Kinect 2, enter third time iteration n=2, smoothly export s 2according to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate, trend prediction b 2according to formula b n=β (s n-s n-1)+(1-β) b n-1calculate, right elbow joint coordinate finally exports according to formula F n+1=s n+ mb ncalculate;
(5) according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate and export, reset, get back to step (2) and restart iteration until n overflows, then along with the motion of people's right arm, what realize right elbow joint coordinate is constantly level and smooth;
The smoothing process of right side all the other 4 joint coordinates of upper limbs is identical with the smoothing process of right elbow joint coordinate.
3, upper limb joint angle computation method in right side is:
(1) set up Kinect coordinate system XYZ, under Kinect coordinate system, obtain the three dimensional space coordinate H (x representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T respectively h, y h, z h), S (x s, y s, z s), E (x e, y e, z e), W (x w, y w, z w), T (x t, y t, z t);
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0; (2) vector is calculated respectively
(3) calculate ∠ A, ∠ B, ∠ C respectively, result of calculation correspondence is kept in global variable A, B, C.
4, main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged; Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, thus controls manipulator motion.
5, the control method of the mechanical arm that the control method of the mechanical arm that the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding is corresponding with the right side upper limbs of human body is identical.
Advantage of the present invention is:
1, by accurately identifying the action of human body, can complete to mechanical arm, mobile robot flexible, accurately manipulate.Make the mutual more friendly of people and robot, improve the intelligent of robot.
2, adopt modular programming, facilitate the transplanting of program on different machines people platform.
3, this technology can be used for tele-robotic manipulation, such as Long-distance Control explosive-removal robot, thus reduce unnecessary casualties.
4, whole robot controlling interface adopts XAML programming, and interface and behavior are separated, expansion and integrated after convenient, and developer can play respective speciality, synchronous development, finally completes more complicated robot controlling.
Accompanying drawing explanation
Fig. 1 is two exponent filtering algorithm flow charts of embodiment 1;
Fig. 2 is the Kinect coordinate system schematic diagram of embodiment 1;
Fig. 3 is the right arm geometric representation of embodiment 1;
Fig. 4 is the geometric representation of the mechanical arm of embodiment 1;
The mechanical arm control principle block diagram b of Fig. 5 a to be the mechanical arm control principle block diagram a of embodiment 1, Fig. 5 b be embodiment 1.
Detailed description of the invention
Below in conjunction with accompanying drawing citing, the present invention is described in more detail:
Embodiment 1:
Composition graphs 1 ~ 5, the general steps of present embodiment is:
(1) obtained 20 skeleton point three-dimensional coordinates of human body by Kinect sensor, test the joint coordinates mainly used and comprise right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates.
(2) smoothing to the two exponent filtering algorithm of the right arm joint coordinates data separate obtained, reduce the shake in right arm joint motions process.
(3) utilize filtered right arm joint coordinates in the three-dimensional coordinate system of Kinect, build vector, just can be known all angles of right arm joint by compute vector angle, comprise right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle.Result of calculation is preserved;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, control for mechanical arm.
Idiographic flow is as follows: the two exponent filtering algorithm principle of (1) joint coordinates and flow process
The present invention adopts two exponential smoothing filtering algorithm.T represents the time, { x trepresent original data sequence, { s trepresent the two exponential smoothing result of t, { b trepresent the optimal estimation of t data sequence trend, F t+mrepresent the optimal estimation of x in the t+m moment, m is predictive factor, and m>0, the concrete formula of two exponential smoothing filtering algorithms is as follows:
s 1=x 0(1)
b 1=x 1-x 0(2)
s t=αx t+(1-α)(s t-1+b t-1),t>1 (3)
b t=β(s t-s t-1)+(1-β)b t-1,t>1 (4)
F t+m=s t+ mb t(5) wherein α represents the data smoothing factor, and 0< α <1, β represents trend smoothing factor, and 0< β <1.Definition F 1=s 0+ b 0, such x can estimate according to formula in the value in all moment.With reference to Fig. 1, for right arm elbow joint coordinate smoothly, other joint coordinates are smoothly similar, and concrete level and smooth implementation step is as follows:
Step 1: initialize multiple parameter.
Data smoothing factor-alpha is initialized as 0.5, and trend smoothing factor β is initialized as 0.25, and predictive factor m is initialized as 0.5, s nrepresent that right elbow joint coordinate smoothly exports, begin to turn to (0,0,0), b nrepresent the optimal estimation of right elbow joint coordinate trend, be initialized as (0,0,0), F n+1represent the optimal estimation of right elbow joint coordinate final result, be initialized as (0,0,0), v nrepresent the current right elbow joint coordinate obtained from Kinect, be initialized as (0,0,0), n represents counting variable, and often obtain once the coordinate of right elbow joint, n adds 1, is defined as integer variable, initializes n=0.
Step 2:
Right elbow joint coordinate v is obtained from Kinect 0, enter first time iteration n=0, trend prediction b 0assignment is 0, smoothly exports s 0assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate finally exports F 1=s 0+ b 0, n adds 1.
Step 3:
Right elbow joint coordinate v is obtained from Kinect 1, enter second time iteration n=1, smoothly export s 1assignment is for working as front right elbow joint coordinate v 1elbow joint v right with last time 0mean value, namely trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate finally exports F 2=s 1+ mb 1, n adds 1.
Step 4:
Right elbow joint coordinate v is obtained from Kinect 2, enter third time iteration n=2, smoothly export s 2according to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate, trend prediction b2 is according to formula b n=β (s n-s n-1)+(1-β) b n-1calculate, right elbow joint coordinate finally exports according to formula F n+1=s n+ mb ncalculate, n adds 1.
Step 5: obtain next right elbow joint coordinate v from Kinect nrepeat step 4, iteration count variable n constantly adds 1, because n is integer variable, finally make n overflow clearing so constantly add 1, get back to again step 2 iteration again, therefore, along with people's right arm in Kinect front moves, what just can realize right elbow joint coordinate is constantly level and smooth.
(2) arm joint angle computation method
Kinect coordinate system is with reference to Fig. 2, and right arm geometric representation is with reference to Fig. 3.The three dimensional space coordinate (x, y, z) representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand T is respectively obtained under Kinect coordinate system, first two exponential smoothing process is passed through, then vector is built, according to three-dimensional vector angle calcu-lation formula, if then vector with included angle cosine be its angle &theta; = arccos ( x 1 x 2 + y 1 y 2 + z 1 z 2 ) x 1 2 + y 1 2 + z 1 2 &CenterDot; x 2 2 + y 2 2 + z 2 2 , Finally calculate each joint angles of arm.Angle A in the angle at a, the right shoulder joint place of angle calculation of right shoulder joint place and Fig. 2, centered by right shoulder joint S, vector right shoulder joint place angle &angle; A = arccos [ ( x h - x s ) &CenterDot; ( x e - x s ) + ( y h - y s ) &CenterDot; ( y e - y s ) + ( z h - z s ) &CenterDot; ( z e - z s ) ] ( x h - x s ) 2 + ( y h - y s ) 2 + ( z h - z s ) 2 &CenterDot; ( x e - x s ) 2 + ( y e - y s ) 2 + ( z e - z s ) 2
B, right elbow joint place angle calculation
Angle B in the angle at right elbow joint place and Fig. 2, centered by right elbow joint E, vector right elbow joint place angle &angle; B = arccos [ ( x s - x e ) &CenterDot; ( x w - x e ) + ( y s - y e ) &CenterDot; ( y w - y e ) + ( z s - z e ) &CenterDot; ( z w - z e ) ] ( x s - x e ) 2 + ( y s - y e ) 2 + ( z s - z e ) 2 &CenterDot; ( x w - x e ) 2 + ( y w - y e ) 2 + ( z w - z e ) 2
C, right wrist joint place angle calculation
Angle C in the angle at right wrist joint place and Fig. 2, centered by right wrist joint W, vector right wrist joint place angle &angle; C = arccos [ ( x t - x w ) &CenterDot; ( x e - x w ) + ( y t - y w ) &CenterDot; ( y e - y w ) + ( z t - z w ) &CenterDot; ( z e - z w ) ] ( x t - x w ) 2 + ( y t - y w ) 2 + ( z t - z w ) 2 &CenterDot; ( x e - x w ) 2 + ( y e - y w ) 2 + ( z e - z w ) 2
Concrete steps:
Step 1:
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0.
Step 2:
Obtain the right stern joint coordinates H (x smoothly h, y h, z h), right shoulder joint coordinate S (x s, y s, z s), right elbow joint coordinate E (x e, y e, z e), right wrist joint coordinate W (x w, y w, z w) and right-handed scale (R.H.scale) T (x t, y t, z t).Vector is calculated respectively according to above-mentioned vector calculation formula
Step 3:
Computing formula according to above-mentioned right shoulder joint place angle A calculates ∠ A, in like manner calculates right elbow joint place ∠ B, right wrist joint place ∠ C, result of calculation correspondence is kept in global variable A, B, C.Angle information is formed a packet, add data packet head and School Affairs, send to robot by wireless serial, return step 2.
(3) manipulator motion controls
The geometric representation of mechanical arm is with reference to figure 4, and its main controller adopts MC9S12XS128 microcontroller.The motion in each joint of mechanical arm adopts Servo-controller to control, and No. 1 steering wheel represents right shoulder joint, No. 2 steering wheels represent right elbow joint, No. 3 steering wheels represent right wrist joint, and when arm againsts body, in mechanical arm and figure, 0 ° of reference line overlaps.It is as follows that manipulator motion controls concrete steps:
Step 1:
MC9S12XS128 basic module initializes, and comprises clock, PWM, serial communication and interruption.
Step 2:
Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged.
Step 3:
Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, controls manipulator motion, return step 2.
Constantly repeat according to above-mentioned steps, just can realize the tracking of mechanical arm to arm action, the theory diagram that whole mechanical arm controls is with reference to figure 5.In experiment, mechanical arm can tracing arm motion reposefully, does not have jitter phenomenon, demonstrate above-mentioned filtering algorithm and arm angles computational methods correct and effectively.

Claims (5)

1., based on a body sense controller mechanical arm method of Kinect, it is characterized in that:
(1) obtained the three-dimensional coordinate in right side upper limbs 5 joints of human body by Kinect sensor, the three-dimensional coordinate in right side upper limbs 5 joints comprises right stern joint coordinates, right shoulder joint coordinate, right elbow joint coordinate, right wrist joint coordinate and right hand joint coordinates;
(2) smoothing to the two exponent filtering algorithm of right side upper limbs 5 the joint coordinates data separate obtained;
(3) utilize right side upper limbs 5 joint coordinates smoothly in the three-dimensional coordinate system of Kinect, build vector, obtained the angle at upper limbs place, right side by compute vector angle, the angle at described upper limbs place, right side comprises right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle;
(4) angle information is merged, form a packet, add data packet head and School Affairs, send to robot by wireless serial, carry out mechanical arm control;
Described two exponent filtering algorithms are:
T represents the time, { x trepresent original data sequence, { s trepresent the two exponential smoothing result of t, { b trepresent the optimal estimation of t data sequence trend, F t+mrepresent the optimal estimation of x in the t+m moment, m is predictive factor, and m > 0, two exponential smoothing filtering algorithm formula is as follows:
s 1=x 0
b 1=x 1-x 0
s t=αx t+(1-α)(s t-1+b t-1),t>1
b t=β(s t-s t-1)+(1-β)b t-1,t>1
F t+m=s t+mb t
α represents the data smoothing factor, and 0 < α < 1, β represents trend smoothing factor, and 0 < β < 1, definition F 1=s 0+ b 0, then x can estimate according to above-mentioned pair of exponential smoothing filtering algorithm formula in the value in all moment.
2. a kind of body sense controller mechanical arm method based on Kinect according to claim 1, is characterized in that: the process that right elbow joint coordinate is level and smooth is:
(1) initiation parameter, smoothly exports s to data smoothing factor α, trend smoothing factor β, predictive factor m, right elbow joint coordinate n, right elbow joint coordinate trend optimal estimation b n, right elbow joint coordinate final result optimal estimation F n+1, current from Kinect obtain right elbow joint coordinate v n, counting variable n gives initial value respectively, n is integer variable, and often obtain once the coordinate of right elbow joint, n adds 1;
(2) right elbow joint coordinate v is obtained from Kinect 0, enter first time iteration n=0, trend prediction b 0assignment is 0, smoothly exports s 0assignment is for working as front right elbow joint coordinate v 0, right elbow joint coordinate finally exports F 1=s 0+ b 0;
(3) right elbow joint coordinate v is obtained from Kinect 1, enter second time iteration n=1, smoothly export s 1assignment is for working as front right elbow joint coordinate v 1elbow joint v right with last time 0mean value, namely trend prediction b 1=(s 1-s 0) β, right elbow joint coordinate finally exports F 2=s 1+ mb 1;
(4) right elbow joint coordinate v is obtained from Kinect 2, enter third time iteration n=2, smoothly export s 2according to formula s n=α v n+ (1-α) (s n-1+ b n-1) calculate, trend prediction b 2according to formula b n=β (s n-s n-1)+(1-β) b n-1calculate, right elbow joint coordinate finally exports according to formula F n+1=s n+ mb ncalculate;
(5) according to the iterative manner of step (4), increase iterations, calculate right elbow joint coordinate and export, reset until n overflows, get back to step (2) and restart iteration, then along with the motion of people's right arm, what realize right elbow joint coordinate is constantly level and smooth;
The smoothing process of right side all the other 4 joint coordinates of upper limbs is identical with the smoothing process of right elbow joint coordinate.
3. a kind of body sense controller mechanical arm method based on Kinect according to claim 2, is characterized in that: right side upper limb joint angle computation method is:
(1) set up Kinect coordinate system XYZ, under Kinect coordinate system, obtain the three dimensional space coordinate H (x representing right stern artis H, right shoulder joint node S, right elbow joint point E, right wrist joint point W and right hand artis T respectively h, y h, z h), S (x s, y s, z s), E (x e, y e, z e), W (x w, y w, z w), T (x t, y t, z t);
Definition global variable A, B, C represent right shoulder joint place angle, right elbow joint place angle, right wrist joint place angle respectively, are all initialized as 0;
(2) vector is calculated respectively
(3) calculate ∠ A, ∠ B, ∠ C respectively, result of calculation correspondence is kept in global variable A, B, C.
4. a kind of body sense controller mechanical arm method based on Kinect according to claim 3, is characterized in that: main controller basic module initializes, and comprises clock, PWM, serial communication and interruption; Wait for that the arm angle information packet that host computer sends arrives, main controller is notified in the mode of interrupting after packet arrives, main controller to read after whole packet calculation check and, if identical with the School Affairs that host computer sends, then this packet is effective, resolves, the angle value in each joint of reading machine mechanical arm to packet, otherwise this packet is invalid, the angle value in former each joint of mechanical arm remains unchanged; Each joint angle angle value of the mechanical arm read is converted into corresponding pwm signal, thus controls manipulator motion.
5. according to the arbitrary a kind of described body sense controller mechanical arm method based on Kinect of claim 1-4, it is characterized in that: the control method of the mechanical arm that the control method of the mechanical arm that the left side upper limbs of human body, left side lower limb and right side lower limb are corresponding is corresponding with the right side upper limbs of human body is identical.
CN201310328791.6A 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator Active CN103386683B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310328791.6A CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310328791.6A CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Publications (2)

Publication Number Publication Date
CN103386683A CN103386683A (en) 2013-11-13
CN103386683B true CN103386683B (en) 2015-04-08

Family

ID=49531201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310328791.6A Active CN103386683B (en) 2013-07-31 2013-07-31 Kinect-based motion sensing-control method for manipulator

Country Status (1)

Country Link
CN (1) CN103386683B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104460972A (en) * 2013-11-25 2015-03-25 安徽寰智信息科技股份有限公司 Human-computer interaction system based on Kinect
CN103760976B (en) * 2014-01-09 2016-10-05 华南理工大学 Gesture identification intelligent home furnishing control method based on Kinect and system
JP5986125B2 (en) * 2014-02-28 2016-09-06 ファナック株式会社 Mechanical system with wireless sensor
CN103921266A (en) * 2014-04-15 2014-07-16 哈尔滨工程大学 Method for somatosensory control over snow and robot on basis of Kinect
CN103995478B (en) * 2014-05-30 2016-05-18 山东建筑大学 Modular Press Machine tool arm experiment porch and method based on virtual reality interaction
CN104227724B (en) * 2014-08-28 2017-01-18 北京易拓智谱科技有限公司 Visual identity-based manipulation method for end position of universal robot
CN104808788B (en) * 2015-03-18 2017-09-01 北京工业大学 A kind of method that non-contact gesture manipulates user interface
CN106607910B (en) * 2015-10-22 2019-03-22 中国科学院深圳先进技术研究院 A kind of robot imitates method in real time
CN105904457B (en) * 2016-05-16 2018-03-06 西北工业大学 A kind of heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN106095083A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 The determination method of body-sensing instruction and body feeling interaction device
CN106095087A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction system and method
CN106095082A (en) * 2016-06-02 2016-11-09 深圳奥比中光科技有限公司 Body feeling interaction method, system and device
CN106313072A (en) * 2016-10-12 2017-01-11 南昌大学 Humanoid robot based on leap motion of Kinect
CN106384115B (en) * 2016-10-26 2019-10-22 武汉工程大学 A kind of joint of mechanical arm angle detecting method
CN107309872B (en) * 2017-05-08 2021-06-15 南京航空航天大学 Flying robot with mechanical arm and control method thereof
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
CN108127667B (en) * 2018-01-18 2021-01-05 西北工业大学 Mechanical arm somatosensory interaction control method based on joint angle increment
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109176512A (en) * 2018-08-31 2019-01-11 南昌与德通讯技术有限公司 A kind of method, robot and the control device of motion sensing control robot
WO2020133628A1 (en) * 2018-12-29 2020-07-02 深圳市工匠社科技有限公司 Humanoid robotic arm somatosensory control system and related product
CN110238853A (en) * 2019-06-18 2019-09-17 广州市威控机器人有限公司 A kind of joint series Mobile Robot Control System, remote control system and method
CN110216676A (en) * 2019-06-21 2019-09-10 深圳盈天下视觉科技有限公司 A kind of Mechanical arm control method, manipulator control device and terminal device
CN110978064A (en) * 2019-12-11 2020-04-10 山东大学 Human body safety assessment method and system in human-computer cooperation

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102727362A (en) * 2012-07-20 2012-10-17 上海海事大学 NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102830798A (en) * 2012-07-31 2012-12-19 华南理工大学 Mark-free hand tracking method of single-arm robot based on Kinect
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013013969A (en) * 2011-07-04 2013-01-24 Hirotaka Niitsuma Robot control by microsoft kinect (r), and application thereof

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102727362A (en) * 2012-07-20 2012-10-17 上海海事大学 NUI (Natural User Interface)-based peripheral arm motion tracking rehabilitation training system and training method
CN102814814A (en) * 2012-07-31 2012-12-12 华南理工大学 Kinect-based man-machine interaction method for two-arm robot
CN102830798A (en) * 2012-07-31 2012-12-19 华南理工大学 Mark-free hand tracking method of single-arm robot based on Kinect
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Kinect深度信息的实时三维重复和滤波算法研究;陈晓明;《计算机应用研究》;20130430;第30卷(第4期);第1216页 *

Also Published As

Publication number Publication date
CN103386683A (en) 2013-11-13

Similar Documents

Publication Publication Date Title
CN103386683B (en) Kinect-based motion sensing-control method for manipulator
Qian et al. Developing a gesture based remote human-robot interaction system using kinect
CN107239728A (en) Unmanned plane interactive device and method based on deep learning Attitude estimation
CN105241461A (en) Map creating and positioning method of robot and robot system
CN105807926A (en) Unmanned aerial vehicle man-machine interaction method based on three-dimensional continuous gesture recognition
CN103921266A (en) Method for somatosensory control over snow and robot on basis of Kinect
CN102236414A (en) Picture operation method and system in three-dimensional display space
CN103440037A (en) Real-time interaction virtual human body motion control method based on limited input information
CN110834329B (en) Exoskeleton control method and device
CN103942829A (en) Single-image human body three-dimensional posture reconstruction method
Guo et al. Two-stream convolutional neural network for accurate rgb-d fingertip detection using depth and edge information
CN110415322A (en) The generation method and device of the action command of virtual objects model
Li Human–robot interaction based on gesture and movement recognition
CN105068536A (en) Moving substrate track planner achieved based on nonlinear optimization method
CN105005381A (en) Shake elimination method for virtual robot arm interaction
CN107336238A (en) The control system of all directionally movable robot
CN103729879A (en) Virtual hand stable grabbing method based on force sense calculation
CN106406875A (en) Virtual digital sculpture method based on natural gesture
CN108664126B (en) Deformable hand grabbing interaction method in virtual reality environment
Yano et al. A facial expression parameterization by elastic surface model
CN205247208U (en) Robotic system
CN109543703A (en) The method and device of sensing data processing
CN103699214A (en) Three-dimensional tracking and interacting method based on three-dimensional natural gestures
Berti et al. Kalman filter for tracking robotic arms using low cost 3d vision systems
CN107168556A (en) Air mouse multi-data fusion method, air mouse and air mouse control system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201231

Address after: 572024 area A129, 4th floor, building 4, Baitai Industrial Park, yazhouwan science and Technology City, Yazhou District, Sanya City, Hainan Province

Patentee after: Nanhai innovation and development base of Sanya Harbin Engineering University

Address before: 150001 Intellectual Property Office, Harbin Engineering University science and technology office, 145 Nantong Avenue, Nangang District, Harbin, Heilongjiang

Patentee before: HARBIN ENGINEERING University