CN102814814A - Kinect-based man-machine interaction method for two-arm robot - Google Patents

Kinect-based man-machine interaction method for two-arm robot Download PDF

Info

Publication number
CN102814814A
CN102814814A CN2012102673153A CN201210267315A CN102814814A CN 102814814 A CN102814814 A CN 102814814A CN 2012102673153 A CN2012102673153 A CN 2012102673153A CN 201210267315 A CN201210267315 A CN 201210267315A CN 102814814 A CN102814814 A CN 102814814A
Authority
CN
China
Prior art keywords
arm
point
robot
mapping point
arm end
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2012102673153A
Other languages
Chinese (zh)
Other versions
CN102814814B (en
Inventor
张平
杜广龙
李泽玲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN201210267315.3A priority Critical patent/CN102814814B/en
Publication of CN102814814A publication Critical patent/CN102814814A/en
Application granted granted Critical
Publication of CN102814814B publication Critical patent/CN102814814B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention discloses a Kinect-based man-machine interaction method for a two-arm robot. In the method, operators are allowed to control a six-degree-of-freedom virtual robot to complete an operation task through a natural hand three-dimensional action; and the operators are helped to control the robot and know the action and ambient environment of the robot on the basis of the visual man-machine interaction. According to the method, the visual guidance-based semi-autonomous shared control can be realized, and an end effector with an object grabbing task obtains the more exact position and range due to the semi-autonomous shared control of the visual guidance, so that the grabbing is more exact.

Description

A kind of tow-armed robot is based on the man-machine interaction method of Kinect
Technical field
The present invention relates to human-computer interaction technique field, particularly a kind of tow-armed robot is based on the man-machine interaction method of Kinect.
Background technology
When the mankind were not suitable for appearing at robot site, it was very necessary in the destructuring dynamic environment, utilizing the operator to handle tele-robotic.
For reaching the object run task, the distant machinery of manipulating needs factitious human action with other contact interfaces, and they tend to hinder human action.
Existing based on vision man-machine exchanged form otherwise only use the minority free degree of staff, or need the operator to do motion significantly, very effort operates.
The device that the normally used distant formula man-machine interface of operating machine has comprised that multiple operator wears, such as the skeleton-type mechanical device, instrument gloves, electromagnetism or inertia motion tracking sensor, perhaps electromyographic activity sensor.Yet the device that these are worn, sensor also have their line possibly hinder the operator to move flexibly.The interface of other contacts is such as imitative robot mechanical arm, and rotating disk, rocking bar, computer mouse and computer graphical interface etc. all need the factitious action of operator or need the operator to learn how to operate.Based on the control mode that the interaction technique of vision has used gesture identification and speech recognition to combine, it allows the operator more to provide order intuitively naturally and do not have the restriction on the physical unit.Yet under this control, the action of controlled robot all is made up of some simple orders, such as position and the orientation of confirming end effector; Extracting also has a series of littler tasks as above to move to move down; Rotation suspends, and continues the pattern that also changes and operate.When the complexity of the robotic manipulator that requires action than higher the time, this technology all can be difficult to operation.The hand motion of single camera is followed the tracks of and has been used in the simulation of teleoperation of robot and distant operation.Yet action command only is confined in the subclass of the real end effector free degree.In addition, those and the specific irrelevant gesture of operation task are used to change the operator scheme of robot, so these have all caused the interchange of operation task unnatural.Ideally, a kind of natural need not the task of complicacy being decomposed into limited instruction and having contact physical unit limiting method required by people.
Summary of the invention
Goal of the invention of the present invention is to the technical deficiency of existing man-machine interaction method, and the man-machine interaction method of a kind of tow-armed robot based on Kinect is provided.
For realizing the foregoing invention purpose, the technical scheme that the present invention adopts is:
The man-machine interaction method of a kind of tow-armed robot based on Kinect is provided, and wherein, Kinect is a 3D body sense video camera; May further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
Preferably, said step S1 comprises:
S11, at first set up fundamental coordinate system, demarcate tip, thumbtip, the position between tip and the thumbtip for also having last arm does.
Preferably, said step S2 may further comprise the steps:
S21, use Kinect obtain depth image, and depth image is carried out skeletal extraction.
Preferably, said step S3 may further comprise the steps:
S31, the arm mapping point of establishing the operator are P, and wherein, different at different position and different is labeled as P1 ~ PN; At first set up the palm bounding box: connect P4 and P5 earlier and obtain line P45; Be bottom center's point then with P5, line P45 is a central shaft; Set up a long L of being, wide is W, and height is the palm bounding box B of H; Find that through statistics this palm bounding box B is if will surround palm wherein; L, W and H value only is following:
Figure 118506DEST_PATH_IMAGE001
(1); Wherein,
Figure 231956DEST_PATH_IMAGE002
is the length of line segment P45;
S32, there is a B ' in the institute among the palm bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
Figure 468771DEST_PATH_IMAGE003
(2);
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
Figure 630762DEST_PATH_IMAGE004
(3)
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
Figure 957838DEST_PATH_IMAGE005
;
Figure 117555DEST_PATH_IMAGE006
;
Figure 641334DEST_PATH_IMAGE007
; On right arm point, set up right referential ;
Figure 788598DEST_PATH_IMAGE009
,
Figure 243851DEST_PATH_IMAGE010
;
Then P8 is with respect to left referential
Figure 81357DEST_PATH_IMAGE005
;
Figure 647467DEST_PATH_IMAGE006
, the coordinate of
Figure 683556DEST_PATH_IMAGE007
is:
Figure 745928DEST_PATH_IMAGE011
(4);
P5 is with respect to right referential
Figure 805151DEST_PATH_IMAGE008
;
Figure 174952DEST_PATH_IMAGE009
, the coordinate of
Figure 206493DEST_PATH_IMAGE010
is:
Figure 3548DEST_PATH_IMAGE012
(5);
The calculating that S35, left hand arrive left robot motion's spatial mappings point; The right hand is to the calculating of right robot motion's spatial mappings point;
Wherein, If left hand with respect to the space of left shoulder is: ;
Figure 897128DEST_PATH_IMAGE014
,
Figure 907809DEST_PATH_IMAGE015
; The space of left side robot is:
Figure 813449DEST_PATH_IMAGE016
;
Figure 909581DEST_PATH_IMAGE017
,
Figure 558868DEST_PATH_IMAGE018
; Then left hand to left robot motion's spatial mappings point is:
Figure 689635DEST_PATH_IMAGE019
(6);
Wherein, If the right hand with respect to the space of right shoulder is:
Figure 264710DEST_PATH_IMAGE020
;
Figure 582559DEST_PATH_IMAGE021
;
Figure 97854DEST_PATH_IMAGE022
; The space of right robot is:
Figure 20811DEST_PATH_IMAGE023
;
Figure 330569DEST_PATH_IMAGE024
,
Figure 73397DEST_PATH_IMAGE025
; Then the right hand to right robot motion's spatial mappings point is:
Figure 64487DEST_PATH_IMAGE026
(7)。
Preferably, said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M; Arm mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; Wherein, arm mapping point P is as drive source, and arm end E conduct is by driven object, and elastic force F acts on and makes arm end E tend to arm mapping point P on the arm end E; Relation below elastic force F satisfies:
Figure 218781DEST_PATH_IMAGE027
(8);
Wherein, K is the Hooke coefficient;
Figure 11026DEST_PATH_IMAGE028
is the sphere of action radius of elastic force F, and D is the distance of arm mapping point and robot end E; When arm mapping point and robot end's distance surpassed , then arm end no longer received elastic force F traction; So human hand and arm joint point extracts failure or wrong the time, suddenling change appears in human hand and arm joint point front and back position when occurring, cause the arm mapping point P phenomenon of also undergoing mutation; At this moment; The distance of arm mapping point P and arm end E must be greater than
Figure 299847DEST_PATH_IMAGE028
; Elastic force F is 0; Then arm end E is drawn by arm mapping point P no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point;
S42, if elastic force F is 0 o'clock, arm end E continues to move, and then can cause system uncontrollable; For preventing above-mentioned situation, at the virtual resistance μ of arm end E, resistance μ satisfies following relation:
Figure 259712DEST_PATH_IMAGE029
(9); Wherein,
Figure 347492DEST_PATH_IMAGE030
is damped coefficient,
Figure 127229DEST_PATH_IMAGE031
;
When the distance of arm end E and arm mapping point P during less than
Figure 460121DEST_PATH_IMAGE028
, resistance μ is directly proportional with 2 relative velocity v; V is 0 o'clock, and arm end E stops near arm mapping point P; When the distance of arm end E and arm mapping point P during greater than
Figure 274494DEST_PATH_IMAGE028
; Resistance is infinitely great, arm end stop motion at once; Then the kinetics equation of this moment is:
Figure 96956DEST_PATH_IMAGE032
(10); Wherein,
Figure 301673DEST_PATH_IMAGE033
;
S43, establish
Figure 500573DEST_PATH_IMAGE034
, the sampling time is
Figure 107135DEST_PATH_IMAGE035
; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
Figure 100498DEST_PATH_IMAGE036
, this moment, distance function was:
Figure 293976DEST_PATH_IMAGE037
(10);
Wherein, ;
Figure 819952DEST_PATH_IMAGE039
is integral constant; This moment, arm end E can not do periodic motion; In a sampling time, get back to the equilbrium position, i.e. the arm mapping point.
Preferably; Among the said step S4; During as
Figure 921900DEST_PATH_IMAGE040
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0, and because resistance is infinitely great, arm end E this moment stop motion at once; During as
Figure 163526DEST_PATH_IMAGE041
, this moment, robot control system was a critical damping vibration.
Preferably, said step S5 has used the overdamp model of vibration; The overdamp model of vibration is: when arm mapping point P when arm end E vibrates, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand,
When if , this moment, distance function was:
Figure 19803DEST_PATH_IMAGE043
(11);
Wherein,
Figure 558232DEST_PATH_IMAGE038
;
Figure 21574DEST_PATH_IMAGE039
is integral constant, determined by primary condition; This situation is the overdamp vibration, and friction takes place; Thereby elimination jitter phenomenon.
The present invention is with respect to prior art; Having following beneficial effect: Kinect is a kind of 3D body sense video camera (exploitation code name " Project Natal "), and it has imported functions such as instant motion capture, image identification, microphone input, speech recognition, community interaction simultaneously.The present invention proposes the man-machine interaction method of a kind of discontiguous tow-armed robot based on Kinect, it allows the operator to come the complete operation task through the staff three-dimensional motion control six degree of freedom virtual robot of nature; This model Benq helps the operator in the man-machine interaction of vision and controls robot, understands robot motion and surrounding environment.This invention has realized half independently shared control based on vision guide, and partly autonomous shared control of vision guide then makes the end effector with target extracting task obtain more accurate position and scope, and feasible extracting is more accurate.
Description of drawings
Fig. 1 human body human hand and arm joint point sketch map;
Fig. 2 is left-handed coordinate system modeling figure;
Fig. 3 is right-handed coordinate system modeling figure.
The specific embodiment
Below in conjunction with accompanying drawing and specific embodiment goal of the invention of the present invention is done to describe in further detail, embodiment can not give unnecessary details at this one by one, but therefore embodiment of the present invention is not defined in following examples.Unless stated otherwise, the material and the processing method of the present invention's employing are present technique field conventional material and processing method.
The present invention provides the man-machine interaction method of a kind of tow-armed robot based on Kinect, and wherein, Kinect is a 3D body sense video camera; May further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
At first set up a basic Kinect coordinate system among the step S1, target be demarcate tip ( I), thumbtip ( T), the position between tip and the thumbtip ( B) also have last arm ( U) the position.
The depth image that step S2 uses Kinect to obtain carries out skeletal extraction, like Fig. 1.
Said step S3 may further comprise the steps:
S31, what at first need set up is the palm bounding box: connecting P4 and obtain line P45 with P5, is bottom center's point with P5 then, and line P45 is a central shaft, sets up a long L of being, wide is W, and height is the palm bounding box B of H; Shown in Fig. 2 and 3; This palm bounding box B can surround palm wherein, finds through statistics, and L, W and H only need get following value just can be so that palm bounding box encirclement palm:
Figure 365968DEST_PATH_IMAGE001
(1); Wherein,
Figure 34584DEST_PATH_IMAGE002
is the length of line segment P45.
S32, there is a B ' in the institute among the bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
Figure 806231DEST_PATH_IMAGE003
(2);
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
Figure 694553DEST_PATH_IMAGE004
(3);
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
Figure 577058DEST_PATH_IMAGE005
;
Figure 929542DEST_PATH_IMAGE006
;
Figure 809774DEST_PATH_IMAGE007
; On right arm point, set up right referential
Figure 247708DEST_PATH_IMAGE008
;
Figure 871588DEST_PATH_IMAGE009
; .
Then P8 is with respect to left referential
Figure 631175DEST_PATH_IMAGE005
;
Figure 556406DEST_PATH_IMAGE006
, the coordinate of is:
Figure 45473DEST_PATH_IMAGE011
(4);
Then P5 is with respect to right referential
Figure 1928DEST_PATH_IMAGE008
;
Figure 414455DEST_PATH_IMAGE009
, the coordinate of
Figure 442454DEST_PATH_IMAGE010
is:
Figure 561719DEST_PATH_IMAGE012
(5).
S35, establish left hand and be:
Figure 16971DEST_PATH_IMAGE013
with respect to the space of left shoulder; ,
Figure 653544DEST_PATH_IMAGE015
; The right hand with respect to the space of right shoulder is:
Figure 892895DEST_PATH_IMAGE020
;
Figure 519049DEST_PATH_IMAGE021
; ; The space of left side robot is:
Figure 948073DEST_PATH_IMAGE016
;
Figure 104248DEST_PATH_IMAGE017
, ; The space of right robot is: ;
Figure 794883DEST_PATH_IMAGE024
, ;
Then left hand to left robot motion's spatial mappings point is:
Figure 711203DEST_PATH_IMAGE019
(6);
Then the right hand to right robot motion's spatial mappings point is:
Figure 10598DEST_PATH_IMAGE026
(7)。
Said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M, staff mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; The mapping point P of staff is as drive source, and arm end E conduct is by driven object, and elastic force F acts on the mapping point P that makes arm end E tend to staff on the arm end E.Relation below elastic force F satisfies:
Figure 722202DEST_PATH_IMAGE027
(8);
Wherein, K is the Hooke coefficient;
Figure 259493DEST_PATH_IMAGE028
is active force scope radius; D is the mapping point of staff and robot end's distance; When the mapping point of staff and robot end distance surpasses
Figure 398351DEST_PATH_IMAGE028
time; Then arm end no longer receives elastic force traction, is similar to the disconnection that oversteps the extreme limit of length that spring stretches.So when human hand and arm joint point extraction failure or mistake occurring; Sudden change appears in human hand and arm joint point front and back position; Cause the arm mapping point P phenomenon of also undergoing mutation; This moment, the distance of arm mapping point P and arm end E must be greater than ; Elastic force is 0, and then arm end is drawn by the arm mapping point no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point.
S42, when arm end E very near the distance of arm mapping point or arm end and arm mapping point during greater than ; Elastic force is 0 basically; And this moment, arm end had certain speed; If do not receive the influence of other power; Arm end can continue constantly to move towards certain direction, causes system uncontrollable.
In order to address this problem; This paper applies virtual resistance μ to arm end when moving, resistance μ satisfies situation:
Figure 590669DEST_PATH_IMAGE029
(9); Wherein
Figure 900428DEST_PATH_IMAGE030
is damped coefficient,
Figure 908835DEST_PATH_IMAGE044
.When the distance of arm end E and arm mapping point P during less than
Figure 962242DEST_PATH_IMAGE028
; Resistance is directly proportional with 2 relative velocity v, arm end E near the arm mapping point can stop after to a certain degree near; When the distance of arm end E and arm mapping point P during greater than
Figure 5284DEST_PATH_IMAGE028
; Resistance is infinitely great, arm end stop motion at once; The kinetics equation of this moment is: (10); Wherein,
Figure 205814DEST_PATH_IMAGE033
.
S43, make ; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
Figure 491619DEST_PATH_IMAGE036
, this moment, distance function was:
Figure 346443DEST_PATH_IMAGE037
(10)
Wherein
Figure 329442DEST_PATH_IMAGE038
;
Figure 724652DEST_PATH_IMAGE039
is integral constant; This moment, arm end can not done periodic motion, in a sampling time, got back to equilbrium position (arm mapping point).
S44, suppose algorithm sampling time is ; The distance size of arm end E and arm mapping point P is D; During then as
Figure 33590DEST_PATH_IMAGE040
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0; And, then think arm end E this moment stop motion at once because resistance is infinitely great.During as
Figure 300623DEST_PATH_IMAGE041
, this moment, system was a critical damping vibration.
Said step S5 has used the overdamp model of vibration.When the among a small circle internal vibration of arm mapping point P at arm end E, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand. Here it is overdamp model of vibration.When making
Figure 201321DEST_PATH_IMAGE042
; This moment, algorithm model was the overdamp model of vibration, and distance function is:
Figure 870200DEST_PATH_IMAGE043
(11);
Wherein
Figure 535668DEST_PATH_IMAGE038
;
Figure 289997DEST_PATH_IMAGE039
is integral constant, decided by primary condition.This situation is the overdamp vibration, and friction takes place; Thereby elimination jitter phenomenon.
The foregoing description is merely preferred embodiment of the present invention, is not to be used for limiting practical range of the present invention.Be that all equalizations of doing according to content of the present invention change and modification, all contained by claim of the present invention scope required for protection.

Claims (7)

1. a tow-armed robot is based on the man-machine interaction method of Kinect, and wherein, Kinect is a 3D body sense video camera; It is characterized in that may further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
2. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect said step S1 comprises:
S11, at first set up fundamental coordinate system, demarcate tip, thumbtip, the position between tip and the thumbtip also has last arm.
3. tow-armed robot according to claim 1 is characterized in that based on the unmarked man-machine interaction method of Kinect: said step S2 may further comprise the steps:
S21, use Kinect obtain depth image, and depth image is carried out skeletal extraction.
4. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect: said step S3 may further comprise the steps:
S31, the arm mapping point of establishing the operator are P, and wherein, different at different position and different is labeled as P1 ~ PN; At first set up the palm bounding box: connect P4 and P5 earlier and obtain line P45; Be bottom center's point then with P5, line P45 is a central shaft; Set up a long L of being, wide is W, and height is the palm bounding box B of H; Find that through statistics this palm bounding box B is if will surround palm wherein; L, W and H value only is following:
(1); Wherein,
Figure 643172DEST_PATH_IMAGE002
is the length of line segment P45;
S32, there is a B ' in the institute among the palm bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
Figure 842072DEST_PATH_IMAGE003
(2);
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
Figure 245372DEST_PATH_IMAGE004
(3);
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
Figure 566631DEST_PATH_IMAGE005
;
Figure 320961DEST_PATH_IMAGE006
;
Figure 57973DEST_PATH_IMAGE007
; On right arm point, set up right referential ;
Figure 886568DEST_PATH_IMAGE009
,
Figure 862615DEST_PATH_IMAGE010
;
Then P8 is with respect to left referential
Figure 403317DEST_PATH_IMAGE005
;
Figure 361303DEST_PATH_IMAGE006
, the coordinate of is:
Figure 425391DEST_PATH_IMAGE011
(4);
P5 is with respect to right referential
Figure 566522DEST_PATH_IMAGE008
;
Figure 64499DEST_PATH_IMAGE009
, the coordinate of
Figure 383616DEST_PATH_IMAGE010
is:
Figure 599834DEST_PATH_IMAGE012
(5);
The calculating that S35, left hand arrive left robot motion's spatial mappings point; The right hand is to the calculating of right robot motion's spatial mappings point;
Wherein, If left hand with respect to the space of left shoulder is:
Figure 482339DEST_PATH_IMAGE013
;
Figure 897140DEST_PATH_IMAGE014
,
Figure 839688DEST_PATH_IMAGE015
; The space of left side robot is:
Figure 277623DEST_PATH_IMAGE016
;
Figure 9825DEST_PATH_IMAGE017
,
Figure 482394DEST_PATH_IMAGE018
; Then left hand to left robot motion's spatial mappings point is:
(6);
Wherein, If the right hand with respect to the space of right shoulder is:
Figure 317812DEST_PATH_IMAGE020
;
Figure 542120DEST_PATH_IMAGE021
; ; The space of right robot is: ;
Figure 379123DEST_PATH_IMAGE024
, ; Then the right hand to right robot motion's spatial mappings point is:
(7)。
5. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect: said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M; Arm mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; Wherein, arm mapping point P is as drive source, and arm end E conduct is by driven object, and elastic force F acts on and makes arm end E tend to arm mapping point P on the arm end E; Relation below elastic force F satisfies:
Figure 840694DEST_PATH_IMAGE027
(8);
Wherein, K is the Hooke coefficient; is the sphere of action radius of elastic force F, and D is the distance of arm mapping point and robot end E; When arm mapping point and robot end's distance surpassed
Figure 621142DEST_PATH_IMAGE028
, then arm end no longer received elastic force F traction; So human arm human hand and arm joint point extracts failure or wrong the time, suddenling change appears in human hand and arm joint point front and back position when occurring, cause the arm mapping point P phenomenon of also undergoing mutation; At this moment; The distance of arm mapping point P and arm end E must be greater than
Figure 657231DEST_PATH_IMAGE028
; Elastic force F is 0; Then arm end E is drawn by arm mapping point P no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point;
S42, if elastic force F is 0 o'clock, arm end E continues to move, and then can cause system uncontrollable; For preventing above-mentioned situation, at the virtual resistance μ of arm end E, resistance μ satisfies following relation:
Figure 345701DEST_PATH_IMAGE029
(9); Wherein,
Figure 732820DEST_PATH_IMAGE030
is damped coefficient,
Figure 837043DEST_PATH_IMAGE031
;
When the distance of arm end E and arm mapping point P during less than
Figure 806267DEST_PATH_IMAGE028
, resistance μ is directly proportional with 2 relative velocity v; V is 0 o'clock, and arm end E stops near arm mapping point P; When the distance of arm end E and arm mapping point P during greater than ; Resistance is infinitely great, arm end stop motion at once; Then the kinetics equation of this moment is:
Figure 274474DEST_PATH_IMAGE032
(10); Wherein,
Figure 447967DEST_PATH_IMAGE033
;
S43, establish
Figure 458648DEST_PATH_IMAGE034
, the sampling time is
Figure 738188DEST_PATH_IMAGE035
; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
Figure 709687DEST_PATH_IMAGE036
, this moment, distance function was:
Figure 155711DEST_PATH_IMAGE037
(10);
Wherein,
Figure 286478DEST_PATH_IMAGE038
;
Figure 487653DEST_PATH_IMAGE039
is integral constant; This moment, arm end E can not do periodic motion; In a sampling time, get back to the equilbrium position, i.e. the arm mapping point.
6. tow-armed robot according to claim 5 is based on the man-machine interaction method of Kinect; It is characterized in that: among the said step S4; During as
Figure 71081DEST_PATH_IMAGE040
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0, and because resistance is infinitely great, arm end E this moment stop motion at once; During as
Figure 320796DEST_PATH_IMAGE041
, this moment, robot control system was in the critical damping model of vibration.
7. tow-armed robot according to claim 5 is characterized in that based on the man-machine interaction method of Kinect: said step S5 has used the overdamp model of vibration; The overdamp model of vibration is: when arm mapping point P when arm end E vibrates, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand,
When if
Figure 620584DEST_PATH_IMAGE042
, this moment, distance function was:
(11);
Wherein,
Figure 1067DEST_PATH_IMAGE038
;
Figure 116791DEST_PATH_IMAGE039
is integral constant, determined by primary condition; This situation is the overdamp vibration, and friction takes place; Thereby elimination jitter phenomenon.
CN201210267315.3A 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot Active CN102814814B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210267315.3A CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210267315.3A CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Publications (2)

Publication Number Publication Date
CN102814814A true CN102814814A (en) 2012-12-12
CN102814814B CN102814814B (en) 2015-07-01

Family

ID=47299436

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210267315.3A Active CN102814814B (en) 2012-07-31 2012-07-31 Kinect-based man-machine interaction method for two-arm robot

Country Status (1)

Country Link
CN (1) CN102814814B (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN104308844A (en) * 2014-08-25 2015-01-28 中国石油大学(华东) Somatosensory control method of five-finger bionic mechanical arm
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104680525A (en) * 2015-02-12 2015-06-03 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104850120A (en) * 2015-03-19 2015-08-19 武汉科技大学 Wheel type mobile robot navigation method based on IHDR self-learning frame
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN106644090A (en) * 2016-12-29 2017-05-10 中南大学 Method and system of application deportment testing based on kinect
CN106774896A (en) * 2016-12-19 2017-05-31 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 One kind is based on the enhanced manipulator teleoperation method of perception and system
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
WO2018219194A1 (en) * 2017-06-02 2018-12-06 东南大学 Cyber arm-based teleoperation system for space station robot
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109330688A (en) * 2018-12-10 2019-02-15 中山市环能缪特斯医疗器械科技有限公司 Safe self-monitoring endoscopic auxiliary manipulator and its intelligence control system
CN109584868A (en) * 2013-05-20 2019-04-05 英特尔公司 Natural Human-Computer Interaction for virtual personal assistant system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8123622B1 (en) * 2011-06-03 2012-02-28 Nyko Technologies, Inc. Lens accessory for video game sensor device
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system
CN202257985U (en) * 2011-10-27 2012-05-30 温萍萍 Image interaction device suitable for autism of children

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102470530A (en) * 2009-11-24 2012-05-23 株式会社丰田自动织机 Method of producing teaching data of robot and robot teaching system
US8123622B1 (en) * 2011-06-03 2012-02-28 Nyko Technologies, Inc. Lens accessory for video game sensor device
CN202257985U (en) * 2011-10-27 2012-05-30 温萍萍 Image interaction device suitable for autism of children

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
GUANGLONG DU ET AL.: "Markerless Kinect-Based Hand Tracking for Robot Teleoperation", 《INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS》, vol. 9, no. 36, 13 July 2012 (2012-07-13), pages 3 - 2 *
姚争为: "大型实时互动增强现实系统中的若干问题研究", 《中国博士学位论文全文数据库》, 15 January 2011 (2011-01-15), pages 54 *
王明东: "基于Kinect骨骼跟踪功能实现PC的手势控制", 《漳州职业技术学院学报》, vol. 14, no. 2, 30 June 2012 (2012-06-30) *

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103112007A (en) * 2013-02-06 2013-05-22 华南理工大学 Human-machine interaction method based on mixing sensor
CN103112007B (en) * 2013-02-06 2015-10-28 华南理工大学 Based on the man-machine interaction method of hybrid sensor
US11609631B2 (en) 2013-05-20 2023-03-21 Intel Corporation Natural human-computer interaction for virtual personal assistant systems
CN109584868A (en) * 2013-05-20 2019-04-05 英特尔公司 Natural Human-Computer Interaction for virtual personal assistant system
CN109584868B (en) * 2013-05-20 2022-12-13 英特尔公司 Natural human-computer interaction for virtual personal assistant system
CN103279206A (en) * 2013-06-15 2013-09-04 苏州时运机器人有限公司 Robot control system with gesture-sensing teaching machine
CN103386683A (en) * 2013-07-31 2013-11-13 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103386683B (en) * 2013-07-31 2015-04-08 哈尔滨工程大学 Kinect-based motion sensing-control method for manipulator
CN103398702B (en) * 2013-08-05 2015-08-19 青岛海通机器人系统有限公司 A kind of mobile robot's remote operation controller and manipulation technology thereof
CN103398702A (en) * 2013-08-05 2013-11-20 青岛海通机器人系统有限公司 Mobile-robot remote control apparatus and control technology
CN103440037B (en) * 2013-08-21 2017-02-08 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN103440037A (en) * 2013-08-21 2013-12-11 中国人民解放军第二炮兵工程大学 Real-time interaction virtual human body motion control method based on limited input information
CN104308844A (en) * 2014-08-25 2015-01-28 中国石油大学(华东) Somatosensory control method of five-finger bionic mechanical arm
CN104440926A (en) * 2014-12-09 2015-03-25 重庆邮电大学 Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect
CN104680525B (en) * 2015-02-12 2017-05-10 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104680525A (en) * 2015-02-12 2015-06-03 南通大学 Automatic human body fall-down detection method based on Kinect depth image
CN104850120A (en) * 2015-03-19 2015-08-19 武汉科技大学 Wheel type mobile robot navigation method based on IHDR self-learning frame
CN104850120B (en) * 2015-03-19 2017-11-10 武汉科技大学 Wheeled mobile robot air navigation aid based on IHDR autonomous learning frameworks
CN105291138A (en) * 2015-11-26 2016-02-03 华南理工大学 Visual feedback platform improving virtual reality immersion degree
CN105719279A (en) * 2016-01-15 2016-06-29 上海交通大学 Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method
CN105719279B (en) * 2016-01-15 2018-07-13 上海交通大学 Based on the modeling of cylindroid trunk and arm regions segmentation and arm framework extraction method
CN105904457A (en) * 2016-05-16 2016-08-31 西北工业大学 Heterogeneous redundant mechanical arm control method based on position tracker and data glove
CN105943163A (en) * 2016-06-27 2016-09-21 重庆金山科技(集团)有限公司 Minimally invasive surgery robot and control device thereof
CN106774896A (en) * 2016-12-19 2017-05-31 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106774896B (en) * 2016-12-19 2018-03-13 吉林大学 A kind of sitting posture hand assembly line model is worth evaluating system
CN106644090A (en) * 2016-12-29 2017-05-10 中南大学 Method and system of application deportment testing based on kinect
CN107030692A (en) * 2017-03-28 2017-08-11 浙江大学 One kind is based on the enhanced manipulator teleoperation method of perception and system
CN106971050A (en) * 2017-04-18 2017-07-21 华南理工大学 A kind of Darwin joint of robot Mapping Resolution methods based on Kinect
CN106971050B (en) * 2017-04-18 2020-04-28 华南理工大学 Kinect-based Darwin robot joint mapping analysis method
CN107272593A (en) * 2017-05-23 2017-10-20 陕西科技大学 A kind of robot body-sensing programmed method based on Kinect
WO2018219194A1 (en) * 2017-06-02 2018-12-06 东南大学 Cyber arm-based teleoperation system for space station robot
CN107225573A (en) * 2017-07-05 2017-10-03 上海未来伙伴机器人有限公司 The method of controlling operation and device of robot
CN108638069A (en) * 2018-05-18 2018-10-12 南昌大学 A kind of mechanical arm tail end precise motion control method
CN109108970A (en) * 2018-08-22 2019-01-01 南通大学 A kind of reciprocating mechanical arm control method based on bone nodal information
CN109108970B (en) * 2018-08-22 2021-11-09 南通大学 Interactive mechanical arm control method based on skeleton node information
CN109330688A (en) * 2018-12-10 2019-02-15 中山市环能缪特斯医疗器械科技有限公司 Safe self-monitoring endoscopic auxiliary manipulator and its intelligence control system

Also Published As

Publication number Publication date
CN102814814B (en) 2015-07-01

Similar Documents

Publication Publication Date Title
CN102814814A (en) Kinect-based man-machine interaction method for two-arm robot
Krupke et al. Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction
US20210205986A1 (en) Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose
CN103112007B (en) Based on the man-machine interaction method of hybrid sensor
Li et al. Survey on mapping human hand motion to robotic hands for teleoperation
CN105677036A (en) Interactive type data glove
Naceri et al. Towards a virtual reality interface for remote robotic teleoperation
WO2019024577A1 (en) Natural human-computer interaction system based on multi-sensing data fusion
CN102830798A (en) Mark-free hand tracking method of single-arm robot based on Kinect
Ha et al. Whole-body multi-modal semi-autonomous teleoperation of mobile manipulator systems
Zubrycki et al. Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper
Dwivedi et al. Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system
Liang et al. An Augmented Discrete‐Time Approach for Human‐Robot Collaboration
Liang et al. Robot teleoperation system based on mixed reality
Tran et al. A hands-free virtual-reality teleoperation interface for wizard-of-oz control
Falck et al. DE VITO: A dual-arm, high degree-of-freedom, lightweight, inexpensive, passive upper-limb exoskeleton for robot teleoperation
Zhou et al. Embodied robot teleoperation based on high-fidelity visual-haptic simulator: Pipe-fitting example
George et al. Openvr: Teleoperation for manipulation
CN108062102A (en) A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding
Sharma et al. Design and implementation of robotic hand control using gesture recognition
Liu et al. Virtual reality based tactile sensing enhancements for bilateral teleoperation system with in-hand manipulation
US20220101477A1 (en) Visual Interface And Communications Techniques For Use With Robots
Sylari et al. Hand gesture-based on-line programming of industrial robot manipulators
CN202507280U (en) Humanoid robot
KR20230100101A (en) Robot control system and method for robot setting and robot control using the same

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant