CN102814814A - Kinect-based man-machine interaction method for two-arm robot - Google Patents
Kinect-based man-machine interaction method for two-arm robot Download PDFInfo
- Publication number
- CN102814814A CN102814814A CN2012102673153A CN201210267315A CN102814814A CN 102814814 A CN102814814 A CN 102814814A CN 2012102673153 A CN2012102673153 A CN 2012102673153A CN 201210267315 A CN201210267315 A CN 201210267315A CN 102814814 A CN102814814 A CN 102814814A
- Authority
- CN
- China
- Prior art keywords
- arm
- point
- robot
- mapping point
- arm end
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Landscapes
- Manipulator (AREA)
Abstract
The invention discloses a Kinect-based man-machine interaction method for a two-arm robot. In the method, operators are allowed to control a six-degree-of-freedom virtual robot to complete an operation task through a natural hand three-dimensional action; and the operators are helped to control the robot and know the action and ambient environment of the robot on the basis of the visual man-machine interaction. According to the method, the visual guidance-based semi-autonomous shared control can be realized, and an end effector with an object grabbing task obtains the more exact position and range due to the semi-autonomous shared control of the visual guidance, so that the grabbing is more exact.
Description
Technical field
The present invention relates to human-computer interaction technique field, particularly a kind of tow-armed robot is based on the man-machine interaction method of Kinect.
Background technology
When the mankind were not suitable for appearing at robot site, it was very necessary in the destructuring dynamic environment, utilizing the operator to handle tele-robotic.
For reaching the object run task, the distant machinery of manipulating needs factitious human action with other contact interfaces, and they tend to hinder human action.
Existing based on vision man-machine exchanged form otherwise only use the minority free degree of staff, or need the operator to do motion significantly, very effort operates.
The device that the normally used distant formula man-machine interface of operating machine has comprised that multiple operator wears, such as the skeleton-type mechanical device, instrument gloves, electromagnetism or inertia motion tracking sensor, perhaps electromyographic activity sensor.Yet the device that these are worn, sensor also have their line possibly hinder the operator to move flexibly.The interface of other contacts is such as imitative robot mechanical arm, and rotating disk, rocking bar, computer mouse and computer graphical interface etc. all need the factitious action of operator or need the operator to learn how to operate.Based on the control mode that the interaction technique of vision has used gesture identification and speech recognition to combine, it allows the operator more to provide order intuitively naturally and do not have the restriction on the physical unit.Yet under this control, the action of controlled robot all is made up of some simple orders, such as position and the orientation of confirming end effector; Extracting also has a series of littler tasks as above to move to move down; Rotation suspends, and continues the pattern that also changes and operate.When the complexity of the robotic manipulator that requires action than higher the time, this technology all can be difficult to operation.The hand motion of single camera is followed the tracks of and has been used in the simulation of teleoperation of robot and distant operation.Yet action command only is confined in the subclass of the real end effector free degree.In addition, those and the specific irrelevant gesture of operation task are used to change the operator scheme of robot, so these have all caused the interchange of operation task unnatural.Ideally, a kind of natural need not the task of complicacy being decomposed into limited instruction and having contact physical unit limiting method required by people.
Summary of the invention
Goal of the invention of the present invention is to the technical deficiency of existing man-machine interaction method, and the man-machine interaction method of a kind of tow-armed robot based on Kinect is provided.
For realizing the foregoing invention purpose, the technical scheme that the present invention adopts is:
The man-machine interaction method of a kind of tow-armed robot based on Kinect is provided, and wherein, Kinect is a 3D body sense video camera; May further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
Preferably, said step S1 comprises:
S11, at first set up fundamental coordinate system, demarcate tip, thumbtip, the position between tip and the thumbtip for also having last arm does.
Preferably, said step S2 may further comprise the steps:
S21, use Kinect obtain depth image, and depth image is carried out skeletal extraction.
Preferably, said step S3 may further comprise the steps:
S31, the arm mapping point of establishing the operator are P, and wherein, different at different position and different is labeled as P1 ~ PN; At first set up the palm bounding box: connect P4 and P5 earlier and obtain line P45; Be bottom center's point then with P5, line P45 is a central shaft; Set up a long L of being, wide is W, and height is the palm bounding box B of H; Find that through statistics this palm bounding box B is if will surround palm wherein; L, W and H value only is following:
S32, there is a B ' in the institute among the palm bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
;
;
; On right arm point, set up right referential
;
,
;
The calculating that S35, left hand arrive left robot motion's spatial mappings point; The right hand is to the calculating of right robot motion's spatial mappings point;
Wherein, If left hand with respect to the space of left shoulder is:
;
,
; The space of left side robot is:
;
,
; Then left hand to left robot motion's spatial mappings point is:
Wherein, If the right hand with respect to the space of right shoulder is:
;
;
; The space of right robot is:
;
,
; Then the right hand to right robot motion's spatial mappings point is:
Preferably, said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M; Arm mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; Wherein, arm mapping point P is as drive source, and arm end E conduct is by driven object, and elastic force F acts on and makes arm end E tend to arm mapping point P on the arm end E; Relation below elastic force F satisfies:
Wherein, K is the Hooke coefficient;
is the sphere of action radius of elastic force F, and D is the distance of arm mapping point and robot end E; When arm mapping point and robot end's distance surpassed
, then arm end no longer received elastic force F traction; So human hand and arm joint point extracts failure or wrong the time, suddenling change appears in human hand and arm joint point front and back position when occurring, cause the arm mapping point P phenomenon of also undergoing mutation; At this moment; The distance of arm mapping point P and arm end E must be greater than
; Elastic force F is 0; Then arm end E is drawn by arm mapping point P no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point;
S42, if elastic force F is 0 o'clock, arm end E continues to move, and then can cause system uncontrollable; For preventing above-mentioned situation, at the virtual resistance μ of arm end E, resistance μ satisfies following relation:
When the distance of arm end E and arm mapping point P during less than
, resistance μ is directly proportional with 2 relative velocity v; V is 0 o'clock, and arm end E stops near arm mapping point P; When the distance of arm end E and arm mapping point P during greater than
; Resistance is infinitely great, arm end stop motion at once; Then the kinetics equation of this moment is:
S43, establish
, the sampling time is
; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
, this moment, distance function was:
Wherein,
;
is integral constant; This moment, arm end E can not do periodic motion; In a sampling time, get back to the equilbrium position, i.e. the arm mapping point.
Preferably; Among the said step S4; During as
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0, and because resistance is infinitely great, arm end E this moment stop motion at once; During as
, this moment, robot control system was a critical damping vibration.
Preferably, said step S5 has used the overdamp model of vibration; The overdamp model of vibration is: when arm mapping point P when arm end E vibrates, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand,
When if
, this moment, distance function was:
Wherein,
;
is integral constant, determined by primary condition; This situation is the overdamp vibration, and friction takes place; Thereby elimination jitter phenomenon.
The present invention is with respect to prior art; Having following beneficial effect: Kinect is a kind of 3D body sense video camera (exploitation code name " Project Natal "), and it has imported functions such as instant motion capture, image identification, microphone input, speech recognition, community interaction simultaneously.The present invention proposes the man-machine interaction method of a kind of discontiguous tow-armed robot based on Kinect, it allows the operator to come the complete operation task through the staff three-dimensional motion control six degree of freedom virtual robot of nature; This model Benq helps the operator in the man-machine interaction of vision and controls robot, understands robot motion and surrounding environment.This invention has realized half independently shared control based on vision guide, and partly autonomous shared control of vision guide then makes the end effector with target extracting task obtain more accurate position and scope, and feasible extracting is more accurate.
Description of drawings
Fig. 1 human body human hand and arm joint point sketch map;
Fig. 2 is left-handed coordinate system modeling figure;
Fig. 3 is right-handed coordinate system modeling figure.
The specific embodiment
Below in conjunction with accompanying drawing and specific embodiment goal of the invention of the present invention is done to describe in further detail, embodiment can not give unnecessary details at this one by one, but therefore embodiment of the present invention is not defined in following examples.Unless stated otherwise, the material and the processing method of the present invention's employing are present technique field conventional material and processing method.
The present invention provides the man-machine interaction method of a kind of tow-armed robot based on Kinect, and wherein, Kinect is a 3D body sense video camera; May further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
At first set up a basic Kinect coordinate system among the step S1, target be demarcate tip (
I), thumbtip (
T), the position between tip and the thumbtip (
B) also have last arm (
U) the position.
The depth image that step S2 uses Kinect to obtain carries out skeletal extraction, like Fig. 1.
Said step S3 may further comprise the steps:
S31, what at first need set up is the palm bounding box: connecting P4 and obtain line P45 with P5, is bottom center's point with P5 then, and line P45 is a central shaft, sets up a long L of being, wide is W, and height is the palm bounding box B of H; Shown in Fig. 2 and 3; This palm bounding box B can surround palm wherein, finds through statistics, and L, W and H only need get following value just can be so that palm bounding box encirclement palm:
S32, there is a B ' in the institute among the bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
;
;
; On right arm point, set up right referential
;
;
.
S35, establish left hand and be:
with respect to the space of left shoulder;
,
; The right hand with respect to the space of right shoulder is:
;
;
; The space of left side robot is:
;
,
; The space of right robot is:
;
,
;
Then left hand to left robot motion's spatial mappings point is:
Then the right hand to right robot motion's spatial mappings point is:
Said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M, staff mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; The mapping point P of staff is as drive source, and arm end E conduct is by driven object, and elastic force F acts on the mapping point P that makes arm end E tend to staff on the arm end E.Relation below elastic force F satisfies:
(8);
Wherein, K is the Hooke coefficient;
is active force scope radius; D is the mapping point of staff and robot end's distance; When the mapping point of staff and robot end distance surpasses
time; Then arm end no longer receives elastic force traction, is similar to the disconnection that oversteps the extreme limit of length that spring stretches.So when human hand and arm joint point extraction failure or mistake occurring; Sudden change appears in human hand and arm joint point front and back position; Cause the arm mapping point P phenomenon of also undergoing mutation; This moment, the distance of arm mapping point P and arm end E must be greater than
; Elastic force is 0, and then arm end is drawn by the arm mapping point no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point.
S42, when arm end E very near the distance of arm mapping point or arm end and arm mapping point during greater than
; Elastic force is 0 basically; And this moment, arm end had certain speed; If do not receive the influence of other power; Arm end can continue constantly to move towards certain direction, causes system uncontrollable.
In order to address this problem; This paper applies virtual resistance μ to arm end when moving, resistance μ satisfies situation:
(9); Wherein
is damped coefficient,
.When the distance of arm end E and arm mapping point P during less than
; Resistance is directly proportional with 2 relative velocity v, arm end E near the arm mapping point can stop after to a certain degree near; When the distance of arm end E and arm mapping point P during greater than
; Resistance is infinitely great, arm end stop motion at once; The kinetics equation of this moment is:
(10); Wherein,
.
S43, make
; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
, this moment, distance function was:
(10)
Wherein
;
is integral constant; This moment, arm end can not done periodic motion, in a sampling time, got back to equilbrium position (arm mapping point).
S44, suppose algorithm sampling time is
; The distance size of arm end E and arm mapping point P is D; During then as
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0; And, then think arm end E this moment stop motion at once because resistance is infinitely great.During as
, this moment, system was a critical damping vibration.
Said step S5 has used the overdamp model of vibration.When the among a small circle internal vibration of arm mapping point P at arm end E, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand. Here it is overdamp model of vibration.When making
; This moment, algorithm model was the overdamp model of vibration, and distance function is:
Wherein
;
is integral constant, decided by primary condition.This situation is the overdamp vibration, and friction takes place; Thereby elimination jitter phenomenon.
The foregoing description is merely preferred embodiment of the present invention, is not to be used for limiting practical range of the present invention.Be that all equalizations of doing according to content of the present invention change and modification, all contained by claim of the present invention scope required for protection.
Claims (7)
1. a tow-armed robot is based on the man-machine interaction method of Kinect, and wherein, Kinect is a 3D body sense video camera; It is characterized in that may further comprise the steps:
S1, operator's arm is set up coordinate system through 3D body sense video camera;
S2, operator's arm image is carried out skeletal extraction through computer;
S3, confirm the position of forefinger and thumb through the graphical analysis that has comprised arm;
S4, use critical damping model of vibration solve the sudden change problem;
S5, use overdamp model of vibration carry out dithering process.
2. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect said step S1 comprises:
S11, at first set up fundamental coordinate system, demarcate tip, thumbtip, the position between tip and the thumbtip also has last arm.
3. tow-armed robot according to claim 1 is characterized in that based on the unmarked man-machine interaction method of Kinect: said step S2 may further comprise the steps:
S21, use Kinect obtain depth image, and depth image is carried out skeletal extraction.
4. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect: said step S3 may further comprise the steps:
S31, the arm mapping point of establishing the operator are P, and wherein, different at different position and different is labeled as P1 ~ PN; At first set up the palm bounding box: connect P4 and P5 earlier and obtain line P45; Be bottom center's point then with P5, line P45 is a central shaft; Set up a long L of being, wide is W, and height is the palm bounding box B of H; Find that through statistics this palm bounding box B is if will surround palm wherein; L, W and H value only is following:
S32, there is a B ' in the institute among the palm bounding box B, calculates it to the distance between the P5, then B2 point is to leave P5 point farthest among the bounding box B:
S33, connection P5 and B2 obtain line PB, for the institute among the bounding box B B ' are arranged, and calculate its distance to line PB, and then the B1 point is to leave line PB point farthest among the bounding box B:
S34, choose the arm human hand and arm joint and put point as a reference; On left arm point, set up left referential
;
;
; On right arm point, set up right referential
;
,
;
The calculating that S35, left hand arrive left robot motion's spatial mappings point; The right hand is to the calculating of right robot motion's spatial mappings point;
Wherein, If left hand with respect to the space of left shoulder is:
;
,
; The space of left side robot is:
;
,
; Then left hand to left robot motion's spatial mappings point is:
(6);
Wherein, If the right hand with respect to the space of right shoulder is:
;
;
; The space of right robot is:
;
,
; Then the right hand to right robot motion's spatial mappings point is:
(7)。
5. tow-armed robot according to claim 1 is characterized in that based on the man-machine interaction method of Kinect: said step S4 may further comprise the steps:
S41, between arm mapping point P and robot end E virtual elastic force F, and virtual machine hand end mass is M; Arm mapping point P position is not directly to be equal to arm end E, but draws over to one's side through elastic force F; Wherein, arm mapping point P is as drive source, and arm end E conduct is by driven object, and elastic force F acts on and makes arm end E tend to arm mapping point P on the arm end E; Relation below elastic force F satisfies:
Wherein, K is the Hooke coefficient;
is the sphere of action radius of elastic force F, and D is the distance of arm mapping point and robot end E; When arm mapping point and robot end's distance surpassed
, then arm end no longer received elastic force F traction; So human arm human hand and arm joint point extracts failure or wrong the time, suddenling change appears in human hand and arm joint point front and back position when occurring, cause the arm mapping point P phenomenon of also undergoing mutation; At this moment; The distance of arm mapping point P and arm end E must be greater than
; Elastic force F is 0; Then arm end E is drawn by arm mapping point P no longer, extracts failure or the wrong sudden change influence that brings thereby filtered human hand and arm joint point;
S42, if elastic force F is 0 o'clock, arm end E continues to move, and then can cause system uncontrollable; For preventing above-mentioned situation, at the virtual resistance μ of arm end E, resistance μ satisfies following relation:
When the distance of arm end E and arm mapping point P during less than
, resistance μ is directly proportional with 2 relative velocity v; V is 0 o'clock, and arm end E stops near arm mapping point P; When the distance of arm end E and arm mapping point P during greater than
; Resistance is infinitely great, arm end stop motion at once; Then the kinetics equation of this moment is:
S43, establish
, the sampling time is
; In order to let arm end E in each sampling time, reach arm mapping point P; Then make
, this moment, distance function was:
6. tow-armed robot according to claim 5 is based on the man-machine interaction method of Kinect; It is characterized in that: among the said step S4; During as
; Think that human hand and arm joint point extracts failure and causes the human hand and arm joint point position phenomenon of undergoing mutation; The suffered tractive force of arm end E this moment is 0, and because resistance is infinitely great, arm end E this moment stop motion at once; During as
, this moment, robot control system was in the critical damping model of vibration.
7. tow-armed robot according to claim 5 is characterized in that based on the man-machine interaction method of Kinect: said step S5 has used the overdamp model of vibration; The overdamp model of vibration is: when arm mapping point P when arm end E vibrates, arm mapping point P is not enough to offset the resistance that arm end E receives to the gravitation that arm end E produces, thereby can not the terminal E motion of drives mechanical hand,
(11);
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210267315.3A CN102814814B (en) | 2012-07-31 | 2012-07-31 | Kinect-based man-machine interaction method for two-arm robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201210267315.3A CN102814814B (en) | 2012-07-31 | 2012-07-31 | Kinect-based man-machine interaction method for two-arm robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN102814814A true CN102814814A (en) | 2012-12-12 |
CN102814814B CN102814814B (en) | 2015-07-01 |
Family
ID=47299436
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210267315.3A Active CN102814814B (en) | 2012-07-31 | 2012-07-31 | Kinect-based man-machine interaction method for two-arm robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102814814B (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103112007A (en) * | 2013-02-06 | 2013-05-22 | 华南理工大学 | Human-machine interaction method based on mixing sensor |
CN103279206A (en) * | 2013-06-15 | 2013-09-04 | 苏州时运机器人有限公司 | Robot control system with gesture-sensing teaching machine |
CN103386683A (en) * | 2013-07-31 | 2013-11-13 | 哈尔滨工程大学 | Kinect-based motion sensing-control method for manipulator |
CN103398702A (en) * | 2013-08-05 | 2013-11-20 | 青岛海通机器人系统有限公司 | Mobile-robot remote control apparatus and control technology |
CN103440037A (en) * | 2013-08-21 | 2013-12-11 | 中国人民解放军第二炮兵工程大学 | Real-time interaction virtual human body motion control method based on limited input information |
CN104308844A (en) * | 2014-08-25 | 2015-01-28 | 中国石油大学(华东) | Somatosensory control method of five-finger bionic mechanical arm |
CN104440926A (en) * | 2014-12-09 | 2015-03-25 | 重庆邮电大学 | Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect |
CN104680525A (en) * | 2015-02-12 | 2015-06-03 | 南通大学 | Automatic human body fall-down detection method based on Kinect depth image |
CN104850120A (en) * | 2015-03-19 | 2015-08-19 | 武汉科技大学 | Wheel type mobile robot navigation method based on IHDR self-learning frame |
CN105291138A (en) * | 2015-11-26 | 2016-02-03 | 华南理工大学 | Visual feedback platform improving virtual reality immersion degree |
CN105719279A (en) * | 2016-01-15 | 2016-06-29 | 上海交通大学 | Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method |
CN105904457A (en) * | 2016-05-16 | 2016-08-31 | 西北工业大学 | Heterogeneous redundant mechanical arm control method based on position tracker and data glove |
CN105943163A (en) * | 2016-06-27 | 2016-09-21 | 重庆金山科技(集团)有限公司 | Minimally invasive surgery robot and control device thereof |
CN106644090A (en) * | 2016-12-29 | 2017-05-10 | 中南大学 | Method and system of application deportment testing based on kinect |
CN106774896A (en) * | 2016-12-19 | 2017-05-31 | 吉林大学 | A kind of sitting posture hand assembly line model is worth evaluating system |
CN106971050A (en) * | 2017-04-18 | 2017-07-21 | 华南理工大学 | A kind of Darwin joint of robot Mapping Resolution methods based on Kinect |
CN107030692A (en) * | 2017-03-28 | 2017-08-11 | 浙江大学 | One kind is based on the enhanced manipulator teleoperation method of perception and system |
CN107225573A (en) * | 2017-07-05 | 2017-10-03 | 上海未来伙伴机器人有限公司 | The method of controlling operation and device of robot |
CN107272593A (en) * | 2017-05-23 | 2017-10-20 | 陕西科技大学 | A kind of robot body-sensing programmed method based on Kinect |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
WO2018219194A1 (en) * | 2017-06-02 | 2018-12-06 | 东南大学 | Cyber arm-based teleoperation system for space station robot |
CN109108970A (en) * | 2018-08-22 | 2019-01-01 | 南通大学 | A kind of reciprocating mechanical arm control method based on bone nodal information |
CN109330688A (en) * | 2018-12-10 | 2019-02-15 | 中山市环能缪特斯医疗器械科技有限公司 | Safe self-monitoring endoscopic auxiliary manipulator and its intelligence control system |
CN109584868A (en) * | 2013-05-20 | 2019-04-05 | 英特尔公司 | Natural Human-Computer Interaction for virtual personal assistant system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8123622B1 (en) * | 2011-06-03 | 2012-02-28 | Nyko Technologies, Inc. | Lens accessory for video game sensor device |
CN102470530A (en) * | 2009-11-24 | 2012-05-23 | 株式会社丰田自动织机 | Method of producing teaching data of robot and robot teaching system |
CN202257985U (en) * | 2011-10-27 | 2012-05-30 | 温萍萍 | Image interaction device suitable for autism of children |
-
2012
- 2012-07-31 CN CN201210267315.3A patent/CN102814814B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102470530A (en) * | 2009-11-24 | 2012-05-23 | 株式会社丰田自动织机 | Method of producing teaching data of robot and robot teaching system |
US8123622B1 (en) * | 2011-06-03 | 2012-02-28 | Nyko Technologies, Inc. | Lens accessory for video game sensor device |
CN202257985U (en) * | 2011-10-27 | 2012-05-30 | 温萍萍 | Image interaction device suitable for autism of children |
Non-Patent Citations (3)
Title |
---|
GUANGLONG DU ET AL.: "Markerless Kinect-Based Hand Tracking for Robot Teleoperation", 《INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS》, vol. 9, no. 36, 13 July 2012 (2012-07-13), pages 3 - 2 * |
姚争为: "大型实时互动增强现实系统中的若干问题研究", 《中国博士学位论文全文数据库》, 15 January 2011 (2011-01-15), pages 54 * |
王明东: "基于Kinect骨骼跟踪功能实现PC的手势控制", 《漳州职业技术学院学报》, vol. 14, no. 2, 30 June 2012 (2012-06-30) * |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103112007A (en) * | 2013-02-06 | 2013-05-22 | 华南理工大学 | Human-machine interaction method based on mixing sensor |
CN103112007B (en) * | 2013-02-06 | 2015-10-28 | 华南理工大学 | Based on the man-machine interaction method of hybrid sensor |
US11609631B2 (en) | 2013-05-20 | 2023-03-21 | Intel Corporation | Natural human-computer interaction for virtual personal assistant systems |
CN109584868A (en) * | 2013-05-20 | 2019-04-05 | 英特尔公司 | Natural Human-Computer Interaction for virtual personal assistant system |
CN109584868B (en) * | 2013-05-20 | 2022-12-13 | 英特尔公司 | Natural human-computer interaction for virtual personal assistant system |
CN103279206A (en) * | 2013-06-15 | 2013-09-04 | 苏州时运机器人有限公司 | Robot control system with gesture-sensing teaching machine |
CN103386683A (en) * | 2013-07-31 | 2013-11-13 | 哈尔滨工程大学 | Kinect-based motion sensing-control method for manipulator |
CN103386683B (en) * | 2013-07-31 | 2015-04-08 | 哈尔滨工程大学 | Kinect-based motion sensing-control method for manipulator |
CN103398702B (en) * | 2013-08-05 | 2015-08-19 | 青岛海通机器人系统有限公司 | A kind of mobile robot's remote operation controller and manipulation technology thereof |
CN103398702A (en) * | 2013-08-05 | 2013-11-20 | 青岛海通机器人系统有限公司 | Mobile-robot remote control apparatus and control technology |
CN103440037B (en) * | 2013-08-21 | 2017-02-08 | 中国人民解放军第二炮兵工程大学 | Real-time interaction virtual human body motion control method based on limited input information |
CN103440037A (en) * | 2013-08-21 | 2013-12-11 | 中国人民解放军第二炮兵工程大学 | Real-time interaction virtual human body motion control method based on limited input information |
CN104308844A (en) * | 2014-08-25 | 2015-01-28 | 中国石油大学(华东) | Somatosensory control method of five-finger bionic mechanical arm |
CN104440926A (en) * | 2014-12-09 | 2015-03-25 | 重庆邮电大学 | Mechanical arm somatic sense remote controlling method and mechanical arm somatic sense remote controlling system based on Kinect |
CN104680525B (en) * | 2015-02-12 | 2017-05-10 | 南通大学 | Automatic human body fall-down detection method based on Kinect depth image |
CN104680525A (en) * | 2015-02-12 | 2015-06-03 | 南通大学 | Automatic human body fall-down detection method based on Kinect depth image |
CN104850120A (en) * | 2015-03-19 | 2015-08-19 | 武汉科技大学 | Wheel type mobile robot navigation method based on IHDR self-learning frame |
CN104850120B (en) * | 2015-03-19 | 2017-11-10 | 武汉科技大学 | Wheeled mobile robot air navigation aid based on IHDR autonomous learning frameworks |
CN105291138A (en) * | 2015-11-26 | 2016-02-03 | 华南理工大学 | Visual feedback platform improving virtual reality immersion degree |
CN105719279A (en) * | 2016-01-15 | 2016-06-29 | 上海交通大学 | Elliptic cylinder-based human trunk modeling, arm area segmentation and arm skeleton extraction method |
CN105719279B (en) * | 2016-01-15 | 2018-07-13 | 上海交通大学 | Based on the modeling of cylindroid trunk and arm regions segmentation and arm framework extraction method |
CN105904457A (en) * | 2016-05-16 | 2016-08-31 | 西北工业大学 | Heterogeneous redundant mechanical arm control method based on position tracker and data glove |
CN105943163A (en) * | 2016-06-27 | 2016-09-21 | 重庆金山科技(集团)有限公司 | Minimally invasive surgery robot and control device thereof |
CN106774896A (en) * | 2016-12-19 | 2017-05-31 | 吉林大学 | A kind of sitting posture hand assembly line model is worth evaluating system |
CN106774896B (en) * | 2016-12-19 | 2018-03-13 | 吉林大学 | A kind of sitting posture hand assembly line model is worth evaluating system |
CN106644090A (en) * | 2016-12-29 | 2017-05-10 | 中南大学 | Method and system of application deportment testing based on kinect |
CN107030692A (en) * | 2017-03-28 | 2017-08-11 | 浙江大学 | One kind is based on the enhanced manipulator teleoperation method of perception and system |
CN106971050A (en) * | 2017-04-18 | 2017-07-21 | 华南理工大学 | A kind of Darwin joint of robot Mapping Resolution methods based on Kinect |
CN106971050B (en) * | 2017-04-18 | 2020-04-28 | 华南理工大学 | Kinect-based Darwin robot joint mapping analysis method |
CN107272593A (en) * | 2017-05-23 | 2017-10-20 | 陕西科技大学 | A kind of robot body-sensing programmed method based on Kinect |
WO2018219194A1 (en) * | 2017-06-02 | 2018-12-06 | 东南大学 | Cyber arm-based teleoperation system for space station robot |
CN107225573A (en) * | 2017-07-05 | 2017-10-03 | 上海未来伙伴机器人有限公司 | The method of controlling operation and device of robot |
CN108638069A (en) * | 2018-05-18 | 2018-10-12 | 南昌大学 | A kind of mechanical arm tail end precise motion control method |
CN109108970A (en) * | 2018-08-22 | 2019-01-01 | 南通大学 | A kind of reciprocating mechanical arm control method based on bone nodal information |
CN109108970B (en) * | 2018-08-22 | 2021-11-09 | 南通大学 | Interactive mechanical arm control method based on skeleton node information |
CN109330688A (en) * | 2018-12-10 | 2019-02-15 | 中山市环能缪特斯医疗器械科技有限公司 | Safe self-monitoring endoscopic auxiliary manipulator and its intelligence control system |
Also Published As
Publication number | Publication date |
---|---|
CN102814814B (en) | 2015-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102814814A (en) | Kinect-based man-machine interaction method for two-arm robot | |
Krupke et al. | Comparison of multimodal heading and pointing gestures for co-located mixed reality human-robot interaction | |
US20210205986A1 (en) | Teleoperating Of Robots With Tasks By Mapping To Human Operator Pose | |
CN103112007B (en) | Based on the man-machine interaction method of hybrid sensor | |
Li et al. | Survey on mapping human hand motion to robotic hands for teleoperation | |
CN105677036A (en) | Interactive type data glove | |
Naceri et al. | Towards a virtual reality interface for remote robotic teleoperation | |
WO2019024577A1 (en) | Natural human-computer interaction system based on multi-sensing data fusion | |
CN102830798A (en) | Mark-free hand tracking method of single-arm robot based on Kinect | |
Ha et al. | Whole-body multi-modal semi-autonomous teleoperation of mobile manipulator systems | |
Zubrycki et al. | Using integrated vision systems: three gears and leap motion, to control a 3-finger dexterous gripper | |
Dwivedi et al. | Combining electromyography and fiducial marker based tracking for intuitive telemanipulation with a robot arm hand system | |
Liang et al. | An Augmented Discrete‐Time Approach for Human‐Robot Collaboration | |
Liang et al. | Robot teleoperation system based on mixed reality | |
Tran et al. | A hands-free virtual-reality teleoperation interface for wizard-of-oz control | |
Falck et al. | DE VITO: A dual-arm, high degree-of-freedom, lightweight, inexpensive, passive upper-limb exoskeleton for robot teleoperation | |
Zhou et al. | Embodied robot teleoperation based on high-fidelity visual-haptic simulator: Pipe-fitting example | |
George et al. | Openvr: Teleoperation for manipulation | |
CN108062102A (en) | A kind of gesture control has the function of the Mobile Robot Teleoperation System Based of obstacle avoidance aiding | |
Sharma et al. | Design and implementation of robotic hand control using gesture recognition | |
Liu et al. | Virtual reality based tactile sensing enhancements for bilateral teleoperation system with in-hand manipulation | |
US20220101477A1 (en) | Visual Interface And Communications Techniques For Use With Robots | |
Sylari et al. | Hand gesture-based on-line programming of industrial robot manipulators | |
CN202507280U (en) | Humanoid robot | |
KR20230100101A (en) | Robot control system and method for robot setting and robot control using the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |