CN108274448A - A kind of the robot teaching method and teaching system of human body interaction - Google Patents
A kind of the robot teaching method and teaching system of human body interaction Download PDFInfo
- Publication number
- CN108274448A CN108274448A CN201810094154.XA CN201810094154A CN108274448A CN 108274448 A CN108274448 A CN 108274448A CN 201810094154 A CN201810094154 A CN 201810094154A CN 108274448 A CN108274448 A CN 108274448A
- Authority
- CN
- China
- Prior art keywords
- teaching
- robot
- demonstrator
- gesture
- robot control
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Abstract
The present invention provides a kind of robot teaching methods of human body interaction, it is characterised in that:Confirm demonstrator's permission and enters human body interactive mode;Enter teaching pattern by judging the gesture of demonstrator's teaching hand to start teaching;During teaching, constantly obtain demonstrator image and skeleton information and carry out parsing form teaching track;By mapping algorithm maps at robot body movement locus, resolves to robot control instruction and be sent to robot control system, robot control system changes the posture of robot body according to robot control instruction in real time;Terminate teaching, output learning track by judging the gesture of the non-teaching hand of demonstrator;Edit teaching track;Generation and editor robot control program.Teaching method teaching action of the present invention is intuitive, and programming efficiency is high, has good safety.The present invention also provides a kind of teaching actions intuitive, programming efficiency height, the robot teaching system interacted with good safety, human body.
Description
Technical field
The present invention relates to industrial robot teaching technical fields, more specifically to a kind of robot of human body interaction
Teaching method and teaching system.
Background technology
With the fast development of science and technology, robot assisted is usually used in industry manufacture, production and its correlative technology field
Production is to save labour turnover and improve production efficiency.Multi-joint industrial robot currently on the market belongs to common six axis mostly
(six joints) robot realizes that movement locus is programmed and stores, later the repetition movement locus by teaching mode
Robot is set to operate.Robot teaching mode mainly uses teaching machine teaching and drawing type teaching two ways at present;Both
Teaching mode respectively has feature, but there is also insufficient.
Robot motion is presented in teaching machine teaching mode in a manner of source code, and teaching action is not intuitive enough;Teaching is defeated from instructing
Enter to realization, needs by longer coding, position fixing process, programming efficiency is relatively low.Drawing type teaching mode is to space and environment
There is certain requirement, under the teaching environment that space limits robot motion amplitude, drawing type teaching cannot achieve;Part environment
Unmanned operation is required for reasons of safety, is even more that can not implement drawing type teaching under the environment.In addition, drawing type teaching side
Formula is to drag the record that robot realizes the positioning and track of target point by demonstrator, and demonstrator is needed to contact robot;
When robotically-driven executing agency and related transducer, driving mechanism break down, demonstrator's contact robot will have safe
Risk, therefore there are security risks for drawing type teaching mode.Therefore, now there are two types of teaching mode be not it is very ideal, there is an urgent need for
Design a kind of intuitive teaching action, programming efficiency height, the robot teaching method with good safety and teaching system.
Invention content
To overcome shortcoming and deficiency of the prior art, it is an object of the present invention to provide a kind of teaching action is straight
It sees, programming efficiency is high, the robot teaching method with good safety, human body interaction.It is another object of the present invention to
The robot teaching system that a kind of teaching action is intuitive, programming efficiency is high, is interacted with good safety, human body is provided.
In order to achieve the above object, the technical scheme is that:A kind of robot of human body interaction
Teaching method, it is characterised in that:Robot essential information is set;Confirm demonstrator's permission and enters human body interactive mode;Pass through
The gesture of demonstrator's teaching hand is judged to enter teaching pattern to start teaching;During teaching, constantly obtain demonstrator's
Image and skeleton information simultaneously carry out posture and movable information that parsing obtains demonstrator's teaching hand, form teaching track;It is logical
Mapping algorithm is crossed by teaching trajectory map into robot body movement locus, robot body movement locus is resolved into robot
Control instruction is simultaneously sent to robot control system, and robot control system changes robot in real time according to robot control instruction
The posture of ontology, to which demonstrator's posture and movable information to be mapped on robot body in real time;By judging that demonstrator is non-
The gesture of teaching hand terminates teaching, output learning track;Edit teaching track;Generation and editor robot control program.
Teaching method of the present invention is by obtaining the image of demonstrator and skeleton information and parsing, handle and formed
For robot control program to realize teaching, programming efficiency is high, and demonstrator need not contact robot, can eliminate because of robot fault
And the security risk for the demonstrator that wounds, there is good safety;During teaching, demonstrator's posture and movable information reflect in real time
It is mapped on robot body, makes robot body that human body attitude transformation can be followed to make corresponding posture changing, teaching action is straight
It sees, demonstrator can check whether teaching action meets the requirements and make corresponding adjustment in time, can further improve programming efficiency.
Preferably, the setting robot essential information;Confirm demonstrator's permission and enters human body interactive mode;Pass through
The gesture of demonstrator's teaching hand is judged to enter teaching pattern to start teaching;During teaching, constantly obtain demonstrator's
Image and skeleton information simultaneously carry out posture and movable information that parsing obtains demonstrator's teaching hand, form teaching track;It is logical
Mapping algorithm is crossed by teaching trajectory map into robot body movement locus, robot body movement locus is resolved into robot
Control instruction is simultaneously sent to robot control system, and robot control system changes robot in real time according to robot control instruction
The posture of ontology, to which demonstrator's posture and movable information to be mapped on robot body in real time;By judging that demonstrator is non-
The gesture of teaching hand terminates teaching, output learning track;Edit teaching track;Generation and editor robot control program are
Finger includes the following steps:
The robot essential information in robot control system is arranged by human-computer interaction interface in S1;
S2 carries out demonstrator's Authority Verification:If demonstrator is authenticated to be the person of having permission, enter human body interactive mode;It is no
Then teaching system alarm and by operation historical record daily record;
S3, binocular camera obtains the image of demonstrator and skeleton Data Concurrent is sent to host computer core algorithm mould
Block;Host computer core algorithm module parses the gesture information of demonstrator's teaching hand and judges:If demonstrator's teaching hand
Gesture is that teaching starts gesture, then enters teaching pattern;
S4, during teaching, binocular camera constantly obtains the image of demonstrator and skeleton information and is sent to
Host computer core algorithm module;Host computer core algorithm module carries out posture and the movement that real time parsing obtains demonstrator's teaching hand
Information forms teaching track;By mapping algorithm by teaching trajectory map at robot body movement locus, by robot body
Movement locus resolves to robot control instruction and is sent to robot control system, and robot control system is according to robot control
System instruction changes the posture of robot body in real time, to which demonstrator's posture and movable information are mapped to robot body in real time
On;Host computer core algorithm module judges the gesture information of the non-teaching hand of demonstrator:If the gesture of the non-teaching hand of demonstrator has been
At teaching gesture, then terminate teaching, host computer core algorithm module is to host computer human-computer interaction interface output learning track;
S5, editor's teaching track;Generation and editor robot control program.
Preferably, in the step S1, setting robot essential information refers to:The Ethernet of robot control system is set
The essential information of mailing address, robot body sets teaching hand and non-teaching hand, imports demonstrator's authority classification data, ties up
Determine the corresponding interactive information of gesture.
Preferably, it will clench fist and receive one kind in two kinds of gestures backward to being pushed forward and clenching fist and bind teaching and start gesture, separately
One kind, which is bound, completes gesture.Two kinds of gestures easily identify, and the accuracy of teaching method can be improved.
Preferably, in the step S2, it refers to that binocular camera acquires the image of demonstrator to carry out demonstrator's Authority Verification
With skeleton information and send supreme position machine core algorithm module;Host computer core algorithm module uses recognition of face and human body
Feature identification technique carries out demonstrator's Authority Verification.Teaching method of the present invention has the function of authority recognition, in combination with recognition of face
The authority classification of demonstrator is realized with characteristics of human body's identification function.
Preferably, in the step S3, host computer core algorithm module parses the gesture information of demonstrator's teaching hand simultaneously
Judged, if the gesture of demonstrator's teaching hand is not to start gesture for teaching, judge demonstrator's teaching hand gesture whether
Robotary gesture in order to control:If so, the gesture of demonstrator's teaching hand is translated as corresponding robot control instruction, and
It is sent to robot control system, to control robot body start and stop, acceleration deceleration state switching.
Preferably, in the step S5, teaching track is edited, and logic judgment and I/O device control are inserted into teaching track
System generates robot control program;Editor robot control program, and logic judgment and IO are inserted into robot control program
Device control code.
Preferably, in the step S5, editor's teaching track refers to adding, deleting, changing, accelerating, slowing down, repeating to show
Teach track.Teaching track can be adjusted flexibly in the method for the present invention, not only can be by teaching trajectory segment extraction process, but also can be by multiple teachings
Track is integrated, and the editors such as can also be accelerated and be slowed down, and is with good expansibility and compatibility, can be by teaching knot
Fruit is converted to robot program and the point information of other teaching technology pictures compatibility.
A kind of robot teaching system of human body interaction, it is characterised in that:Including:
Robot body has six axis joints;
Binocular camera, image and skeleton information for obtaining demonstrator are simultaneously sent;
Host computer core algorithm module, the image and skeleton letter of the demonstrator for receiving binocular camera transmission
Breath carries out gesture, posture and movable information that parsing obtains demonstrator;Carry out gesture judgement;Teaching track is formed, mapping is passed through
Robot body movement locus at robot body movement locus, is resolved to robot control and referred to by teaching trajectory map by algorithm
It enables and sends robot control instruction;And for editing teaching track, generation and editor robot control program;
Robot control system, the robot control instruction for receiving the transmission of host computer core algorithm module, changes in real time
Become the posture of robot body;
Host computer human-computer interaction interface, the robot essential information for being arranged in robot control system, and for showing
Teaching track.
Teaching system of the present invention is by obtaining the image of demonstrator and skeleton information and parsing, handle and formed
For robot control program to realize teaching, programming efficiency is high, and demonstrator need not contact robot, can eliminate because of robot fault
And the security risk for the demonstrator that wounds, there is good safety;During teaching, demonstrator's posture and movable information reflect in real time
It is mapped on robot body, makes robot body that human body attitude transformation can be followed to make corresponding posture changing, teaching action is straight
It sees, demonstrator can check whether teaching action meets the requirements and make corresponding adjustment in time, can further improve programming efficiency.
Preferably, the host computer human-computer interaction interface and host computer core algorithm module pass through Ethernet and robot control
System processed is communicated.
Compared with prior art, the invention has the advantages that and advantageous effect:
1, teaching method programming efficiency of the present invention is high, has good safety;Teaching action is intuitive, and demonstrator can be timely
It checks whether teaching action meets the requirements and make corresponding adjustment, can further improve programming efficiency;
2, teaching method of the present invention has the function of authority recognition, is realized in combination with recognition of face and characteristics of human body's identification function
The authority classification of demonstrator;
3, teaching method of the present invention be with good expansibility and compatibility, teaching result can be converted to other and shown
The robot program of religion technology picture compatibility and point information;
4, teaching system programming efficiency of the present invention is high, has good safety;Teaching action is intuitive, and demonstrator can be timely
It checks whether teaching action meets the requirements and make corresponding adjustment, can further improve programming efficiency.
Description of the drawings
Fig. 1 is the flow chart of teaching method of the present invention;
Fig. 2 is the application scenario diagram of teaching method of the present invention.
Specific implementation mode
The present invention is described in further detail with specific implementation mode below in conjunction with the accompanying drawings.
Embodiment one
As shown in Figure 1, the present embodiment provides a kind of robot teaching method of human body interaction, setting robot believes substantially
Breath;Confirm demonstrator's permission and enters human body interactive mode;Enter teaching pattern by judging the gesture of demonstrator's teaching hand
To start teaching;During teaching, constantly obtain demonstrator image and skeleton information and carry out parsing obtain teaching
The posture and movable information of person's teaching hand form teaching track;By mapping algorithm by teaching trajectory map at robot body
Robot body movement locus is resolved to robot control instruction and is sent to robot control system by movement locus, machine
People's control system changes the posture of robot body according to robot control instruction in real time, to believe demonstrator's posture and movement
Breath is mapped on robot body in real time;Terminate teaching, output learning track by judging the gesture of the non-teaching hand of demonstrator;
Edit teaching track;Generation and editor robot control program.
Specifically, including the following steps:
The robot essential information in robot control system is arranged by human-computer interaction interface in S1;
S2 carries out demonstrator's Authority Verification:If demonstrator is authenticated to be the person of having permission, enter human body interactive mode;It is no
Then teaching system alarm and by operation historical record daily record;
S3, binocular camera obtains the image of demonstrator and skeleton Data Concurrent is sent to host computer core algorithm mould
Block;Host computer core algorithm module parses the gesture information of demonstrator's teaching hand and judges:If demonstrator's teaching hand
Gesture is that teaching starts gesture, then enters teaching pattern;
S4, during teaching, binocular camera constantly obtains the image of demonstrator and skeleton information and is sent to
Host computer core algorithm module;Host computer core algorithm module carries out posture and the movement that real time parsing obtains demonstrator's teaching hand
Information forms teaching track;By mapping algorithm by teaching trajectory map at robot body movement locus, by robot body
Movement locus resolves to robot control instruction and is sent to robot control system, and robot control system is according to robot control
System instruction changes the posture of robot body in real time, to which demonstrator's posture and movable information are mapped to robot body in real time
On;Host computer core algorithm module judges the gesture information of the non-teaching hand of demonstrator:If the gesture of the non-teaching hand of demonstrator has been
At teaching gesture, then terminate teaching, host computer core algorithm module is to host computer human-computer interaction interface output learning track;
S5, editor's teaching track;Generation and editor robot control program.
Teaching method of the present invention is by obtaining the image of demonstrator and skeleton information and parsing, handle and formed
For robot control program to realize teaching, programming efficiency is high, and demonstrator need not contact robot, can eliminate because of robot fault
And the security risk for the demonstrator that wounds, there is good safety;During teaching, demonstrator's posture and movable information reflect in real time
It is mapped on robot body, makes robot body that human body attitude transformation can be followed to make corresponding posture changing, teaching action is straight
It sees, demonstrator can check whether teaching action meets the requirements and make corresponding adjustment in time, can further improve programming efficiency.
Image and skeleton information are parsed into gesture and existing algorithm can be used;Image and skeleton information are parsed into hand
Posture and movable information, form teaching track and existing algorithm can be used;Robot body movement locus is resolved into robot
Existing algorithm can be used in control instruction.
Preferable scheme is that in the step S1, setting robot essential information refers to:Robot control system is set
Ethernet communication address, robot body essential information, set teaching hand and non-teaching hand, import demonstrator's authority classification number
According to the corresponding interactive information of binding gesture.
Preferably, it will clench fist and receive one kind in two kinds of gestures backward to being pushed forward and clenching fist and bind teaching and start gesture, separately
One kind, which is bound, completes gesture.Two kinds of gestures easily identify, and the accuracy of teaching method can be improved.
In the step S2, it refers to that binocular camera acquires the image and human body of demonstrator to carry out demonstrator's Authority Verification
Bone information simultaneously sends supreme position machine core algorithm module;Host computer core algorithm module is known using recognition of face and characteristics of human body
Other technology carries out demonstrator's Authority Verification.Teaching method of the present invention has the function of authority recognition, in combination with recognition of face and human body
Feature recognition function realizes the authority classification of demonstrator.The prior art can be used in recognition of face and characteristics of human body's identification technology.
In the step S3, host computer core algorithm module parses the gesture information of demonstrator's teaching hand and sentences
It is disconnected, if the gesture of demonstrator's teaching hand is not to start gesture for teaching, whether in order to control to judge the gesture of demonstrator's teaching hand
Robotary gesture:If so, the gesture of demonstrator's teaching hand is translated as corresponding robot control instruction, and it is sent to
Robot control system, to control robot body start and stop, acceleration deceleration state switching.
In the step S5, teaching track is edited, and is inserted into logic judgment and I/O device control in teaching track, is generated
Robot control program;Editor robot control program, and logic judgment and I/O device control are inserted into robot control program
Code processed.
In the step S5, editor's teaching track refers to adding, deleting, changing, accelerating, slowing down, repeating teaching track.
Teaching track can be adjusted flexibly in the method for the present invention, not only can by teaching trajectory segment extraction process, but also can by multiple teaching tracks into
Row is integrated, and the editors such as can also be accelerated and be slowed down, and is with good expansibility and compatibility, can convert teaching result
Robot program for other teaching technology pictures compatibility and point information.
Embodiment two
The present embodiment provides a kind of robot teaching systems of human body interaction, it is characterised in that:Including:
Robot body has six axis joints;
Binocular camera, image and skeleton information for obtaining demonstrator are simultaneously sent;
Host computer core algorithm module, the image and skeleton letter of the demonstrator for receiving binocular camera transmission
Breath carries out gesture, posture and movable information that parsing obtains demonstrator;Carry out gesture judgement;Teaching track is formed, mapping is passed through
Robot body movement locus at robot body movement locus, is resolved to robot control and referred to by teaching trajectory map by algorithm
It enables and sends robot control instruction;And for editing teaching track, generation and editor robot control program;
Robot control system, the robot control instruction for receiving the transmission of host computer core algorithm module, changes in real time
Become the posture of robot body;
Host computer human-computer interaction interface, the robot essential information for being arranged in robot control system, and for showing
Teaching track.
Teaching system of the present invention is by obtaining the image of demonstrator and skeleton information and parsing, handle and formed
For robot control program to realize teaching, programming efficiency is high, and demonstrator need not contact robot, can eliminate because of robot fault
And the security risk for the demonstrator that wounds, there is good safety;During teaching, demonstrator's posture and movable information reflect in real time
It is mapped on robot body, makes robot body that human body attitude transformation can be followed to make corresponding posture changing, teaching action is straight
It sees, demonstrator can check whether teaching action meets the requirements and make corresponding adjustment in time, can further improve programming efficiency.
Host computer human-computer interaction interface and host computer core algorithm module are carried out by Ethernet and robot control system
Communication.
The above embodiment is a preferred embodiment of the present invention, but embodiments of the present invention are not by above-described embodiment
Limitation, it is other it is any without departing from the spirit and principles of the present invention made by changes, modifications, substitutions, combinations, simplifications,
Equivalent substitute mode is should be, is included within the scope of the present invention.
Claims (10)
1. a kind of robot teaching method of human body interaction, it is characterised in that:Robot essential information is set;Confirm demonstrator's power
It limits and enters human body interactive mode;Enter teaching pattern by judging the gesture of demonstrator's teaching hand to start teaching;Showing
During religion, constantly obtain demonstrator image and skeleton information and carry out parsing obtain demonstrator's teaching hand posture and
Movable information forms teaching track;By mapping algorithm by teaching trajectory map at robot body movement locus, by robot
Ontology movement locus resolves to robot control instruction and is sent to robot control system, and robot control system is according to machine
People's control instruction changes the posture of robot body in real time, to which demonstrator's posture and movable information are mapped to robot in real time
On ontology;Terminate teaching, output learning track by judging the gesture of the non-teaching hand of demonstrator;Edit teaching track;It generates
And editor robot control program.
2. the robot teaching method of human body interaction according to claim 1, it is characterised in that:The setting robot
Essential information;Confirm demonstrator's permission and enters human body interactive mode;By judging that the gesture of demonstrator's teaching hand is shown to enter
Religion pattern is to start teaching;During teaching, the image and skeleton information of continuous acquisition demonstrator simultaneously parse
Go out the posture and movable information of demonstrator's teaching hand, forms teaching track;By mapping algorithm by teaching trajectory map at machine
Robot body movement locus is resolved to robot control instruction and is sent to control system of robot by human body's movement locus
System, robot control system change the posture of robot body according to robot control instruction in real time, thus by demonstrator's posture
It is mapped on robot body in real time with movable information;Terminate teaching by judging the gesture of the non-teaching hand of demonstrator, exports
Teaching track;Edit teaching track;Generation and editor robot control program, refer to including the following steps:
The robot essential information in robot control system is arranged by human-computer interaction interface in S1;
S2 carries out demonstrator's Authority Verification:If demonstrator is authenticated to be the person of having permission, enter human body interactive mode;Otherwise show
Teaching system alarms and by operation historical record daily record;
S3, binocular camera obtains the image of demonstrator and skeleton Data Concurrent is sent to host computer core algorithm module;On
Position machine core algorithm module parses the gesture information of demonstrator's teaching hand and judges:If the gesture of demonstrator's teaching hand is
Teaching starts gesture, then enters teaching pattern;
S4, during teaching, binocular camera constantly obtains the image of demonstrator and skeleton information and sends supreme position
Machine core algorithm module;Host computer core algorithm module carries out real time parsing and show that the posture of demonstrator's teaching hand and movement are believed
Breath forms teaching track;Teaching trajectory map is transported robot body at robot body movement locus by mapping algorithm
Dynamic rail mark resolves to robot control instruction and is sent to robot control system, and robot control system is controlled according to robot
Instruction changes the posture of robot body in real time, to which demonstrator's posture and movable information are mapped to robot body in real time
On;Host computer core algorithm module judges the gesture information of the non-teaching hand of demonstrator:If the gesture of the non-teaching hand of demonstrator has been
At teaching gesture, then terminate teaching, host computer core algorithm module is to host computer human-computer interaction interface output learning track;
S5, editor's teaching track;Generation and editor robot control program.
3. the robot teaching method of human body interaction according to claim 2, it is characterised in that:In the step S1, if
Setting robot essential information refers to:The ethernet communication address of robot control system, the essential information of robot body are set,
Teaching hand and non-teaching hand are set, demonstrator's authority classification data, the corresponding interactive information of binding gesture are imported.
4. the robot teaching method of human body interaction according to claim 3, it is characterised in that:It will clench fist to being pushed forward and hold
Fist receives one kind in two kinds of gestures and binds teaching backward starts gesture, and another kind, which is bound, completes gesture.
5. the robot teaching method of human body interaction according to claim 2, it is characterised in that:In the step S2, into
Row demonstrator's Authority Verification refers to that the image and skeleton information of binocular camera acquisition demonstrator simultaneously send supreme position machine core
Center algorithm module;Host computer core algorithm module carries out demonstrator's permission using recognition of face and characteristics of human body's identification technology and tests
Card.
6. the robot teaching method of human body interaction according to claim 2, it is characterised in that:In the step S3, on
Position machine core algorithm module parses the gesture information of demonstrator's teaching hand and judges, if the gesture of demonstrator's teaching hand is not
It is to start gesture for teaching, then judges the gesture of demonstrator's teaching hand whether in order to control robotary gesture:If so, will show
The gesture of religion person's teaching hand is translated as corresponding robot control instruction, and is sent to robot control system, to control machine
Human body's start and stop, acceleration deceleration state switching.
7. the robot teaching method of human body interaction according to claim 2, it is characterised in that:In the step S5, compile
Teaching track is collected, and is inserted into logic judgment and I/O device control in teaching track, generates robot control program;Edit machine
People controls program, and logic judgment and I/O device control routine are inserted into robot control program.
8. the robot teaching method of human body interaction according to claim 7, it is characterised in that:In the step S5, compile
It refers to adding, deleting, changing, accelerating, slowing down, repeating teaching track to collect teaching track.
9. a kind of robot teaching system of human body interaction, it is characterised in that:Including:
Robot body has six axis joints;
Binocular camera, image and skeleton information for obtaining demonstrator are simultaneously sent;
Host computer core algorithm module, the image and skeleton information of the demonstrator for receiving binocular camera transmission, into
Row parsing obtains the gesture, posture and movable information of demonstrator;Carry out gesture judgement;Teaching track is formed, mapping algorithm is passed through
By teaching trajectory map at robot body movement locus, robot body movement locus is resolved into robot control instruction simultaneously
Robot control instruction is sent;And for editing teaching track, generation and editor robot control program;
Robot control system, the robot control instruction for receiving the transmission of host computer core algorithm module, changes machine in real time
The posture of device human body;
Host computer human-computer interaction interface, the robot essential information for being arranged in robot control system, and for showing
Teach track.
10. the robot teaching system of human body interaction according to claim 9, it is characterised in that:The host computer is man-machine
Interactive interface and host computer core algorithm module are communicated by Ethernet and robot control system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810094154.XA CN108274448A (en) | 2018-01-31 | 2018-01-31 | A kind of the robot teaching method and teaching system of human body interaction |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810094154.XA CN108274448A (en) | 2018-01-31 | 2018-01-31 | A kind of the robot teaching method and teaching system of human body interaction |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108274448A true CN108274448A (en) | 2018-07-13 |
Family
ID=62807197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810094154.XA Pending CN108274448A (en) | 2018-01-31 | 2018-01-31 | A kind of the robot teaching method and teaching system of human body interaction |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108274448A (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109318232A (en) * | 2018-10-22 | 2019-02-12 | 佛山智能装备技术研究院 | A kind of polynary sensory perceptual system of industrial robot |
CN109483517A (en) * | 2018-10-22 | 2019-03-19 | 天津扬天科技有限公司 | A kind of cooperation robot teaching method based on the tracking of hand appearance |
CN109807870A (en) * | 2019-03-20 | 2019-05-28 | 昆山艾派科技有限公司 | Robot demonstrator |
CN110141498A (en) * | 2019-06-04 | 2019-08-20 | 辰耀智能装备(厦门)有限公司 | A kind of moxa-moxibustion smart collaboration robot and its operating method |
CN110146044A (en) * | 2019-06-14 | 2019-08-20 | 上海航天设备制造总厂有限公司 | A kind of TCP precision measure and calibration method |
CN110253583A (en) * | 2019-07-02 | 2019-09-20 | 北京科技大学 | The human body attitude robot teaching method and device of video is taken based on wearing teaching |
CN110347243A (en) * | 2019-05-30 | 2019-10-18 | 深圳乐行天下科技有限公司 | A kind of working method and robot of robot |
CN110788860A (en) * | 2019-11-11 | 2020-02-14 | 路邦科技授权有限公司 | Bionic robot action control method based on voice control |
CN111002289A (en) * | 2019-11-25 | 2020-04-14 | 华中科技大学 | Robot online teaching method and device, terminal device and storage medium |
CN111203854A (en) * | 2019-12-27 | 2020-05-29 | 深圳市越疆科技有限公司 | Robot track reproduction method, control device, equipment and readable storage medium |
CN111300402A (en) * | 2019-11-26 | 2020-06-19 | 爱菲力斯(深圳)科技有限公司 | Robot control method based on gesture recognition |
CN112936282A (en) * | 2021-03-08 | 2021-06-11 | 常州刘国钧高等职业技术学校 | Method and system for improving motion sensing control accuracy of industrial robot |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60205720A (en) * | 1984-03-30 | 1985-10-17 | Matsushita Electric Ind Co Ltd | Robot operation teaching device |
CN105328701A (en) * | 2015-11-12 | 2016-02-17 | 东北大学 | Teaching programming method for series mechanical arms |
CN105500370A (en) * | 2015-12-21 | 2016-04-20 | 华中科技大学 | Robot offline teaching programming system and method based on somatosensory technology |
CN105677031A (en) * | 2016-01-04 | 2016-06-15 | 广州华欣电子科技有限公司 | Control method and device based on gesture track recognition |
US20160303737A1 (en) * | 2015-04-15 | 2016-10-20 | Abb Technology Ltd. | Method and apparatus for robot path teaching |
CN106142092A (en) * | 2016-07-26 | 2016-11-23 | 张扬 | A kind of method robot being carried out teaching based on stereovision technique |
CN107272593A (en) * | 2017-05-23 | 2017-10-20 | 陕西科技大学 | A kind of robot body-sensing programmed method based on Kinect |
-
2018
- 2018-01-31 CN CN201810094154.XA patent/CN108274448A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60205720A (en) * | 1984-03-30 | 1985-10-17 | Matsushita Electric Ind Co Ltd | Robot operation teaching device |
US20160303737A1 (en) * | 2015-04-15 | 2016-10-20 | Abb Technology Ltd. | Method and apparatus for robot path teaching |
CN105328701A (en) * | 2015-11-12 | 2016-02-17 | 东北大学 | Teaching programming method for series mechanical arms |
CN105500370A (en) * | 2015-12-21 | 2016-04-20 | 华中科技大学 | Robot offline teaching programming system and method based on somatosensory technology |
CN105677031A (en) * | 2016-01-04 | 2016-06-15 | 广州华欣电子科技有限公司 | Control method and device based on gesture track recognition |
CN106142092A (en) * | 2016-07-26 | 2016-11-23 | 张扬 | A kind of method robot being carried out teaching based on stereovision technique |
CN107272593A (en) * | 2017-05-23 | 2017-10-20 | 陕西科技大学 | A kind of robot body-sensing programmed method based on Kinect |
Non-Patent Citations (1)
Title |
---|
张新星: "《工业机器人应用基础》", 30 June 2017 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109483517A (en) * | 2018-10-22 | 2019-03-19 | 天津扬天科技有限公司 | A kind of cooperation robot teaching method based on the tracking of hand appearance |
CN109318232A (en) * | 2018-10-22 | 2019-02-12 | 佛山智能装备技术研究院 | A kind of polynary sensory perceptual system of industrial robot |
CN109807870A (en) * | 2019-03-20 | 2019-05-28 | 昆山艾派科技有限公司 | Robot demonstrator |
CN110347243A (en) * | 2019-05-30 | 2019-10-18 | 深圳乐行天下科技有限公司 | A kind of working method and robot of robot |
CN110141498A (en) * | 2019-06-04 | 2019-08-20 | 辰耀智能装备(厦门)有限公司 | A kind of moxa-moxibustion smart collaboration robot and its operating method |
CN110146044B (en) * | 2019-06-14 | 2021-12-28 | 上海航天设备制造总厂有限公司 | TCP precision measurement and calibration method |
CN110146044A (en) * | 2019-06-14 | 2019-08-20 | 上海航天设备制造总厂有限公司 | A kind of TCP precision measure and calibration method |
CN110253583A (en) * | 2019-07-02 | 2019-09-20 | 北京科技大学 | The human body attitude robot teaching method and device of video is taken based on wearing teaching |
CN110788860A (en) * | 2019-11-11 | 2020-02-14 | 路邦科技授权有限公司 | Bionic robot action control method based on voice control |
CN111002289A (en) * | 2019-11-25 | 2020-04-14 | 华中科技大学 | Robot online teaching method and device, terminal device and storage medium |
CN111002289B (en) * | 2019-11-25 | 2021-08-17 | 华中科技大学 | Robot online teaching method and device, terminal device and storage medium |
CN111300402A (en) * | 2019-11-26 | 2020-06-19 | 爱菲力斯(深圳)科技有限公司 | Robot control method based on gesture recognition |
CN111203854B (en) * | 2019-12-27 | 2021-05-25 | 深圳市越疆科技有限公司 | Robot track reproduction method, control device, equipment and readable storage medium |
CN111203854A (en) * | 2019-12-27 | 2020-05-29 | 深圳市越疆科技有限公司 | Robot track reproduction method, control device, equipment and readable storage medium |
CN112936282A (en) * | 2021-03-08 | 2021-06-11 | 常州刘国钧高等职业技术学校 | Method and system for improving motion sensing control accuracy of industrial robot |
CN112936282B (en) * | 2021-03-08 | 2022-01-07 | 常州刘国钧高等职业技术学校 | Method and system for improving motion sensing control accuracy of industrial robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108274448A (en) | A kind of the robot teaching method and teaching system of human body interaction | |
Yu et al. | Deep learning-based traffic safety solution for a mixture of autonomous and manual vehicles in a 5G-enabled intelligent transportation system | |
CN102581850B (en) | GSK-Link bus based modular robot control device and control method | |
CN108921200A (en) | Method, apparatus, equipment and medium for classifying to Driving Scene data | |
CN103456203B (en) | A kind of portable entity programmed method and system | |
CN102245356B (en) | For optimizing the method and system of the parameter of the robot for assembling in production | |
CN104008465B (en) | Grid switching operation bill safety implemented systems | |
CN106528744A (en) | Format conversion method and system | |
CN108306804A (en) | A kind of Ethercat main station controllers and its communication means and system | |
CN112613534B (en) | Multi-mode information processing and interaction system | |
Wu et al. | Scenario-based modeling of the on-board of a satellite-based train control system with colored petri nets | |
CN110428702A (en) | Building block system programing system based on resistance classification | |
Viethen et al. | Graphs and spatial relations in the generation of referring expressions | |
CN106445153A (en) | Man-machine interaction method and device for intelligent robot | |
KR101811395B1 (en) | Method and apparatus for associating information with each other for controlling traffic light and tram light at a crossroads | |
CN113561173B (en) | Coding, decoding and track planning method of motion controller based on WINCE platform | |
US11897134B2 (en) | Configuring a simulator for robotic machine learning | |
US20220048191A1 (en) | Robotic activity decomposition | |
CN114783560A (en) | Recovered line system of wisdom | |
KR102251867B1 (en) | System and method for providing coding education using star topology in Internet of things | |
Wu et al. | Comparing LLMs for Prompt-Enhanced ACT-R and Soar Model Development: A Case Study in Cognitive Simulation | |
Kanis | Interactive HamNoSys notation editor for signed speech annotation | |
Saod et al. | Speech-controlled vehicle for manufacturing operation | |
CN107256181A (en) | A kind of service bus platform for accessing polymorphic type groupware | |
Yan | Next Generation Wireless Sensor Network Based Japanese Remote interactive practical teaching platform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180713 |