CN104827474B - Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people - Google Patents

Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people Download PDF

Info

Publication number
CN104827474B
CN104827474B CN201510221175.XA CN201510221175A CN104827474B CN 104827474 B CN104827474 B CN 104827474B CN 201510221175 A CN201510221175 A CN 201510221175A CN 104827474 B CN104827474 B CN 104827474B
Authority
CN
China
Prior art keywords
robot
feature point
tool
fisrt feature
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510221175.XA
Other languages
Chinese (zh)
Other versions
CN104827474A (en
Inventor
刘永
顾伟国
闫瑾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN201510221175.XA priority Critical patent/CN104827474B/en
Publication of CN104827474A publication Critical patent/CN104827474A/en
Application granted granted Critical
Publication of CN104827474B publication Critical patent/CN104827474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Numerical Control (AREA)
  • Manipulator (AREA)

Abstract

The present invention provides a kind of Virtual Demonstration intelligent robot programming servicing unit for learning people, and robot end is provided with power tool, including:It is identical and be provided with the teaching display-tool with sticking plastic of fisrt feature point with the power tool that robot end is set, the camera unit of the setting of characteristic point movement locus, teaching display-tool with sticking plastic joint angles information and operative goals on teaching display-tool with sticking plastic, and the robot controller for generating robot work order are captured and recorded according to time sequencing;The power tool is provided with second feature point corresponding with the fisrt feature point of teaching display-tool with sticking plastic.

Description

Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people
Technical field
Technology is intelligently weaved into the present invention relates to a kind of industrial robot, particularly a kind of Virtual Demonstration robot for learning people Intelligence programming method and servicing unit.
Background technology
With the development of science and technology, increasing industrial robot is obtained in each industrial circle extensively should With, such as robot spray painting, robot is carried, robotic asssembly etc..Now, in these areas, industrial robot can replace People completes repeatable precise operation, so as to ensure that product quality.When replacing people with robot and carrying out operation, it is necessary to right in advance Robot sends instruction, it is stipulated that action and the particular content of operation that robot should be completed.Along with the hair of industrial robot Exhibition, robotic programming technology is also developed and perfect;Robotic programming has turned into a pith of robot technology;Machine In addition to by the support of robot hardware, quite a few relies on robot language to complete the function of device people.
Robotic programming has many methods, and opsition dependent is divided into online programming and off-line programing:
Online programming controls robot motion by the manual mode operation box of robot, and the process of on-line teaching includes making machine The end movement of the instrument installed on people records the coordinate in this position robot to its operating position, then machine The track autokinetic movement recorded when people is along teaching, completes specific operation task.The precision of teaching is completely by the experience of demonstrator Range estimation decision, the teaching of teach-by-doing, it is impossible to produce a desired effect, and be possible to hurt teaching personnel, very it is dangerous.
The accuracy of off-line programing is higher, can carry out other work, can lead to without robot, robot during programming L-G simulation test program is crossed, can pre-optimized operation scheme and the time cycle of operation, the process or subprogram knot that will can be completed in the past Close in sequence to be programmed, can avoid hurting operating personnel.Compared with teaching playback programming, off-line programing system is to behaviour Make that personnel requirement is higher, the ability that need to possess special robot knowledge and programming, using also inconvenient, especially exists To can not be simply direct in the description of robot manipulating task task.If teaching can be introduced again in the system of robot off-line programming Existing function so that operating personnel guide on-line teaching robot on the computer screen by corresponding man-machine interface, produces Robot manipulating task track, and then generate robotic motion routine and simulation and optimization can be carried out, undoubtedly will greatly strengthen machine The operability of people's off-line programing and the friendly of user interface so that robotic programming is more convenient and simple, are conducive to promoting Application of the robot technology in actual production and popularization.
The content of the invention
It is an object of the invention to provide a kind of intelligent robot programmed method of the pseudo operation for learning people, with operation Simply, flexibility ratio is high and intelligentized advantage.
A kind of Virtual Demonstration intelligent robot programming servicing unit for learning people, robot end is provided with power tool, wraps Include teaching display-tool with sticking plastic, robot controller and robot controller.The power tool that the teaching display-tool with sticking plastic is set with robot end It is identical and be provided with fisrt feature point;The camera unit is captured according to time sequencing and records characteristic point motion rail on teaching display-tool with sticking plastic Mark and teaching display-tool with sticking plastic joint angles information;The robot controller of generation robot work order.The power tool It is provided with second feature point corresponding with the fisrt feature point of teaching display-tool with sticking plastic.
The method that the Virtual Demonstration intelligent robot of study people is programmed is realized using said apparatus, including:
Teaching display-tool with sticking plastic carries out dummy activity, movement locus and joint of the camera unit according to time sequencing to fisrt feature point Angle is captured and is generated the camera space two-dimensional coordinate of fisrt feature point,
Robot controller is obtained according to the camera space coordinate and joint angles of fisrt feature point and includes fisrt feature point Robot 3 d space coordinate and joint angles in interior instruction repertorie,
Instruction repertorie is input into robot by robot controller;
The robot controller is controlled according to the robot 3 d space coordinate information of fisrt feature point in instruction repertorie The movement locus of the second feature point on power tool.
Before teaching display-tool with sticking plastic carries out dummy activity, the mapping relations in robot space and camera space are first set up, specifically Method is:
Camera unit captures more than three groups and works and middle second feature dot image and obtains the camera space two of second feature point Dimension coordinate;
Recorder person joint's angle parameter, the robot space of second feature point is obtained by robot Forward kinematics Three-dimensional coordinate;
Set up the mapping relations in robot space and camera space
Wherein, (xc,yc) be second feature point (11) camera space two-dimensional coordinate, (px,py,pz) it is second feature point Three-dimensional coordinate in robot space, C=[C1,C2,C3,C4,C5,C6] it is mapping relations parameter.
The present invention compared with prior art, with advantages below:(1) it is original to need operating personnel's needs to hold by doing and illustrating Robot end carry out teaching, it is necessary to energetically could simulation job attitude and be difficult teaching in place, now need not be laborious, easily It is dexterous;(2) sensation of a kind of " on the spot in person " is had in operating personnel and robot interactive process, is provided the user A kind of brand-new harmonious man-machine interaction operating environment;(3) arrangement of several characteristic points is set in hand-held teaching apparatus equipment can be only One determines attitude, and characteristic point can be LED, and this solves operating environment restricted problem.
The present invention is described further with reference to Figure of description.
Brief description of the drawings
Fig. 1 is that present system constitutes schematic diagram.
Fig. 2 is the structural representation of hand-held Virtual Demonstration instrument of the invention.
Fig. 3 is the schematic diagram that the present invention realizes welding car door.
Fig. 4 is the flow chart of the inventive method.
Specific embodiment
With reference to Fig. 1 and Fig. 2, a kind of Virtual Demonstration intelligent robot of study people programs servicing unit, including:Teaching display-tool with sticking plastic 2nd, camera unit 3 and robot controller 4.The teaching display-tool with sticking plastic 2 is identical with the power tool 1 that robot end is set and is provided with Fisrt feature point 21;The camera unit 3 captures according to time sequencing and records on teaching display-tool with sticking plastic 2 characteristic point movement locus and show Teaching and administrative staff has 2 joint angles information;The work order of the generation of the robot controller 4 robot.The power tool 1 be provided with The corresponding second feature point 11 of fisrt feature point 21 of teaching display-tool with sticking plastic 2.Characteristic point wherein in second feature point 11 is LED, black Point or white point.
The industrial robot 6 of the artificial six degree of freedom of machine used with reference to Fig. 1 and Fig. 2, the present invention, its end is provided with welding Instrument, the soldering appliance includes soldering appliance maxilla 12 and mobilizable soldering appliance lower jaw 13.The simulation of hand-held teaching display-tool with sticking plastic 2 It is soldering appliance maxilla 12, its shape and size is identical with soldering appliance maxilla 12, but the very light in weight of teaching display-tool with sticking plastic 2, Operator with flexible operating and can facilitate pose adjustment.
With reference to Fig. 2, there are several fisrt feature points 21 on teaching display-tool with sticking plastic 2, fisrt feature point 21 may be located at simulation welding On the characteristic point mounting blocks that the rear end of instrument maxilla is stretched out.Fisrt feature point arranges 21 row combinations should meet unique mark teaching work The attitude of tool 2.As a example by the present invention is using L shape arrangement, fisrt feature point 21 has three, and first fisrt feature point and second The line B of line A between individual fisrt feature point between second fisrt feature point and the 3rd fisrt feature point, and Line A length is the twice of line B length.The front port simulated welding gun head of teaching display-tool with sticking plastic 2.
The three second feature points 11 corresponding with three fisrt feature points 21 also are provided with soldering appliance maxilla 12.It is described The distance of corresponding fisrt feature point 21 to teaching display-tool with sticking plastic front end analogue arc welding gun head is corresponding to the upper soldering appliance of robot end Second feature point 11 to the front end arc welding gun head of soldering appliance maxilla 12 distance it is equal.
As shown in figure 3, the visual field of camera unit 3 covers the operating area of teaching display-tool with sticking plastic 2, the full side to teaching display-tool with sticking plastic 2 is realized Shooting of the position without dead angle.The camera unit 3 is made up of two or more camera 31, the folder between the distance between camera and camera Angle does not need to determine.As a example by welding car door 51, there is solder joint 52 on car door 51, teaching display-tool with sticking plastic 2 reaches pad 52 with operation attitude Position, the characteristic point 21 on car door 51 on all solder joints 52 (i.e. setting) and teaching display-tool with sticking plastic 2 can be by the middle part of some cameras 31 Split-phase motor is observed.
With reference to Fig. 4, a kind of Virtual Demonstration intelligent robot programmed method of study people is comprised the following steps.
Step 1, sets up the mapping relations in robot space and camera space.
The field range overwrite job region of camera unit 3, gathers the Welder on multigroup operating area inner machine people end The image of second feature point 11 of tool, coordinate of the characteristic point in camera space is obtained by image recognition, records corresponding robot Joint angles information, the robot space coordinates of characteristic point is obtained by robot Forward kinematics, set up robot space with The mapping relations of camera space, mapping relations are as follows:
Wherein, (xc,yc) it is two-dimensional coordinate of the characteristic point in camera space, (px,py,pz) it is characteristic point empty in robot Between in three-dimensional coordinate, C=[C1,C2,C3,C4,C5,C6] it is mapping relations parameter.This process is the instruction of mapping relations parameter Practice step, the second feature point group number of capture is more, and C is more accurate for mapping relations parameter.
Step 2, teaching display-tool with sticking plastic 2 carries out dummy activity.
The hand-held teaching display-tool with sticking plastic 2 of operating personnel reaches the position of solder joint 52 of robot manipulating task, and dummy robot's operation attitude is simultaneously Ensure teaching display-tool with sticking plastic front port arc welding gun head on solder joint 52, the fisrt feature point 21 on teaching display-tool with sticking plastic can be by camera unit 3 Camera 21 observe, the hand-held teaching display-tool with sticking plastic 2 of operating personnel reaches different bond pad locations successively.
Step 3, gathers Virtual Demonstration tool information.
Fisrt feature point 21 in camera unit 3 in all camera acquisition steps 2 on teaching display-tool with sticking plastic 2, preserves each operation Image of the position feature point in camera space, and generate the camera space two-dimensional coordinate of fisrt feature point 21.
Step 4, position coordinates is calculated under robot space.
The robot space being had built up based on step 1 and the mapping relations of camera space, will the teaching in camera space Camera space coordinate (being obtained by the image recognition of characteristic point) information conversion of the fisrt feature point 21 on instrument 2 obtains machine The three-dimensional coordinate information or joint angles information in people space, and record joint angles information.
Step 5, automatically generates robot program.
After teaching terminates, the joint angles information order that robot controller 4 will be recorded in step 4 is preserved, and generation is included The robot 3 d space coordinate and joint angles of fisrt feature point 21 are in interior instruction repertorie robot program.
Finally, robot controller 4 performs the program of step 5 generation, and control robot is with the position of operating personnel's teaching And attitude, automate and move to All Jobs position, weld job task is completed, after the completion of operation, robot returns to raw bits Put, wait operating personnel's teaching next time, realize that intelligent robot is programmed.

Claims (6)

1. a kind of Virtual Demonstration intelligent robot for learning people programs servicing unit, and robot end is provided with power tool (1), its It is characterised by, including:
It is identical and be provided with the teaching display-tool with sticking plastic (2) of fisrt feature point (21) with the power tool (1) that robot end is set,
Characteristic point movement locus on teaching display-tool with sticking plastic (2), teaching display-tool with sticking plastic (2) joint angles information are captured and recorded according to time sequencing With the camera unit (3) of operative goals setting, and
The robot controller (4) of generation robot work order;
The power tool (1) is provided with second feature point (11) corresponding with teaching display-tool with sticking plastic (2) fisrt feature point (21).
2. the Virtual Demonstration intelligent robot of study people according to claim 1 programs servicing unit, fisrt feature point (22) including more than two characteristic points, arrangement meets the attitude of unique mark teaching display-tool with sticking plastic (2) between characteristic point.
3. the Virtual Demonstration intelligent robot of study people according to claim 2 programs servicing unit, fisrt feature point (22) there is the line A between three, and first fisrt feature point and second fisrt feature point special perpendicular to second first The line B a little and between the 3rd fisrt feature point is levied, and line A length is the twice of line B length.
4. the Virtual Demonstration intelligent robot of the study people according to Claims 2 or 3 programs servicing unit, fisrt feature point (22) characteristic point and in second feature point (11) is LED, stain or white point.
5. the Virtual Demonstration intelligent robot of the study people of a kind of servicing unit using described in above-mentioned any one claim Programmed method, it is characterised in that including:
Teaching display-tool with sticking plastic (2) carries out dummy activity, and camera unit (3) is according to time sequencing to the motion rail of fisrt feature point (21) The setting of mark, joint angles and operative goals is captured and is generated the camera space two-dimensional coordinate of fisrt feature point (21),
Operation of the robot controller (01) according to the camera space coordinate, joint angles and operative goals of fisrt feature point (21) Point obtains the instruction repertorie of the robot 3 d space coordinate comprising fisrt feature point (21) and joint angles,
Instruction repertorie is input into robot by robot controller (4);
The robot controller (4) according to fisrt feature point (21) in instruction repertorie robot 3 d space coordinate information control The movement locus of second feature point (11) on manufacturing instrument (1).
6. it is according to claim 5 study people Virtual Demonstration intelligent robot programmed method, it is characterised in that in teaching Before instrument (2) carries out dummy activity, the mapping relations in robot space and camera space are first set up, specific method is:
Camera unit (3) captures more than three groups and works and middle second feature point (11) image and obtains the camera of second feature point (11) Space two-dimensional coordinate;
Recorder person joint's angle parameter, the robot space of second feature point (11) is obtained by robot Forward kinematics Three-dimensional coordinate;
Set up the mapping relations in robot space and camera space
x c = ( C 1 2 + C 2 2 - C 3 2 - C 4 2 ) p x + 2 ( C 2 C 3 + C 1 C 4 ) p y + 2 ( C 2 C 4 - C 1 C 3 ) p z + C 5
y c = 2 ( C 2 C 3 - C 1 C 4 ) p x + ( C 1 2 - C 2 2 + C 3 2 - C 4 2 ) p y + 2 ( C 3 C 4 + C 1 C 2 ) p z + C 6
Wherein, (xc,yc) be second feature point (11) camera space two-dimensional coordinate, (px,py,pz) it is second feature point (11) Three-dimensional coordinate in robot space, C=[C1,C2,C3,C4,C5,C6] it is mapping relations parameter.
CN201510221175.XA 2015-05-04 2015-05-04 Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people Active CN104827474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510221175.XA CN104827474B (en) 2015-05-04 2015-05-04 Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510221175.XA CN104827474B (en) 2015-05-04 2015-05-04 Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people

Publications (2)

Publication Number Publication Date
CN104827474A CN104827474A (en) 2015-08-12
CN104827474B true CN104827474B (en) 2017-06-27

Family

ID=53805845

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510221175.XA Active CN104827474B (en) 2015-05-04 2015-05-04 Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people

Country Status (1)

Country Link
CN (1) CN104827474B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444739A (en) * 2016-07-15 2017-02-22 鹿龙 Multi-industrial-robot virtual offline co-simulation system and method
CN108000499B (en) * 2016-10-27 2020-07-31 达明机器人股份有限公司 Programming method of robot visual coordinate
JP6534126B2 (en) * 2016-11-22 2019-06-26 パナソニックIpマネジメント株式会社 Picking system and control method therefor
CN107220099A (en) * 2017-06-20 2017-09-29 华中科技大学 A kind of robot visualization virtual teaching system and method based on threedimensional model
CN107331279A (en) * 2017-08-16 2017-11-07 嘉兴锐视智能科技有限公司 Teaching apparatus and system
CN108427282A (en) * 2018-03-30 2018-08-21 华中科技大学 A kind of solution of Inverse Kinematics method based on learning from instruction
CN108705536A (en) * 2018-06-05 2018-10-26 雅客智慧(北京)科技有限公司 A kind of the dentistry robot path planning system and method for view-based access control model navigation

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09225872A (en) * 1996-02-23 1997-09-02 Yaskawa Electric Corp Robot teaching device
JP4449410B2 (en) * 2003-10-27 2010-04-14 ソニー株式会社 Robot apparatus and object learning method thereof
JP4267005B2 (en) * 2006-07-03 2009-05-27 ファナック株式会社 Measuring apparatus and calibration method
JP2011048621A (en) * 2009-08-27 2011-03-10 Honda Motor Co Ltd Robot off-line teaching method
JP2011110621A (en) * 2009-11-24 2011-06-09 Toyota Industries Corp Method of producing teaching data of robot and robot teaching system
JP2011224745A (en) * 2010-04-21 2011-11-10 Yaskawa Electric Corp Robot teaching device and controller for the same, and program
CN102350700A (en) * 2011-09-19 2012-02-15 华南理工大学 Method for controlling robot based on visual sense

Also Published As

Publication number Publication date
CN104827474A (en) 2015-08-12

Similar Documents

Publication Publication Date Title
CN104827474B (en) Learn the Virtual Demonstration intelligent robot programmed method and servicing unit of people
CN110238831B (en) Robot teaching system and method based on RGB-D image and teaching device
Quintero et al. Robot programming through augmented trajectories in augmented reality
Pettersen et al. Augmented reality for programming industrial robots
CN104858876B (en) Visual debugging of robotic tasks
CN110561450B (en) Robot assembly offline example learning system and method based on dynamic capture
CN108161882A (en) A kind of robot teaching reproducting method and device based on augmented reality
CN108161904A (en) Robot online teaching device, system, method and equipment based on augmented reality
CN108481323A (en) Augmented reality-based robot motion trajectory automatic programming system and method
CN203449306U (en) Master-slave-type double-industrial-robot coordination operation control system
CN107220099A (en) A kind of robot visualization virtual teaching system and method based on threedimensional model
CN108340351A (en) A kind of robot teaching apparatus, method and teaching robot
Pan et al. Augmented reality-based robot teleoperation system using RGB-D imaging and attitude teaching device
Ganapathyraju Hand gesture recognition using convexity hull defects to control an industrial robot
CN104325268A (en) Industrial robot three-dimensional space independent assembly method based on intelligent learning
CN104002297A (en) Teaching system, teaching method and robot system
CN210361314U (en) Robot teaching device based on augmented reality technology
Waymouth et al. Demonstrating cloth folding to robots: Design and evaluation of a 2d and a 3d user interface
CN106881717A (en) A kind of surface of robot spacing follows method for paint spraying
Thoo et al. Online and offline robot programming via augmented reality workspaces
CN110948467A (en) Handheld teaching device and method based on stereoscopic vision
Bai et al. Parallel calligraphy robot: Framework and system implementation
CN205353759U (en) Three -dimensional route intelligent recognition tracker of robot
Marín et al. A predictive interface based on virtual and augmented reality for task specification in a Web telerobotic system
CN103009388B (en) Light wave transmitter as well as robot track locating system and robot track locating method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
EXSB Decision made by sipo to initiate substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant