CN108656120A - A kind of teaching based on image comparison, processing method - Google Patents
A kind of teaching based on image comparison, processing method Download PDFInfo
- Publication number
- CN108656120A CN108656120A CN201810319631.8A CN201810319631A CN108656120A CN 108656120 A CN108656120 A CN 108656120A CN 201810319631 A CN201810319631 A CN 201810319631A CN 108656120 A CN108656120 A CN 108656120A
- Authority
- CN
- China
- Prior art keywords
- teaching
- data
- robot
- processing
- equipment
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 17
- 230000000007 visual effect Effects 0.000 claims abstract description 116
- 238000000034 methods Methods 0.000 claims abstract description 89
- 239000011901 water Substances 0.000 claims abstract description 33
- 230000000875 corresponding Effects 0.000 claims abstract description 28
- 238000003754 machining Methods 0.000 claims description 48
- 241001269238 Data Species 0.000 claims description 5
- 230000001939 inductive effects Effects 0.000 claims description 4
- 238000010276 construction Methods 0.000 claims description 3
- 238000004088 simulation Methods 0.000 claims description 3
- 239000002585 bases Substances 0.000 claims 4
- 238000006243 chemical reactions Methods 0.000 abstract description 5
- 230000005012 migration Effects 0.000 abstract description 2
- 239000000463 materials Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000006011 modification reactions Methods 0.000 description 3
- 238000005516 engineering processes Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 210000004279 Orbit Anatomy 0.000 description 1
- 238000010835 comparative analysis Methods 0.000 description 1
- 230000001808 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reactions Methods 0.000 description 1
- 238000010586 diagrams Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0081—Programme-controlled manipulators with master teach-in means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/04—Viewing devices
Abstract
Description
Technical field
The present invention relates to teaching robot field, more particularly to a kind of teaching based on image comparison, processing method.
Background technology
The teaching programming of current industrial robot needs programming operation personnel very familiar to robot, skilled to grasp correlation Programming knowledge, this causes longer time to be spent and higher cost for training industrial robot programming operation personnel, And then the threshold of industrial robot application is improved, also improve the cost of industrial robot application.
It is used to solve the technology application of some professional domain using vision system majority at present, and is used for regarding for teaching programming Feel system, there are industrial robot system's repeatable accuracy is relatively low, operand is big, and response speed is slower, the problem of easily being blocked.
Invention content
The technical problem to be solved by the present invention is to:A kind of teaching based on image comparison, processing method are provided.
The solution that the present invention solves its technical problem is:
A kind of teaching method based on image comparison, it is characterised in that:
Include the following steps:
Step a) equipment teachings process equipment, is located at teaching system at the teaching system equipment with teaching work coordinate system The teaching robot of equipment side, the teaching system equipment include the teaching visual response moved in teaching work coordinate system The side of module, teaching visual response module is equipped with teaching work top, the side and/or teaching workbench of teaching work top The top in face be equipped with for build teaching work coordinate system, typing teaching visual response module run trace carry teaching The teaching for building camera builds module, and teaching visual response module is equipped with teaching attitude transducer, teaching gives camera; It is binocular camera or more mesh cameras that teaching, which builds camera,;It is that binocular camera or more mesh image that teaching, which gives camera, Head;
Workpiece is placed on teaching work top, teaching visual response module and teaching process equipment are fixedly connected on Together;
Step b) starts teaching structure module, teaching attitude transducer, the given camera of teaching, allows operating personnel are hand-held to show Religion process equipment is processed workpiece, and the teaching structure camera typing teaching visual response module of teaching structure module is allowed to exist Track in teaching work coordinate system allows teaching to give camera to work to form the track data for including multiple tracing points Part carries out taking pictures to form given image data, and the posture of teaching attitude transducer perception teaching visual response module is allowed to form posture Data send the track data with given image data, attitude data to programing system, to be contrasted data set, institute Correction data collection is stated by the tracing point of track data as female data, and each female data correspond to one group of subdata, the son Data packet contains given image data of the teaching visual response module at the position of each tracing point, attitude data;
Teaching process equipment with teaching visual response module is connect by step c) with teaching robot, uses track data To guide teaching robot that teaching visual response module is driven by real space point determined by all female data, to show successively Pose adjustment when visual response module being taught to reach the position of each female data is at given posture, and the given posture is all in accordance with this The attitude data of the corresponding subdata of female data is configured, and adjacent two female data are referred to as female data M, female data N, Before teaching visual response module is from the position of female data M to the position of female data N, allows teaching to give camera and workpiece is carried out Image taking forms current image date, while allowing the given figure in current image date subdata corresponding with mother data M Given image deviation is obtained as data carry out image comparison, and machine is driven according to given image deviation and teaching attitude misalignment People adjusts position of the teaching visual response module relative to workpiece, finally allows current image date son corresponding with mother data M The given image of given image data in data compares deviation in the margin of tolerance, then records the machine of current teaching robot The robot pose data of device people's pose data, current teaching robot are known as robot base's data;By all robots Foundation data conformity forms machining locus data.
As being further improved for said program, in step c), teaching robot drives teaching visual response module from mother Process before the position of data M to female data N, the change in location process of teaching visual response module is such:
Position where the current teaching visual response modules of process a) is P, obtain P current image date and with female number Comparison, which is carried out, according to the given image data in the corresponding subdatas of M generates given image deviation, implementation procedure b);
Process b) judges given image deviation, if given image deviation exceeds the margin of tolerance, implementation procedure c);If Given image deviation records the robot pose data of current teaching robot in the margin of tolerance, driving teaching robot with The position movement that dynamic teaching visual response module is determined to female data N;
Process c) according to given image deviation come allow teaching robot drive teaching visual response module carry out position and/or The adjustment of posture, then implementation procedure a).
Further include the step a1 being located at after step a) as being further improved for said program), in step b) afterwards and in step Rapid a1) after step b1), before the step c) and in step b1) after step c1);
Step a1) using the teaching process equipment for being fixed with teaching system equipment as modelling apparatus, teaching robot is made To look after and guide robot, modelling apparatus is mounted in instruction robot, several obligatory points are set around workpiece, allow instruction Robot drives modelling apparatus based on the robot coordinate of instruction robot, allows modelling apparatus with specified path, with specifying posture Pass through several above-mentioned obligatory points successively, and workpiece is shot so that each tracing point of specified path carries posture Information, the three-dimensional basic data of image information, integrate all three-dimensional basic datas, the three-dimensional mould of the material object to generate workpiece Type;
Step b1) attitude data point/line in all female data and/or subdata is visually stood with material object Body Model is presented jointly, is stated to be analogue data by visual female data and/or subnumber, adjustment member analogue data and/or Increase and decrease analogue data;
Step c1) allow instruction robot to drive modelling apparatus to look after and guide the robot coordinate of robot by analogue data institute Determining actual spatial point, the actual spatial point are known as simulating point, posture of the modelling apparatus on each simulation point It is determined by analogue data, to regenerate female data in teaching work coordinate system, the corresponding subdata of female data.
As being further improved for said program, teaching, which is built, is equipped with attitude transducer in module, including at least two show Religion structure module, the left or right side or front side of teaching work top or rear side are equipped at least one teaching and build module, teaching The top of work top is equipped at least one teaching and builds module, at least one teaching builds module and is known as global structure mould The monitoring range of block, the teaching structure camera of the global structure module covers other all teaching structure modules;It is global The teaching structure that the monitoring range of the teaching structure camera of structure module respectively builds module with other all teachings is taken the photograph As head monitoring range intersection, after each robot base's data is formed in step c), the teaching of teaching robot can be monitored The teaching structure camera of structure module takes pictures to teaching robot, to be formed and the one-to-one machine of robot base's data Device people's pose presentation data.
A kind of processing method based on image comparison,
Following steps are executed after executing a kind of all steps of the above-mentioned teaching method based on image comparison of any one:
Step d1) equipment processing work table top, it is located at the machining robot of processing work table top side, on machining robot Equipped with flowing water process equipment, workpiece is placed on processing work table top, driving machining robot is according to machining locus data come band Dynamic process equipment is processed workpiece.
A kind of processing method based on image comparison,
Following steps are executed after executing a kind of all steps of the above-mentioned teaching method based on image comparison of any one:
Step d2) it equips flowing water process equipment, the system of processing equipment with processing work coordinate system, be located at system of processing The machining robot of equipment side, system of processing equipment include the processing visual response mould moved in processing work coordinate system Block, processes the side processing work table top of visual response module, the side of processing work table top and/or processing work table top it is upper Side is equipped with builds module for the processing of the run trace building processing work coordinate system, visual response module is processed in typing, Processing structure module is identical as teaching structure modular structure, and processing visual response module is equipped with machining posture sensor, processing Work as preceding camera;Processing is binocular camera or more mesh cameras when preceding camera;
Workpiece is placed on processing work table top, processing visual response module is fixedly connected on flowing water process equipment Together, flowing water process equipment is connect with machining robot;
Step e) places workpiece on processing work table top, and processing structure module is allowed to build processing work by basic point of workpiece Coordinate system, machining robot drive processing visual response module to pass through tracing point successively according to machining locus data;
After processing visual response module reaches the tracing point of correction data collection, machining robot is corresponding according to tracing point Attitude data in subdata confirms or adjusts the posture of processing visual response module so that processing visual response module Posture when processing is identical as the posture on the same tracing point of teaching visual response module;
Allow processing when preceding camera to workpiece carry out image taking formed current image date, by current image date with show The given image data in the corresponding subdata of tracing point where religion visual response module are compared, if current image date When with the picture comparing results of given image data in the margin of tolerance, starts or flowing water process equipment is kept to add workpiece Work, and if the picture comparing result of current image date and given image data when exceeding the margin of tolerance, pause flowing water processing is set The standby processing to workpiece allows machining robot to drive processing visual response module adjustment posture, position, and continues to obtain current figure As data and the current image date newly obtained and the picture of given image data is allowed to compare, visual response module is processed to allow Posture, position it is identical as subdata, then start flowing water process equipment workpiece is processed;
Relative position and flowing water process equipment after teaching process equipment, the connection of teaching visual response module process visual impression Relative position after answering module to connect is identical;Teaching system equipment is identical as the structure of system of processing equipment;Teaching process equipment It is identical as the structure of flowing water process equipment.
Further include mobile work platform, the upper surface of mobile work platform is described as being further improved for said program Processing work table top.
As being further improved for said program, teaching system equipment is same equipment with system of processing equipment;Teaching adds Construction equipment is same equipment with flowing water process equipment.
As being further improved for said program, machining robot, teaching robot are same equipment, processing work platform Face, teaching work top are same equipment, and it is same equipment that teaching, which builds module with processing structure module,.The beneficial effect of the present invention Fruit is:Teaching visual response module has attitude transducer, so visual response module against workpiece and is carried out teaching trip When walking, each tracing point in visual response module can be corresponding with attitude data and image data, complete human hand teaching Afterwards, the teaching process equipment with teaching visual response module is mounted on robot, the side confirmed by image comparison Method realizes conversion of the work coordinate system to robot coordinate system, greatly reduces and requires the profession of operating personnel and system number According to translation operation amount.The present invention is used for teaching robot.
Specific implementation mode
The technique effect of the design of the present invention, concrete structure and generation is carried out below with reference to embodiment clear, complete Ground describes, to be completely understood by the purpose of the present invention, feature and effect.Obviously, described embodiment is the one of the present invention Section Example, rather than whole embodiments, based on the embodiment of the present invention, those skilled in the art are not paying creativeness The other embodiment obtained under the premise of labour, belongs to the scope of protection of the invention.In addition, what is be previously mentioned in text is all / connection relation is connect, not singly refers to component and directly connects, and referring to can be according to specific implementation situation, by adding or reducing connection Auxiliary, to form more preferably coupling structure.Each technical characteristic in the present invention, can be under the premise of not conflicting conflict Combination of interactions.
This is the embodiment of the present invention, specifically:
A kind of teaching method based on image comparison, it is characterised in that:
Include the following steps:
Step a) equipment teachings process equipment, is located at teaching system at the teaching system equipment with teaching work coordinate system The teaching robot of equipment side, the teaching system equipment include the teaching visual response moved in teaching work coordinate system The side of module, teaching visual response module is equipped with teaching work top, the side and/or teaching workbench of teaching work top The top in face be equipped with for build teaching work coordinate system, typing teaching visual response module run trace carry teaching The teaching for building camera builds module, and teaching visual response module is equipped with teaching attitude transducer, teaching gives camera; It is binocular camera or more mesh cameras that teaching, which builds camera,;It is that binocular camera or more mesh image that teaching, which gives camera, Head;
Workpiece is placed on teaching work top, teaching visual response module and teaching process equipment are fixedly connected on Together;
Step b) starts teaching structure module, teaching attitude transducer, the given camera of teaching, allows operating personnel are hand-held to show Religion process equipment is processed workpiece, and the teaching structure camera typing teaching visual response module of teaching structure module is allowed to exist Track in teaching work coordinate system is generated synchronously when teaching visual response module moves in teaching work coordinate system and shows The attitude data for teaching the teaching attitude transducer of visual response module allows to form the track data for including multiple tracing points Teaching structure camera take pictures forming given image data to workpiece, and teaching attitude transducer is allowed to perceive teaching visual response The posture of module forms attitude data, and the track data with given image data, attitude data is sent to programing system, to It is contrasted data set, the correction data collection is used as female data by the tracing point of track data, and each female data are corresponding One group of subdata, the subdata include given image number of the teaching visual response module at the position of each tracing point According to, attitude data;
Teaching process equipment with teaching visual response module is connect by step c) with teaching robot, uses track data To guide teaching robot that teaching visual response module is driven by real space point determined by all female data, to show successively Pose adjustment when visual response module being taught to reach the position of each female data is at given posture, and the given posture is all in accordance with this The attitude data of the corresponding subdata of female data is configured, and adjacent two female data are referred to as female data M, female data N, Before teaching visual response module is from the position of female data M to the position of female data N, allows teaching to give camera and workpiece is carried out Image taking forms current image date, while allowing the given figure in current image date subdata corresponding with mother data M Given image deviation is obtained as data carry out image comparison, and machine is driven according to given image deviation and teaching attitude misalignment People adjusts position of the teaching visual response module relative to workpiece, finally allows current image date son corresponding with mother data M The given image of given image data in data compares deviation in the margin of tolerance, then records the machine of current teaching robot The robot pose data of device people's pose data, current teaching robot are known as robot base's data;By all robots Foundation data conformity forms machining locus data.
Teaching visual response module has attitude transducer, so visual response module against workpiece and is carried out teaching When migration, each tracing point in visual response module can be corresponding with attitude data and image data, shown completing human hand After religion, the teaching process equipment with teaching visual response module is mounted on robot, is confirmed by image comparison Method realizes conversion of the work coordinate system to robot coordinate system, greatly reduces and requires the profession of operating personnel and system Data conversion operand, operating personnel can be ignorant of teaching programming, it is only necessary to which following the prescribed order can generate to execute above-mentioned steps The track of machining locus data, and the robot that the track of the machining locus data in step c) is based on teaching robot is sat What mark was established, it can Direct Drive Robot.
Correction data collection, robot pose data, robot base's data, driving flowing water process equipment follow processing machine People's running orbit can be read out to be processed to workpiece, change so that it is convenient to be transplanted and be used to other robots In processing, and the prior art is avoided, needs to carry out teaching at least once to each robot.
In step c), teaching robot drives mistake of the teaching visual response module before from the position of female data M to female data N The change in location process of journey, teaching visual response module is such:
Position where the current teaching visual response modules of process a) is P, obtain P current image date and with female number Comparison, which is carried out, according to the given image data in the corresponding subdatas of M generates given image deviation, implementation procedure b);
Process b) judges given image deviation, if given image deviation exceeds the margin of tolerance, implementation procedure c);If Given image deviation records the robot pose data of current teaching robot in the margin of tolerance, driving teaching robot with The position movement that dynamic teaching visual response module is determined to female data N;
Process c) according to given image deviation come allow teaching robot drive teaching visual response module carry out position and/or The adjustment of posture, then implementation procedure a).
Such regulative mode can realize the robot pose for quickly adjusting robot, reduce the fortune of system well Calculate burden.
The invention also includes be located at the step a1 after step a)), in step b) afterwards and in step a1) after step b1), Before step c) and in step b1) after step c1);
Step a1) using the teaching process equipment for being fixed with teaching system equipment as modelling apparatus, teaching robot is made To look after and guide robot, modelling apparatus is mounted in instruction robot, several obligatory points are set around workpiece, allow instruction Robot drives modelling apparatus based on the robot coordinate of instruction robot, allows modelling apparatus with specified path, with specifying posture Pass through several above-mentioned obligatory points successively, and workpiece is shot so that each tracing point of specified path carries posture Information, the three-dimensional basic data of image information, integrate all three-dimensional basic datas, the three-dimensional mould of the material object to generate workpiece Type;
Step a1) using the teaching process equipment for being fixed with teaching system equipment as modelling apparatus, teaching robot is made To look after and guide robot, modelling apparatus is mounted in instruction robot, several obligatory points are set around workpiece, allow instruction Robot drives modelling apparatus based on the robot coordinate of instruction robot, allows modelling apparatus with specified path, with specifying posture Pass through several above-mentioned obligatory points successively, and workpiece is shot so that each tracing point of specified path carries posture Information, the three-dimensional basic data of image information, integrate all three-dimensional basic datas, the three-dimensional mould of the material object to generate workpiece Type;
The three-dimensional model in kind generated by such method can directly display on the computer screen, and each tracing point Also it can show on the computer screen, this just easily realizes human-computer interaction, reduces the specialization of adjustment tracing point Difficulty, anyone, without it is well educated, without very abundant programming experience and ability, by simply training, can adjust calmly Position data, to realization being accurately positioned in process.
Step b1) attitude data point/line in all female data and/or subdata is visually stood with material object Body Model is presented jointly, is stated to be analogue data by visual female data and/or subnumber, adjustment member analogue data and/or Increase and decrease analogue data;
Material object three-dimensional model of the present invention is the corresponding surface model in surface of physical model or workpiece top point, and Three-dimensional model in kind is exactly generated, while analogue data is visually presented tracing point jointly with three-dimensional model in kind, this Sample is just avoided that specialized modification of program, can adjust analogue data by visually operating, and reduces teaching programming Professional difficulty.
The posture and corresponding pictorial diagram induced by attitude transducer due to each pose of teaching visual response module As determining, belong to closed-loop control;Precision than robot itself control is high, therefore substantially increases the repeatable accuracy of robot; Meanwhile directly measure attitude data by attitude transducer and align posture, when image comparison robot need to only make up and down, it is left Right, front and back translational motion eliminates a large amount of, complicated transport, enables system Effec-tive Function.
Since analogue data is established with robot coordinate, work in teaching to rapidly convert analogue data to Female data in coordinate system and the corresponding subdata of female data, can be by step c1) it realizes.
Step c1) allow instruction robot to drive modelling apparatus to look after and guide the robot coordinate of robot by analogue data institute Determining actual spatial point, the actual spatial point are known as simulating point, posture of the modelling apparatus on each simulation point It is determined by analogue data, to regenerate female data in teaching work coordinate system, the corresponding subdata of female data.
Step c1) traditional coordinate computation mapping operations are avoided, the burden of system operations is reduced, also avoids accumulating Error improves repeatable accuracy.
The present invention does not use the original three-dimensional model of workpiece and uses the three-dimensional model in kind of workpiece generation, can be very well The female data of precisely adjustment and subdata are carried out, because of the pattern draft of the original three-dimensional model of three-dimensional model in kind and workpiece, material Expect that shrinking percentage all can be different, the female data and subdata for adjusting out with original three-dimensional model can not be suitable for high-precision Processing.And step a1), step b1), step c1) by the conversion of coordinate system, the computational burden of system can be reduced, can also be kept away Exempt from accumulated error, to realize high-precision processing, and is avoided that the machining accuracy that the precision deficiency of robot itself is brought Insufficient problem, substantially reduces a large amount of compensation operation caused by the operation precision in order to compensate for robot itself.
Teaching builds and is equipped with attitude transducer in module, including at least two teachings build module, teaching work top Left or right side or front side or rear side are equipped at least one teaching and build module, and the top of teaching work top is equipped at least one Teaching builds module, at least one teaching builds module and is known as global structure module, the teaching structure of the global structure module The monitoring range for building camera covers other all teaching structure modules;The prison of the teaching structure camera of overall situation structure module The teaching structure camera monitoring range for respectively building module with other all teachings depending on range intersects, each in step c) After a robot base's data are formed, the teaching that can monitor the teaching structure module of teaching robot builds camera to teaching Robot takes pictures, to be formed and the one-to-one robot pose image data of robot base's data.Such setting can To prevent from interrupting the monitoring to robot, and the positioning of multiple Working positions can be carried out to robot, also can processing when The method that compare by picture is waited, real time picture that processing structure camera obtains and robot pose image data are carried out pair Than to realize the pose adjustment of quick robot location's navigation, robot itself.
A kind of processing method based on image comparison,
Following steps are executed after executing a kind of all steps of the above-mentioned teaching method based on image comparison of any one:
Step d1) equipment processing work table top, it is located at the machining robot of processing work table top side, on machining robot Equipped with flowing water process equipment, workpiece is placed on processing work table top, driving machining robot is according to machining locus data come band Dynamic process equipment is processed workpiece.
Such processing method is suitable for machining accuracy general, the higher situation of machining accuracy, and machining accuracy is wanted Very high situation is sought, needs to use following method:
A kind of processing method based on image comparison,
Following steps are executed after executing a kind of all steps of the above-mentioned teaching method based on image comparison of any one:
Step d2) it equips flowing water process equipment, the system of processing equipment with processing work coordinate system, be located at system of processing The machining robot of equipment side, system of processing equipment include the processing visual response mould moved in processing work coordinate system Block, processes the side processing work table top of visual response module, the side of processing work table top and/or processing work table top it is upper Side is equipped with builds module for the processing of the run trace building processing work coordinate system, visual response module is processed in typing, Processing structure module is identical as teaching structure modular structure, and processing visual response module is equipped with machining posture sensor, processing Work as preceding camera;Processing is binocular camera or more mesh cameras when preceding camera;
Workpiece is placed on processing work table top, processing visual response module is fixedly connected on flowing water process equipment Together, flowing water process equipment is connect with machining robot;
Step e) places workpiece on processing work table top, and processing structure module is allowed to build processing work by basic point of workpiece Coordinate system, machining robot drive processing visual response module to pass through tracing point successively according to machining locus data;
After processing visual response module reaches the tracing point of correction data collection, machining robot is corresponding according to tracing point Attitude data in subdata confirms or adjusts the posture of processing visual response module so that processing visual response module Posture when processing is identical as the posture on the same tracing point of teaching visual response module;
Allow processing when preceding camera to workpiece carry out image taking formed current image date, by current image date with show The given image data in the corresponding subdata of tracing point where religion visual response module are compared, if current image date When with the picture comparing results of given image data in the margin of tolerance, starts or flowing water process equipment is kept to add workpiece Work, and if the picture comparing result of current image date and given image data when exceeding the margin of tolerance, pause flowing water processing is set The standby processing to workpiece allows machining robot to drive processing visual response module adjustment posture, position, and continues to obtain current figure As data and the current image date newly obtained and the picture of given image data is allowed to compare, visual response module is processed to allow Posture, position it is identical as subdata, then start flowing water process equipment workpiece is processed;
Relative position and flowing water process equipment after teaching process equipment, the connection of teaching visual response module process visual impression Relative position after answering module to connect is identical;Teaching system equipment is identical as the structure of system of processing equipment;Teaching process equipment It is identical as the structure of flowing water process equipment.
This processing method can track always workpiece, thus either actual flowing water processing in using which kind of robot, Whether workpiece is moving, and machining robot can be accurately performed the action of tutorial program defined, can very well restore and show Religion acts.Moreover, tutorial program is highly convenient for transplanting, a teaching can be used for multiple systems of processing, and each system of processing The robot of side is once damaged, and spare machining robot is loaded tutorial program, is at will mounted on original processing The side of robot can be achieved with processing.And processing structure camera can allow processing induction module quickly to be positioned, and it is current It is accurate that the picture comparing result of image data and given image data can allow machining robot that can be carried out to processing induction module Position, pose adjustment realize that accurate realization teaching action or even the place of teaching and the place of processing can not to allow Together.According to the method for the comparative analysis of image, mismachining tolerance can be compensated automatically in this way, avoid reduces error institute by algorithm The problem of the system operations load weight brought, can greatly improve repeatable accuracy.The error includes the kinematic error of robot, rail It is worn after error, robot long-time use caused by itself vibration of robot when the accumulated errors of mark data, processing It is error caused by amount of deflection that the error brought, robot crawl weight, which cause each arm of robot to occur,.
For the ease of Continuous maching, the present embodiment further includes mobile work platform, and the upper surface of mobile work platform is described Processing work table top.And just because of the tracking teaching of the image comparison with workpiece of the present invention, processing method, it could realize shifting It starts building the accurate processing of part.
Teaching system equipment is same equipment with system of processing equipment;Teaching process equipment is same with flowing water process equipment Equipment.
Machining robot, teaching robot are same equipment, and processing work table top, teaching work top are same equipment, It is same equipment that teaching, which builds module with processing structure module,.
The better embodiment of the present invention is illustrated above, but the present invention is not limited to the embodiment, Those skilled in the art can also make various equivalent modifications or replacement under the premise of without prejudice to spirit of that invention, this Equivalent modification or replacement are all contained in the application claim limited range a bit.
Claims (9)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810319631.8A CN108656120B (en) | 2018-04-11 | 2018-04-11 | Teaching and processing method based on image contrast |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810319631.8A CN108656120B (en) | 2018-04-11 | 2018-04-11 | Teaching and processing method based on image contrast |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108656120A true CN108656120A (en) | 2018-10-16 |
CN108656120B CN108656120B (en) | 2020-10-30 |
Family
ID=63783299
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810319631.8A CN108656120B (en) | 2018-04-11 | 2018-04-11 | Teaching and processing method based on image contrast |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108656120B (en) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
CN106327569A (en) * | 2015-06-30 | 2017-01-11 | 遵义林棣科技发展有限公司 | Three-dimensional modeling method of digital controlled lathe workpiece |
CN107309882A (en) * | 2017-08-14 | 2017-11-03 | 青岛理工大学 | A kind of robot teaching programming system and method |
-
2018
- 2018-04-11 CN CN201810319631.8A patent/CN108656120B/en active IP Right Grant
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103406905A (en) * | 2013-08-20 | 2013-11-27 | 西北工业大学 | Robot system with visual servo and detection functions |
CN106327569A (en) * | 2015-06-30 | 2017-01-11 | 遵义林棣科技发展有限公司 | Three-dimensional modeling method of digital controlled lathe workpiece |
CN107309882A (en) * | 2017-08-14 | 2017-11-03 | 青岛理工大学 | A kind of robot teaching programming system and method |
Also Published As
Publication number | Publication date |
---|---|
CN108656120B (en) | 2020-10-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10582974B2 (en) | Estimation of a position and orientation of a frame used in controlling movement of a tool | |
CN104067781B (en) | Based on virtual robot and integrated picker system and the method for real machine people | |
US20160165130A1 (en) | Eye/Head Controls for Camera Pointing | |
Hashimoto et al. | Visual servoing with hand-eye manipulator-optimal control approach | |
US7447615B2 (en) | Simulation apparatus for robot operation having function of visualizing visual field by image capturing unit | |
US8175750B2 (en) | Control apparatus and control method for robot arm, robot, control program for robot arm, and robot arm control-purpose integrated electronic circuit | |
DE102005060967B4 (en) | Method and device for setting up a trajectory of a robot device | |
DE602006000648T2 (en) | Offline teaching device for a robot | |
KR101660064B1 (en) | Method for allowing a manipulator to cover a predetermined trajectory, and control device for carrying out said method | |
JP6468741B2 (en) | Robot system and robot system calibration method | |
CN101630409B (en) | Hand-eye vision calibration method for robot hole boring system | |
EP1985416B1 (en) | Mobile robot | |
Bejczy et al. | The phantom robot: predictive displays for teleoperation with time delay | |
EP3407088A1 (en) | Systems and methods for tracking location of movable target object | |
CA2784720C (en) | Predictive control and visualizing system for a nc machine tool | |
US8306661B2 (en) | Method and system for establishing no-entry zone for robot | |
JP5049976B2 (en) | 3D model data confirmation method and 3D model data confirmation apparatus | |
JP2015212012A (en) | Method of controlling robot tool | |
CN104400279B (en) | Pipeline space weld seam based on CCD identifies the method with trajectory planning automatically | |
US5331413A (en) | Adjustable control station with movable monitors and cameras for viewing systems in robotics and teleoperations | |
JP3946711B2 (en) | Robot system | |
CN106392267B (en) | A kind of real-time welding seam tracking method of six degree of freedom welding robot line laser | |
EP1950010B1 (en) | Robot and method for programming a robot | |
WO2016208467A1 (en) | Calibration device and robot system using same | |
EP3342550A1 (en) | Manipulator system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |