CN104180753A - Rapid calibration method of robot visual system - Google Patents

Rapid calibration method of robot visual system Download PDF

Info

Publication number
CN104180753A
CN104180753A CN201410371760.3A CN201410371760A CN104180753A CN 104180753 A CN104180753 A CN 104180753A CN 201410371760 A CN201410371760 A CN 201410371760A CN 104180753 A CN104180753 A CN 104180753A
Authority
CN
China
Prior art keywords
coordinate
vision
mechanical arm
point
induction
Prior art date
Application number
CN201410371760.3A
Other languages
Chinese (zh)
Inventor
卢盛林
郭龙
Original Assignee
东莞市奥普特自动化科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 东莞市奥普特自动化科技有限公司 filed Critical 东莞市奥普特自动化科技有限公司
Priority to CN201410371760.3A priority Critical patent/CN104180753A/en
Publication of CN104180753A publication Critical patent/CN104180753A/en

Links

Abstract

The invention relates to a rapid calibration method of a robot visual system, especially to calibration between a camera and a manipulator and calibration between cameras. The method comprises the steps of arranging a calibration plate, setting movement rules of the manipulator, obtaining coordinates of induction points of the calibration plate and coordinates of the corresponding visual system, establishing mapping relations of matrixes according to the coordinates, and solving a transformation matrix.

Description

The quick calibrating method of robotic vision system

Technical field

The present invention relates to robot coordinate setting technique field, particularly a kind of quick calibrating method of robotic vision system.

Background technology

At present, in global manufacturing industry, industrial robot has played more and more important effect aborning.In order to make robot can be competent at more complicated work, not only will there be better control system in robot, also needs the variation of perception environment more.Wherein robot vision with its contain much information, information completely becomes most important robot perception function.The precision welding that Yi Yong robot carries out electronic element of circuit board is example, in welding process, robot can utilize the video camera in vision system automatically to locate workpiece or workplace, and evaluation work scene, with respect to the relative position of robot, fulfils assignment with auxiliary robot.

The demarcation of robotic vision system, sets up the transformational relation between robot coordinate system and the image coordinate system of video camera, is one of gordian technique in robot manipulating task process.Yet the shortcoming that existing robotic vision system scaling method ubiquity condition precedent is many, complexity is high, calculated amount is large can not realize the requirement of simple and efficient demarcation in production application.

Summary of the invention

The object of the invention is to solve the deficiencies in the prior art, the quick calibrating method of robotic vision system is provided, this scaling method is simple, information processing capacity is few; Easy to operate.

The technical solution used in the present invention is:

The quick calibrating method of robotic vision system, for being transformed into robot coordinate a robotic vision system; It comprises the following steps:

Step 1, scaling board is set, a plurality of preset point for vision system induction are set on scaling board, scaling board is set up to coordinate;

Step 2, mechanical arm arrange an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set according to the position of preset point, and mechanical arm is subject to processing device and controls, and makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:

Step 3, mechanical arm start mobile, and the camera head of vision system carries out image taking to mechanical arm simultaneously; Vision system is divided coordinate to display window, calculates the Grid Track of induction point in camera window simultaneously;

Step 4, mechanical arm move to one of them preset point, suspend t second; Vision system is preserved now induction point at the coordinate information of display window; Simultaneously also by the coordinate information storage of preset point;

Step 5, repeating step three, four repeatedly, obtain a plurality of induction points at the coordinate information of display window: (X1, Y1), (X2, Y2), (X3, Y3) ... (Xn, Yn); And the coordinate information of corresponding a plurality of preset point: (x1, y1), (x2, y2) ... (xn, yn);

Step 6, set up coordinate mapping relations:

Try to achieve transition matrix; Obtain the transformational relation of the coordinate of robot vision system and the coordinate of robot coordinate system, thereby the coordinate completing in vision system is demarcated.

Between mechanical arm and robotic vision system, without communication, be connected, at mechanical arm, cover after preset dwell point, robotic vision system automatic acquisition image coordinate, calculates the transformational relation between arm-and-hand system coordinate and robotic vision system coordinate.

Further, when carrying out step 5, the induction point that obtains coordinate information is no less than 3.

Further, when carrying out step 5, obtain the induction point of coordinate information not point-blank.

Further, carrying out step 6 while solving transition matrix, mapping relations changed into:

Order matrix ; ; , ; ;

Have: AX=b, AX '=b '; These two matrix equations are adopted respectively to least square method solution matrix X and X ', then obtain matrix .

When solving approximate transition matrix, first matrix A is carried out to QR decomposition, i.e. A=QR; Wherein Q is orthogonal matrix, and upper triangular matrix during R, has: min||AX-b||=min||QRX-b||=min||RX-Q -1b||; By least square method, solve again; Certainly can directly adopt MATLAB software to calculate.

The quick calibrating method of robotic vision system, for one of them vision system coordinate conversion is arrived to another vision system coordinate, comprises the following steps:

Step 1, scaling board is set, a plurality of preset point for two vision systems inductions are set on scaling board;

Step 2, provide a mechanical arm, mechanical arm arranges an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set according to the position of preset point, and mechanical arm is subject to processing device and controls, and makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:

Step 3, mechanical arm start mobile, and the first camera of one of them vision system and the second camera of another vision system are carried out image taking to mechanical arm respectively; The display window of two vision systems is divided respectively coordinate, and calculates the Grid Track of induction point in corresponding display window simultaneously;

Step 4, mechanical arm move to one of them preset point, suspend t second; Camera is preserved now induction point at the coordinate information of window; Simultaneously also by the coordinate information storage of preset point; Then mechanical arm moves to another preset point;

Step 5, repeating step three, four repeatedly, obtain the coordinate of a plurality of induction points in one of them vision system of coordinate information of respective window and be respectively: (R1, T1), (R2, T2), (R3, T3) ... (Rn, Tn); Coordinate in another vision system is respectively: (r1, t1), (r2, t2) ... (tn, tn);

Step 6, set up coordinate mapping relations:

Try to achieve transition matrix; Obtain the transformational relation of coordinate between two vision systems, thereby complete two coordinates in vision system, demarcate.

Further, when carrying out step 5, the induction point that obtains coordinate information is no less than 3.

Further, when carrying out step 5, obtain the induction point of coordinate information not point-blank.

Further, carrying out step 6 while solving transition matrix, mapping relations changed into:

Order matrix ; ; ; ; ; Have: BZ=m, BZ '=m '; These two matrix equations are adopted respectively to least square method solution matrix m and m ', then obtain matrix .

The beneficial effect that the present invention obtains is: the present invention adopts the coordinate collection of vision system, then adopt the mode of transition matrix, carry out coordinate system and demarcate conversion, the relation between the automatic Calibration of relation between vision system and mechanical arm and vision system that can realize is rapidly demarcated, reduce artificial participation, there is simple and fast, practical.

Accompanying drawing explanation

Fig. 1 is the schematic flow sheet of embodiments of the invention 1.

Fig. 2 is the schematic flow sheet of embodiments of the invention 2.

                        

Embodiment

Below in conjunction with accompanying drawing 1 to Fig. 2 and embodiment, the present invention is described further.

Embodiment 1: referring to Fig. 1.

At present mechanical arm is generally all provided with own stepping coordinate system, the stepping program cutting as line etc.; When adopting vision system to control other gearings to coordinate with mechanical arm, the coordinate system of the coordinate system of vision system and mechanical arm need to be united.

The quick calibrating method of robotic vision system, for robot coordinate being transformed into a robotic vision system coordinate, it comprises the following steps:

Step 1, scaling board is set, a plurality of preset point for the induction of vision system camera head are set on scaling board, scaling board is set up to coordinate system; This coordinate system is the coordinate system based on robot device itself.These two coordinate systems are equal to.

Step 2, mechanical arm arrange an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set in the coordinate system of mechanical arm according to the position of preset point, makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:

Step 3, mechanical arm start mobile, and the camera head of vision system carries out image taking to mechanical arm simultaneously; Vision system is set up coordinate or vision system to display window and is set up its coordinate of coordinate system and be presented in display window, calculates induction point corresponding Grid Track in display window simultaneously;

Step 4, move to one of them preset point when mechanical arm, suspend t second, the t as required sensitivity of perception sets; Although vision system has calculated the Grid Track of induction point in display window, if induction point is always in moving process, can there is error in the coordinate points that Grid Track is corresponding, here arrange after pause step, during time-out, the changes in coordinates scope reading is very little, can be so that vision system captures the coordinate of induction point in display window accurately.Vision system is preserved the now coordinate information of induction point correspondence in display window; Also corresponding preset point is stored at the intrasystem coordinate information of robot coordinate simultaneously; Here suspend t second, be convenient to coordinate acquisition information.

Step 5, repeating step three, four repeatedly, obtain a plurality of induction points at the coordinate information of window: (X1, Y1), (X2, Y2), (X3, Y3) ... (Xn, Yn); And the coordinate information of corresponding a plurality of preset point: (x1, y1), (x2, y2) ... (xn, yn);

Carrying out coordinate timing signal, need to obtain the coordinate correspondence mappings relation of two coordinate systems, and the acquisition of mapping relations can obtain by the conversion of coordinates matrix and mapping point matrix:

Step 6, set up coordinate mapping relations:

Try to achieve transition matrix; Obtain the transformational relation of coordinate between camera and mechanical arm, thereby the coordinate completing in vision system is demarcated.

Further, when carrying out step 5, the induction point that obtains coordinate information is no less than 3.

In order to obtain more accurately transition matrix, the coordinate information that induction point is corresponding is no less than 3.Effect preferred version, the coordinate value of induction point is at 5-10.

Further, when carrying out step 5, obtain the induction point of coordinate information not point-blank.

With the coordinate information of straight line, in mapping process, there is defect, should avoid as far as possible.

Further, carrying out step 6 while solving transition matrix, mapping relations changed into:

Order matrix ; ; , ; ;

Have: AX=b, AX '=b '; These two matrix equations are adopted respectively to least square method solution matrix X and X ', then obtain matrix .

When obtaining transition matrix, because this transition matrix may not be an exact value, therefore need to try to achieve a transition matrix that error is lower; This transition matrix, by splitting, is then tried to achieve transition matrix by least square method.

Embodiment 2: referring to Fig. 2.

The quick calibrating method of robotic vision system, for one of them vision system coordinate conversion is arrived to another vision system coordinate, comprises the following steps:

Step 1, scaling board is set, a plurality of preset point for vision system induction are set on scaling board;

Step 2, provide a mechanical arm, mechanical arm arranges an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set according to the position of preset point, and mechanical arm is subject to processing device and controls, and makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:

Step 3, mechanical arm start mobile, and the first camera of one of them vision system and the second camera of another vision system are carried out image taking to mechanical arm respectively; The display window of two vision systems is divided respectively coordinate; Or the coordinate of two vision systems is presented at corresponding display window, and calculate the Grid Track of induction point in corresponding display window simultaneously;

Step 4, mechanical arm move to one of them preset point, suspend t second; Pause step is set, and after shooting, changes in coordinates scope is very little, with this automatic decision determine induction point in coordinate.Two vision systems are all preserved now induction point at the coordinate information of corresponding display window; Then mechanical arm moves to another preset point;

Step 5, repeating step three, four repeatedly, obtain a plurality of induction points at the coordinate information of corresponding display window; Coordinate in one of them vision system is respectively: (R1, T1), (R2, T2), (R3, T3) ... (Rn, Tn); Coordinate in another vision system is respectively: (r1, t1), (r2, t2) ... (tn, tn);

Step 6, set up coordinate mapping relations:

Try to achieve transition matrix; Obtain the transformational relation of coordinate between two vision systems, thereby complete two coordinates in vision system, demarcate.

Further, when carrying out step 5, the induction point that obtains coordinate information is no less than 3.

Further, when carrying out step 5, obtain the induction point of coordinate information not point-blank.

Further, carrying out step 6 while solving transition matrix, mapping relations changed into:

Order matrix ; ; ; ; ; Have: BZ=m, BZ '=m '; These two matrix equations are adopted respectively to least square method solution matrix m and m ', then obtain matrix .

Below be only the application's preferred embodiment, equivalent technical solutions on this basis still falls into application protection domain.

Claims (6)

1. the quick calibrating method of robotic vision system, for robot coordinate being transformed into a robotic vision system, is characterized in that: it comprises the following steps:
Step 1, scaling board is set, a plurality of preset point of demarcating for vision system are set on scaling board;
Step 2, mechanical arm arrange an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set according to the position of preset point, makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:
Step 3, mechanical arm start mobile, and the camera head of vision system carries out image taking to mechanical arm simultaneously; Vision system is divided coordinate or vision system to display window and is set up its coordinate of coordinate system and be presented in display window, calculates the Grid Track of induction point in camera window simultaneously;
Step 4, mechanical arm move to one of them preset point, suspend t second; Vision system is preserved now induction point at the coordinate information of display window; Simultaneously also by the coordinate information storage of preset point;
Step 5, repeating step three, four repeatedly, obtain a plurality of induction points at the coordinate information of display window: (X1, Y1), (X2, Y2), (X3, Y3) ... (Xn, Yn); And the coordinate information of corresponding a plurality of preset point: (x1, y1), (x2, y2) ... (xn, yn);
Step 6, set up coordinate mapping relations:
Try to achieve transition matrix; Obtain the transformational relation of the coordinate of robot vision system and the coordinate of robot coordinate system, thereby the coordinate completing in vision system is demarcated.
2. the quick calibrating method of robotic vision system according to claim 1, is characterized in that: when carrying out step 5, the induction point that obtains coordinate information is no less than 3.
3. the quick calibrating method of robotic vision system according to claim 2, is characterized in that: when carrying out step 5, obtain the induction point of coordinate information not point-blank.
4. the quick calibrating method of robotic vision system, for by one of them vision system coordinate conversion to another vision system coordinate, it is characterized in that: comprise the following steps:
Step 1, scaling board is set, a plurality of preset point of demarcating for two vision systems are set on scaling board;
Step 2, provide a mechanical arm, mechanical arm arranges an induction point moving for demarcating mechanical arm; The movement rule of mechanical arm is set according to the position of preset point, makes the each mobile rear induction point of mechanical arm all be positioned at the top of preset point:
Step 3, mechanical arm start mobile, and the first camera of one of them vision system and the second camera of another vision system are carried out image taking to mechanical arm respectively; The display window of two vision systems is divided respectively coordinate, or the coordinate of two vision systems is presented at corresponding display window; And calculate the Grid Track of induction point in corresponding display window simultaneously;
Step 4, mechanical arm move to one of them preset point, suspend t second; Camera is preserved now induction point at the coordinate information of window; Simultaneously also by the coordinate information storage of preset point; Then mechanical arm moves to another preset point;
Step 5, repeating step three, four repeatedly, obtain the coordinate of a plurality of induction points in one of them vision system of coordinate information of respective window and be respectively: (R1, T1), (R2, T2), (R3, T3) ... (Rn, Tn); Coordinate in another vision system is respectively: (r1, t1), (r2, t2) ... (tn, tn);
Step 6, set up coordinate mapping relations:
Try to achieve transition matrix; Obtain the transformational relation of coordinate between two vision systems, thereby complete two coordinates in vision system, demarcate.
5. the quick calibrating method of robotic vision system according to claim 4, is characterized in that: when carrying out step 5, the induction point that obtains coordinate information is no less than 3.
6. the quick calibrating method of robotic vision system according to claim 5, is characterized in that: when carrying out step 5, obtain the induction point of coordinate information not point-blank.
CN201410371760.3A 2014-07-31 2014-07-31 Rapid calibration method of robot visual system CN104180753A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410371760.3A CN104180753A (en) 2014-07-31 2014-07-31 Rapid calibration method of robot visual system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410371760.3A CN104180753A (en) 2014-07-31 2014-07-31 Rapid calibration method of robot visual system

Publications (1)

Publication Number Publication Date
CN104180753A true CN104180753A (en) 2014-12-03

Family

ID=51961969

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410371760.3A CN104180753A (en) 2014-07-31 2014-07-31 Rapid calibration method of robot visual system

Country Status (1)

Country Link
CN (1) CN104180753A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104864807A (en) * 2015-04-10 2015-08-26 深圳大学 Manipulator hand-eye calibration method based on active binocular vision
CN105066984A (en) * 2015-07-16 2015-11-18 深圳訾岽科技有限公司 Vision positioning method and system
CN105364927A (en) * 2015-12-15 2016-03-02 天津立德尔智能装备科技有限公司 Robot carrying visual system and robot carrying quick positioning method
CN105631875A (en) * 2015-12-25 2016-06-01 广州视源电子科技股份有限公司 Method and system for determining mapping relations between camera coordinates and arm gripper coordinates
CN105654474A (en) * 2015-12-28 2016-06-08 深圳先进技术研究院 Mechanical arm positioning method based on visual guidance and device thereof
CN105844670A (en) * 2016-03-30 2016-08-10 东莞市速美达自动化有限公司 Horizontal robot mobile camera multi-point mobile calibration method
CN106524910A (en) * 2016-10-31 2017-03-22 潍坊路加精工有限公司 Execution mechanism visual calibration method
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN108436909A (en) * 2018-03-13 2018-08-24 南京理工大学 A kind of hand and eye calibrating method of camera and robot based on ROS
CN108527360A (en) * 2018-02-07 2018-09-14 唐山英莱科技有限公司 A kind of location position system and method
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN109325980A (en) * 2018-07-27 2019-02-12 深圳大学 A kind of method, apparatus and manipulator for manipulator positioning target
CN109910016A (en) * 2019-04-22 2019-06-21 亿嘉和科技股份有限公司 Vision collecting scaling method, apparatus and system based on multi-degree-of-freemechanical mechanical arm
CN110111394A (en) * 2019-05-16 2019-08-09 湖南三一快而居住宅工业有限公司 Based on manipulator feature to the method and device of video camera automatic Calibration

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1727839A (en) * 2004-07-28 2006-02-01 发那科株式会社 Method of and device for re-calibrating three-dimensional visual sensor in robot system
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
JP2011011321A (en) * 2009-07-06 2011-01-20 Fuji Electric Holdings Co Ltd Robot system and calibration method for the same
CN102848389A (en) * 2012-08-22 2013-01-02 浙江大学 Realization method for mechanical arm calibrating and tracking system based on visual motion capture
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera
CN103170980A (en) * 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 Positioning system and positioning method for household service robot
CN103363899A (en) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 Calibration device and calibration method for calibrating coordinate system of robot arm

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1727839A (en) * 2004-07-28 2006-02-01 发那科株式会社 Method of and device for re-calibrating three-dimensional visual sensor in robot system
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
JP2011011321A (en) * 2009-07-06 2011-01-20 Fuji Electric Holdings Co Ltd Robot system and calibration method for the same
CN102848389A (en) * 2012-08-22 2013-01-02 浙江大学 Realization method for mechanical arm calibrating and tracking system based on visual motion capture
CN103170980A (en) * 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 Positioning system and positioning method for household service robot
CN103170973A (en) * 2013-03-28 2013-06-26 上海理工大学 Man-machine cooperation device and method based on Kinect video camera
CN103363899A (en) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 Calibration device and calibration method for calibrating coordinate system of robot arm

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104864807A (en) * 2015-04-10 2015-08-26 深圳大学 Manipulator hand-eye calibration method based on active binocular vision
CN104864807B (en) * 2015-04-10 2017-11-10 深圳大学 A kind of manipulator hand and eye calibrating method based on active binocular vision
CN105066984A (en) * 2015-07-16 2015-11-18 深圳訾岽科技有限公司 Vision positioning method and system
CN105066984B (en) * 2015-07-16 2019-03-12 深圳訾岽科技有限公司 A kind of vision positioning method and system
CN106767393A (en) * 2015-11-20 2017-05-31 沈阳新松机器人自动化股份有限公司 The hand and eye calibrating apparatus and method of robot
CN106767393B (en) * 2015-11-20 2020-01-03 沈阳新松机器人自动化股份有限公司 Hand-eye calibration device and method for robot
CN105364927A (en) * 2015-12-15 2016-03-02 天津立德尔智能装备科技有限公司 Robot carrying visual system and robot carrying quick positioning method
CN105631875A (en) * 2015-12-25 2016-06-01 广州视源电子科技股份有限公司 Method and system for determining mapping relations between camera coordinates and arm gripper coordinates
CN105654474A (en) * 2015-12-28 2016-06-08 深圳先进技术研究院 Mechanical arm positioning method based on visual guidance and device thereof
CN105844670B (en) * 2016-03-30 2018-12-18 广东速美达自动化股份有限公司 Horizontal machine people moves camera Multipoint movable scaling method
CN105844670A (en) * 2016-03-30 2016-08-10 东莞市速美达自动化有限公司 Horizontal robot mobile camera multi-point mobile calibration method
CN106524910B (en) * 2016-10-31 2018-10-30 潍坊路加精工有限公司 Executing agency's vision alignment method
CN106524910A (en) * 2016-10-31 2017-03-22 潍坊路加精工有限公司 Execution mechanism visual calibration method
WO2018214147A1 (en) * 2017-05-26 2018-11-29 深圳配天智能技术研究院有限公司 Robot calibration method and system, robot and storage medium
CN108527360A (en) * 2018-02-07 2018-09-14 唐山英莱科技有限公司 A kind of location position system and method
CN108436909A (en) * 2018-03-13 2018-08-24 南京理工大学 A kind of hand and eye calibrating method of camera and robot based on ROS
CN109325980A (en) * 2018-07-27 2019-02-12 深圳大学 A kind of method, apparatus and manipulator for manipulator positioning target
CN109910016A (en) * 2019-04-22 2019-06-21 亿嘉和科技股份有限公司 Vision collecting scaling method, apparatus and system based on multi-degree-of-freemechanical mechanical arm
CN110111394A (en) * 2019-05-16 2019-08-09 湖南三一快而居住宅工业有限公司 Based on manipulator feature to the method and device of video camera automatic Calibration

Similar Documents

Publication Publication Date Title
EP3105016B1 (en) Automatic calibration method for robot systems using a vision sensor
US20180257238A1 (en) Manipulator system
CN104175031B (en) A kind of welding robot system with autonomous centering capacity carries out the method for welding
JP6000579B2 (en) Information processing apparatus and information processing method
US9604363B2 (en) Object pickup device and method for picking up object
US10406686B2 (en) Bare hand robot path teaching
DE102010045752B4 (en) Visual perception system and method for a humanoid robot
JP5767464B2 (en) Information processing apparatus, information processing apparatus control method, and program
CN103406905B (en) Robot system with visual servo and detection functions
JP3946711B2 (en) Robot system
DE102015001527A1 (en) Robot system using visual feedback
US20190036337A1 (en) System for robotic 3d printing
CN203266633U (en) Space coordinate positioning grabbing mechanical arm
JP2013036988A (en) Information processing apparatus and information processing method
CN104690551B (en) A kind of robot automation's assembly system
CN101493682B (en) Generating device of processing robot program
JP2012254518A (en) Robot control system, robot system and program
JP2011115877A (en) Double arm robot
JP2012192466A (en) User support apparatus for image processing system, program thereof and image processing apparatus
US20050159842A1 (en) Measuring system
US9529945B2 (en) Robot simulation system which simulates takeout process of workpieces
CN105234943A (en) Industrial robot demonstration device and method based on visual recognition
JP2008021092A (en) Simulation apparatus of robot system
CN107498558A (en) Full-automatic hand and eye calibrating method and device
WO2014055909A3 (en) System and method for camera-based auto-alignment

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20141203

RJ01 Rejection of invention patent application after publication