CN102135776B - Industrial robot control method based on visual positioning - Google Patents

Industrial robot control method based on visual positioning Download PDF

Info

Publication number
CN102135776B
CN102135776B CN2011100265297A CN201110026529A CN102135776B CN 102135776 B CN102135776 B CN 102135776B CN 2011100265297 A CN2011100265297 A CN 2011100265297A CN 201110026529 A CN201110026529 A CN 201110026529A CN 102135776 B CN102135776 B CN 102135776B
Authority
CN
China
Prior art keywords
robot
tool
point
coordinate
luminescent marking
Prior art date
Application number
CN2011100265297A
Other languages
Chinese (zh)
Other versions
CN102135776A (en
Inventor
解则晓
于浩源
王旭
Original Assignee
解则晓
于浩源
王旭
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 解则晓, 于浩源, 王旭 filed Critical 解则晓
Priority to CN2011100265297A priority Critical patent/CN102135776B/en
Publication of CN102135776A publication Critical patent/CN102135776A/en
Application granted granted Critical
Publication of CN102135776B publication Critical patent/CN102135776B/en

Links

Abstract

The invention relates to the field of a robot, in particular to an industrial robot control method based on visual positioning. In artificial teaching process, positioning motion control process and oriented motion control process of the control method provided by the invention, the positioning of luminescence mark points or the ends of an operating tool is performed in the coordinates ow, xw, yw and zy of a visual measuring device, thus the control system has no requirement on the motion control accuracy of the robot and the relative position relation of the coordinates of the tool and the coordinates of the robot, and the control method only requires linear motion realized by the rotation of axes from the first axis to the third axis and rotary motion realized by the rotation of axes of axes from the fourth axis to the sixth axis according to the theoretical model of the robot and, thereby fundamentally avoiding the extremely strict requirements on the accuracy of a robot model and the calibrating accuracy of the coordinates of the tool in conventional off-line programming teaching and meeting the operating requirements of a complicated track with high accuracy.

Description

Industrial robot control method based on vision localization

Technical field

The present invention relates to the robot field, especially based on the industrial robot control method of vision localization.

Background technology

Industrial robot has obtained widespread use in each industrial circle, as robot welding, robot spray paint, robot transport etc.Industrial robot replaces manual work to make repeatably accurate movement in these fields, thereby has guaranteed the consistance of product quality.Robot will accomplish specific action and must move along particular track, and dual mode planning robot's movement locus is arranged at present, and the one, online teaching, the 2nd, off-line programing teaching.

Online teaching is the handcontrolled manipulation box control robot motion through robot, and the process of online teaching comprises the instrument of installing in the robot that makes, and moves to its operating position like the end of spray gun, welding gun etc., and notes the coordinate in this position robot; Control robot moves to a plurality of positions as required, and writes down these tracks; The track autokinetic movement that writes down when robot is along teaching is then accomplished specific operation task.It is low that the advantage of this teaching method is that the model accuracy to robot requires, and uses the theoretical model of robot to get final product; In addition, need not demarcate the tool coordinates system of robot.The shortcoming of this method is: not only process is loaded down with trivial details, efficient is low for the online teach programming of (1) robot, take the valuable production time, and makes robot system become a relatively independent unit, is difficult to realize seamless integrated with other system or production run; (2) precision of teaching is leaned on demonstrator's experience range estimation decision fully, and precision is lower, is difficult to obtain gratifying teaching effect for complicated weld seam etc.; (3) in the teaching process, manual control robot motion is to a specified point, complicated operating process, and teaching personnel easy fatigue, but also be under robot working space's the hazardous environment.Therefore to be only applicable to movement locus simple for this method, i.e. the occasion that teaching is counted less, accuracy requirement is not high can not satisfy the needs of the flexible production of current short run, many kinds.

The off-line programing mode is adopted in teaching to the compound movement track usually.The off-line programing teaching is based on accurate robot model; And set up tool coordinates system; The coordinate of tool tip is accurately transformed under the robot frame of reference continuous motion track of tool tip when cooking up practical operation according to the cad model of operation workpiece.The realization of this mode is to robot model's precision, and the robot tool coordinate system is established a capital very high accuracy requirement really.

The robot model is the orientation of expression robot end under the robot frame of reference, typically refers to the theoretical model of robot, and theoretical model also is designing a model of robot, the relation when promptly designing between determined bar length and each joint.And the actual manufacturing of robot and installation process can produce error, cause the realistic model of robot and theoretical model that very big-difference is arranged, and use the site error that must cause the robot end as realistic model with theoretical model.For the real model that obtains robot just must be demarcated robot, confirm that promptly real joint relation and bar are long.At present, scaling method commonly used is a rotation situation of measuring each joint with three coordinate measuring machine, joint arm measuring machine or laser tracker, thereby confirms the axis direction and the position in each joint, rebuilds the robot model with this.Usually gauge point is fixed on the arm in a joint of robot; Robot can be turned to a plurality of positions around this joint; Utilize measuring machine or laser tracker to measure the position of this gauge point; Utilize the coordinate fitting circle of same gauge point under diverse location can get the normal direction of central coordinate of circle and circle, the normal direction of circle can be represented the axis direction in this joint.The error of measuring machine and laser tracker can be brought the site error of gauge point, thereby causes the joints axes error.Therefore, existing scaling method is not greatly improved to the stated accuracy of robot, still can be greater than 1 millimeter through the site error of calibrated robot.

The robot end installs the different operation instrument all will confirm the robot model based on said method with the Coordinate Conversion of tool tip in the robot frame of reference, only need tool coordinates system is transformed in robot end's coordinate system to get final product.Method commonly used at present is a three point method, thereby this method is the demarcation that three points of hand-guided robot and space contact the implementation tool coordinate system.Yet, in practical operation, be difficult to make the operation tool end to contact with fixed in space point fully, or on contacting fully, have the gap, or both contact amounts be excessive, bump.Therefore, be that finding the solution of parameter certainly exists error to tool coordinates.

In sum, there is following problem in existing robot teaching mode: (1) online teaching operation process is loaded down with trivial details, and efficient and precision are all lower, and it is simpler to be only applicable to operation trace, and the not high occasion of accuracy requirement; (2) the off-line programing teaching requires very high to the solving precision that robot model's precision and tool coordinates is parameter; And the present stated accuracy that robot model's stated accuracy and tool coordinates are parameter is lower; Therefore; The path accuracy that off-line programing generates is not high, can not satisfy the action need of high precision complicated track.

Summary of the invention

The objective of the invention is to propose a kind of industrial robot control method based on vision localization, its security is good, is more prone to monitoring for the motion state of operation tool, and accomplishes that real-time processing, precision are high, can expand the application of robot.

The present invention is that the technical scheme below adopting realizes: a kind of industrial robot control method based on vision localization, and this method may further comprise the steps:

(1) vision measurement device is carried out modeling and demarcation:

Each video camera to vision measurement device is set up the pin-hole imaging model, and model is following:

ρ u v 1 = fN x r 1 + r 7 u 0 fN x r 2 + r 8 u 0 fN x r 3 + r 9 u 0 fN x t x + t z u 0 fN y r 4 + r 7 v 0 fN y r 5 + r 8 v 0 fN y r 6 + r 9 v 0 fN y t y + t z v 0 r 7 r 8 r 9 t z x w y w z w 1

Wherein, r 1~r 9Be the rotating vector that camera coordinates is tied to world coordinate system, t x, t y, t zBe translational movement, f is a focal length, u 0, v 0Be the computer picture center, (u v) is the computer picture pixel, (x w, y w, z w) be the coordinate in the three-dimensional world coordinate system, utilize same plane grid target simultaneously each video camera to be demarcated, obtain the unknown parameter in each model, and definite vision measurement device coordinate system o that constitutes by each video camera measurement space point wx wy wz w, the three-dimensional coordinate of the spatial point that vision measurement device is surveyed is all in this coordinate system;

(2) confirming of the identification of luminescent marking point and three-dimensional coordinate:

Each video camera is taken the luminescent marking point respectively, the position of identification luminescent marking point in each width of cloth image, and mate respectively, each monocular vision model simultaneous solution can be obtained each luminescent marking point at vision measurement device coordinate system o wx wy wz wIn three-dimensional coordinate;

(3) confirming of the terminal three-dimensional coordinate of operation tool:

In luminescent marking point plane, set up two-dimensional coordinate system O TX TY T, be X promptly with the line of crossing the 3rd luminescent marking point, the 4th luminescent marking point TAxle, the line of crossing first luminescent marking point, the second luminescent marking point is Y TAxle, the end with operation tool is placed in the taper hole then, and operation tool is rotated around this taper hole; At least rotate to four positions; The terminal invariant position of operation tool like this, and variation has taken place in the position of luminescent marking point, the three-dimensional coordinate according to each luminescent marking point under each position can be confirmed O TCoordinate, according to the O under at least four positions TCoordinate fitting obtain sphere centre coordinate O S, this sphere centre coordinate is the operation tool end at vision measurement device coordinate system o wx wy wz wUnder three-dimensional coordinate;

(4) artificial teaching process:

Operation tool is taken off from the robot end; With the hand-held flexible motion of staff; The movement locus that needs when walking out real work; Like the track of weld seam, with the movement locus of each luminescent marking point on the vision measurement device real time record operation tool, this movement locus is at vision measurement device coordinate system o in this process wx wy wz wUnder the 3 d-dem point; Movement locus by luminescent marking point can be confirmed movement locus and the attitude that operation tool is terminal according to the method in (3);

(5) robot learning process:

After artificial teaching finishes; Again operation tool is fixed on the end of robot; The track of walking out when making operation tool walk out artificial teaching once more through the control robot motion; In this process; Position through vision measurement device observation luminescent marking point; And be input in the robot controller, robot controller can calculate the terminal position of operation tool and the attitude of operation tool according to the position of luminous point, the feedback information of controlling as robot with the difference of the attitude of the current attitude of the terminal current location of operation tool and the poor and operation tool of the position of artificial teaching process operation tool tip and operation tool manual work teaching process; The end that alternately makes operation tool through the setting movement control and the process of decide direction motion control robot end's during along artificial teaching orbiting motion, the while makes operation tool under each terminal position the attitude attitude during with artificial teaching process is identical.

Among the present invention, in the described setting movement control procedure, the current location that operation tool is terminal is defined as current point, make the operation tool end from current point to impact point near, the coordinate of current point and impact point all is at vision measurement device coordinate system o wx wy wz wDown; According to the ideal model of robot, utilize its first to the 3rd can set up along three directions and make straight-line model, i.e. rotation through first to the 3rd realizes the rectilinear motion along three directions: first to three that controls the robot that has operation tool makes robot move along the Z direction; And the position of real-time calculating operation tool tip; Obtain poor between terminal current location and the target location, terminal when impact point is nearest when operation tool, write down Z coordinate at this moment; Then first of control robot to three rotation make robot along X to moving; And record decision operation tool tip is to the distance of impact point; Record distance X coordinate hour, next make robot along Y to moving, record along Y to the Y coordinate of back tool tip when impact point is nearest that move.Several steps can make operation tool terminal near impact point on the process.

Describedly decide in the direction motion control process, with Y TThe direction of direction indication operation tool be attitude, from current point, around the 4th rotation of robot, work as Y earlier THour write down the 4th angle with the direction vector angle of impact point; Around the 5th rotation, work as Y again THour write down the 5th angle with the direction vector angle of impact point; Around the 6th rotation, work as Y at last THour write down the 6th angle, at this moment Y with the direction vector angle of impact point TDirection near the direction of impact point.

The setting movement control procedure is whenever carried out a step can make the operation tool end near impact point; Same decide direction motion control process and whenever carry out a step and also can make the attitude of the attitude of operation tool near artificial teaching, so setting movement control and the process of deciding the direction motion control the hocket position that makes robot and the attitude all position and the attitude of approaching artificial teaching.Deviation when the direction of terminal position of operation tool and operation tool and artificial teaching is noted the current angle of six axles of robot in allowed band the time; Robot has just accomplished the study of a position like this, and in kind robot can carry out the study of other positions.Continuous path during with artificial teaching is separated into several positions, and robot is as long as accomplish track identical in the time of just can walking out with artificial teaching to the study of these positions; The position of writing down when the motion after the robot completion study is repetitive learning no longer needs vision measurement device that it is monitored in the motion process.

The invention has the beneficial effects as follows: the first, control system of the present invention is simple in structure, and is easy to operate, and efficient and precision are all very high; The second, artificial teaching process, the setting movement control procedure of control method according to the invention and deciding in the direction motion control process all is at vision measurement device coordinate system o to luminescent marking point or the terminal location of operation tool wx wy wz wUnder; Therefore the motion control precision to robot does not require; Tool coordinates system is not required with robot coordinate system's relative position relation; As long as rotate through first to the 3rd according to the theoretical model of robot and to realize rectilinear motion, the 4th to the 6th is rotated that realizing rotatablely moves get final product, the extremely requirement of strictness that when fundamentally having avoided conventional off-line programing teaching robot model's precision and tool coordinates has been stated accuracy; The 3rd, the robot model is simple relatively, and the theoretical model when designing with robot is fully done controlling models, and need the precision of robot not debugged especially; The 4th; Vision measurement device is can many robots shared, needs only robot and has better repeatability after direction motion control process forms stable motion path with decide through the setting movement control procedure; And operation tool is fixing reliable; Do not have change in location just can move repeatedly and accomplish same task, therefore can realize many shared binocular vision measurement mechanisms of robot, save cost along this path; The 5th, this method has very strong adaptability to workpiece location and putting position, as long as workpiece is placed in the working range of robot, having overcome the off-line programing teaching needs the accurate shortcoming of confirming initial point; The 6th, this method is the basis with the vision measurement system, on highest level, robot is carried out closed-loop control, and greatly reducing robot model, tool coordinates is the accuracy requirement of intermediate links such as parameter, can expand the application of robot; The 7th, this method is safe, is more prone to monitoring for the ERST of instrument, and accomplishes real-time processing, reduces because control or the economic loss that cooperates fault to cause.In sum, operating process of the present invention is simple, and efficient and precision are high, can satisfy the operation requirement of high precision complicated track.

Description of drawings

Fig. 1 is based on the structural representation of the industrial robot control system of vision localization;

Fig. 2 is the distribution schematic diagram of luminescent marking point among the embodiment 1;

Fig. 3 is the structural representation that the terminal three-dimensional coordinate of operation tool is confirmed;

Fig. 4 be among the embodiment 1 the operation tool end at gauge point three-dimensional system of coordinate O TX TY TZ TUnder coordinate;

Fig. 5 is the terminal movement locus of setting movement control procedure operation tool among the embodiment 1;

Fig. 6 is the movement locus of directed movement process operation instrument among the embodiment 1;

Fig. 7 is the distribution schematic diagram of luminescent marking point among the embodiment 2;

Fig. 8 is the distribution schematic diagram of luminescent marking point among the embodiment 3.

Embodiment

Below in conjunction with accompanying drawing the present invention is further specified.

Embodiment 1

Fig. 1 to Fig. 6 is embodiment 1 described industrial robot control system and a control method thereof based on vision localization.Described industrial robot control system is as shown in Figure 1; This system comprises robot 1, robot controller 2 and vision measurement device; Described vision measurement device is a binocular vision measurement mechanism 3, and robot 1 adopts six-joint robot, and end is provided with operation tool 5; Robot 1 is connected with binocular vision measurement mechanism 3 through robot controller 2, robot controller 2 with the locating information of binocular vision measurement mechanism 3 as feedback information.Luminescent marking point 4 is fixed on the operation tool 5 of robot, and 3 pairs of luminescent marking points 4 of binocular vision measurement mechanism are measured in real time, through just can confirm the movement locus of robot end's operation tool to the measurement of luminescent marking point.Binocular vision measurement mechanism 3 is imported the operation tool that records 5 terminal positions in the robot controller 2, and robot controller 2 carries out contrary separating according to the robot theoretical model, sends robot 1 to, and control robot 1 motion.Be fixed with flat board on the said operation tool; Flat board is provided with four luminescent marking points 4; The i.e. first luminescent marking point 4A, the second luminescent marking point 4B, the 3rd luminescent marking point 4C and the 4th luminescent marking point 4D; Wherein the line between line between the first luminescent marking point 4A, the second luminescent marking point 4B and the 3rd luminescent marking point 4C, the 4th luminescent marking point 4D is perpendicular and to meet at any be the O point, and the first luminescent marking point 4A, the second luminescent marking point 4B are different with distance between the O point.Binocular vision measurement mechanism 3 is made up of two video cameras, and at a distance of 1-2m, and the angle between two video cameras is 15-25 ° between two video cameras.

Utilize said system to realize that the method for industrial robot motion control is described below:

(1) the binocular vision measurement mechanism is carried out modeling and demarcation:

Utilize the three-dimensional coordinate of these two video camera measurement space points at first to demarcate it, each video camera is set up the pin-hole imaging model, model is following:

ρ u v 1 = fN x r 1 + r 7 u 0 fN x r 2 + r 8 u 0 fN x r 3 + r 9 u 0 fN x t x + t z u 0 fN y r 4 + r 7 v 0 fN y r 5 + r 8 v 0 fN y r 6 + r 9 v 0 fN y t y + t z v 0 r 7 r 8 r 9 t z x w y w z w 1

Wherein, r 1~r 9Be the rotating vector that camera coordinates is tied to world coordinate system, t x, t y, t zBe translational movement, f is a focal length, u 0, v 0Be the computer picture center, more than these parameters are unknown parameters, need come confirm through demarcating.(u v) is the computer picture pixel, (x w, y w, z w) be the coordinate in the three-dimensional world coordinate system, utilize same plane grid target simultaneously two video cameras to be demarcated, obtain the unknown parameter in each self model, and definite binocular vision measurement mechanism coordinate system o wx wy wz w

(2) confirming of the identification of luminescent marking point and three-dimensional coordinate:

The distribution of luminescent marking point is as shown in Figure 2; The distance that intersection point O point is 4A, 4B, 4C, 4D to these four luminescent marking points all is known; In two video cameras, take this four gauge points simultaneously, the distance of ordering to O owing to 4A point, 4B point is different, can discern the position of a gauge point in two width of cloth images in view of the above; Realize coupling, two monocular vision model simultaneous solutions can be got each luminescent marking point at binocular vision measurement mechanism coordinate system o wx wy wz wIn three-dimensional coordinate;

(3) confirming of the terminal three-dimensional coordinate of operation tool:

Can confirm the three-dimensional coordinate that operation tool is terminal according to the coordinate of four luminescent marking points,, in luminescent marking point plane, set up two-dimensional coordinate system O like Fig. 2 TX TY T, the line of promptly ordering with mistake the 3rd luminescent marking point 4C, the 4th luminescent marking point 4D is X TAxle, the line of crossing the first luminescent marking point 4A, the second luminescent marking point 4B is Y TAxle, the end with operation tool is placed in the taper hole then, makes operation tool around this taper hole rotation; The terminal invariant position of operation tool like this; Luminescent marking point rotates to four positions at least, and like Fig. 3, the three-dimensional coordinate according to four luminescent marking points under each position can be confirmed O TCoordinate, according to the O under four positions TThe coordinate fitting ball obtain sphere centre coordinate O S, this sphere centre coordinate is the operation tool end at o wx wy wz wUnder coordinate;

(4) artificial teaching process:

The teaching process is artificial teaching; Fixing luminescent marking point on operation tool; Operation tool can be taken off from the robot end, with the hand-held flexible motion of staff, the movement locus that needs when walking out real work; As shown in Figure 1, welding gun is terminal along the track that is moved by the weld seam 7 of operation workpiece 6 during welding; In the process of artificial teaching, use the movement locus of each luminescent marking point on the binocular vision measurement mechanism real time record operation tool; This movement locus is the 3 d-dem point under visual coordinate system; This movement locus can use in the individual robot of specification different accuracy of the same race, also can in dissimilar even the diverse robot of structure, use;

Like Fig. 4, in the first luminescent marking point 4A position, according to sphere centre coordinate O STo coordinate plane O TX TY TDistance, the two-dimensional coordinate system of the luminescent marking point at position 4A place expanded be three-dimensional system of coordinate O TX TY TZ T, sphere centre coordinate O then STo coordinate plane O TX TY TDistance be the Z of this three-dimensional system of coordinate TCoordinate, O SAt gauge point X TY TThe coordinate of the subpoint of coordinate plane is X and the Y coordinate of tool tip at gauge point, can confirm that like this operation tool end is at gauge point three-dimensional system of coordinate O TX TY TZ TUnder coordinate, thereby can confirm the movement locus of tool tip by the track of luminescent marking point;

Through CAD software or CAM software the teaching track that generates in the artificial teaching process is carried out fairing processing, be input to then in the robot controller 2, supply the motion control in the robot learning process to use;

(5) robot learning process:

After artificial teaching finishes; Operation tool 5 is fixed on the end of robot 1; The track of walking out when making operation tool 5 walk out artificial teaching once more by 1 motion of control robot; In this process, by the position of binocular vision measurement mechanism 3 observation luminescent marking points, and in the input robot controller 2; Robot controller 2 calculates the track of luminescent marking point 4 in real time; The alternate position spike of luminescent marking point 4 during with the current location of luminescent marking point 4 and artificial teaching, robot end's orbiting motion by setting movement control and when decide end that the direction motion control makes operation tool 5 along artificial teaching, the attitude of the attitude that the while can also guarantee operation tool 5 during with artificial teaching is identical; The artificial six-joint robot of machine that the present invention uses; First to three of six-joint robot is mainly used to accomplish the allocation motion; The the 4th to six main the completion decided the direction motion, utilizes first to three hypothetical machine human model of being set up the rotation of three axles can be converted into rectilinear motion; Utilize the 4th to the six hypothetical machine human model of being set up to realize the rotation of operation tool.Following mask body is introduced the setting movement control procedure and is decided direction motion control process.

In the setting movement control procedure; Curve as shown in Figure 5 is the terminal movement locus of operation tool that artificial teaching forms; The current location of operation tool is defined as current point; Make operation tool from current point to impact point E near, the coordinate of current point and impact point all is at binocular vision measurement mechanism coordinate system o wx wy wz wDown, in the motion control process, first to three rotation of control robot 1 makes robot move along the Z direction; And the position of real-time calculating operation tool tip; Obtain poor between terminal current location and the target location, like Fig. 5, this difference diminishes during the ascending variation of Z value; Hour terminal nearest when this difference from impact point, write down Z coordinate at this moment; Then first of control robot to three rotations make robot along X to moving; And record decision operation tool tip is to the distance of impact point; Record distance X coordinate hour; Next make first to three rotation of robot make robot along Y to moving, through the front along Z to X to move, can be near impact point along Y to moving back operation tool end.

In deciding direction motion control process, as shown in Figure 6, the direction that can change operation tool 5 around the 4th to six rotation of robot is an attitude.With Y TThe direction of direction indication operation tool, from current point,, work as Y earlier around the 4th rotation THour write down the 4th angle with the direction vector angle of impact point E; Around the 5th rotation, work as Y again THour write down the 5th angle with the direction vector angle of impact point E; Around the 6th rotation, work as Y at last THour write down the 6th angle, at this moment Y with the direction vector angle of impact point E TThe direction of direction and impact point E very approaching.

The setting movement control procedure is whenever carried out a step can make the operation tool end near impact point, decides the attitude of direction motion control process when whenever carrying out attitude that a step also can make operation tool near artificial teaching equally.Setting movement control and the process of decide the direction motion control the hocket position that makes robot and the attitude all position and the attitude of approaching artificial teaching.

Artificial teaching process, setting movement control procedure and decide in the direction motion control process all are at binocular vision measurement mechanism coordinate system o to the location of luminescent marking point 4 or operation tool 5 ends wx wy wz wUnder.Approximate algorithm in the motion process from the current point to the impact point also is under binocular vision measurement mechanism coordinate system; Therefore the motion control precision to robot does not require; Tool coordinates system is not required with robot coordinate system's positioning relation, as long as make first to three to realize rectilinear motion according to the theoretical model of robot, the 4th to six realization rotatablely moves and gets final product; Demarcate the extremely requirement of strictness of robot model and tool coordinates system when fundamentally having avoided conventional off-line programing teaching; The robot model is simple relatively simultaneously, and the theoretical model when designing with robot is fully done controlling models, need not debug especially the precision of robot.

Vision measurement device can be shared by many robots; Through the setting movement control procedure with decide after direction motion control process forms stable motion path; As long as robot has better repeatability, and operation tool is fixing reliable, does not have change in location just can move repeatedly along this path and accomplishes same task; Therefore can realize many shared binocular vision measurement mechanisms of robot, save cost.

The six-joint robot that the present invention adopted can be replaced by seven robots or parallel robot.In the present embodiment, described binocular vision measurement mechanism also can be by replacements such as monocular vision measurement mechanism or trinocular vision measurement mechanisms.When adopting the monocular vision measurement mechanism, the distance between each luminescent marking point has strict regulation; And adopt binocular vision measurement mechanism, trinocular vision measurement mechanism or more during the used for multi-vision visual measurement mechanism; Distance between each luminescent marking point does not require; And the order number of the vision measurement device that adopts is many more; Can observe the luminescent marking point from more perspective, the scope that robot allows to change increases.

Embodiment 2

As shown in Figure 7, different with embodiment 1 is, is fixed with five luminescent marking points on the described operation tool, is respectively the first gauge point 4A, the second gauge point 4B, the 3rd gauge point 4C, the 4th gauge point 4D and the 5th gauge point 4E.Wherein the line of the line of the first gauge point 4A, the second gauge point 4B and the 3rd gauge point 4C, the 4th gauge point 4D is perpendicular, and intersects at a point, and the 5th gauge point 4E is positioned on this intersection point.The gauge point that distributes is many more, and single luminescent marking point is more little to the influence of action accuracy.

Other is with embodiment 1.

Embodiment 3

As shown in Figure 8, different with embodiment 1 is, is fixed with six luminescent marking points on the described operation tool, is respectively the first gauge point 4A, the second gauge point 4B, the 3rd gauge point 4C, the 4th gauge point 4D, the 5th gauge point 4E and the 6th gauge point 4F.Wherein the line of the line of the first gauge point 4A, the second gauge point 4B and the 3rd gauge point 4C, the 4th gauge point 4D is perpendicular, and intersects at a point, and the 5th gauge point 4E is positioned on this intersection point, the 6th gauge point 4F be positioned at the second gauge point 4B under.

More luminescent marking point also can be set on the operation tool, and luminescent marking point is all along the line setting of the line of the first gauge point 4A and the second gauge point 4B and the 3rd gauge point 4C, the 4th gauge point 4D.

Other is with embodiment 1.

Claims (3)

1. industrial robot control method based on vision localization is characterized in that this method may further comprise the steps:
(1) vision measurement device is carried out modeling and demarcation:
Each video camera to vision measurement device is set up the pin-hole imaging model, and model is following:
ρ u v 1 = fN x r 1 + r 7 u 0 fN x r 2 + r 8 u 0 fN x r 3 + r 9 u 0 fN x t x + t z u 0 fN y r 4 + r 7 v 0 fN y r 5 + r 8 v 0 fN y r 6 + r 9 v 0 fN y t y + t z v 0 r 7 r 8 r 9 t z x w y w z w 1
Wherein, r 1~r 9Be the rotating vector that camera coordinates is tied to world coordinate system, t x, t y, t zBe translational movement, f is a focal length, u 0, v 0Be the computer picture center, (u v) is the computer picture pixel, (x w, y w, z w) be the coordinate in the three-dimensional world coordinate system, utilize same plane grid target simultaneously each video camera to be demarcated, obtain the location parameter in each model, and definite vision measurement device coordinate system o that constitutes by each video camera measurement space point wx wy wz w
(2) confirming of the identification of luminescent marking point and three-dimensional coordinate:
Each video camera is taken the luminescent marking point respectively, the position of identification luminescent marking point in each width of cloth image, and mate respectively, each monocular vision model simultaneous solution can be obtained each luminescent marking point at vision measurement device coordinate system o wx wy wz wIn three-dimensional coordinate;
(3) confirming of the terminal three-dimensional coordinate of operation tool:
In luminescent marking point plane, set up two-dimensional coordinate system O TX TY T, be X promptly with the line of crossing the 3rd luminescent marking point (4C), the 4th luminescent marking point (4D) TAxle, the line of crossing first luminescent marking point (4A), second luminescent marking point (4B) is Y TAxle, the end with operation tool is placed in the taper hole then, makes the terminal stationkeeping of operation tool, and operation tool rotates to four positions at least around this taper hole rotation, and the three-dimensional coordinate according to each luminescent marking point under each position can be confirmed O TCoordinate, according to the O under at least four positions TCoordinate fitting obtain sphere centre coordinate O S, this sphere centre coordinate is the operation tool end at vision measurement device coordinate system o wx wy wz wUnder three-dimensional coordinate;
(4) artificial teaching process:
Operation tool is taken off from the robot end; With the hand-held flexible motion of staff; The movement locus that needs when walking out real work, with the movement locus of each luminescent marking point in the operation of vision measurement device real time record, this movement locus is at vision measurement device coordinate system o in this process wx wy wz wUnder the 3 d-dem point, confirm the movement locus that operation tool is terminal and the attitude of operation tool by the track of luminescent marking point;
(5) robot learning process:
After artificial teaching finishes; Again operation tool (5) is fixed on the end of robot (1); The track that operation tool (5) is walked out through control robot (1) motion when walking out artificial teaching once more; In this process; Position through vision measurement device observation luminescent marking point (4); And in input robot controller (2), robot controller (2) is according to the terminal position of the real-time calculating operation instrument in the position of luminescent marking point (5) and the attitude of operation tool (5), the feedback information of controlling as robot with the difference of the attitude of the current attitude of the terminal current location of operation tool (5) and the poor and operation tool (5) of the position of artificial teaching process operation instrument (5) end and artificial teaching process operation instrument (5); The end that alternately makes operation tool (5) through the setting movement control and the process of decide direction motion control robot end's during along artificial teaching orbiting motion, the while makes operation tool (5) under each terminal position the attitude attitude during with artificial teaching process is identical.
2. the industrial robot control method based on vision localization according to claim 1; It is characterized in that: in the described setting movement control procedure; The current location that operation tool is terminal is defined as current point; Make the operation tool end from current point to impact point near, the coordinate of current point and impact point all is at vision measurement device coordinate system o wx wy wz wDown; First to three rotation of robot (1) that control has operation tool (5) makes robot move along the Z direction, and the position of calculating operation tool tip in real time, obtains poor between terminal current location and the target location; When tool tip when nearest, is write down Z coordinate at this moment from impact point; Then first of control robot to three rotations make robot along X to moving; And record decision operation tool tip is to the distance of impact point; Record distance X coordinate hour; Next first to the three rotary machine people who makes robot along Y to moving, along Y to the back operation tool end that moves near impact point.
3. the industrial robot control method based on vision localization according to claim 1 is characterized in that: describedly decide in the direction motion control process, with Y TThe direction of direction indication operation tool be attitude, from current point, around the 4th rotation of robot, work as Y earlier THour write down the 4th angle with the direction vector angle of impact point; Around the 5th rotation, work as Y again THour write down the 5th angle with the direction vector angle of impact point; Around the 6th rotation, work as Y at last THour write down the 6th angle, at this moment Y with the direction vector angle of impact point TDirection near the direction of impact point.
CN2011100265297A 2011-01-25 2011-01-25 Industrial robot control method based on visual positioning CN102135776B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2011100265297A CN102135776B (en) 2011-01-25 2011-01-25 Industrial robot control method based on visual positioning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2011100265297A CN102135776B (en) 2011-01-25 2011-01-25 Industrial robot control method based on visual positioning

Publications (2)

Publication Number Publication Date
CN102135776A CN102135776A (en) 2011-07-27
CN102135776B true CN102135776B (en) 2012-06-13

Family

ID=44295580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2011100265297A CN102135776B (en) 2011-01-25 2011-01-25 Industrial robot control method based on visual positioning

Country Status (1)

Country Link
CN (1) CN102135776B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014041180A1 (en) * 2012-09-17 2014-03-20 Areva Wind Gmbh Lifter and method for handling a rotor blade, rotor blade for a wind generator and method for marking a rotor blade, system comprising a lifter and a rotor blade
CN104942789A (en) * 2015-06-12 2015-09-30 邓莉莉 Four-shaft transfer robot

Families Citing this family (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102608969B (en) * 2012-03-02 2014-04-23 北京理工大学 Cascade robot hand and eye registration control method based on infrared optical alignment
CN102764929B (en) * 2012-07-23 2014-09-24 清华大学 Elliptical orbit directional tangential line constant-speed welding robot device
CN102806560B (en) * 2012-08-24 2014-08-06 电子科技大学 Method capable of automatically eliminating motion accumulated error of robot
CN102841559B (en) * 2012-09-26 2015-09-02 苏州工业园区职业技术学院 Low speed adhesive dispensing robot servo-control system in four axles
CN102962549B (en) * 2012-11-26 2014-04-02 清华大学 Robot control method for welding along any curve trace in vertical plane
CN103077640A (en) * 2013-01-09 2013-05-01 北京石油化工学院 Three-dimensional (3D) auxiliary demonstration system for welding of robot
CN103495807B (en) * 2013-09-27 2015-08-19 重庆大学 Multirobot optical-fiber laser diced system
CN104515502A (en) * 2013-09-28 2015-04-15 沈阳新松机器人自动化股份有限公司 Robot hand-eye stereo vision measurement method
CN103759635B (en) * 2013-12-25 2016-10-26 合肥工业大学 The scanning survey robot detection method that a kind of precision is unrelated with robot
CN105333819A (en) * 2014-08-15 2016-02-17 苏州北硕检测技术有限公司 Robot workpiece assembly and form and location tolerance detection system and method based on face laser sensor
CN104438095A (en) * 2014-12-12 2015-03-25 杭州郝姆斯食品有限公司 Automatic sorting and shipping equipment and sorting and shipping method
CN106610666A (en) * 2015-10-22 2017-05-03 沈阳新松机器人自动化股份有限公司 Assistant robot based on binocular vision, and control method of assistant robot
CN106643479A (en) * 2015-10-30 2017-05-10 柯昆(昆山)自动化有限公司 Robot TCP precision detection system based on machine vision
JP6710946B2 (en) * 2015-12-01 2020-06-17 セイコーエプソン株式会社 Controllers, robots and robot systems
CN105547172B (en) * 2016-01-06 2018-05-01 上海大学 System and method based on acupuncture luminescence method measurement industrial robot repeatable accuracy
CN105499865A (en) * 2016-01-22 2016-04-20 广西大学 Planar welding manipulator with function of automatic track seeking
CN105598970B (en) * 2016-02-01 2017-11-14 北京航空航天大学 Closed-loop robot system and its control method
CN105834557B (en) * 2016-05-27 2018-10-16 廊坊智通机器人系统有限公司 The straight bead off-line tracking method of arc welding robot welding
CN106247932B (en) * 2016-07-25 2019-03-12 天津大学 A kind of online error-compensating apparatus of robot based on camera chain and method
CN106228563B (en) * 2016-07-29 2019-02-26 杭州鹰睿科技有限公司 Automatic setup system based on 3D vision
CN106468572B (en) * 2016-08-30 2018-10-16 凌云光技术集团有限责任公司 Articulated robot dynamic property evaluation system
CN106546168B (en) * 2016-09-29 2019-07-05 上海航天精密机械研究所 One-axis laser scanning machine people's method for precisely marking based on ball fitting
CN106541419B (en) * 2016-10-13 2019-01-25 同济大学 A kind of measurement method of robot trajectory's error
CN106780494A (en) * 2017-02-10 2017-05-31 云南电网有限责任公司电力科学研究院 A kind of electrical verification hanging ground-wire robot visual orientation method
CN106844729A (en) * 2017-02-10 2017-06-13 云南电网有限责任公司电力科学研究院 A kind of database of electrical verification hanging ground-wire Robot visual location system
CN106846418A (en) * 2017-02-10 2017-06-13 云南电网有限责任公司电力科学研究院 A kind of electrical verification hanging ground-wire Robot visual location system
CN107727026A (en) * 2017-10-18 2018-02-23 北方工业大学 The scaling method of the workpiece coordinate system to be cooperated towards double industrial robots
CN108057937A (en) * 2017-11-10 2018-05-22 江苏新光数控技术有限公司 A kind of numerical control four-axis welding manipulator
CN108008631B (en) * 2017-11-29 2020-01-17 中国地质大学(武汉) Underwater robot path tracking method, device and storage device
TWI653130B (en) * 2017-12-05 2019-03-11 財團法人工業技術研究院 Correction device of mechanical arm and correction method thereof
CN108204811B (en) * 2018-01-09 2020-02-07 山东大学 Handheld space complex curve welding track coordinate measuring instrument and measuring method
CN108620281A (en) * 2018-05-10 2018-10-09 康凯 It is a kind of to be suitable for high-rise or super-high storied building door and window adhesive supplier
CN108705536A (en) * 2018-06-05 2018-10-26 雅客智慧(北京)科技有限公司 A kind of the dentistry robot path planning system and method for view-based access control model navigation
CN108722746A (en) * 2018-07-05 2018-11-02 大连事事达数控机械科技有限公司 A kind of working method of planer-type five-axle linkage intelligent vision paint spraying machine
CN109095096A (en) * 2018-07-09 2018-12-28 湖南驰众机器人有限公司 A kind of control method of automatic loading/unloading AGV vehicle
CN109079789B (en) * 2018-08-23 2020-09-25 顺德职业技术学院 Method and system for remotely acquiring operation video of industrial robot
CN109040688B (en) * 2018-08-23 2020-09-25 顺德职业技术学院 Method and system for storing acquired industrial robot operation video
CN109571487B (en) * 2018-09-12 2020-08-28 河南工程学院 Robot demonstration learning method based on vision
CN109530984A (en) * 2018-11-19 2019-03-29 施努卡(苏州)智能装备有限公司 Vision positioning welding and assembling method
CN111551111B (en) * 2020-05-13 2021-02-05 华中科技大学 Part feature robot rapid visual positioning method based on standard ball array

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1559356A (en) * 2004-03-11 2005-01-05 上海交通大学 Method for connecting space object positioned by visible marking points
CN101015915A (en) * 2007-03-08 2007-08-15 上海交通大学 Magnetic density detecting robot with self-locate function and telescope joint
CN101226640A (en) * 2007-12-21 2008-07-23 西北工业大学 Method for capturing movement based on multiple binocular stereovision
CN101261118A (en) * 2008-04-17 2008-09-10 天津大学 Rapid automatized three-dimensional appearance on-line measurement method and system based on robot
WO2010003289A1 (en) * 2008-07-11 2010-01-14 中国科学院沈阳自动化研究所 Apparatus and method for robots tracking appointed path with high precision
CN101949687A (en) * 2010-09-19 2011-01-19 天津大学 Detection method of automobile door based on vision measurement

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1559356A (en) * 2004-03-11 2005-01-05 上海交通大学 Method for connecting space object positioned by visible marking points
CN1256072C (en) * 2004-03-11 2006-05-17 上海交通大学 Method for connecting space object positioned by visible marking points
CN101015915A (en) * 2007-03-08 2007-08-15 上海交通大学 Magnetic density detecting robot with self-locate function and telescope joint
CN101226640A (en) * 2007-12-21 2008-07-23 西北工业大学 Method for capturing movement based on multiple binocular stereovision
CN101261118A (en) * 2008-04-17 2008-09-10 天津大学 Rapid automatized three-dimensional appearance on-line measurement method and system based on robot
WO2010003289A1 (en) * 2008-07-11 2010-01-14 中国科学院沈阳自动化研究所 Apparatus and method for robots tracking appointed path with high precision
CN101949687A (en) * 2010-09-19 2011-01-19 天津大学 Detection method of automobile door based on vision measurement

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014041180A1 (en) * 2012-09-17 2014-03-20 Areva Wind Gmbh Lifter and method for handling a rotor blade, rotor blade for a wind generator and method for marking a rotor blade, system comprising a lifter and a rotor blade
EP2708487B1 (en) * 2012-09-17 2016-01-20 Areva Wind GmbH Lifter and method for handling a rotor blade and system comprising a lifter and a rotor blade
CN104942789A (en) * 2015-06-12 2015-09-30 邓莉莉 Four-shaft transfer robot
CN104942789B (en) * 2015-06-12 2017-04-19 邓莉莉 Four-shaft transfer robot

Also Published As

Publication number Publication date
CN102135776A (en) 2011-07-27

Similar Documents

Publication Publication Date Title
CN105518486B (en) The system and method for following the trail of the orientation of movable object object
CN104384765B (en) Based on the automatic soldering method of threedimensional model and machine vision
DE102014103738B3 (en) Visual troubleshooting for robotic tasks
US10486308B2 (en) Software interface for authoring robotic manufacturing process
US9221176B2 (en) Robot system and method for controlling the same
CN104589354B (en) Robot controller, robot system and robot
CN104354167B (en) A kind of Robotic Hand-Eye Calibration method and device
CN102448679B (en) Method and system for extremely precise positioning of at least one object in the end position in space
CN106163771B (en) Method and system for robot 3D printing
DE102010045752B4 (en) Visual perception system and method for a humanoid robot
CN105716525B (en) End effector of robot coordinate system scaling method based on laser tracker
JP5199452B2 (en) External system for improving robot accuracy
RU2605393C2 (en) Method of calibrating and programming robot
CN1322961C (en) Metering mechanism
EP3076255B1 (en) Automated dynamic manufacturing systems and related methods
JP3946711B2 (en) Robot system
CN102581445B (en) Visual real-time deviation rectifying system and visual real-time deviation rectifying method for robot
CN104476549B (en) The manipulator motion path compensation method that view-based access control model is measured
DE112016002057T5 (en) THREE-DIMENSIONAL MEASURING DEVICE SOLVIBLY COUPLED WITH A ROBOT ARM ON A MOTORIZED MOBILE PLATFORM
US10239641B2 (en) Automated scanning systems for non-destructive inspection of curved cylinder-like workpieces
KR20180038479A (en) Robot system
US7818091B2 (en) Process and device for determining the position and the orientation of an image reception means
US9393696B2 (en) Robot system and robot control method
US9731419B2 (en) System and method for programming robots
EP3165978A1 (en) Absolute robot powered positioning procedure

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120613

Termination date: 20180125