CN101372098A - Robot device and control method thereof - Google Patents

Robot device and control method thereof Download PDF

Info

Publication number
CN101372098A
CN101372098A CNA2007101466511A CN200710146651A CN101372098A CN 101372098 A CN101372098 A CN 101372098A CN A2007101466511 A CNA2007101466511 A CN A2007101466511A CN 200710146651 A CN200710146651 A CN 200710146651A CN 101372098 A CN101372098 A CN 101372098A
Authority
CN
China
Prior art keywords
image
manipulator
chela
image unit
robot device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CNA2007101466511A
Other languages
Chinese (zh)
Inventor
孙增圻
王宏
郝淼
藤井正和
河野幸弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IHI Corp
Original Assignee
IHI Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by IHI Corp filed Critical IHI Corp
Priority to CNA2007101466511A priority Critical patent/CN101372098A/en
Priority to JP2008215926A priority patent/JP5292998B2/en
Publication of CN101372098A publication Critical patent/CN101372098A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Manipulator (AREA)

Abstract

The invention provides a control method used for a robot device, comprising the steps as follows: when the manipulator of the robot device is controlled, a plurality of benchmark images are not needed to be prepared, rapid control can be carried out owning to light calculation load; furthermore, the position and gesture of manipulator correspondingly to the object can be controlled in three-dimensional space; according to the moving quantity of image of object (101) which leads the direction of the paw of the manipulator (1) to rotate in the image photographed by a camera (6), the direction of the paw of the manipulator (1) is controlled, thus leading the image of the object (101) to be arranged at the center of the visible field; according to the size of the image of the object (101) gained by the camera (6), the distance of the paw and the object (101) of the manipulator (1) is calculated, thus leading the paw of the manipulator (1) to move along the direction of the optical axis of the camera (6) so as to approach the object (101).

Description

Robot device's control method and robot device
The present invention relates to a kind of robot device's control method and robot device, front end at manipulator (Manipulator) has among the robot device of camera, and making manipulator behavior is that the target signature amount of calculating and storing according to dbjective state in advance is consistent with the characteristic quantity that the image that basis is obtained by camera obtains.
Background technology
At present, propose a kind of robot device who possesses manipulator, in addition, also proposed a kind of control method that is used for robot device that such robot device's manipulator behavior is controlled.And, the various control methods that the chela of manipulator is moved as robot device's control method, have been proposed with the object caught by manipulator as target.
For example in patent documentation 1, put down in writing a kind of robot device, it prepares the benchmark image of a plurality of objects in advance, and obtain the image information that obtains by the camera that is installed on the manipulator chela, the approaching benchmark image of shape of the object in retrieval and the image information, the position of calculating object thing and manipulator chela relation thus.
In addition, put down in writing a kind of robot device in patent documentation 2, it prepares the benchmark image of object in advance, comes photographic images by the camera that is installed on the manipulator chela, simultaneously obtain with the approaching image of benchmark image before, it is a certain amount of that manipulator is moved.
[patent documentation 1] spy opens the 2003-231078 communique
[patent documentation 2] spy opens the 2000-263482 communique
Summary of the invention
But, in the robot device of patent documentation 1 record, in order to improve control accuracy, need prepare a plurality of benchmark images in advance, and when each manipulator behavior, whole benchmark images be retrieved, so calculated load is higher, and can not control rapidly.
In addition, in the robot device of patent documentation 2 record, the position and the stance adjustment of the manipulator in can only corresponding two dimensional surface can't be in position and the posture of three dimensions inner control manipulator with respect to object.And, in this robot device, must make a video recording repeatedly, the action of measurement and manipulator, so can not control rapidly.
Therefore, the present invention In view of the foregoing invents, its purpose is to provide a kind of robot device's control method and robot device, when robot device's manipulator is controlled, need not to prepare a plurality of benchmark images, calculated load gentlier can be controlled rapidly like this, and can be in position and the posture of three dimensions inner control manipulator with respect to object.
Reach above-mentioned purpose in order to solve above-mentioned problem, robot device's of the present invention control method has any one in the following structure.
(structure 1)
It is a kind of manipulator with 6 above frees degree, the control method that has the robot device of image unit in the chela position of manipulator, be characterized as: the object image that will obtain by image unit with contrast about the prior canned data of the shape of object, the amount of movement of object image in the image unit visual field when being rotated according to the direction that makes the manipulator chela, control the direction of mechanical hand vice pawl, make the image of object be positioned at the center in the image unit visual field, then, the object image that will be obtained by image unit contrasts with the prior canned data about the object shape, size according to the object image that obtains by image unit, come the chela of calculating machine hand and the distance between the object, the chela of manipulator is moved on the optical axis direction of the lens of image unit, make this chela near object.
(structure 2)
In the control method of the robot device with structure 1, be characterized as: object has circle or approaches round-shaped characteristic quantity, and its upper surface is a measurement face.
(structure 3)
In the control method of the robot device with structure 1, be characterized as: object has circular arbitrary shape in addition, and has measurement face, and this measurement mask has circle or approaches round-shaped characteristic quantity.
(structure 4)
In having any one robot device's the control method of structure 1 to the structure 3, be characterized as: the object image that will be obtained by image unit contrasts with the prior canned data about the object shape, and the direction of manipulator chela is made as the direction vertical with the measurement face of object.
(structure 5)
In the control method of the robot device with structure 4, be characterized as: the object image that will be obtained by image unit contrasts with the prior canned data about the object shape, direction to the manipulator chela is controlled, so that the image of object is positioned at the center in the image unit visual field, then, the manipulator chela is rotated control, makes the direction of object image in the image unit visual field for stipulating.
In addition, robot device of the present invention has any one in the following structure.
(structure 6)
As a kind of manipulator with 6 above frees degree, the robot device who has image unit in the chela position of manipulator, be characterized as: the control module that possesses the control manipulator behavior, the object image that control module will be obtained by image unit contrasts with the prior canned data about the object shape, the amount of movement of object image in the image unit visual field when being rotated according to the direction that makes the manipulator chela, control the direction of mechanical hand vice pawl, so that the image of object is positioned at the center in the image unit visual field, then, the object image that will be obtained by image unit contrasts with the prior canned data about the object shape, size according to the object image that obtains by image unit, come the chela of calculating machine hand and the distance between the object, the chela of manipulator is moved on the optical axis direction of the lens of image unit, make this chela near object.
(structure 7)
In the robot device with structure 6, be characterized as: object has circle or is similar to round-shaped characteristic quantity, and table is measurement face above.
(structure 8)
In the robot device with structure 6, be characterized as: object has circular arbitrary shape in addition, and has measurement face, and this measurement mask has circle or approaches round-shaped characteristic quantity.
(structure 9)
Has structure 6 to the structure 8 among any one the robot device, be characterized as: the object image that control module will be obtained by image unit contrasts with the prior canned data about the object shape, and the direction of manipulator chela is made as the direction vertical with the measurement face of object.
(structure 10)
In robot device with structure 9, be characterized as: the object image that control module will be obtained by image unit contrasts with the prior canned data about the object shape, direction to the manipulator chela is controlled, make the image of object be positioned at the center in the image unit visual field, then, the manipulator chela is rotated control, makes the direction of the image of the object in the image unit visual field for regulation.
In a plurality of ROBOT CONTROL methods of the present invention, has structure 1, object image when being rotated according to the direction that makes the manipulator chela thus is at the amount of movement of image unit, control the direction of mechanical hand vice pawl, make the image of object be positioned at the center in the image unit visual field, then, size according to the object image that obtains by image unit, calculate the distance between manipulator chela and the object, the manipulator chela is moved on the optical axis direction of the lens of image unit, can make the manipulator chela thus near object.
In a plurality of ROBOT CONTROL methods of the present invention, has structure 2, object has circle or is similar to the characteristic quantity of circular shape thus, and its upper surface is a measurement face, so can promptly contrast the object image that obtained by image unit and prior canned data about the object shape.
In a plurality of ROBOT CONTROL methods of the present invention, has structure 3, object has circular arbitrary shape in addition thus, and possess the measurement face that has circle or be similar to round-shaped characteristic quantity, so can promptly contrast the object image that obtains by image unit and prior canned data about the object shape.
In a plurality of ROBOT CONTROL methods of the present invention, has structure 4, the direction that makes the manipulator chela thus is the direction vertical with the measurement face of object, so can carry out the correct position adjustment of manipulator and object, and can make manipulator catch object effectively.
In a plurality of ROBOT CONTROL methods of the present invention, has structure 5, control the direction of mechanical hand vice pawl thus, make the image of object be positioned at the center in the image unit visual field, then, the manipulator chela is rotated control, make the object image in the image unit visual field become prescribed direction,, can make manipulator catch object effectively so can carry out the correct position adjustment of manipulator and object.
Promptly, in robot device's of the present invention control method, can in three dimensions, carry out the position adjustment and the stance adjustment of manipulator and object, can repeatedly make any joint move the angle of appointment and the action that general device for industrial robot is had such as on the optical axis direction of the lens of image unit, move, carry out the position adjustment, so can simplify the structure of control module, alleviate computational load.
In addition, in a plurality of robots of the present invention, has structure 6, object image when control module is rotated according to the direction that makes the manipulator chela thus is at the amount of movement of image unit, control the direction of mechanical hand vice pawl, make the image of object be positioned at the center in the image unit visual field, then, size according to the object image that obtains by image unit, calculate the distance between manipulator chela and the object, the manipulator chela is moved on the optical axis direction of the lens of image unit, therefore by repeatedly making any joint move specified angle, and the action that general device for industrial robot had such as on the optical axis direction of the lens of image unit, move, can make the manipulator chela near object.
In a plurality of robots of the present invention, has structure 7, object has circle or is similar to the characteristic quantity of circular shape thus, and its upper surface is a measurement face, so control module can promptly contrast the object image that obtained by image unit and prior canned data about the object shape.
In a plurality of robots of the present invention, has structure 8, object has circular arbitrary shape in addition thus, and possess the measurement face that has circle or be similar to round-shaped characteristic quantity, so control module can promptly contrast the object image that obtained by image unit and prior canned data about the object shape.
In a plurality of robots of the present invention, has structure 9, to make the direction of manipulator chela be the direction vertical with the measurement face of object to control module thus, so can carry out the correct position adjustment of manipulator and object, and can make manipulator catch object effectively.
In a plurality of robots of the present invention, has structure 10, control module is controlled the direction of mechanical hand vice pawl thus, make the image of object be positioned at the center in the image unit visual field, then, the manipulator chela is rotated control, make the image of the object in the image unit visual field become the direction of regulation,, can make manipulator catch object effectively so can carry out the correct position adjustment of manipulator and object.
Promptly, in robot device of the present invention, can in three dimensions, carry out the position adjustment and the stance adjustment of manipulator and object, and can repeatedly make any joint move specified angle and the action that general device for industrial robot is had such as on the optical axis direction of the lens of image unit, move, carry out the position adjustment, so can simplify the structure of control module, alleviate computational load.
Promptly, the invention provides a kind of robot device's control method and robot device, when being controlled, robot device's manipulator need not to prepare a plurality of benchmark images, calculated load is lighter like this, can control rapidly, and can be in position and the posture of three dimensions inner control manipulator with respect to object.
Description of drawings
Fig. 1 is an expression robot device's of the present invention diagrammatic side view.
Fig. 2 is the expression robot device's of the present invention prior preparation (a) of control method and the flow chart of control action (b).
The front view of Fig. 3 photographed images that to be expression corresponding with each stage under the execution robot device's of the present invention state of control method.
The stereogram of Fig. 4 structure that to be expression show the camera among the robot device of the present invention as pinhole camera.
Fig. 5 is illustrated in the robot device's of the present invention control method, carries out the front view of the photographed images under the centering state.
Fig. 6 is the flow chart of the order of the centering action in the expression robot device's of the present invention control method.
Fig. 7 is the flow chart of the order adjusted of two-dimensional position control in the expression robot device's of the present invention control method and phase place.
Symbol description
1 manipulator, 1a first connecting rod, 1b second connecting rod, 1c third connecting rod, 1d the 4th connecting rod, 1e the 5th connecting rod, 1f the 6th connecting rod, 2a first joint, 2b second joint, 2c the 3rd joint, 2d the 4th joint, 2e the 5th joint, 2f the 6th joint, 3 computers, 5 front end tool mechanisms, 6 cameras, 101 objects;
The specific embodiment
Below, describe being used to implement preferred forms of the present invention with reference to accompanying drawing.
(robot device's structure)
Fig. 1 is the diagrammatic side view of expression robot device's of the present invention structure.
As shown in Figure 1, robot device of the present invention has manipulator 1, by this manipulator 1 can catch, conveyance and to miscellaneous part assembling object 101.
Manipulator 1 is made of a plurality of actuators (drive unit) and connecting rod (rigid construction thing), has 6 frees degree.That is, connect via rotatable (bending) or rotatable joint 2a, 2b, 2c, 2d, 2e, 2f between each connecting rod, drive relatively by actuator (actuator) respectively.Each actuator is by computer 3 controls that become control module.
In this manipulator 1, first connecting rod (connecting rod of end) 1a is set to connect end one side at abutment portion 4 via the first joint 2a.The first joint 2a is the joint that can center on vertical axis (z axle) rotation.End one side that connects second connecting rod 1b at the front end of this first connecting rod 1a via second joint 2b.Second joint 2b makes second connecting rod 1b can center on the joint that trunnion axis rotates.Be connected with end one side of third connecting rod 1c via the 3rd joint 2c in front end one side of second connecting rod 1b.The 3rd joint 2c makes third connecting rod 1c can center on the joint that trunnion axis rotates.
And, be connected with end one side of the 4th connecting rod 1d via the 4th joint 2d in front end one side of third connecting rod 1c.The 4th joint 2d is the joint that the 4th connecting rod 1d can be rotated around the axle of third connecting rod 1c.Be connected with end one side of the 5th connecting rod 1e via the 5th joint 2e in front end one side of the 4th connecting rod 1d.The 5th joint 2e is the joint that the 5th connecting rod 1e can be rotated around the axle vertical with the axle of the 4th connecting rod 1d.Be connected with end one side of the 6th connecting rod 1f via the 6th joint 2f in front end one side of the 5th connecting rod 1e.The 6th joint 2f is the joint that the 6th connecting rod 1f can be rotated around the axle of the 5th connecting rod 1e.
So, rotating joint and rotatable joint alternatively are set in manipulator 1 amount to 6, guaranteed 6 frees degree thus.
Be provided with in front end one side (hereinafter referred to as " chela ") of the 6th connecting rod 1d and catch or the front end tool mechanism 5 of processing object thing 101.This front end tool mechanism 5 is controlled by computer 3 or not shown other control systems.In addition, at the camera 6 of being attached with of chela as the image unit that constitutes by imaging lens system and CCD (solid-state imager).
Computer 3 has image feature data table 3a.In this image feature data table 3a, store the view data and the numeric data of the shape of a plurality of objects 101 of expression.In addition, computer 3 has image processing part 3, from by the shape of extracting object parts 101 picture signal of camera 6 outputs, the view data and the numeric data of storing among this shape and the image feature data table 3a is contrasted.And computer 3 has action command generating unit 3c.This action command generating unit 3c generates the action command signal at each actuator of manipulator 1, and sends to each actuator according to the results of comparison of the data in picture signal among the image processing part 3b and the image feature data table 3a.
And, in manipulator 1, in turn the position of the relative first connecting rod 1a of second connecting rod 1b, the position of the relative second connecting rod 1b of third connecting rod 1c, the position of the relative third connecting rod 1c of the 4th connecting rod 1d and the position of front end one side connecting rod are controlled, can control the position of chela thus, and the work transporting that will be caught by front end tool mechanism 5 at this chela is to the position of regulation.
And, by in this robot device, carrying out robot device's of the present invention shown below control method, come robot brain tool hand 1, make the lens axis of the measurement face of object 101 perpendicular to camera 6.In addition, in this embodiment, be that central portion describes with object 101.
Fig. 2 is the expression robot device's of the present invention preparation in advance (a) of control method and the flow chart of control action (b).
As the preparation in advance that is used for robot brain tool hand 1, shown in (a) among Fig. 2, when beginning in S1 when preparing in advance, falling into a trap can be regarded as at S2 is the intrinsic parameter of the robot device's of control object manipulator 1, and is stored in the storage device of computer 3 as table in advance.
Promptly, active component number anyhow (Xpx, Ypx) (resolution (pixel)), and effective pixel size (Xln anyhow according to camera 6 imaging apparatus, Yln) (physical length) calculates the pixel in the imaging apparatus and the transformational relation of physical quantity length as follows.
(Kxln,Kyln)=(Xln/xpx,Yln/Ypx)
The stereogram of Fig. 4 structure that to be expression show camera 6 as pinhole camera.
In addition, as shown in Figure 4,, calculate length PS in the photographed images and the relation of actual physical length RS according to 101 distance L from the imaging lens system 6a of camera 6 to the distance D pl (physical length) of imaging apparatus 6b with from imaging lens system 6a to object.In addition, the distance D p1 from imaging lens system 6a to imaging apparatus 6b and from imaging lens system 6a to object the relation of 101 distance L, can calculate according to the focal length of imaging lens system 6a.Therefore, can calculate length PS the photographed images and the relation of actual physical length RS according to the focal length of distance D pl from imaging lens system 6a to imaging apparatus 6b and imaging lens system 6a.
And, as the robot device intrinsic parameter, calculate the transition matrix T that is tied to camera coordinate system from coordinate.
Then, in S3 calculating object thing 101 intrinsic parameter, and be stored among the image feature data table 3a of computer 3.
That is,, store the shape facility amount of the object 101 when making a video recording in advance by 6 pairs of objects 101 of camera as the image feature amount of the object 101 that becomes target.Specifically, come the target location (Px on the image of storage object thing 101 with pixel unit, Py) (center or position of centre of gravity etc.), in addition, as showing in advance the posture θ of storage representation direction (is unit with degree or rad), being used to big or small PS (pixel unit) that judges distance etc.For example, the object 101 that is taken be shaped as ellipse the time, in advance oval center on the memory image (Px, Py), PS (major axis or minor axis), θ (to the angle of major axis) etc.
Object 101 is similar to circle on image, for example cover with circle, obtains its center and radius.In addition, can calculate the also area (area of upper surface) of storage object thing 101 in advance.
And, in S4, finish to prepare in advance.If prepare to finish, then shown in (b) among Fig. 2, can carry out control action in advance.That is, when beginning control action in S5, in S6, carry out centering.
The front view of Fig. 3 photographed images that to be expression corresponding with each stage under the execution robot device's of the present invention state of control method.
Shown in the stage among Fig. 3 (phase) 1, centering is to make manipulator 1 carry out the control of following action, and this action makes the image of object 101 come the center of the image of being taken by camera 6.
Fig. 5 is the front view that is illustrated in the photographic images under the state that carries out object 101 centering.
About this centering, as shown in Figure 5, the center of the image that camera 6 is taken is made as C0, and (X0, Y0), the initial position of the object on this image 101 is made as C1, and (Px1 Py1) describes.
Fig. 6 is the flow chart of the order of expression centering action.
This centering is undertaken by following order.That is, as shown in Figure 6, when when S11 begins to feel relieved, the initial position C1 that has obtained the center C0 of the image of being taken by camera 6, the object 101 on the image in S12 enters S13 afterwards.In S13, select the 4th joint 2d, in S14, drive suitable value qt4.The driving amount can be arbitrarily angle and towards, for example desirable is about 3deg to 5deg.
Made a video recording once more by camera 6 this moment as shown in Figure 5, and the position of the object 101 in this moment image obtained is made as C2, and (Px2 Py2), shown in the S14 among Fig. 6, obtains and memory location C2.
Then, enter S15, as shown in Figure 5, ask the position C0 of the object 101 on the image and position C2 apart from d, position C1 and position C2 apart from d12.Because the position of each point (coordinate on the image) is known, thus can by following formula ask above-mentioned these apart from d, d12.
d = [ ( X 0 - Px 2 ) 2 + ( Y 0 - Py 2 ) 2 ]
d 12 = [ ( Px 1 - Px 2 ) 2 + ( Py 1 - Py 2 ) 2 ]
Then, enter the S16 of Fig. 6, ask the slope k 1 of straight line C0~C1, the slope k 2 of straight line C0~C2, the slope k 12 of straight line C1~C2.Because the position of each point (coordinate on the image) is known, these slopes can be obtained by following formula.
k1=(Py1-Py0)/(Px1-Px0)
k2=(Py2-Py0)/(Px2-Px0)
k12=(Py1-Py2)/(Px1-Px2)
At this, as shown in Figure 5, will be vertical and be made as an A by the straight line of position C0 and the intersection point of straight line C1~C2 with straight line C1~C2.Enter the S17 of Fig. 6, ask an A and position C0 apart from d0A.Because this is straight line C1~C2 and the distance of putting C0, so can obtain by following formula.
d 0 A = | k 12 · X 0 - Y 0 + Py 1 - k 12 · Px 1 | / ( k 12 2 + 1 )
Then, enter S18, ask from position C1 to an A apart from d2A.This comes to obtain as follows according to the relation of the length of side of right angled triangle C0-A-C2 apart from d2A.
d 2 A = ( d 12 2 - d 0 A 2 )
Enter S19, make the 4th joint 2d rotation setting q4.The size of this setting q4 is obtained according to following formula.
q4=d2A·q4t/d12
That is, because make the 4th joint 2d rotation q4t, d12 has been moved in the position on the image, what are in the hope of the anglec of rotation q4 that is used for mobile d2A.
The symbol of setting q4 decides by following rule.That is, if (1/k12-k1) identical with (k2-k1) symbol, then be made as with q4t equidirectional, if (1/k12-k1) different with (k2-k1) symbol, then be made as the direction opposite with q4t.
Then, enter S20,, carry out and control identical from S12 to S19 about the 5th joint 2e.Then, if the position of the center C0 in the image of taking by camera 6 and object 101 (Px, difference Py) be certain following apart from the time, then enter S21, and finish to feel relieved and move.When the position of the center of image C0 and object 101 (Px, difference Py) be not certain following apart from the time, carry out the action from S12 to S20 once more.
When finishing centering in this wise as mentioned above, then in (b) of Fig. 2, enter S7, make the action of camera 6 near object 101.
Shown in the stage among Fig. 32, because centering finishes, even on the lens axis direction of camera 6 chela is moved, it is constant that the position of the object 101 on the obtained image also stops at the center of image.Under this state, chela is moved, with near object 101 on the lens axis direction of camera 6.
For chela is moved on the lens axis direction of camera 6, need the known mechanical hand 1 and the position of the lens axis of camera 6 to concern.As shown in Figure 1, the optical axis coordinate is Tc, and the chela coordinate is T6, if obtain the matrix T that it is changed in advance, then can ask optical axis coordinate system Tc according to TT6=Tc, can make chela go up mobile in the lens axis direction (zc direction of principal axis) of camera 6.
The axial amount of movement L of zc can obtain as follows according to the relation of length PS in the photographed images shown in Figure 4 and actual physics length RS.
L=Dpl/(Kxpl·PS/cos(θ1))·RS
Wherein, this is that Kxp1 and Kypl are the situation of equal value.When Kxpl is not identical value with Kypl, take out x axle composition and the y axle composition of PS, and it is made as PSx and PSy, obtain as follows.
L = Dpl / { [ ( Kxpl · PSx ) 2 + ( Kypl · PSy ) 2 ] / cos ( θ 1 ) } · RS
In addition, θ 1 is expression chela axle z6 with respect to vertical axis much angles that tilts, and can obtain by following formula.
θ1=arccos((z6×zc)/(|z6|·|zc|))
At this, (z6 * zc) is the inner product of two vector z6 and zc, | z6|, | zc| is the norm (length) of vector z6, zc.Arccos (a) is the θ of cos (θ)=a, and wherein (0 ≦ θ ≦ 180deg).
This is to seeing that owing to inclining the object that object 101 causes being seen compares the operation that actual little situation is revised.In addition, object 101 be shaped as circle or sub-circular the time, if ask major axis, then can reduce and watch the influence that is produced sideling, so need not to carry out such correction of passing through cos (θ 1).
According to the distance L of so obtaining, make camera 6 and chela on the lens axis zc of camera 6 direction, move to object 101.
Then, in (b) of Fig. 2, enter S8, carry out stance adjustment.Shown in the stage among Fig. 33, this stance adjustment is following action, and this action makes the measurement face (upper surface) of the lens axis of camera 6 perpendicular to object 101 for the posture of control chela.
For example, when, manipulator 1 parallel to the ground at measurement face also is placed on the face parallel with measurement face, under the chela axle z6 situation parallel with the lens axis zc of camera, control manipulator 1 the chela posture, make chela axle z6 perpendicular to the ground, be camera 6 vertically downward.
Then, in (b) of Fig. 2, enter S9, carry out two-dimensional position control and phase place adjustment.
Fig. 7 represents the order that two-dimensional position control and phase place are adjusted.
As mentioned above, owing to carried out the position adjustment, shown in the stage 4 of Fig. 3, the image of object 101 has departed from the center of the image of being obtained by camera 6.But the lens axis of camera 6 becomes vertical with measurement face.Therefore, under this state, can feed back correction position and phase place by two dimensional image.At this, so-called phase place is the direction of object 101 images in the visual field of camera 6, and the chela by making manipulator rotates around the lens axis of camera 6 to be revised.
That is, in the S22 of Fig. 7, when beginning two-dimensional position control and phase place adjustment control, enter step S23, to the big or small PS of the object on the image 101, with handle the big or small PS1 that obtains by image and compare.If handled big or small PS1 hour that obtains by image, then chela is too near object 101, so that chela to direction action, if when PS1 is big away from object 101, then chela is far away excessively apart from object 101, so that chela near the action of the direction of object 101.
In S24, judge whether the deviation of big or small PS and big or small PS1 is below the setting, below setting, then return S23, repeat the move operation of chela.If the deviation of big or small PS and big or small PS1 becomes below the setting, then enter S25.
In S25, (Px Py), and decides the amount of movement of chela according to following formula to handle the position ask object 101 by image.
[the axial chela amount of movement of the x of camera]=Kxp1 (Px-X0)
[the axial chela amount of movement of the y of camera]=Kypl (Py-Y0)
In addition, Shi Ji moving direction is obtained according to the relation of direction of principal axis on the image and chela direction.Then, enter S26, (whether Px-X0 is below the setting Py-Y0) to the deviation of judgement center, below setting, then returns the move operation that S25 repeats chela.If (Px-X0 Py-Y0) becomes below the setting deviation of center, then enters S27.
In S27,, make the phase theta 1 of object 101 become purpose angle θ with the 6th joint 2f rotary manipulation θ-θ 1.Then, enter S28, finish two-dimensional position control and phase place adjustment control.That is, in (b) of Fig. 2, enter S10, finish control action.
Shape with object 101 is illustrated as circle in the above-described embodiment.But, the shape of object 101 is not to be restricted to circle, as long as on the surface of object 101, have some circular characteristic, just can handle and extract circular portion by image, for example can be that profile is polygon but wherein has the shape of circular port etc.

Claims (10)

1. a robot device control method, the robot device that it is used to have the manipulator of 6 above frees degree and has image unit in the chela position of manipulator is characterized in that,
The object image that will be obtained by described image unit contrasts with the prior canned data about described object shape, the amount of movement of described object image in the described image unit visual field when being rotated according to the direction that makes described manipulator chela controlled the direction of described manipulator chela, so that described object image is positioned at the center in the described image unit visual field
Then, the object image that will be obtained by described image unit contrasts with the prior canned data about described object shape, size according to the object image that obtains by described image unit, calculate the chela of described manipulator and the distance between the described object, the chela of described manipulator is moved on the optical axis direction of the lens of described image unit, make this chela near described object.
2. robot device's according to claim 1 control method is characterized in that,
Described object has circle or approaches round-shaped characteristic quantity, and its upper surface is a measurement face.
3. robot device's according to claim 1 control method is characterized in that,
Described object has circular arbitrary shape in addition, and has measurement face, and this measurement mask has circle or approaches round-shaped characteristic quantity.
4. according to the control method of any described robot device in the claim 1 to 3, it is characterized in that,
The object image that will be obtained by described image unit contrasts with the prior canned data about described object shape, and the direction of described manipulator chela is made as the direction vertical with the measurement face of described object.
5. robot device's according to claim 4 control method is characterized in that,
The object image that will be obtained by described image unit contrasts with the prior canned data about described object shape, the direction of described manipulator chela controlled, so that described object image is positioned at the center in the described image unit visual field,
Then, the chela of described manipulator is rotated control, makes the described object image in the described image unit visual field be the direction of regulation.
6. robot device, it has the manipulator of 6 above frees degree, has image unit in the chela position of manipulator, it is characterized in that,
The control module that possesses the described manipulator behavior of control,
The object image that described control module will be obtained by described image unit contrasts with the prior canned data about described object shape, the amount of movement of described object image in the described image unit visual field when being rotated according to the direction that makes described manipulator chela, control the direction of described manipulator chela, so that the image of described object is positioned at the center in the described image unit visual field, then, the object image that will be obtained by described image unit contrasts with the prior canned data about described object shape, size according to the object image that obtains by described image unit, calculate the chela of described manipulator and the distance between the described object, the chela of described manipulator is moved on the optical axis direction of the lens of described image unit, make this chela near described object.
7. robot device according to claim 6 is characterized in that,
Described object has circle or approaches round-shaped characteristic quantity, and its upper surface is a measurement face.
8. robot device's method according to claim 6 is characterized in that,
Described object has circular arbitrary shape in addition, and has measurement face, and this measurement mask has circle or approaches round-shaped characteristic quantity.
9. according to any described robot device in the claim 6 to 8, it is characterized in that,
The object image that described control module will be obtained by described image unit contrasts with the prior canned data about described object shape, and the direction of described manipulator chela is made as the direction vertical with the measurement face of described object.
10. robot device according to claim 9 is characterized in that,
The object image that described control module will be obtained by described image unit contrasts with the prior canned data about described object shape, direction to described manipulator chela is controlled, so that the image of described object is positioned at the center in the described image unit visual field, then, described manipulator chela is rotated control, makes the direction of described object image in the described image unit visual field for stipulating.
CNA2007101466511A 2007-08-23 2007-08-23 Robot device and control method thereof Pending CN101372098A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CNA2007101466511A CN101372098A (en) 2007-08-23 2007-08-23 Robot device and control method thereof
JP2008215926A JP5292998B2 (en) 2007-08-23 2008-08-25 Robot apparatus control method and robot apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNA2007101466511A CN101372098A (en) 2007-08-23 2007-08-23 Robot device and control method thereof

Publications (1)

Publication Number Publication Date
CN101372098A true CN101372098A (en) 2009-02-25

Family

ID=40446571

Family Applications (1)

Application Number Title Priority Date Filing Date
CNA2007101466511A Pending CN101372098A (en) 2007-08-23 2007-08-23 Robot device and control method thereof

Country Status (2)

Country Link
JP (1) JP5292998B2 (en)
CN (1) CN101372098A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102785249A (en) * 2011-05-16 2012-11-21 精工爱普生株式会社 Robot control system, robot system and program
CN102848387A (en) * 2011-06-28 2013-01-02 苏州经贸职业技术学院 Conveying manipulator system
CN104227722A (en) * 2013-06-17 2014-12-24 佳能株式会社 Robot system and robot control method
CN107972065A (en) * 2016-10-21 2018-05-01 和硕联合科技股份有限公司 Mechanical arm positioning method and system applying same
CN108858186A (en) * 2018-05-30 2018-11-23 南昌大学 A kind of trolley is to infrared object detection, identification and tracking
CN109343709A (en) * 2013-06-13 2019-02-15 原相科技股份有限公司 Device with gesture sensor
CN109382849A (en) * 2017-08-07 2019-02-26 刘海云 Using the robot eyes of cone coordinate system vibration zoom
CN112793983A (en) * 2020-12-30 2021-05-14 易思维(杭州)科技有限公司 Automatic loading method suitable for clamping groove type material rack
CN113580147A (en) * 2021-09-02 2021-11-02 乐聚(深圳)机器人技术有限公司 Robot control method, device, equipment and storage medium
US11740703B2 (en) 2013-05-31 2023-08-29 Pixart Imaging Inc. Apparatus having gesture sensor

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TR200909397A2 (en) 2009-12-14 2011-07-21 Vargin Gök Gökhan A robot with multiple axes.
JP5868257B2 (en) * 2012-05-10 2016-02-24 株式会社エイチアンドエフ Work position detection method and work transfer method using the same
JP2013255972A (en) * 2012-06-14 2013-12-26 Shinnichi Kogyo Co Ltd Workpiece conveying device and method for controlling the same
CN109227540A (en) * 2018-09-28 2019-01-18 深圳蓝胖子机器人有限公司 A kind of robot control method, robot and computer readable storage medium
KR20230084970A (en) * 2021-12-06 2023-06-13 네이버랩스 주식회사 Method and system for controlling serving robots

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3208953B2 (en) * 1993-09-29 2001-09-17 株式会社デンソー Three-dimensional position and posture recognition method based on vision and three-dimensional position and posture recognition device based on vision
JP3466340B2 (en) * 1995-09-07 2003-11-10 アシスト シンコー株式会社 A 3D position and orientation calibration method for a self-contained traveling robot
JP3220706B2 (en) * 1996-02-29 2001-10-22 三井造船株式会社 Vision support device for manipulator operation
JPH1011146A (en) * 1996-06-25 1998-01-16 Shinko Electric Co Ltd Device for correcting stop posture of mobile object
JP2000263482A (en) * 1999-03-17 2000-09-26 Denso Corp Attitude searching method and attitude searching device of work, and work grasping method and work grasping device by robot
JP2003211381A (en) * 2002-01-16 2003-07-29 Denso Wave Inc Robot control device
JP2003231078A (en) * 2002-02-14 2003-08-19 Denso Wave Inc Position control method for robot arm and robot device
JP2003326485A (en) * 2002-05-08 2003-11-18 Mitsubishi Heavy Ind Ltd Manipulator with sensor for capture
JP4825964B2 (en) * 2003-08-18 2011-11-30 独立行政法人 宇宙航空研究開発機構 Non-contact measuring device
CN101396830A (en) * 2007-09-29 2009-04-01 株式会社Ihi Robot control method and robot

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102785249A (en) * 2011-05-16 2012-11-21 精工爱普生株式会社 Robot control system, robot system and program
CN102848387A (en) * 2011-06-28 2013-01-02 苏州经贸职业技术学院 Conveying manipulator system
US11740703B2 (en) 2013-05-31 2023-08-29 Pixart Imaging Inc. Apparatus having gesture sensor
CN109343709A (en) * 2013-06-13 2019-02-15 原相科技股份有限公司 Device with gesture sensor
CN109343709B (en) * 2013-06-13 2022-05-06 原相科技股份有限公司 Device with gesture sensor
US9393696B2 (en) 2013-06-17 2016-07-19 Canon Kabushiki Kaisha Robot system and robot control method
CN104227722A (en) * 2013-06-17 2014-12-24 佳能株式会社 Robot system and robot control method
CN107972065B (en) * 2016-10-21 2020-06-16 和硕联合科技股份有限公司 Mechanical arm positioning method and system applying same
CN107972065A (en) * 2016-10-21 2018-05-01 和硕联合科技股份有限公司 Mechanical arm positioning method and system applying same
CN109382849A (en) * 2017-08-07 2019-02-26 刘海云 Using the robot eyes of cone coordinate system vibration zoom
CN108858186A (en) * 2018-05-30 2018-11-23 南昌大学 A kind of trolley is to infrared object detection, identification and tracking
CN108858186B (en) * 2018-05-30 2021-05-07 南昌大学 Method for detecting, identifying and tracking infrared object by using trolley
CN112793983A (en) * 2020-12-30 2021-05-14 易思维(杭州)科技有限公司 Automatic loading method suitable for clamping groove type material rack
CN113580147A (en) * 2021-09-02 2021-11-02 乐聚(深圳)机器人技术有限公司 Robot control method, device, equipment and storage medium

Also Published As

Publication number Publication date
JP5292998B2 (en) 2013-09-18
JP2009051003A (en) 2009-03-12

Similar Documents

Publication Publication Date Title
CN101372098A (en) Robot device and control method thereof
CN107883929B (en) Monocular vision positioning device and method based on multi-joint mechanical arm
EP1043126B1 (en) Teaching model generating method
US7532949B2 (en) Measuring system
JP3300682B2 (en) Robot device with image processing function
Wang et al. Eye-in-hand tracking control of a free-floating space manipulator
CN104227722B (en) Robot system and robot control method
CN104802166B (en) Robot control system, robot, program and robot control method
EP2682711B1 (en) Apparatus and method for three-dimensional measurement and robot system comprising said apparatus
US7084900B1 (en) Image processing apparatus
EP1521211B1 (en) Method and apparatus for determining the position and orientation of an image receiving device
JP5093058B2 (en) How to combine robot coordinates
JP5573537B2 (en) Robot teaching system
Rossell et al. An effective strategy of real-time vision-based control for a Stewart platform
Qiao Advanced sensing development to support robot accuracy assessment and improvement
JPS63254575A (en) Calibration device for visual sensor
Bielenberg Robust Visual Servo Control and Tracking for the Manipulator of a Planetary Exploration Rover
JPH05197416A (en) Teaching device for robot operation point
Marapane et al. Experiments in active vision with real and virtual robot heads
Huang et al. Dynamic compensation framework to improve the autonomy of industrial robots
WO2023172385A1 (en) Systems and methods for robotic manipulation using extended reality
NL2016960B1 (en) System and method for controlling a machine, in particular a robot
Shimizu et al. Multi-purpose wide-angle vision system for remote control of planetary exploring rover
Sharma Role of active vision in optimizing visual feedback for robot control
JP2024015393A (en) Robot device for detecting interference of constituting member of robot

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20090225