JP2009000782A - System for controlling robot, and robot hand - Google Patents

System for controlling robot, and robot hand Download PDF

Info

Publication number
JP2009000782A
JP2009000782A JP2007164426A JP2007164426A JP2009000782A JP 2009000782 A JP2009000782 A JP 2009000782A JP 2007164426 A JP2007164426 A JP 2007164426A JP 2007164426 A JP2007164426 A JP 2007164426A JP 2009000782 A JP2009000782 A JP 2009000782A
Authority
JP
Japan
Prior art keywords
robot
unit
camera
control system
teaching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2007164426A
Other languages
Japanese (ja)
Inventor
Hiroyuki Hayashi
Nobuo Higuchi
Hiroshi Yonezawa
弘之 林
伸夫 樋口
浩 米澤
Original Assignee
Idec Corp
Idec株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Idec Corp, Idec株式会社 filed Critical Idec Corp
Priority to JP2007164426A priority Critical patent/JP2009000782A/en
Publication of JP2009000782A publication Critical patent/JP2009000782A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

Provided is a robot control system capable of simplifying teaching work, reducing teaching time, and preventing occurrence of a chocolate stop.
In a robot control system 1, robot arms 30 and 40 and robot hands 31 and 41 provided at the tips of the robot arms 30 and 40 and having a chuck portion for gripping a workpiece and a small camera for shooting a workpiece are provided. The robots 3 and 4 are equipped with a touch panel on which images taken by a small camera are displayed. A teaching pendant for inputting teaching to the robots 3 and 4 and a predetermined teaching input by the teaching pendant. Controllers A and B for driving and controlling the robots 3 and 4 are provided so that the coordinate position is corrected based on an image photographed by a small camera and displayed on the touch panel and moved to the corrected coordinate position.
[Selection] Figure 1

Description

  The present invention relates to a robot control system, and more particularly, to a robot control system capable of performing visual control using imaging by a camera.

  There has been proposed an automatic assembling apparatus including an industrial robot for automatically assembling equipment such as electronic equipment composed of a large number of parts in an assembly yard of a factory (see Japanese Patent Laid-Open Nos. 2000-354919 and 2006-43844). ).

  Industrial robots generally have a robot arm that can be swiveled and moved up and down, a robot hand that is provided at the tip of the robot arm and has a chuck portion for gripping a workpiece (part), and a robot arm and a robot hand. And a control unit that controls driving according to the control program. The automatic assembling apparatus includes a plurality of component supply trays that are arranged at predetermined positions on the work table surface and accommodate a large number of components, and an assembly jig for assembling the components. .

  During operation of the automatic assembly apparatus, the robot arm is driven, the robot hand is moved to the position of a predetermined component supply tray, and the component is gripped by the tip chuck portion. From this state, the robot arm is driven, the robot hand is moved to the position of the assembly jig, and the part held by the chuck portion is assembled to the assembly jig. Thereafter, by repeating the same operation, a desired device is automatically assembled.

In such an automatic assembling apparatus including an industrial robot, before starting an automatic operation, an operation called teaching for teaching the operation reference coordinates and the operation procedure to the robot is required. In conventional teaching work, an approximate coordinate position is obtained by computer simulation on a personal computer, etc., and then the operator manually operates the robot hand manually using a teaching pendant or teaching box. The setting is made.
JP 2000-354919 A (see paragraph [0022] and FIG. 1) Japanese Patent Laying-Open No. 2006-43844 (see paragraph [0039] and FIG. 1)

  In the conventional apparatus, setting the fine coordinate position is very complicated in the teaching work described above. In addition, there are errors due to individual differences between robots, errors that occur when installing various modules such as robot hands, component supply trays, and assembly jigs, and errors due to individual differences between chuck parts and assembly jigs themselves. Teaching had to be performed every time it was re-installed, and it took a long time for teaching work. In particular, in the case of small equipment assembly equipment, the operator may have to push the head into the equipment and perform teaching work in an unreasonable manner, which is very safe and friendly for the operator. It was not an environment. Furthermore, if automatic assembly of the robot is continued, the displacement of the robot arm itself and the displacement of the robot hand occur and accumulate, or the assembly cannot be continued during automatic operation due to the displacement of the workpiece. Therefore, a so-called “choco stop” (that is, a machine stop state caused by a temporary trouble, not a machine failure) has occurred.

  The present invention has been made in view of such a conventional situation, and the problem to be solved by the present invention is that teaching work can be simplified, teaching time can be shortened, and so-called “chocolate during automatic operation” can be achieved. An object of the present invention is to provide a robot control system capable of preventing the occurrence of “stop”.

  A robot control system according to a first aspect of the present invention includes a robot including a robot arm and a robot hand having a chuck portion for gripping a workpiece provided at the tip of the robot, and an operation unit for performing teaching input to the robot. A camera unit that can photograph at least a workpiece and a predetermined coordinate position input by teaching on the operation unit is corrected based on an image captured by the camera unit, and the robot is driven and controlled to move to the corrected coordinate position. And a control unit.

  In the first aspect of the invention, the robot arm and the robot hand are moved to a predetermined coordinate position (for example, the coordinate position immediately before the workpiece gripping operation or the grip releasing operation) input by teaching, and after the movement, the workpiece is moved by the camera unit. Take a picture. A control part correct | amends the said predetermined coordinate position based on the image image | photographed with the camera part.

  Here, when correction is performed at the time of teaching input, errors due to the installation of various modules such as robot hands, component supply trays, assembly jigs, and errors due to individual differences in chucks and assembly jigs themselves. Even when such a situation occurs, correction for various errors is automatically performed at a time based on an image photographed by the camera unit. Thereby, automatic correction by self-calibration is possible, teaching work can be simplified, and teaching time can be shortened.

  In addition, since correction is performed based on images captured by the camera unit, there is no need for the operator to pierce the head into the device during teaching work, and a safe and friendly work environment for the operator. Can be realized.

  On the other hand, when correction is performed during automatic operation after teaching, the control unit controls the drive of the robot, causing the robot arm and robot hand to move to the corrected coordinate position. Can be prevented.

  According to a second aspect of the present invention, in the first aspect, the camera unit is provided on a side surface of the support base that supports the chuck unit.

  In this case, it is possible to obtain an image of the tip of the chuck portion, and as such a camera portion, for example, a small vision camera or a CCD camera is suitable. Further, in this case, since the camera unit is provided on the side surface of the support base of the chuck unit, there is an advantage that the camera unit is not affected by the opening / closing operation of the chuck unit.

  According to a third aspect of the present invention, in the second aspect, the support of the chuck portion is provided so as to be movable.

  In this case, since the support base of the camera unit itself can be moved based on the image photographed by the camera unit, fine adjustment of the position of the chuck unit can be easily performed.

  According to a fourth aspect of the present invention, in the first aspect, the camera section is provided at the center position of the support surface of the support base that supports the chuck section.

  In this case, since the optical axis of the camera unit can coincide with the work axis of the chuck unit, the coordinate position can be easily set.

  According to a fifth aspect of the present invention, in the first aspect, the operation unit has a touch panel type display for displaying an image photographed by the camera unit, and correction is performed when the operator touches the display. .

  In this case, the image photographed by the camera unit is displayed on the display of the operation unit. While viewing the image on the display, the operator touches the display and operates the operation unit so that the current position of the robot hand is the starting point and the target coordinate position (for example, the position of the assembly jig) is the ending point. To make corrections.

  In this case, even if errors occur due to the installation of various modules such as robot hands, parts supply trays, assembly jigs, or errors due to individual differences in chucks or assembly jigs themselves, the operator can operate them. By simply looking at the image on the display and inputting the coordinates of the start point and end point, various errors are automatically corrected at once. Thereby, automatic correction by self-calibration is possible, teaching work can be simplified, and teaching time can be shortened.

  Also, in this case, it is not necessary for the operator to pierce his / her head into the apparatus during teaching work, and the operator only has to operate while looking at the display of the operation unit. A safe and friendly work environment can be realized.

  Furthermore, in this case, the operator can easily perform correction with one touch using the display of the operation unit.

  According to a sixth aspect of the present invention, in the first aspect, the control unit drives and controls the robot hand based on an image photographed by the camera unit so that the robot hand moves to an appropriate handling position during automatic driving. .

  According to the invention of claim 6, during automatic driving, when the robot hand moves to the vicinity of a preset handling position, the camera unit captures an image, and the control unit corrects the handling position based on the image. . As a result, the robot hand always moves to an appropriate handling position during automatic driving.

  As a result, for example, when a position error of the robot arm itself or a position error of the robot hand occurs, a work housed in the component supply tray is displaced, is tilted, or is assembled. Even if an assembly error occurs when assembling a workpiece with a jig, the image taken with the camera unit is processed to correct the coordinate position of the workpiece. Since the robot hand is moved to a position where the workpiece can be reliably gripped, handling errors of the robot hand (that is, the workpiece cannot be gripped) can be prevented. As a result, it is possible to prevent the occurrence of “choco stop” during automatic operation, and the robustness of the entire system can be greatly improved.

  According to a seventh aspect of the present invention, in the first aspect, the robot hand further includes a removal device for removing the work-in-process when a chocolate stop occurs during operation.

  According to the invention of claim 7, in the unlikely event that a “choco stop” occurs during automatic operation, the robot hand moves to the position of the work in process based on the image captured by the camera unit, and the robot hand Since the provided removal device automatically removes the work-in-process, the automatic return after the occurrence of “chocomotive stop” is possible.

  According to an eighth aspect of the present invention, in the first aspect, the control unit stores an image pattern of a part of the operator's body, and an image photographed by the camera unit includes the image pattern. The control unit is configured to urgently stop the operation of the robot arm. Thereby, the safety of the whole system can be improved.

  According to a ninth aspect of the present invention, in the first aspect, the robot hand includes a base that is detachably attached to the tip of the robot arm, and a chuck drive unit that moves the chuck portion relative to the base. Are provided with a first control unit that drives and controls the chuck driving unit based on an image captured by the camera unit, and a second control unit that performs driving control other than the chuck driving unit in the robot.

  In this case, the chuck unit moves with respect to the base by the first control unit drivingly controlling the chuck driving unit based on the image taken by the camera unit. Thereby, the position correction of the chuck portion can be performed independently of the robot drive control by the second control portion. In this case, for example, it becomes easy to design a robot operation program.

  In a tenth aspect of the present invention, in the first aspect, when the control unit determines that the workpiece cannot be gripped based on an image photographed by the camera unit, it controls driving of another operation for gripping the workpiece. ing.

  In this case, the control unit drives the removal device to remove the work in progress, instructs the operator to skip the workpiece that cannot be gripped and grip the next workpiece, or displays an alarm to notify the operator. Or Thereby, the operation rate of the whole system can be improved.

  The camera section in claim 1 includes a portable camera as described in the invention of claim 11.

  In this case, the camera unit can be shared by a plurality of robots by appropriately moving the camera unit to an appropriate position corresponding to each robot during teaching.

  A robot hand according to a twelfth aspect of the present invention is a robot hand provided at the tip of a robot arm, a base detachably attached to the tip of the robot arm, and a workpiece gripping portion provided on the base. The robot hand includes a camera unit that can photograph at least a workpiece, and a drive unit that moves the chuck unit with respect to the base based on an image photographed by the camera unit.

  In this case, based on the image photographed by the camera unit, the chuck unit moves with respect to the base by driving the driving unit. As a result, the position of the chuck portion can be corrected independently of the movement of the robot arm.

  As described above, according to the robot control system of the present invention, the predetermined coordinate position input to the operation unit is corrected based on the image photographed by the camera unit. Is automatically performed at a time, and automatic correction by self-calibration becomes possible, whereby the teaching work can be simplified and the teaching time can be shortened. Further, according to the present invention, during automatic operation after teaching, the control unit controls the drive of the robot so that the robot arm and the robot hand can be moved to the corrected coordinate position. it can.

Embodiments of the present invention will be described below with reference to the accompanying drawings.
1 to 8 are views for explaining a robot control system according to an embodiment of the present invention. 1 is a schematic plan view of the robot control system, FIG. 2 is a side perspective view of the robot, FIG. 3 is an enlarged perspective view of the robot hand, FIG. 4 is an enlarged front view of the teaching pendant, and FIG. FIG. 6 is a diagram showing an example of the teaching pendant operation during teaching, FIG. 7 is a diagram showing an example of handling control of the robot hand during automatic operation, and FIG. 8 is an example of the operation of the system of this embodiment. It is a figure which shows compared with the conventional apparatus.

  As shown in FIG. 1, the robot control system 1 includes two assembly robots 3 and 4 that are spaced apart from each other on a work table surface 2. The robots 3 and 4 are all articulated robots and have a plurality of robot arms 30 and 40, respectively. Robot hands 31 and 41 are provided at the distal ends of the robot arms 30 and 40 on the distal end side, respectively.

  On one side of the work table surface 2, a plurality of component supply trays 5 and 6 for accommodating a large number of components (workpieces) (not shown) are arranged. These component supply trays 5 and 6 are carried onto the work table surface 2 by a carry-in device (not shown). On the worktable surface 2, an assembly jig 7 for assembling parts by the robots 3 and 4 is attached at a substantially central position. Below the work table 2, there are provided controllers A and B for driving and controlling the robots 3 and 4, respectively. A portable camera 8 for photographing the entire movable range of the robot arms 30 and 40 is disposed on one side of the work table 2. The portable camera 8 preferably has a wide-angle lens and has a zoom function and a pan function.

  As shown in FIG. 2, the robot 3 has a detachable robot hand 31 at the tip of a robot arm 30 on the tip side. The robot hand 31 is provided with a plurality of workpiece gripping chuck portions 32.

  As shown in FIG. 3, the chuck portion 32 is supported by a support base 33. The support 33 has a built-in drive mechanism for driving the chuck portion 32. The robot hand 31 has a base plate 31A. One of the four support bases 33 shown in the figure is provided on the base plate 31A so as to be movable in the direction of the arrow shown in the figure, and the tip of the chuck portion 32 is photographed on the side surface of the movable support base 33. A small camera, for example, a vision camera or a CCD camera 34 is attached.

  Since the robot hand 41 of the robot 4 has the same configuration, the description thereof is omitted here.

  The robot control system 1 includes a teaching pendant (operation unit) 10 as shown in FIG. The teaching pendant 10 includes an LCD (liquid crystal) display 11 for displaying an image taken by the camera 34 of the robot hand 31, left and right grips 12 and 13 for an operator to hold, and an emergency stop button 14. ing.

  The display 11 is a touch panel display. In addition, a plurality of push button switches 11 a for operating the robot such as starting, stopping, and teaching are provided on both the left and right sides of the display 11. The back surface of the grip 12 is provided with a three-position type enable switch (not shown) for the operator to avoid danger during non-steady work such as teaching or trial operation.

Next, the control unit of the robot control system 1 will be described with reference to FIG.
The control unit 100 includes a controller A of the robot 3 and a controller B of the robot 4. The controllers A and B are configured to be able to communicate with each other.

  The teaching pendant 10 is connected to the input side of the controller A wirelessly. As described above, the touch panel 11, the emergency stop switch 14, and the enable switch 15 are connected to the teaching pendant 10. A small camera 34 and a portable camera 8 are connected to the input side of the controller A.

Motors m 1 to mn for driving the robot arm 30 and the robot hand 31 are connected to the output side of the controller A. Further, a removal device 105 is connected to the output side of the controller A for removing work in process from the work table surface 2 when a “chocolate stop” occurs during automatic operation. The removal device 105 is provided in the robot hand 31, and preferably any one of the chuck portions 32 in the robot hand 31 functions as the removal device 105. Note that an air nozzle for ejecting compressed air toward the work table surface 2 may be provided in the removal device 105.

  A personal computer 110 is further connected to the output side of the controller A. The personal computer 110 is used not only to perform simulation during teaching, but also to monitor an image during automatic driving taken by the portable camera 8. The computer 110 is connected to the controller A via a wireless LAN. Note that an image during automatic operation may be displayed on the teaching pendant 10.

On the other hand, on the input side of the controller B, similarly, in the robot 4, a small camera 44 provided on the support base of the chuck portion of the robot hand 41 is connected. Further, motors m 1 ′ to m n ′ for driving the robot arm 40 and the robot hand 41 and a removal device 106 similar to the removal device 105 are connected to the output side of the controller B.

Next, the teaching work of the robot control system configured as described above will be described.
Here, the teaching work of the robot 3 will be described, but the same applies to the robot 4, and the description of the teaching work of the robot 4 is omitted here.

  When teaching is performed, the operator first moves the robot arm 30 by operating the teaching pendant 10, and at this time, the operator moves the robot arm 30 and the robot hand 31 closer to the target position while viewing the approximate coordinates. Enter the position. After the movement of the robot arm 30 and the robot hand 31, an image is taken by the small camera 34 of the robot hand 31. The photographed image is displayed on the touch panel 11 of the teaching pendant 10.

  Assume that the image displayed on the touch panel 11 after movement is as shown in FIG. This image shows a state where the chuck portion 32 of the robot hand 31 that holds the workpiece W is stopped immediately before the assembly jig 7 on the work table surface 2.

  Next, the operator touches and inputs the current coordinate position X and the target coordinate position Y on the touch panel 11 with a pen or the like, or traces the outline of the workpiece W held by the chuck portion 32 with the touch pen. In addition, the corresponding contour of the concave portion into which the workpiece W is assembled in the assembling jig 7 is similarly traced with a touch pen to indicate a portion to be matched. By pressing the teaching button on the side of the touch panel after inputting these two pens, the coordinate position obtained by adding the difference between the two specified coordinates to the current coordinate position X is stored in the memory of the controller A. .

  In this way, after inputting a predetermined coordinate position by moving the robot arm 30 by the teaching pendant 10, the position coordinate is corrected by the touch pen based on the image displayed on the touch panel 11, thereby easily teaching. Is completed. The control unit drives and controls the robot based on the corrected coordinate position. Instead of correction by pen input, a correction method described later with reference to FIG. 7 may be used.

  During automatic operation after teaching, the controller A controls the drive of the robot arm 30 and the robot hand 31, whereby the robot hand 31 and the chuck unit 32 move to the predetermined coordinate positions taught by the teaching pendant 10, and the workpiece W Is assembled to the assembling jig 7 (see FIG. 6B).

  In this case, even if there are errors due to the installation of various modules such as the robot hand 31 and the assembly jig 7 and errors due to individual differences in the chuck portion 32 and the assembly jig 7 themselves, By just looking at the image of the teaching pendant 10 and inputting the coordinate positions of the start point and end point, corrections for various errors are automatically performed at once. Thereby, automatic correction by self-calibration is possible, teaching work can be simplified, and teaching time can be shortened.

  Further, in this case, it is not necessary for the operator to pierce the head into the apparatus at the time of teaching work, and the operator only needs to operate while looking at the touch panel 11 of the teaching pendant 10. A safe and friendly work environment can be realized.

Next, handling control of the chuck portion 32 using the small camera 34 during automatic operation will be described with reference to FIG.
In the drawing, an image taken by the small camera 34 when the workpiece W is taken out from the component supply tray 5 is displayed as an image on the touch panel 10 of the teaching pendant 10.

  Now, as shown in FIG. 7A, it is assumed that the workpiece W ′ accommodated in the component supply tray 5 is disposed so as to be inclined from the normal position. The video in FIG. 5A photographed by the small camera 34 is read by the controller A and subjected to image processing by the controller A so that the chuck portion 32 moves to an appropriate handling position for gripping the workpiece W ′. Then, the movement coordinates of the chuck portion 32 are calculated.

  The robot hand 31 is driven and controlled based on the image processing result in the controller A and moves to an appropriate handling position (see FIG. 7B), whereby the chuck portion 32 securely holds the workpiece W ′. can do.

  In this way, during automatic operation, images taken by the small camera 34 are sequentially read into the controller A, and the handling position is corrected in real time as necessary, so that it is accommodated in the component supply tray 5. The control unit corrects the coordinate position of teaching input even if the workpiece W is misaligned, tilted, or misassembled when assembling the workpiece W with the assembling jig 7. By doing so, the robot hand 31 is moved to an appropriate handling position, so that it is possible to prevent a handling error of the robot hand 31 (failed to grip the workpiece). As a result, it is possible to prevent the occurrence of “choco stop” during automatic operation, and the robustness of the entire system can be greatly improved.

  In this case, since the small camera 24 is provided on the side surface of the support base 33 that supports the chuck portion 32 (see FIG. 3), an image of the tip of the chuck portion 32 can be obtained. There is also an advantage that the small camera 34 is not affected by the opening / closing operation of the chuck portion. Further, since the support base 33 of the chuck portion 32 is movably provided, the position of the chuck portion 32 can be easily finely adjusted.

Next, the operation of this robot control system will be described with reference to FIG.
FIG. 8A shows the operation of the conventional apparatus, and FIG. 8B shows the operation of the system of this embodiment. In each figure, the horizontal axis represents time, and the vertical axis represents the number of units assembled per unit time. Therefore, the area surrounded by diagonal lines represents the total production volume, indicating that the machine was actually in operation. Conversely, the white area indicates that the machine has stopped.

In FIG. 8, times Δt 1 and ΔT 1 from when the equipment is started up until when the operation is actually started indicate the teaching time. As shown in the figure, the teaching time ΔT 1 in the robot control system according to the present embodiment is a fraction of the teaching time Δt 1 in the conventional apparatus.

  As described above, the image captured by the small cameras 34 and 44 is displayed on the touch panel 11 of the teaching pendant 10 and the teaching operation for the robot hands 31 and 41 can be easily performed by operating the image on the touch panel 11. This is because it was made possible.

  In addition, the occurrence of “choco stop” during automatic operation has been greatly reduced. This is because, as described above, the handling error of the chuck portion 32 during the automatic operation is greatly reduced as compared with the conventional device, and the occurrence of “chocolate stop” is avoided in advance.

Furthermore, the return time ΔT 2 required for the machine to return after occurrence of “choco stop” is shorter than the return times Δt 2 and Δt 3 in the conventional apparatus. This is because, in the conventional apparatus, after the occurrence of “chocolate stop”, the operator manually pressed the start button after removing the work in progress by hand. This is because the removal device 105 automatically removes the work-in-process after the occurrence of this, so that the machine automatically returns.

Further, the setup time ΔT 3 required for the setup change is also reduced compared to the setup time Δt 4 in the conventional apparatus. This is because the setup change operation requires replacement of the robot hand, replacement of the assembly jig, etc., and the coordinate position correction was conventionally performed by manual operation of the robot hand. This is because it can be easily performed by operating the teaching pendant 10 using an image from the small camera 34.

  As can be seen from FIG. 8, the time required from the start-up of the equipment to the first setup change is shorter in the system of this embodiment than in the conventional apparatus, and the productivity is improved. As described above, this is because, in the system of the present embodiment, firstly, the teaching time has decreased, secondly, the occurrence frequency of “choco stop” has decreased, and thirdly, “chocolate stop”. It can be mentioned that the time required to return after the occurrence of "is reduced.

  As shown in FIG. 8, the area of the hatched area from the start of operation until the first setup change is the same between the system of the present embodiment and the conventional apparatus. In other words, the system of this embodiment and the conventional apparatus assemble the same number of products until the first setup change.

  Further, each memory of the controllers A and B of the robot control system 1 stores a part of the image pattern of the operator's body. Then, when the image pattern of the operator's body is included in the image taken by the portable camera 8 during the automatic operation, the controllers A and B come to urgently stop the operation of the robot arms 30 and 40. ing. Thereby, a highly safe system can be realized. Instead of the portable camera 8, the emergency stop of the operation may be similarly performed based on an image taken by a small camera attached to the robot hand 31.

  According to the present embodiment, the small camera 34 is provided on the workpiece gripping chuck portion 32 of the robot hand 31, and an image photographed by the small camera 34 is displayed on the touch panel 11 of the teaching pendant 10. Since the coordinate position is corrected by operating the teaching pendant 10 based on this, correction for various errors is automatically performed at one time, and automatic correction by self-calibration becomes possible. Therefore, teaching work can be simplified and teaching time can be shortened.

  In the above-described embodiment, the small camera 34 provided in the robot hand 31 is used as the camera unit for photographing the workpiece. However, the portable camera 8 is used instead of the small camera 34. May be. In this case, the coordinate position is corrected based on the image captured by the portable camera 8.

  Moreover, in the said Example, although the small camera 34 was shown in the side surface of the support stand 33 of the chuck | zipper part 32, FIG. 9 thru | or FIG. 11 replaced with the small camera 34 or the small size of the said Example. In addition to the camera 34, an example is shown in which a camera is attached at a position different from the above-described embodiment. In these drawings, the same reference numerals as those in the previous embodiment denote the same or corresponding parts.

  In the example shown in FIG. 9, the small camera 34 </ b> A is provided at the center position of the support surface of the support base 33 that supports the chuck portion 32.

  In this case, although there is a disadvantage that the field of view of the small camera 34A is narrowed when the chuck portion 32 is closed, the optical axis of the small turtle 34A can coincide with the work axis of the chuck portion 32. Easy to set up.

  In the example shown in FIG. 10, the camera 34B is arranged at a position inside each chuck portion 32, 32 ', that is, at a substantially central position of the base plate 31A. Further, in the figure, a chuck portion 32 ′ is a chuck portion that functions as a removal device, and a small camera 34 </ b> C is provided on the side surface of the support base 33.

  In this case, since the camera 34B is disposed on the side of the chuck portion 32, the camera 34B is used to capture an entire image when the “chocolate stop” occurs. Since the small camera 34C can take an image of the tip of the chuck portion 32 ', the removal work when removing the work in progress by the chuck portion 32' can be performed smoothly.

  In the example shown in FIG. 11, the small camera 34 </ b> D is provided on a camera support 33 </ b> A provided on the side of the chuck portion 32. In this case, since the small camera 34D is arranged on the side of the chuck portion 32, an image of the tip of the chuck portion cannot be taken, but the state can be taken when a “chocolate stop” occurs.

  In the example shown in FIG. 12, a small camera 34 </ b> A similar to that in FIG. 9 is provided at the center position of the support surface of each support base 33 that supports each chuck portion 32. The robot hand 31 has a controller 100 ′ that performs drive control of only the robot hand 31. The controller 100 ′ is a controller that performs control independent of the controllers A and B described above. Each support base 33 of each chuck portion 32 is attached to a moving base 35 that is movable in the X direction and the Y direction (and the Z direction, which is the vertical direction) in the horizontal plane. In FIG. 12, reference numeral 31B indicates a base that is detachably attached to the tip of the robot arm 30 (FIG. 2).

  In this case, when it is necessary to correct the movement coordinate position of the chuck portion 32 based on the image photographed by the small camera 34A, the controller 100 ′ drives and controls the movement base 35, thereby the chuck portion 32. Moves in any of the X direction, Y direction, Z direction, XY direction, YZ direction, X direction, or XYZ direction, and the coordinate position of the chuck portion 32 is corrected.

  In this case, the control for gripping the workpiece can be performed independently of the control other than the drive unit of the robot.

  FIG. 13 shows an automatic return robot hand 31. The robot hand 31 is provided with a camera 34B similar to that shown in FIG. Further, the robot hand 31 is provided with a nozzle 36 that ejects compressed air and a suction unit 37 that sucks air. The automatic return robot hand 31 is disposed at a position indicated by reference numerals 51 and 52 in FIG.

  If the controller determines that the workpiece cannot be gripped by the chuck during automatic operation, the controller drives the robot arm and places the robot hand at the positions indicated by reference numerals 51 and 52 in FIG. Replace the robot hand for automatic return.

  Then, the controller recognizes that there is a work-in-progress based on the image captured by the camera 34B, and first tries to remove the work-in-progress by driving the removal-dedicated chuck portion 32 '. After the removal work, if it is determined from the image taken by the camera 34B that there is no work in progress, the controller determines that the work in progress has been removed and resumes the assembly operation.

  On the other hand, if it is determined that the work-in-process has not been removed from the imaging of the camera 34B, the controller repeats the compressed air ejection from the nozzle 36 and the air suction by the suction unit 37 alternately several times, If it is determined from the image of 34B that there is no work-in-process, the controller determines that the work-in-process has been removed and restarts the assembly operation.

  If it is determined that the work-in-progress has not been removed from the image captured by the camera 34B, the controller displays an alarm display, such as sounding an alarm, and notifies the operator. The operator removes the work in process and then resumes automatic operation.

  In addition to the teaching pendant, the operation unit described above may be a terminal provided in an office away from the site where the robot is placed. In this case, it is possible to perform monitoring and teaching of the robot during operation by displaying an image captured by the camera on the display of the terminal.

  The wirelessly connected teaching pendant 10 and the portable camera 8 can be carried by an operator. Therefore, when a plurality of robot systems shown in FIG. The portable camera 8 can be shared.

1 is a schematic plan view of a robot control system according to an embodiment of the present invention. It is a side perspective view of a robot. It is an expansion perspective view of a robot hand. It is a front enlarged view of a teaching pendant. It is a block block diagram of the control part of a robot control system. It is a figure which shows an example of operation of the teaching pendant at the time of teaching. It is a figure which shows an example of handling control of the robot hand at the time of automatic driving | operation. It is a figure which shows an example of the driving | running operation | movement of a present Example system compared with a conventional apparatus. FIG. 6 is an enlarged perspective view of a robot hand according to another embodiment of the present invention. FIG. 6 is an enlarged perspective view of a robot hand according to another embodiment of the present invention. FIG. 6 is an enlarged perspective view of a robot hand according to another embodiment of the present invention. FIG. 6 is an enlarged perspective view of a robot hand according to another embodiment of the present invention. FIG. 6 is an enlarged perspective view of a robot hand according to another embodiment of the present invention.

Explanation of symbols

1: Robot control system

3: Robot 30: Robot arm 31: Robot hand 32: Chuck part 33: Support base 34: Small camera (camera part)

4: Robot 40: Robot arm 41: Robot hand 44: Small camera

8: Portable camera (camera part)

10: Teaching pendant (operation unit)
11: Touch panel (display)

100: Control part A, B: Controller

105: Removal device 106: Removal device

W: Workpiece

Claims (12)

  1. A robot control system,
    A robot including a robot arm and a robot hand provided at a tip of the robot arm and having a workpiece gripping portion;
    An operation unit for inputting teaching to the robot;
    At least a camera that can shoot the workpiece,
    A controller for driving and controlling the robot so as to move to a corrected coordinate position by correcting a predetermined coordinate position input to the operation unit based on an image photographed by the camera unit;
    Robot control system equipped with.
  2. In claim 1,
    The camera unit is provided on a side surface of a support base that supports the chuck unit.
    A robot control system characterized by that.
  3. In claim 2,
    The support is movably provided;
    A robot control system characterized by that.
  4. In claim 1,
    The camera unit is provided at a center position of a support surface of a support table that supports the chuck unit.
    A robot control system characterized by that.
  5. In claim 1,
    The operation unit has a display for displaying an image taken by the camera unit, and the display is a touch panel display, and the correction is performed by an operator touching the display. ,
    A robot control system characterized by that.
  6. In claim 1,
    In order for the robot hand to move to an appropriate handling position during automatic operation, the control unit performs the correction based on the image photographed by the camera unit, and drives and controls the robot hand.
    A robot control system characterized by that.
  7. In claim 1,
    The robot hand further includes a removal device for removing work in progress when a chocolate stop occurs during operation.
    A robot control system characterized by that.
  8. In claim 1,
    An image pattern of a part of the operator's body is stored in the control unit, and when the image captured by the camera unit includes the image pattern, the control unit operates the robot arm. Emergency stop,
    A robot control system characterized by that.
  9. In claim 1,
    The robot hand is
    A base detachably attached to the tip of the robot arm;
    A chuck drive unit that moves the chuck unit relative to the base,
    The control unit is
    A first control unit that drives and controls the chuck driving unit based on an image captured by the camera unit;
    A second control unit that performs drive control other than the chuck drive unit in the robot;
    A robot control system characterized by that.
  10. In claim 1,
    When the control unit determines that the workpiece cannot be gripped based on the image captured by the camera unit, it drives and controls another operation for gripping the workpiece.
    A robot control system characterized by that.
  11. In claim 1,
    The camera unit includes a portable camera;
    A robot control system characterized by that.
  12. A robot hand provided at the tip of a robot arm,
    A base removably attached to the tip of the robot arm;
    A chuck for gripping a workpiece provided on the base;
    A camera unit provided in the robot hand and capable of photographing at least a workpiece;
    A drive unit that moves the chuck unit relative to the base, based on an image captured by the camera unit;
    Robot hand equipped with.
JP2007164426A 2007-06-21 2007-06-21 System for controlling robot, and robot hand Pending JP2009000782A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007164426A JP2009000782A (en) 2007-06-21 2007-06-21 System for controlling robot, and robot hand

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007164426A JP2009000782A (en) 2007-06-21 2007-06-21 System for controlling robot, and robot hand

Publications (1)

Publication Number Publication Date
JP2009000782A true JP2009000782A (en) 2009-01-08

Family

ID=40317729

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007164426A Pending JP2009000782A (en) 2007-06-21 2007-06-21 System for controlling robot, and robot hand

Country Status (1)

Country Link
JP (1) JP2009000782A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010228064A (en) * 2009-03-27 2010-10-14 National Institute Of Advanced Industrial Science & Technology Robot arm operating method of robot device for welfare, robot arm operation program, and recording medium
NL1038363C2 (en) * 2010-11-05 2011-09-06 Lely Patent Nv Automatic milk device with camera control.
JP2012210675A (en) * 2011-03-31 2012-11-01 Ihi Corp Hand guide device, and control method therefor
JP2013158844A (en) * 2012-02-01 2013-08-19 Ihi Corp Work assembling device
WO2014020739A1 (en) * 2012-08-02 2014-02-06 富士機械製造株式会社 Work machine provided with articulated robot and electric component mounting machine
WO2014083695A1 (en) * 2012-11-30 2014-06-05 株式会社安川電機 Robotic system
JP2015085411A (en) * 2013-10-29 2015-05-07 セイコーエプソン株式会社 Robot, and working method of robot
DE102016006252A1 (en) 2015-05-29 2016-12-01 Fanuc Corporation Manufacturing system with a robot having a position correction function

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6114890A (en) * 1984-06-26 1986-01-23 Kobe Steel Ltd Inspection device for work of robot
JPS61142088A (en) * 1984-12-12 1986-06-28 Toshiba Corp Handling device for part under disordered state
JPH02249421A (en) * 1989-03-24 1990-10-05 Iseki & Co Ltd Harvester of fruit and the like
JPH05108125A (en) * 1991-10-16 1993-04-30 Kobe Steel Ltd Reproduction controller of robot
JPH07132475A (en) * 1993-11-09 1995-05-23 Matsushita Electric Ind Co Ltd Robot position teaching device
JPH07136864A (en) * 1993-11-11 1995-05-30 Matsushita Electric Ind Co Ltd Trouble cause eliminating system
JPH08286701A (en) * 1995-04-11 1996-11-01 Nissan Motor Co Ltd Multi-robot control method and system
JPH08305433A (en) * 1995-04-27 1996-11-22 Nissan Motor Co Ltd Depalletizing controller
WO1997024206A1 (en) * 1995-12-27 1997-07-10 Fanuc Ltd Composite sensor robot system
JP2000263480A (en) * 1999-03-12 2000-09-26 Meidensha Corp Bin picking device
JP2001320698A (en) * 2000-05-12 2001-11-16 Nippon Signal Co Ltd:The Image type monitoring method, and image type monitoring device and safety system using it
JP2004230513A (en) * 2003-01-30 2004-08-19 Fanuc Ltd Work fetching device
JP2004351570A (en) * 2003-05-29 2004-12-16 Fanuc Ltd Robot system
JP2007125624A (en) * 2005-11-01 2007-05-24 Sharp Corp Hand device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6114890A (en) * 1984-06-26 1986-01-23 Kobe Steel Ltd Inspection device for work of robot
JPS61142088A (en) * 1984-12-12 1986-06-28 Toshiba Corp Handling device for part under disordered state
JPH02249421A (en) * 1989-03-24 1990-10-05 Iseki & Co Ltd Harvester of fruit and the like
JPH05108125A (en) * 1991-10-16 1993-04-30 Kobe Steel Ltd Reproduction controller of robot
JPH07132475A (en) * 1993-11-09 1995-05-23 Matsushita Electric Ind Co Ltd Robot position teaching device
JPH07136864A (en) * 1993-11-11 1995-05-30 Matsushita Electric Ind Co Ltd Trouble cause eliminating system
JPH08286701A (en) * 1995-04-11 1996-11-01 Nissan Motor Co Ltd Multi-robot control method and system
JPH08305433A (en) * 1995-04-27 1996-11-22 Nissan Motor Co Ltd Depalletizing controller
WO1997024206A1 (en) * 1995-12-27 1997-07-10 Fanuc Ltd Composite sensor robot system
JP2000263480A (en) * 1999-03-12 2000-09-26 Meidensha Corp Bin picking device
JP2001320698A (en) * 2000-05-12 2001-11-16 Nippon Signal Co Ltd:The Image type monitoring method, and image type monitoring device and safety system using it
JP2004230513A (en) * 2003-01-30 2004-08-19 Fanuc Ltd Work fetching device
JP2004351570A (en) * 2003-05-29 2004-12-16 Fanuc Ltd Robot system
JP2007125624A (en) * 2005-11-01 2007-05-24 Sharp Corp Hand device

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010228064A (en) * 2009-03-27 2010-10-14 National Institute Of Advanced Industrial Science & Technology Robot arm operating method of robot device for welfare, robot arm operation program, and recording medium
NL1038363C2 (en) * 2010-11-05 2011-09-06 Lely Patent Nv Automatic milk device with camera control.
JP2012210675A (en) * 2011-03-31 2012-11-01 Ihi Corp Hand guide device, and control method therefor
JP2013158844A (en) * 2012-02-01 2013-08-19 Ihi Corp Work assembling device
WO2014020739A1 (en) * 2012-08-02 2014-02-06 富士機械製造株式会社 Work machine provided with articulated robot and electric component mounting machine
JPWO2014020739A1 (en) * 2012-08-02 2016-07-11 富士機械製造株式会社 Working machine equipped with an articulated robot and electrical component mounting machine
US10099365B2 (en) 2012-08-02 2018-10-16 Fuji Corporation Work machine provided with articulated robot and electric component mounting machine
CN104797386A (en) * 2012-11-30 2015-07-22 株式会社安川电机 Robotic system
WO2014083695A1 (en) * 2012-11-30 2014-06-05 株式会社安川電機 Robotic system
JP5983763B2 (en) * 2012-11-30 2016-09-06 株式会社安川電機 Robot system
JP2015085411A (en) * 2013-10-29 2015-05-07 セイコーエプソン株式会社 Robot, and working method of robot
DE102016006252A1 (en) 2015-05-29 2016-12-01 Fanuc Corporation Manufacturing system with a robot having a position correction function
US10031515B2 (en) 2015-05-29 2018-07-24 Fanuc Corporation Production system including robot with position correction function that supplies or ejects workpieces to or from a machine tool
DE102016006252B4 (en) 2015-05-29 2019-08-08 Fanuc Corporation Manufacturing system with a robot having a position correction function

Similar Documents

Publication Publication Date Title
CN1053136C (en) Automated part assembly machine
US7211978B2 (en) Multiple robot arm tracking and mirror jog
EP1521211A2 (en) Method and apparatus for determining the position and orientation of an image receiving device
EP0796704A1 (en) Apparatus and method for correcting a travelling route for a robot
US5980082A (en) Robot movement control device and movement control method
US6366310B1 (en) Electronic parts mounting apparatus
US6696668B2 (en) Laser soldering method and apparatus
KR100963460B1 (en) Automatic cutting device and production method for beveled product
JP4167954B2 (en) Robot and robot moving method
DE102005058867B4 (en) Method and device for moving a camera arranged on a pan and tilt head along a predetermined path of movement
JP4481576B2 (en) Paste applicator
JP6108860B2 (en) Robot system and control method of robot system
CN101351294B (en) Laser processing system and laser processing method
US20060072988A1 (en) Transfer robot system
JP4680778B2 (en) Printing inspection method and printing apparatus
JP4964522B2 (en) Screen printing device
US8355813B2 (en) Machining status monitoring method and machining status monitoring apparatus
JP5229177B2 (en) Component mounting system
JP2001274596A (en) Component mounting apparatus and component mounting method
US20030118436A1 (en) Work loading method for automatic palletizer, work loading method, work loading apparatus and attachment replacing method thereof
CN103029118A (en) Robot apparatus, robot system, and method for producing a to-be-processed material
US6246789B1 (en) Component mounting apparatus and method
CN100591490C (en) Robot programming device
JP2005190102A (en) Numerical controller
JP2009148845A (en) Small-size production equipment

Legal Events

Date Code Title Description
A621 Written request for application examination

Effective date: 20100531

Free format text: JAPANESE INTERMEDIATE CODE: A621

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110404

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111226

A131 Notification of reasons for refusal

Effective date: 20120104

Free format text: JAPANESE INTERMEDIATE CODE: A131

A521 Written amendment

Effective date: 20120305

Free format text: JAPANESE INTERMEDIATE CODE: A523

A131 Notification of reasons for refusal

Effective date: 20121009

Free format text: JAPANESE INTERMEDIATE CODE: A131

A02 Decision of refusal

Effective date: 20130225

Free format text: JAPANESE INTERMEDIATE CODE: A02