WO2022195938A1 - Robot system positioning accuracy measurement method - Google Patents

Robot system positioning accuracy measurement method Download PDF

Info

Publication number
WO2022195938A1
WO2022195938A1 PCT/JP2021/037694 JP2021037694W WO2022195938A1 WO 2022195938 A1 WO2022195938 A1 WO 2022195938A1 JP 2021037694 W JP2021037694 W JP 2021037694W WO 2022195938 A1 WO2022195938 A1 WO 2022195938A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
tip
machine tool
identification
camera
Prior art date
Application number
PCT/JP2021/037694
Other languages
French (fr)
Japanese (ja)
Inventor
秀樹 長末
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2022195938A1 publication Critical patent/WO2022195938A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to a robot that performs work on a machine tool, an automatic guided vehicle equipped with this robot and passing through a work position set with respect to the machine tool, and a control that controls the operation of these robots and the automatic guided vehicle.
  • the present invention relates to a method for measuring the positioning accuracy of a robot tip in a robot system composed of devices.
  • a self-propelled robot disclosed in International Publication No. 2018/92222 (Patent Document 1 below) is known as an example of the above-described robot system.
  • This self-propelled robot consists of an unmanned guided vehicle and a manipulator with 3 or more degrees of freedom provided on the unmanned guided vehicle. , moves to the vicinity of the target position while recognizing its own approximate position, and then recognizes the target or the marker attached to the target by the camera, thereby performing precise positioning.
  • this self-propelled robot receives work requests related to transport work for transporting objects to be transported and maintenance work such as disposal of chips, replacement of tools, and lubrication, which are transmitted from a plurality of machine tools. to the requesting machine tool.
  • the manipulator (robot) described above has a structure in which a plurality of arms (links) are connected by joints so that they are continuous, and the positioning error of the hand (end effector) provided at the tip is , is governed by the rotational accuracy of each joint, and the motion error of each joint is accumulated. For this reason, for example, when motion errors of the joints become large due to deterioration over time, etc., these are accumulated, resulting in a large positioning error of the hand, and as a result, the motion accuracy of the robot deteriorates.
  • the above-mentioned unmanned guided vehicle is configured to move on wheels, and conventionally, these wheels are made of resin. For this reason, when the automatic guided vehicle operates for a long period of time, the wheels wear, and when the wheels wear like this, the positioning accuracy of the automatic guided vehicle deteriorates. positioning accuracy of the hand is deteriorated.
  • the positioning accuracy of the hand in the robot system is within the allowable range, for example, the positioning accuracy is periodically measured, and when the positioning accuracy is outside the allowable range, the robot It is necessary to perform maintenance work such as replacement of parts that make up the joints and replacement of the wheels of the automatic guided vehicle.
  • measuring the positioning accuracy of the hand requires a dedicated special measuring device, and such a measuring device is generally expensive. It was impractical and difficult to prepare a bulky and expensive measuring device.
  • the present invention has been made in view of the above circumstances, and its object is to provide a measuring method capable of measuring the positioning accuracy of the tip of the robot in the robot system without using a special measuring device.
  • the present invention for solving the above problems is a robot having an end effector at its distal end and using the end effector to perform work on a machine tool; an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool; A control device that controls the automatic guided vehicle and the robot, A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot.
  • a method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle Positioning the moving body of the machine tool and the tip of the robot at a first position where the camera can capture the identification figure, capturing the identification figure with the camera, and based on the obtained image, a first step of calculating a relative positional relationship between the camera and the identification figure at the first position;
  • the tip portion of the robot is positioned in a predetermined direction at a second position moved by a predetermined target distance, and the machining is performed.
  • the present invention relates to a positioning accuracy measuring method for a robot system.
  • the robot system has an identification figure disposed on one of the moving body of the machine tool or the tip of the robot, and a camera that captures the identification figure. is disposed on the other of the moving body of the machine tool and the tip of the robot.
  • the movable body of the machine tool and the tip of the robot are positioned at a first position where the identification figure can be imaged by the camera, and the identification figure is imaged by the camera.
  • the relative positional relationship between the camera and the identification figure at the first position is calculated (first step). For example, the relative positional relationship between the camera and the identification figure is calculated (recognized) by calculating the center position of the identification figure on the image based on the image including the identification figure captured by the camera.
  • the distal end of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position.
  • the identification figure is imaged by the camera, and based on the obtained image, , the relative positional relationship between the camera and the identification figure at the second position is calculated (third step).
  • the relative positional relationship between the camera and the identification figure at the second position can be obtained by calculating the center position of the identification figure on the image based on the image containing the identification figure captured by the camera, as described above.
  • the moving distance of the moving body can be directly measured using a general-purpose measuring device called a linear scale, for example, and by moving the moving body using such a measuring device, the A moving object can be moved by a target distance with high accuracy.
  • the machine tool is provided with a numerical controller and the operation of the moving body is controlled by this numerical controller, the moving body is moved under the control of this numerical controller to achieve the target It can be moved by distance with high accuracy.
  • the positioning accuracy of the tip of the robot in the movement direction is calculated (fourth step). That is, if the moving distance of the tip of the robot to the second position and the moving distance of the moving object match with high accuracy, the (center) position of the identification figure on the image captured at the first position, The position is the same as the (center) position of the identification figure on the image captured at the second position. Therefore, by calculating the difference between the (center) position of the identification figure on the image captured at the first position and the (center) position of the identification figure on the image captured at the second position, the tip of the robot can be calculated.
  • the tip of the robot can be measured.
  • the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
  • the present invention provides a robot having an end effector at its distal end and using the end effector to perform work on a machine tool; an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool; A control device that controls the automatic guided vehicle and the robot, A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot.
  • the robot system has an identification figure provided on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure. is disposed on the other of the moving body of the machine tool and the tip of the robot.
  • the moving body of the machine tool and the tip of the robot are positioned at a first position where the identification figure can be imaged by the camera, and the identification figure is imaged by the camera (first step). Then, for example, the position of the identification figure on the captured image is recognized by calculating the center position of the identification figure on the captured image based on the image containing the identification figure captured by the camera.
  • the tip of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position ( second step).
  • the moving body of the machine tool is moved in the movement direction of the tip of the robot, and the position of the identification figure captured by the camera on the camera frame (captured image) is imaged at the first position. Positioning is performed so as to match the position of the identified figure on the picked-up image, and the movement distance of the moving body is detected (third step).
  • the moving distance of the moving object can be directly measured by using a general-purpose measuring device called a linear scale, or the machine tool can be equipped with a numerical control device, as described above.
  • the moving distance of the moving body can be calculated based on the positional information of the moving body obtained from the numerical control device. Thus, by doing so, it is possible to recognize the movement distance of the moving object with high accuracy.
  • the robot system can be can be used to measure the positioning accuracy of the robot tip by using a machine tool to which the be able to. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
  • the present invention provides a robot having an end effector at its distal end and using the end effector to perform work on a machine tool; an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool; A control device that controls the automatic guided vehicle and the robot, A detection unit is disposed on either the moving body of the machine tool or the tip of the robot, and the detected portion detected by the detection unit is set to the other of the moving body of the machine tool or the tip of the robot.
  • the present invention relates to a positioning accuracy measuring method
  • a detection unit is disposed on either the moving body of the machine tool or the tip of the robot, and the detection unit detects A state is set in which the detected portion is disposed on the other of the moving body of the machine tool and the tip portion of the robot.
  • the moving body of the machine tool and the distal end portion of the robot are positioned at a first position where the detected portion can be detected by the detection portion, and the detected portion is detected by the detection portion (first position). 1 step).
  • the tip of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position ( second step).
  • the moving body of the machine tool is moved in the moving direction of the tip of the robot, and the detection unit positions the detected part at a position, and detects the moving distance of the moving body ( 3rd step).
  • the moving distance of the moving body can be directly measured by using a general-purpose measuring device called a linear scale, for example, in the same manner as described above, or alternatively, if the machine tool is equipped with a numerical controller and this numerical control is
  • the moving distance of the mobile body can be calculated based on the position information of the mobile body obtained from the numerical control device. Thus, by doing so, it is possible to recognize the movement distance of the moving object with high accuracy.
  • the conventional expensive measuring device is not used.
  • the positioning accuracy of the robot tip can be measured. Any measuring instrument can be used. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
  • the positioning accuracy of the tip of the robot can be measured by using a machine tool to which the robot system is applied without using an expensive measuring device as in the conventional art. can do. Moreover, even if the movement distance of the moving body is to be directly measured, a general-purpose inexpensive measuring device can be used. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
  • FIG. 1 is a block diagram showing the configuration of a robot system according to a first embodiment;
  • FIG. It is a perspective view showing the main composition of the machine tool concerning a 1st embodiment.
  • 1 is a perspective view showing an automatic guided vehicle and a robot according to a first embodiment;
  • FIG. 4 is an explanatory diagram showing identification figures according to the first embodiment;
  • FIG. 4 is an explanatory diagram for explaining a positioning accuracy measuring method according to the first embodiment;
  • FIG. 4 is an explanatory diagram for explaining a positioning accuracy measuring method according to the first embodiment; It is a perspective view showing the main composition of the machine tool concerning the modification of a 1st embodiment and a 2nd embodiment.
  • FIG. 11 is a perspective view showing an automatic guided vehicle and a robot according to modifications of the first embodiment and the second embodiment;
  • FIG. 11 is a perspective view showing the main configuration of a machine tool according to a third embodiment of the invention;
  • FIG. 11 is an explanatory diagram for explaining a positioning accuracy measuring method according to the third embodiment;
  • FIG. 11 is a perspective view showing an automatic guided vehicle and a robot according to a modified example of the third embodiment;
  • FIG. 11 is a perspective view showing the main configuration of a machine tool according to a modified example of the third embodiment;
  • a production system 1 of this example is composed of a robot system 20, a machine tool 100, a material stocker 120 and a product stocker 121 as peripheral devices, and the like. , an automatic guided vehicle 35, a robot 25 mounted on the automatic guided vehicle 35, a control device 40 for controlling the robot 25 and the automatic guided vehicle 35, and the like.
  • the machine tool 100 is a compound machining type NC machine tool controlled by a numerical controller (not shown). As shown in FIG. A first main shaft 101 and a second main shaft 103 having coaxial shafts and facing each other, a turret 108 capable of mounting a plurality of tools, and a tool rest rotatably holding the turret 108. 107 and the like.
  • the tool spindle 105 is driven by a first X m -axis feed mechanism, a first Y m -axis feed mechanism, and a first Z m -axis feed mechanism (not shown) under the control of a numerical controller (not shown ) . It moves in the m -axis and Zm -axis directions, and similarly, the tool rest 107 moves in the Xm -axis and Zm -axis directions by a second Xm-axis feed mechanism and a second Zm-axis feed mechanism ( not shown ) .
  • the Zm axis is a feed axis parallel to the axes of the first spindle 101 and the second spindle 103
  • the Xm axis is a feed axis perpendicular to the Z axis in the horizontal plane
  • the Ym axis is a feed axis in the vertical direction. It is the feed axis.
  • a first chuck 102 is attached to the first spindle 101 and a second chuck 104 is attached to the second spindle 103.
  • the workpiece W to be machined is and both ends of the machined work W' are gripped.
  • the first main spindle 101 and the second main spindle 103 are rotated about their axes, and the tool main spindle 105 and the tool post 107 are moved along the X m axis. , Ym -axis and Zm -axis directions, the workpiece W to be machined is machined by the tool mounted on the tool spindle 105 and the tool mounted on the turret 108 .
  • FIG. 3 shows a state in which the holder 110 to which the identification figure 111 is adhered is attached to the tool spindle 105.
  • the holder 110 is in the tool storage section. It is stored in a certain tool magazine (not shown), and is taken out from the tool magazine (not shown) and mounted on the tool spindle 105 when measuring the positioning accuracy of the robot system 20 .
  • the material stocker 120 is arranged on the left side of the machine tool 100 in FIG.
  • the product stocker 121 is arranged on the right side of the machine tool 100 in FIG.
  • the automatic guided vehicle 35 has the robot 25 mounted on a mounting surface 36, which is the upper surface thereof, and is provided with an operation panel 37 that can be carried by an operator.
  • the operation panel 37 includes an input/output unit for inputting/outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
  • the automatic guided vehicle 35 is equipped with a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light), and under the control of the control device 40, the machine tool 100 , the machine tool 100, the material stocker 120 and the product stocker 121. through each working position.
  • the control device 40 is arranged inside the automatic guided vehicle 35 .
  • the robot 25 is an articulated robot having three arms, a first arm 26, a second arm 27 and a third arm 28, which are manipulator units. is attached, and a hand 30 as an end effector is attached to the support shaft 29 . Further, a support bar 31 is attached to the support shaft 29, and a camera 32 is attached to the support bar 31 as an end effector. Under the control of the controller 40, the robot 25 moves the hand 30 and the camera 32 within a three-dimensional space defined by three orthogonal axes xr , yr and zr . In this example, the xr axis is set substantially parallel to the front surface of the automatic guided vehicle 35 .
  • the control device 40 includes an operation program storage unit 41, a movement position storage unit 42, an operation posture storage unit 43, a map information storage unit 44, a manual operation control unit 45, an automatic operation control unit 46, a map It is composed of an information generation unit 47, a position recognition unit 48, an input/output interface 49, and the like.
  • the control device 40 is connected to the machine tool 100 , the material stocker 120 , the product stocker 121 , the robot 25 , the automatic guided vehicle 35 and the operation panel 37 via the input/output interface 49 .
  • the control device 40 is composed of a computer including a CPU, RAM, ROM, etc.
  • the manual operation control unit 45, the automatic operation control unit 46, the map information generation unit 47, the position recognition unit 48, and the input/output interface 49 are The function is realized by a computer program, and the processing described later is executed.
  • the motion program storage unit 41, the movement position storage unit 42, the motion posture storage unit 43, and the map information storage unit 44 are composed of appropriate storage media such as RAM.
  • the manual operation control unit 45 is a functional unit that operates the unmanned guided vehicle 35 and the robot 25 according to operation signals input from the operation panel 37 by the operator. That is, the operator can manually operate the automatic guided vehicle 35 and the robot 25 using the operation panel 37 under the control of the manual operation control unit 45 .
  • the manual operation control unit 45 controls, for example, the automatic guided vehicle 35 from the operation panel 37 in two orthogonal axes (x, r , y r axis) is input, the automatic guided vehicle 35 is moved in the direction corresponding to the input signal by the corresponding distance, and is orthogonal to the xr axis and the yr axis.
  • the automatic guided vehicle 35 is turned according to the input signal.
  • the manual operation control unit 45 receives the input The tip of the robot 25 is moved by the corresponding distance in the direction corresponding to the signal. Further, when a signal for opening and closing the hand 30 is input from the operation panel 37, the manual operation control unit 45 opens and closes the hand 30 in response to the input, and receives a signal for operating the camera 32 from the operation panel 37. Then, the camera 32 is operated accordingly.
  • the operation program storage unit 41 stores an automatic operation program for automatically operating the automatic guided vehicle 35 and the robot 25 during automatic production, and stores the automatic guided vehicle 35 when generating factory map information, which will be described later.
  • This is a functional unit that stores a map generation program for operation.
  • the automatic driving program and the map generating program are input from, for example, an input/output unit provided on the operation panel 37 and stored in the operation program storage unit 41 .
  • the automatic operation program includes command codes relating to the movement position as a target position to which the automatic guided vehicle 35 moves, the movement speed, and the orientation of the automatic guided vehicle 35. and a command code for operating the camera 32 are included.
  • the map generation program includes command codes for causing the automatic guided vehicle 35 to travel all over the factory without a track so that the map information generation unit 47 can generate map information.
  • the map information storage unit 44 is a functional unit that stores map information including arrangement information of machines, devices, equipment, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 travels. It is generated by the map information generator 47 .
  • the map information generation unit 47 causes the automatic guided vehicle 35 to travel according to the map generation program stored in the operation program storage unit 41 under the control of the automatic operation control unit 46 of the control device 40, Acquiring spatial information in the factory from the distance data detected by the sensor, and recognizing the planar shape of the equipment etc. installed in the factory, for example, based on the planar shape of the pre-registered equipment etc.
  • the machine tool 100, the material stocker 120, and the product stocker 121 in this example, which are arranged in the factory, are recognized in terms of position, planar shape, etc. (layout information).
  • the map information generating unit 47 stores the obtained spatial information and the arrangement information of the devices, etc. in the map information storage unit 44 as map information of the factory.
  • the position recognition unit 48 recognizes the position and posture of the automatic guided vehicle 35 in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 44. Based on the position and orientation of the automatic guided vehicle 35 recognized by the position recognition section 48 , the operation of the automatic guided vehicle 35 is controlled by the automatic operation control section 46 .
  • the movement position storage unit 42 is a movement position as a specific target position to which the automatic guided vehicle 35 moves, and is a functional unit that stores a specific movement position corresponding to the command code in the operation program. These movement positions include the work positions set for the machine tool 100, the material stocker 120, and the product stocker 121 described above.
  • This movement position is determined, for example, by manually operating the automatic guided vehicle 35 using the operation panel 37 under the control of the manual operation control unit 45 to move it to each target position, and then performing the position recognition. It is set by the operation of storing the position data recognized by the unit 48 in the movement position storage unit 42 . This operation is called a so-called teaching operation.
  • the motion posture storage unit 43 stores data relating to motion postures (motion postures) of the robot 25 that sequentially change as the robot 25 moves in a predetermined order, corresponding to command codes in the motion program. is a functional unit that stores the The data relating to the motion posture is obtained when the robot 25 is manually operated by teaching operation using the operation panel 37 under the control of the manual operation control unit 45 to take each target posture. , the rotation angle data of each joint (motor) of the robot 25 in each posture, and this rotation angle data is stored in the motion posture storage unit 43 as data relating to the motion posture.
  • the specific motion posture of the robot 25 is set in each of the material stocker 120, the machine tool 100 and the product stocker 121.
  • the work start posture takeout start posture
  • the unprocessed work W stored in the material stocker 120 is gripped by the hand 30, and the material stocker 120 is
  • Each work posture (each take-out posture) for taking out from 120 and the posture when taking out is completed (take-out completion posture, which is the same posture as the take-out start posture in this example) are set as take-out motion postures.
  • a work take-out motion attitude for taking out the machined work W' from the machine tool 100 and a work mounting motion attitude for attaching the unmachined work W to the machine tool 100 are set.
  • the automatic operation control unit 46 is a functional unit that uses either the automatic operation program or the map generation program stored in the operation program storage unit 41 and operates the automatic guided vehicle 35 and the robot 25 according to the program. . At that time, the data stored in the movement position storage section 42 and the motion posture storage section 43 are used as necessary.
  • the automatic operation program stored in the operation program storage unit 41 under the control of the automatic operation control unit 46 of the control device 40 is executed, the automatic guided vehicle 35 and the robot 25 are operated according to this automatic operation program, and unmanned automatic production is executed.
  • the holder 110 to which the identification figure 111 is attached is taken out from a tool magazine (not shown) and mounted on the tool spindle 105, which is the moving body of the machine tool 100, and the automatic guided vehicle 35 and the robot 25 are operated. Then, the tool spindle 105, the automatic guided vehicle 35, and the robot 25 are positioned at a position where the identification graphic 111 can be imaged by the camera 32 attached to the robot 25 (this position is referred to as a "first position"). do. At this time, the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 . The positional relationship between the holder 110 positioned in this way and the tip of the robot 25 is indicated by a solid line in FIG. 6(a).
  • the identification figure 111 has a matrix structure in which a plurality of square pixels are arranged two-dimensionally, and each pixel is displayed in white or black. . In FIGS. 3 and 5, black pixels are shaded.
  • FIG. 7(a) shows an image of the identification figure 111 captured by the camera 32 at the first position.
  • the robot 25 is driven to position its tip in the xr -axis plus direction at a position (this position is referred to as a "second position") moved by a predetermined target distance.
  • the tool spindle 105 is moved by the same target distance in the Zm -axis positive direction, which is the same direction as the moving direction of the tip of the robot 25 .
  • This state is indicated by a dashed line in FIG. 6(b).
  • the movement distance of the tip of the robot 25 is recognized based on control position information in the manual operation control unit 45.
  • the movement distance of the tool spindle 105 is controlled by the numerical controller (not shown). is recognized based on the location information of
  • FIG. 7(b) shows an image of the identification figure 111 captured by the camera 32 at the second position.
  • the relative positional relationship (x r1 , z r1 ) between the camera 32 and the identification figure 111 at the first position, and the relative positional relationship (x r2 , z r2 ), the positioning accuracy of the tip of the robot 25 in the x r -axis direction is calculated.
  • the robot 25 can be measured by using the machine tool 100 to which the robot system 20 is applied without using an expensive measuring device as in the conventional art. It is possible to measure the positioning accuracy of the tip portion of the tip easily and inexpensively. The positioning accuracy depends on the resolution of the camera 32. By using a high-resolution camera 32, the positioning accuracy can be measured with high accuracy.
  • the camera 32 is arranged on the robot 25 and the identification figure 111 is arranged on the tool spindle 105 of the machine tool 100.
  • the arrangement of the camera 32 and the identification figure 111 is It is relative, and as shown in FIG. 8, a camera 112 may be arranged on the tool spindle 105, and an identification figure may be arranged on the robot 25, as shown in FIG.
  • the positioning accuracy of the tip of the robot 25 can be measured.
  • a holder 110 holding a camera 112 is attached to the tool spindle 105 .
  • an identification figure 111 is attached to the front surface of the support bar 31 of the robot 25 .
  • the holder 110 to which the identification figure 111 is attached is taken out from a tool magazine (not shown) and attached to the tool spindle 105, which is a moving body of the machine tool 100 ( 3), the automatic guided vehicle 35 and the robot 25 are operated, and the tool spindle 105 and the automatic guided vehicle are moved to a position (first position) where the identification graphic 111 can be imaged by the camera 32 attached to the robot 25. Position the car 35 and the robot 25 . At this time, the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 (see FIG. 6). Then, at this first position, the identification graphic 111 is imaged by the camera 32 .
  • the robot 25 is driven to position its tip in the xr -axis plus direction by a predetermined target distance (second position).
  • the robot 25 is moved in the positive direction of the Zm axis, which is the same direction as the moving direction of the tip of the robot 25, and the position of the identification figure 111 captured by the camera 32 on the camera frame is the identification figure 111 imaged at the first position. , and the movement distance of the tool spindle 105 is detected.
  • the position of the identification graphic 111 on the camera frame is, for example, as shown in FIG. 7, the central position ((x r1 , z r1 ) , (x r2 , z r2 )). Therefore, for example, based on the image captured at the first position, the center position (x r1 , z r1 ) of the identification graphic 111 on the captured image is calculated, and the camera 32 positioned at the second position is calculated.
  • the tool spindle 105 is positioned such that the center position (x r2 , z r2 ) of the identification figure 111 imaged by , coincides with the center position (x r1 , z r1 ) at the first position.
  • the detected movement distance of the tool spindle 105 and the movement distance of the tip of the robot 25 are calculated.
  • the positioning accuracy of the tip of the robot 25 in the xr -axis direction is calculated by calculating the difference value between .
  • the robot system can be By using the machine tool 100 to which 20 is applied, the positioning accuracy of the tip of the robot 25 can be measured easily and inexpensively. Also in this aspect, by using a high-resolution camera 32, the positioning accuracy can be measured with high accuracy.
  • the camera 32 is arranged on the robot 25 and the identification figure 111 is arranged on the tool spindle 105 of the machine tool 100, but the arrangement of the camera 32 and the identification figure 111 is relative.
  • a camera 112 may be arranged on the tool spindle 105 as shown in FIG. 8, and an identification figure may be arranged on the robot 25 as shown in FIG. Even in this manner, the positioning accuracy of the tip of the robot 25 can be measured by performing the same operation as described above.
  • the holder 110 holding the touch probe 114 as the detection unit is taken out from a tool magazine (not shown) and mounted on the tool spindle 105 which is the moving body of the machine tool 100 .
  • the automatic guided vehicle 35 and the robot 25 are operated so that the right end face of the end face of the support bar 31 of the robot 25 moves to a position near the plus side of the touch probe 114 in the Zm -axis direction. (first position).
  • the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 (see FIG. 11).
  • the tool spindle 105 is moved in the Z m -axis plus direction until the touch probe 114 touches the end surface of the support bar 31 and detects the support bar 31 (see FIG. 11(a)).
  • the support bar 31 functions as a part to be detected for the touch probe 114 .
  • the robot 25 is driven to move the tip end portion of the robot 25 in the positive direction of the xr axis by a predetermined target distance xra as shown in FIG.
  • the tool spindle 105 is moved in the Zm -axis plus direction, which is the same direction as the moving direction of the tip of the robot 25, until the touch probe 114 touches the end surface of the support bar 31 and detects the support bar 31 ( See FIG. 11(b)).
  • the movement distance Zma of the tool spindle 105 is detected based on the control position information in the numerical controller (not shown), the detected movement distance Zma of the tool spindle 105 and the robot 25
  • the positioning accuracy of the tip of the robot 25 in the xr -axis direction is calculated by calculating the difference from the moving distance xra of the tip.
  • a conventional expensive measurement device is used.
  • the machine tool 100 to which the robot system 20 is applied the positioning accuracy of the tip portion of the robot 25 can be measured easily and inexpensively.
  • the part to be detected is set (that is, arranged) in the robot 25, and the touch probe 114 as the detection part is arranged in the tool spindle 105 of the machine tool 100. 12, the hand 30 of the robot 25 grips the touch probe 114, and as shown in FIG. 13, a block-shaped detected portion 116 is formed.
  • the holder 115 may be attached to the tool spindle 105 . Also in this mode, the positioning accuracy of the tip of the robot 25 can be measured by performing the same operation as described above.
  • the positioning accuracy of the tip of the robot 25 in the xr -axis direction is measured.
  • the positioning accuracy of the tip of the robot 25 in the zr -axis direction can be measured.
  • the positioning accuracy of the automatic guided vehicle 35 in the xr -axis direction can be measured by a similar operation, not limited to the robot 25 .
  • the positioning accuracy of the tip of the robot 25 in the xr -axis direction or the zr -axis direction in the combined operation of the automatic guided vehicle 35 and the robot 25, that is, in a mode in which both are moved, is measured by the same operation as above. can do.
  • the moving distance of the tool spindle 105 as a moving body can be directly measured using, for example, a general-purpose inexpensive measuring instrument called a linear scale.
  • the identification figure is not limited to the above example, and may be of any shape as long as the relative positional relationship between the camera and the identification figure can be obtained from the captured image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

According to the present invention, an identification figure (111) is disposed on one of a moving body (105) or a tip of a robot of a machine tool, and a camera is disposed on the other of the moving body (105) or the tip of the robot. The moving body (105) and the tip of the robot are positioned at a first position, the identification figure (111) is imaged by the camera, and on the basis of the image, the relative positions of the camera and the identification figure (111) at the first position are calculated. Next, at least one of the robot and an automatic guided vehicle is operated to position the tip of the robot at a predetermined second position and move the moving body (105) in the same direction by the same distance to image the identification figure (111). Next, on the basis of the obtained image, the relative positions of the camera and the identification figure (111) at the second position are calculated, and on the basis of the relative positions of the camera and the identification figure at the first position and the second position, the positioning accuracy of the tip of the robot is calculated.

Description

ロボットシステムの位置決め精度測定方法Robot system positioning accuracy measurement method
 本発明は、工作機械に対して作業を行うロボット、このロボットを搭載し、工作機械に対して設定された作業位置に経由する無人搬送車、並びにこれらロボット及び無人搬送車の動作を制御する制御装置から構成されるロボットシステムにおいて、当該ロボット先端部の位置決め精度を測定する方法に関する。 The present invention relates to a robot that performs work on a machine tool, an automatic guided vehicle equipped with this robot and passing through a work position set with respect to the machine tool, and a control that controls the operation of these robots and the automatic guided vehicle. The present invention relates to a method for measuring the positioning accuracy of a robot tip in a robot system composed of devices.
 従来、上述したロボットシステムの一例として、国際公開第2018/92222号(下記特許文献1)に開示された自走ロボットが知られている。この自走ロボットは、無人搬送車と、この無人搬送車上に設けられた3軸以上の自由度を持つマニピュレータとを備えて構成され、位置基準情報としての電波やレーザ光を観測することで、自身のおよその位置を認識しながら目標位置付近まで移動し、この後、目標物又は目標物に取り付けられたマーカをカメラにより認識することによって、精密な位置決めをするように構成されている。 Conventionally, a self-propelled robot disclosed in International Publication No. 2018/92222 (Patent Document 1 below) is known as an example of the above-described robot system. This self-propelled robot consists of an unmanned guided vehicle and a manipulator with 3 or more degrees of freedom provided on the unmanned guided vehicle. , moves to the vicinity of the target position while recognizing its own approximate position, and then recognizes the target or the marker attached to the target by the camera, thereby performing precise positioning.
 そして、この自走ロボットは、複数の工作機械からそれぞれ送信される、被搬送物を搬送する搬送作業、並びに切屑の廃棄、工具の入れ替えや給油といった保守作業に関する作業要求を受信し、受信した作業を要求元の工作機械に対して実行する。 Then, this self-propelled robot receives work requests related to transport work for transporting objects to be transported and maintenance work such as disposal of chips, replacement of tools, and lubrication, which are transmitted from a plurality of machine tools. to the requesting machine tool.
国際公開第2018/92222号WO2018/92222
 ところで、上述したマニピュレータ(ロボット)は、複数本のアーム(リンク)をこれらが連続するように関節によって連結した構造を有するものであり、その先端部に設けられるハンド(エンドエフェクタ)の位置決め誤差は、各関節の回転精度によって支配され、各関節部の動作誤差が累積されたものとなる。このため、例えば、経年劣化等によって各関節部の動作誤差が大きくなると、これらが累積される結果として、ハンドの位置決め誤差が大きくなり、この結果、ロボットの動作精度が悪化することになる。 By the way, the manipulator (robot) described above has a structure in which a plurality of arms (links) are connected by joints so that they are continuous, and the positioning error of the hand (end effector) provided at the tip is , is governed by the rotational accuracy of each joint, and the motion error of each joint is accumulated. For this reason, for example, when motion errors of the joints become large due to deterioration over time, etc., these are accumulated, resulting in a large positioning error of the hand, and as a result, the motion accuracy of the robot deteriorates.
 また、上述した無人搬送車は、車輪によって移動するように構成されており、従来、この車輪には樹脂製の物が使用されている。このため、無人搬送車が長時間稼働すると車輪が摩耗し、このように車輪が摩耗すると無人搬送車の位置決め精度が悪化することになり、この結果、この無人搬送車上に配設されたロボットのハンドの位置決め精度が悪化することになる。 In addition, the above-mentioned unmanned guided vehicle is configured to move on wheels, and conventionally, these wheels are made of resin. For this reason, when the automatic guided vehicle operates for a long period of time, the wheels wear, and when the wheels wear like this, the positioning accuracy of the automatic guided vehicle deteriorates. positioning accuracy of the hand is deteriorated.
 そして、上記のようにして、ロボットのハンドの位置決め精度が悪化すると、ロボットシステムに予定された動作に誤差が生じて、当該動作が完遂されないことになる。 Then, as described above, when the positioning accuracy of the robot's hand deteriorates, an error occurs in the operation planned for the robot system, and the operation is not completed.
 したがって、当該ロボットシステムにおける前記ハンドの位置決め精度が許容範囲内のものに保証されるように、例えば、その位置決め精度を定期的に測定して、位置決め精度が許容範囲外になったときには、ロボットの関節部を構成する部品の交換や、無人搬送車の車輪の交換といった保守作業を行う必要がある。 Therefore, in order to ensure that the positioning accuracy of the hand in the robot system is within the allowable range, for example, the positioning accuracy is periodically measured, and when the positioning accuracy is outside the allowable range, the robot It is necessary to perform maintenance work such as replacement of parts that make up the joints and replacement of the wheels of the automatic guided vehicle.
 ところが、従来、ハンドの位置決め精度を測定するには、それ専用の特別な測定装置が必要であり、このような測定装置は概して高価であるため、一般のユーザにおいては、このような設備コストが嵩む高価な測定装置を用意することは、現実的ではなく、困難なことであった。 Conventionally, however, measuring the positioning accuracy of the hand requires a dedicated special measuring device, and such a measuring device is generally expensive. It was impractical and difficult to prepare a bulky and expensive measuring device.
 本発明は、以上の実情に鑑みなされたものであって、前記ロボットシステムおけるロボット先端部の位置決め精度を、特別な測定装置を用いることなく測定できる測定方法の提供を、その目的とする。 The present invention has been made in view of the above circumstances, and its object is to provide a measuring method capable of measuring the positioning accuracy of the tip of the robot in the robot system without using a special measuring device.
 上記課題を解決するための本発明は、先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
 前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
 前記無人搬送車及びロボットを制御する制御装置とを備え、
 識別図形を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記識別図形を撮像するカメラを前記工作機械の移動体又は前記ロボットの先端部の他方に配設したロボットシステムにおいて、
 前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
 前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第1位置におけるカメラと識別図形との相対的な位置関係を算出する第1工程と、
 前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させた第2位置に位置決めするとともに、前記工作機械の移動体を、前記ロボットの先端部の移動方向に、同じ目標距離だけ移動させる第2工程と、
 この後、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第2位置におけるカメラと識別図形との相対的な位置関係を算出する第3工程と、
 前記第1位置及び第2位置におけるカメラと識別図形との相対的な位置関係に基づいて、前記ロボットの先端部の前記移動方向における位置決め精度を算出する第4工程とを、順次実施するようにしたロボットシステムの位置決め精度測定方法に係る。
The present invention for solving the above problems is a robot having an end effector at its distal end and using the end effector to perform work on a machine tool;
an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
A control device that controls the automatic guided vehicle and the robot,
A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot. in
A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
Positioning the moving body of the machine tool and the tip of the robot at a first position where the camera can capture the identification figure, capturing the identification figure with the camera, and based on the obtained image, a first step of calculating a relative positional relationship between the camera and the identification figure at the first position;
By operating at least one of the robot and the automatic guided vehicle, the tip portion of the robot is positioned in a predetermined direction at a second position moved by a predetermined target distance, and the machining is performed. a second step of moving the moving body of the machine by the same target distance in the moving direction of the tip of the robot;
Thereafter, a third step of capturing an image of the identification figure with the camera and calculating a relative positional relationship between the camera and the identification figure at a second position based on the obtained image;
and a fourth step of calculating the positioning accuracy of the distal end portion of the robot in the moving direction based on the relative positional relationship between the camera and the identification graphic at the first position and the second position. The present invention relates to a positioning accuracy measuring method for a robot system.
 この態様(第1の態様)の測定方法によれば、まず、ロボットシステムは、識別図形が前記工作機械の移動体又は前記ロボットの先端部の一方に配設され、前記識別図形を撮像するカメラが前記工作機械の移動体又は前記ロボットの先端部の他方に配設された状態に設定される。 According to the measuring method of this aspect (first aspect), first, the robot system has an identification figure disposed on one of the moving body of the machine tool or the tip of the robot, and a camera that captures the identification figure. is disposed on the other of the moving body of the machine tool and the tip of the robot.
 次に、前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第1位置におけるカメラと識別図形との相対的な位置関係を算出する(第1工程)。例えば、カメラによって撮像された識別図形を含む画像を基に、当該画像上の識別図形の中心位置を算出することによって、カメラと識別図形との相対的な位置関係を算出(認識)する。 Next, the movable body of the machine tool and the tip of the robot are positioned at a first position where the identification figure can be imaged by the camera, and the identification figure is imaged by the camera. Based on this, the relative positional relationship between the camera and the identification figure at the first position is calculated (first step). For example, the relative positional relationship between the camera and the identification figure is calculated (recognized) by calculating the center position of the identification figure on the image based on the image including the identification figure captured by the camera.
 次に、前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させた第2位置に位置決めするとともに、前記工作機械の移動体を、前記ロボットの先端部の移動方向に、同じ目標距離だけ移動させた後(第2工程)、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第2位置におけるカメラと識別図形との相対的な位置関係を算出する(第3工程)。 Next, by operating at least one of the robot and the automatic guided vehicle, the distal end of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position. , after moving the moving body of the machine tool by the same target distance in the movement direction of the tip of the robot (second step), the identification figure is imaged by the camera, and based on the obtained image, , the relative positional relationship between the camera and the identification figure at the second position is calculated (third step).
 前記第2位置におけるカメラと識別図形との相対的な位置関係は、上述したように、カメラによって撮像された識別図形を含む画像を基に、当該画像上の識別図形の中心位置を算出することによって得ることができる。尚、移動体の移動距離は、例えば、リニアスケールと称される汎用の測定器を用いて直接的に測定することができ、このような測定器を用いて移動体を移動させることで、当該移動体を、目標とする距離だけ高精度に移動させることができる。或いは、工作機械が数値制御装置を備え、移動体の動作がこの数値制御装置によって制御される場合には、当該移動体を、この数値制御装置による制御の下で移動させることにより、目標とする距離だけ高精度に移動させることができる。 The relative positional relationship between the camera and the identification figure at the second position can be obtained by calculating the center position of the identification figure on the image based on the image containing the identification figure captured by the camera, as described above. can be obtained by The moving distance of the moving body can be directly measured using a general-purpose measuring device called a linear scale, for example, and by moving the moving body using such a measuring device, the A moving object can be moved by a target distance with high accuracy. Alternatively, when the machine tool is provided with a numerical controller and the operation of the moving body is controlled by this numerical controller, the moving body is moved under the control of this numerical controller to achieve the target It can be moved by distance with high accuracy.
 そして、前記第1位置及び第2位置におけるカメラと識別図形との相対的な位置関係から、ロボット先端部の前記移動方向における位置決め精度を算出する(第4工程)。即ち、第2位置へのロボット先端部の移動距離と、移動体の移動距離とが高精度に一致していれば、第1位置において撮像された画像上の識別図形の(中心)位置と、第2位置において撮像された画像上の識別図形の(中心)位置とは同じ位置となる。したがって、第1位置において撮像された画像上の識別図形の(中心)位置と、第2位置において撮像された画像上の識別図形の(中心)位置との差分を算出することによって、ロボット先端部の位置決め精度を算出することができる。 Then, from the relative positional relationship between the camera and the identification figure at the first position and the second position, the positioning accuracy of the tip of the robot in the movement direction is calculated (fourth step). That is, if the moving distance of the tip of the robot to the second position and the moving distance of the moving object match with high accuracy, the (center) position of the identification figure on the image captured at the first position, The position is the same as the (center) position of the identification figure on the image captured at the second position. Therefore, by calculating the difference between the (center) position of the identification figure on the image captured at the first position and the (center) position of the identification figure on the image captured at the second position, the tip of the robot can be calculated.
 以上のように、本発明の第1の態様に係る測定方法によれば、従来のような高価な測定装置を用いなくても、ロボットシステムが適用される工作機械を用いることで、ロボット先端部の位置決め精度を測定することができ、また、仮に、移動体の移動距離を測定する必要がある場合でも、汎用の安価な測定器を用いることで、ロボット先端部の位置決め精度を測定することができる。斯くして、この測定方法によれば、ロボット先端部の位置決め精度を安価且つ容易に測定することができる。 As described above, according to the measuring method according to the first aspect of the present invention, by using a machine tool to which a robot system is applied without using an expensive measuring device as in the conventional art, the tip of the robot can be measured. In addition, even if it is necessary to measure the moving distance of a moving object, it is possible to measure the positioning accuracy of the tip of the robot by using a general-purpose inexpensive measuring instrument. can. Thus, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
 また、本発明は、先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
 前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
 前記無人搬送車及びロボットを制御する制御装置とを備え、
 識別図形を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記識別図形を撮像するカメラを前記工作機械の移動体又は前記ロボットの先端部の他方に配設したロボットシステムにおいて、
 前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
 前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像する第1工程と、
 前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする第2工程と、
 前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記カメラによって捉えられる前記識別図形のカメラフレーム上の位置が、前記第1位置において撮像された識別図形のカメラフレーム上の位置と一致する位置に位置決めするとともに、前記移動体の移動距離を検出する第3工程と、
 前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボット先端部の前記移動方向における位置決め精度とする第4工程とを、順次実施するようにしたロボットシステムの位置決め精度測定方法に係る。
Further, the present invention provides a robot having an end effector at its distal end and using the end effector to perform work on a machine tool;
an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
A control device that controls the automatic guided vehicle and the robot,
A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot. in
A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
a first step of positioning the moving body of the machine tool and the distal end portion of the robot at a first position where the identification figure can be imaged by the camera, and imaging the identification figure by the camera;
a second step of moving at least one of the robot and the automatic guided vehicle to move the distal end portion of the robot in a predetermined direction by a predetermined target distance and positioning it at a second position; ,
The moving body of the machine tool is moved in the moving direction of the tip of the robot, and the position of the identification figure captured by the camera on the camera frame is changed to the position on the camera frame of the identification figure imaged at the first position. A third step of positioning at a position coinciding with the position of and detecting the moving distance of the moving body;
a fourth step of calculating a difference value between the moving distance of the robot tip and the detected moving distance of the moving body, and determining the positioning accuracy of the robot tip in the moving direction; It relates to a positioning accuracy measuring method for a robot system.
 この態様(第2の態様)の測定方法によれば、まず、ロボットシステムは、識別図形が前記工作機械の移動体又は前記ロボットの先端部の一方に配設され、前記識別図形を撮像するカメラが前記工作機械の移動体又は前記ロボットの先端部の他方に配設された状態に設定される。 According to the measuring method of this aspect (second aspect), first, the robot system has an identification figure provided on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure. is disposed on the other of the moving body of the machine tool and the tip of the robot.
 次に、前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像する(第1工程)。そして、例えば、カメラによって撮像された識別図形を含む画像を基に、当該撮像画像上の識別図形の中心位置を算出することによって、当該撮像画像上の識別図形の位置を認識する。 Next, the moving body of the machine tool and the tip of the robot are positioned at a first position where the identification figure can be imaged by the camera, and the identification figure is imaged by the camera (first step). Then, for example, the position of the identification figure on the captured image is recognized by calculating the center position of the identification figure on the captured image based on the image containing the identification figure captured by the camera.
 次に、前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする(第2工程)。 Next, by operating at least one of the robot and the automatic guided vehicle, the tip of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position ( second step).
 この後、前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記カメラによって捉えられる前記識別図形のカメラフレーム(撮像画像)上の位置が、前記第1位置において撮像された識別図形の撮像画像上の位置と一致する位置に位置決めするとともに、前記移動体の移動距離を検出する(第3工程)。尚、移動体の移動距離は、上述と同様に、例えば、リニアスケールと称される汎用の測定器を用いて直接的に測定することができ、或いは、工作機械が数値制御装置を備え、この数値制御装置によって移動体の動作が制御される場合には、移動体の移動距離は、数値制御装置から得られる移動体の位置情報に基づいて算出することができる。斯くして、このようにすることで、移動体の移動距離を高精度に認識することができる。 After that, the moving body of the machine tool is moved in the movement direction of the tip of the robot, and the position of the identification figure captured by the camera on the camera frame (captured image) is imaged at the first position. Positioning is performed so as to match the position of the identified figure on the picked-up image, and the movement distance of the moving body is detected (third step). The moving distance of the moving object can be directly measured by using a general-purpose measuring device called a linear scale, or the machine tool can be equipped with a numerical control device, as described above. When the motion of the moving body is controlled by a numerical control device, the moving distance of the moving body can be calculated based on the positional information of the moving body obtained from the numerical control device. Thus, by doing so, it is possible to recognize the movement distance of the moving object with high accuracy.
 次に、前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボット先端部の前記移動方向における位置決め精度とする(第4工程)。 Next, a difference value between the moving distance of the robot tip and the detected moving distance of the moving body is calculated, and the positioning accuracy of the robot tip in the moving direction is calculated (fourth step).
 以上のように、本発明の第2の態様に係る測定方法によれば、上述した第1の態様に係る測定方法と同様に、従来のような高価な測定装置を用いなくても、ロボットシステムが適用される工作機械を用いることで、ロボット先端部の位置決め精度を測定することができ、また、仮に、移動体の移動距離を直接的に測定する場合でも、汎用の安価な測定器を用いることができる。したがって、この測定方法によれば、ロボット先端部の位置決め精度を安価且つ容易に測定することができる。 As described above, according to the measuring method according to the second aspect of the present invention, similarly to the measuring method according to the first aspect described above, the robot system can be can be used to measure the positioning accuracy of the robot tip by using a machine tool to which the be able to. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
 また、本発明は、先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
 前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
 前記無人搬送車及びロボットを制御する制御装置とを備え、
 検知部を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記検知部によって検出される被検知部が前記工作機械の移動体又は前記ロボットの先端部の他方に設定されたロボットシステムにおいて、
 前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
 前記検知部によって前記被検知部を検出可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記検知部により前記被検知部を検出する第1工程と、
 前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする第2工程と、
 前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記検知部が前記被検知部を検知する位置に位置決めするとともに、前記移動体の移動距離を検出する第3工程と、
 前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボットの先端部の前記移動方向における位置決め精度とする第4工程とを、順次実施するようにしたロボットシステムの位置決め精度測定方法に係る。
Further, the present invention provides a robot having an end effector at its distal end and using the end effector to perform work on a machine tool;
an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
A control device that controls the automatic guided vehicle and the robot,
A detection unit is disposed on either the moving body of the machine tool or the tip of the robot, and the detected portion detected by the detection unit is set to the other of the moving body of the machine tool or the tip of the robot. In a robot system with
A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
a first step of positioning the moving body of the machine tool and the distal end portion of the robot at a first position where the detected portion can be detected by the detection portion, and detecting the detected portion by the detection portion;
a second step of moving at least one of the robot and the automatic guided vehicle to move the distal end portion of the robot in a predetermined direction by a predetermined target distance and positioning it at a second position; ,
a third step of moving the moving body of the machine tool in the movement direction of the tip portion of the robot to position the detection section at a position where the detected portion is detected, and detecting the moving distance of the moving body; ,
a fourth step of calculating a difference value between the movement distance of the robot tip and the detected movement distance of the moving body, and determining the positioning accuracy of the tip of the robot in the movement direction; The present invention relates to a positioning accuracy measuring method for a robot system configured as such.
 この態様(第3の態様)の測定方法によれば、まず、ロボットシステムは、検知部が前記工作機械の移動体又は前記ロボットの先端部の一方に配設され、前記検知部によって検出される被検知部が前記工作機械の移動体又は前記ロボットの先端部の他方に配設された状態に設定される。 According to the measuring method of this aspect (third aspect), first, in the robot system, a detection unit is disposed on either the moving body of the machine tool or the tip of the robot, and the detection unit detects A state is set in which the detected portion is disposed on the other of the moving body of the machine tool and the tip portion of the robot.
 次に、前記検知部によって前記被検知部を検出可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記検知部により前記被検知部を検出する(第1工程)。 Next, the moving body of the machine tool and the distal end portion of the robot are positioned at a first position where the detected portion can be detected by the detection portion, and the detected portion is detected by the detection portion (first position). 1 step).
 次に、前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする(第2工程)。 Next, by operating at least one of the robot and the automatic guided vehicle, the tip of the robot is moved in a predetermined direction by a predetermined target distance and positioned at a second position ( second step).
 この後、前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記検知部が前記被検知部を検知する位置に位置決めするとともに、前記移動体の移動距離を検出する(第3工程)。移動体の移動距離は、上述と同様に、例えば、リニアスケールと称される汎用の測定器を用いて直接的に測定することができ、或いは、工作機械が数値制御装置を備え、この数値制御装置によって移動体の動作が制御される場合には、移動体の移動距離は、数値制御装置から得られる移動体の位置情報に基づいて算出することができる。斯くして、このようにすることで、移動体の移動距離を高精度に認識することができる。 After that, the moving body of the machine tool is moved in the moving direction of the tip of the robot, and the detection unit positions the detected part at a position, and detects the moving distance of the moving body ( 3rd step). The moving distance of the moving body can be directly measured by using a general-purpose measuring device called a linear scale, for example, in the same manner as described above, or alternatively, if the machine tool is equipped with a numerical controller and this numerical control is When the motion of the mobile body is controlled by the device, the moving distance of the mobile body can be calculated based on the position information of the mobile body obtained from the numerical control device. Thus, by doing so, it is possible to recognize the movement distance of the moving object with high accuracy.
 次に、前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボット先端部の前記移動方向における位置決め精度とする(第4工程)。 Next, a difference value between the moving distance of the robot tip and the detected moving distance of the moving body is calculated, and the positioning accuracy of the robot tip in the moving direction is calculated (fourth step).
 以上のように、本発明の第3の態様に係る測定方法によれば、上述した第1の態様及び第2の態様に係る測定方法と同様に、従来のような高価な測定装置を用いなくても、ロボットシステムが適用される工作機械を用いることで、ロボット先端部の位置決め精度を測定することができ、また、仮に、移動体の移動距離を直接的に測定する場合でも、汎用の安価な測定器を用いることができる。したがって、この測定方法によれば、ロボット先端部の位置決め精度を安価且つ容易に測定することができる。 As described above, according to the measuring method according to the third aspect of the present invention, similarly to the measuring methods according to the first aspect and the second aspect, the conventional expensive measuring device is not used. However, by using a machine tool to which the robot system is applied, the positioning accuracy of the robot tip can be measured. Any measuring instrument can be used. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
 以上のように、本発明に係る測定方法によれば、従来のような高価な測定装置を用いなくても、ロボットシステムが適用される工作機械を用いることで、ロボット先端部の位置決め精度を測定することができる。また、仮に、移動体の移動距離を直接的に測定する場合でも、汎用の安価な測定器を用いることができる。したがって、この測定方法によれば、ロボット先端部の位置決め精度を安価且つ容易に測定することができる。 As described above, according to the measuring method according to the present invention, the positioning accuracy of the tip of the robot can be measured by using a machine tool to which the robot system is applied without using an expensive measuring device as in the conventional art. can do. Moreover, even if the movement distance of the moving body is to be directly measured, a general-purpose inexpensive measuring device can be used. Therefore, according to this measuring method, the positioning accuracy of the tip of the robot can be measured inexpensively and easily.
本発明の第1の実施形態に係る生産システムの概略構成を示した平面図である。BRIEF DESCRIPTION OF THE DRAWINGS It is the top view which showed schematic structure of the production system which concerns on the 1st Embodiment of this invention. 第1の実施形態に係るロボットシステムの構成を示したブロック図である。1 is a block diagram showing the configuration of a robot system according to a first embodiment; FIG. 第1の実施形態に係る工作機械の主要構成を示した斜視図である。It is a perspective view showing the main composition of the machine tool concerning a 1st embodiment. 第1の実施形態に係る無人搬送車及びロボットを示した斜視図である。1 is a perspective view showing an automatic guided vehicle and a robot according to a first embodiment; FIG. 第1の実施形態に係る識別図形を示した説明図である。FIG. 4 is an explanatory diagram showing identification figures according to the first embodiment; 第1の実施形態に係る位置決め精度測定方法を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining a positioning accuracy measuring method according to the first embodiment; 第1の実施形態に係る位置決め精度測定方法を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining a positioning accuracy measuring method according to the first embodiment; 第1の実施形態及び第2の実施形態の変形例に係る工作機械の主要構成を示した斜視図である。It is a perspective view showing the main composition of the machine tool concerning the modification of a 1st embodiment and a 2nd embodiment. 第1の実施形態及び第2の実施形態の変形例に係る無人搬送車及びロボットを示した斜視図である。FIG. 11 is a perspective view showing an automatic guided vehicle and a robot according to modifications of the first embodiment and the second embodiment; 本発明の第3の実施形態に係る工作機械の主要構成を示した斜視図である。FIG. 11 is a perspective view showing the main configuration of a machine tool according to a third embodiment of the invention; 第3の実施形態に係る位置決め精度測定方法を説明するための説明図である。FIG. 11 is an explanatory diagram for explaining a positioning accuracy measuring method according to the third embodiment; 第3の実施形態の変形例に係る無人搬送車及びロボットを示した斜視図である。FIG. 11 is a perspective view showing an automatic guided vehicle and a robot according to a modified example of the third embodiment; 第3の実施形態の変形例に係る工作機械の主要構成を示した斜視図である。FIG. 11 is a perspective view showing the main configuration of a machine tool according to a modified example of the third embodiment;
 以下、本発明の具体的な実施の形態について、図面を参照しながら説明する。 Specific embodiments of the present invention will be described below with reference to the drawings.
1.第1の実施形態
[システム構成]
 まず、本発明の第1の実施形態に係るシステム構成について説明する。図1及び図2に示すように、一例としての本例の生産システム1は、ロボットシステム20、工作機械100、周辺装置としての材料ストッカ120及び製品ストッカ121などから構成され、前記ロボットシステム20は、無人搬送車35、この無人搬送車35に搭載されるロボット25、並びにロボット25及び無人搬送車35を制御する制御装置40などから構成される。
1. First Embodiment [System Configuration]
First, the system configuration according to the first embodiment of the present invention will be explained. As shown in FIGS. 1 and 2, a production system 1 of this example is composed of a robot system 20, a machine tool 100, a material stocker 120 and a product stocker 121 as peripheral devices, and the like. , an automatic guided vehicle 35, a robot 25 mounted on the automatic guided vehicle 35, a control device 40 for controlling the robot 25 and the automatic guided vehicle 35, and the like.
 前記工作機械100は、数値制御装置(図示せず)によって制御される複合加工型のNC工作機械であって、図3に示すように、工具を保持して回転させる工具主軸105、それぞれの中心軸が同軸に設けられ、且つ相互に対向するように配設された第1主軸101及び第2主軸103、複数の工具を装着可能なタレット108、並びにこのタレット108を回転自在に保持する刃物台107などから構成される。 The machine tool 100 is a compound machining type NC machine tool controlled by a numerical controller (not shown). As shown in FIG. A first main shaft 101 and a second main shaft 103 having coaxial shafts and facing each other, a turret 108 capable of mounting a plurality of tools, and a tool rest rotatably holding the turret 108. 107 and the like.
 工具主軸105は、図示しない数値制御装置による制御の下で、同じく図示しない第1X軸送り機構、第1Y軸送り機構及び第1Z軸送り機構によって、相互に直交するX軸、Y軸及びZ軸方向に移動し、同様にして、刃物台107は、図示しない第2X軸送り機構及び第2Z軸送り機構によって、X軸及びZ軸方向に移動する。尚、Z軸は前記第1主軸101及び第2主軸103の軸線と平行な送り軸であり、X軸は水平面内でZ軸に直交する送り軸であり、Y軸は鉛直方向の送り軸である。 The tool spindle 105 is driven by a first X m -axis feed mechanism, a first Y m -axis feed mechanism, and a first Z m -axis feed mechanism (not shown) under the control of a numerical controller (not shown ) . It moves in the m -axis and Zm -axis directions, and similarly, the tool rest 107 moves in the Xm -axis and Zm -axis directions by a second Xm-axis feed mechanism and a second Zm-axis feed mechanism ( not shown ) . The Zm axis is a feed axis parallel to the axes of the first spindle 101 and the second spindle 103, the Xm axis is a feed axis perpendicular to the Z axis in the horizontal plane, and the Ym axis is a feed axis in the vertical direction. It is the feed axis.
 また、第1主軸101には第1チャック102が装着されるとともに、第2主軸103には第2チャック104が装着されており、これら第1チャック102及び第2チャック104によって、加工前ワークWや加工済ワークW’の両端が把持される。そして、この状態で、前記数値制御装置(図示せず)による制御の下、第1主軸101及び第2主軸103がその軸線回りに回転し、且つ前記工具主軸105や刃物台107がX軸、Y軸及びZ軸方向に適宜移動することで、工具主軸105に装着された工具やタレット108に装着された工具によって、加工前ワークWが加工される。 A first chuck 102 is attached to the first spindle 101 and a second chuck 104 is attached to the second spindle 103. By these first and second chucks 102 and 104, the workpiece W to be machined is and both ends of the machined work W' are gripped. In this state, under the control of the numerical controller (not shown), the first main spindle 101 and the second main spindle 103 are rotated about their axes, and the tool main spindle 105 and the tool post 107 are moved along the X m axis. , Ym -axis and Zm -axis directions, the workpiece W to be machined is machined by the tool mounted on the tool spindle 105 and the tool mounted on the turret 108 .
 尚、本例では、図3に示すように、前記タレット108の外周面に、加工済ワークW’を支持する支持具109が設けられており、前記ロボットシステム20によって、加工済ワークW’を取り外す際に、この支持具109に加工済ワークW’が仮置きされる。また、図3では、識別図形111が貼着されたホルダ110を工具主軸105に装着した状態を図示しているが、このホルダ110は、工作機械100によって加工が行われるときには、工具収容部である工具マガジン(図示せず)に格納されており、ロボットシステム20の位置決め精度を測定する際に、工具マガジン(図示せず)から取り出されて工具主軸105に装着される。 Incidentally, in this example, as shown in FIG. A machined work W′ is temporarily placed on this support 109 when it is removed. FIG. 3 shows a state in which the holder 110 to which the identification figure 111 is adhered is attached to the tool spindle 105. When machining is performed by the machine tool 100, the holder 110 is in the tool storage section. It is stored in a certain tool magazine (not shown), and is taken out from the tool magazine (not shown) and mounted on the tool spindle 105 when measuring the positioning accuracy of the robot system 20 .
 前記材料ストッカ120は、図1において工作機械100の左隣に配設され、当該工作機械100で加工される加工前ワークW(材料)をストックする装置である。また、前記製品ストッカ121は、図1において工作機械100の右隣に配設され、当該工作機械100で加工された加工済ワークW’(製品又は半製品)をストックする装置である。 The material stocker 120 is arranged on the left side of the machine tool 100 in FIG. The product stocker 121 is arranged on the right side of the machine tool 100 in FIG.
 図1及び図4に示すように、前記無人搬送車35は、その上面である載置面36に前記ロボット25が搭載され、また、オペレータが携帯可能な操作盤37が付設されている。尚、この操作盤37は、データの入出力を行う入出力部、当該無人搬送車35及びロボット25を手動操作する操作部、並びに画面表示可能なディスプレイなどを備えている。 As shown in FIGS. 1 and 4, the automatic guided vehicle 35 has the robot 25 mounted on a mounting surface 36, which is the upper surface thereof, and is provided with an operation panel 37 that can be carried by an operator. The operation panel 37 includes an input/output unit for inputting/outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
 また、無人搬送車35は、工場内における自身の位置を認識可能なセンサ(例えば、レーザ光を用いた距離計測センサ)を備えており、前記制御装置40による制御の下で、前記工作機械100、材料ストッカ120及び製品ストッカ121が配設される領域を含む工場内を無軌道で走行するように構成され、本例では、前記工作機械100、材料ストッカ120及び製品ストッカ121のそれぞれに対して設定された各作業位置に経由する。尚、本例では、前記制御装置40は、この無人搬送車35内に配設されている。 Further, the automatic guided vehicle 35 is equipped with a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light), and under the control of the control device 40, the machine tool 100 , the machine tool 100, the material stocker 120 and the product stocker 121. through each working position. Incidentally, in this example, the control device 40 is arranged inside the automatic guided vehicle 35 .
 前記ロボット25は、マニュピレータ部である第1アーム26、第2アーム27及び第3アーム28の3つのアームを備えた多関節型のロボットであり、第3アーム28の先端部には支持軸29が取り付けられ、また、この支持軸29には、エンドエフェクタとしてのハンド30が装着されている。更に、前記支持軸29には、支持バー31が取り付けられ、この支持バー31には、同じくエンドエフェクタとしてのカメラ32が装着されている。そして、ロボット25は、前記制御装置40による制御の下で、これらハンド30及びカメラ32をx軸,y軸及びz軸の直交3軸で定義される3次元空間内で移動させる。尚、本例では、x軸は無人搬送車35の前面とほぼ平行に設定されている。 The robot 25 is an articulated robot having three arms, a first arm 26, a second arm 27 and a third arm 28, which are manipulator units. is attached, and a hand 30 as an end effector is attached to the support shaft 29 . Further, a support bar 31 is attached to the support shaft 29, and a camera 32 is attached to the support bar 31 as an end effector. Under the control of the controller 40, the robot 25 moves the hand 30 and the camera 32 within a three-dimensional space defined by three orthogonal axes xr , yr and zr . In this example, the xr axis is set substantially parallel to the front surface of the automatic guided vehicle 35 .
 図2に示すように、前記制御装置40は、動作プログラム記憶部41、移動位置記憶部42、動作姿勢記憶部43、マップ情報記憶部44、手動運転制御部45、自動運転制御部46、マップ情報生成部47、位置認識部48及び入出力インターフェース49などから構成される。そして、制御装置40は、この入出力インターフェース49を介して、前記工作機械100、材料ストッカ120、製品ストッカ121、ロボット25、無人搬送車35及び操作盤37に接続している。 As shown in FIG. 2, the control device 40 includes an operation program storage unit 41, a movement position storage unit 42, an operation posture storage unit 43, a map information storage unit 44, a manual operation control unit 45, an automatic operation control unit 46, a map It is composed of an information generation unit 47, a position recognition unit 48, an input/output interface 49, and the like. The control device 40 is connected to the machine tool 100 , the material stocker 120 , the product stocker 121 , the robot 25 , the automatic guided vehicle 35 and the operation panel 37 via the input/output interface 49 .
 尚、制御装置40は、CPU、RAM、ROMなどを含むコンピュータから構成され、前記手動運転制御部45、自動運転制御部46、マップ情報生成部47、位置認識部48及び入出力インターフェース49は、コンピュータプログラムによってその機能が実現され、後述する処理を実行する。また、動作プログラム記憶部41、移動位置記憶部42、動作姿勢記憶部43及びマップ情報記憶部44はRAMなどの適宜記憶媒体から構成される。 The control device 40 is composed of a computer including a CPU, RAM, ROM, etc. The manual operation control unit 45, the automatic operation control unit 46, the map information generation unit 47, the position recognition unit 48, and the input/output interface 49 are The function is realized by a computer program, and the processing described later is executed. The motion program storage unit 41, the movement position storage unit 42, the motion posture storage unit 43, and the map information storage unit 44 are composed of appropriate storage media such as RAM.
 前記手動運転制御部45は、オペレータにより前記操作盤37から入力される操作信号に従って、前記無人搬送車35及びロボット25を動作させる機能部である。即ち、オペレータは、この手動運転制御部45による制御の下で、操作盤37を用いた、前記無人搬送車35及びロボット25の手動操作を実行することができる。 The manual operation control unit 45 is a functional unit that operates the unmanned guided vehicle 35 and the robot 25 according to operation signals input from the operation panel 37 by the operator. That is, the operator can manually operate the automatic guided vehicle 35 and the robot 25 using the operation panel 37 under the control of the manual operation control unit 45 .
 具体的には、手動運転制御部45は、前記操作盤37から、例えば、前記無人搬送車35を、水平面内で当該無人搬送車35に対して設定された直交2軸(x軸、y軸)の各方向に移動させる信号が入力されると、入力された信号に対応する方向に、対応する距離だけ、当該無人搬送車35を移動させ、前記x軸及びy軸と直交するz軸(鉛直軸)回りに旋回させる信号が入力されると、入力された信号に応じて当該無人搬送車35を旋回させる。 Specifically, the manual operation control unit 45 controls, for example, the automatic guided vehicle 35 from the operation panel 37 in two orthogonal axes (x, r , y r axis) is input, the automatic guided vehicle 35 is moved in the direction corresponding to the input signal by the corresponding distance, and is orthogonal to the xr axis and the yr axis. When a signal for turning around the zr axis (vertical axis) is input, the automatic guided vehicle 35 is turned according to the input signal.
 また、操作盤37から、前記ロボット25の先端部を、前記x軸、y軸及びz軸の各方向に移動させる信号が入力されると、手動運転制御部45は、入力された信号に対応する方向に、対応する距離だけ、ロボット25の先端部を移動させる。また、手動運転制御部45は、操作盤37から前記ハンド30を開閉させる信号が入力されると、これに応じて当該ハンド30を開閉させ、操作盤37から前記カメラ32を動作させる信号が入力されると、これに応じて当該カメラ32を動作させる。 Further, when a signal for moving the tip of the robot 25 in each of the xr -axis, yr -axis, and zr -axis directions is input from the operation panel 37, the manual operation control unit 45 receives the input The tip of the robot 25 is moved by the corresponding distance in the direction corresponding to the signal. Further, when a signal for opening and closing the hand 30 is input from the operation panel 37, the manual operation control unit 45 opens and closes the hand 30 in response to the input, and receives a signal for operating the camera 32 from the operation panel 37. Then, the camera 32 is operated accordingly.
 前記動作プログラム記憶部41は、自動生産時に前記無人搬送車35及び前記ロボット25を自動運転するための自動運転用プログラム、並びに後述する工場内のマップ情報を生成する際に前記無人搬送車35を動作させるためのマップ生成用プログラムを記憶する機能部である。自動運転用プログラム及びマップ生成用プログラムは、例えば、前記操作盤37に設けられた入出力部から入力され、当該動作プログラム記憶部41に格納される。 The operation program storage unit 41 stores an automatic operation program for automatically operating the automatic guided vehicle 35 and the robot 25 during automatic production, and stores the automatic guided vehicle 35 when generating factory map information, which will be described later. This is a functional unit that stores a map generation program for operation. The automatic driving program and the map generating program are input from, for example, an input/output unit provided on the operation panel 37 and stored in the operation program storage unit 41 .
 尚、この自動運転用プログラムには、無人搬送車35が移動する目標位置としての移動位置、移動速度及び無人搬送車35の向きに関する指令コードが含まれ、また、ロボット25が順次動作する当該動作に関する指令コード、及び前記カメラ32の操作に関する指令コードが含まれる。また、マップ生成用プログラムは、前記マップ情報生成部47においてマップ情報を生成できるように、無人搬送車35を無軌道で工場内を隈なく走行させるための指令コードが含まれる。 The automatic operation program includes command codes relating to the movement position as a target position to which the automatic guided vehicle 35 moves, the movement speed, and the orientation of the automatic guided vehicle 35. and a command code for operating the camera 32 are included. The map generation program includes command codes for causing the automatic guided vehicle 35 to travel all over the factory without a track so that the map information generation unit 47 can generate map information.
 前記マップ情報記憶部44は、無人搬送車35が走行する工場内に配置される機械、装置、機器など(装置等)の配置情報を含むマップ情報を記憶する機能部であり、このマップ情報は前記マップ情報生成部47によって生成される。 The map information storage unit 44 is a functional unit that stores map information including arrangement information of machines, devices, equipment, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 travels. It is generated by the map information generator 47 .
 前記マップ情報生成部47は、前記制御装置40の自動運転制御部46による制御の下で、前記動作プログラム記憶部41に格納されたマップ生成用プログラムに従って無人搬送車35を走行させた際に、前記センサによって検出される距離データから工場内の空間情報を取得するとともに、工場内に配設される装置等の平面形状を認識し、例えば、予め登録された装置等の平面形状を基に、工場内に配設された具体的な装置、本例では、工作機械100、材料ストッカ120及び製品ストッカ121の位置、平面形状等(配置情報)を認識する。そして、マップ情報生成部47は、得られた空間情報及び装置等の配置情報を工場内のマップ情報として前記マップ情報記憶部44に格納する。 When the map information generation unit 47 causes the automatic guided vehicle 35 to travel according to the map generation program stored in the operation program storage unit 41 under the control of the automatic operation control unit 46 of the control device 40, Acquiring spatial information in the factory from the distance data detected by the sensor, and recognizing the planar shape of the equipment etc. installed in the factory, for example, based on the planar shape of the pre-registered equipment etc. The machine tool 100, the material stocker 120, and the product stocker 121 in this example, which are arranged in the factory, are recognized in terms of position, planar shape, etc. (layout information). Then, the map information generating unit 47 stores the obtained spatial information and the arrangement information of the devices, etc. in the map information storage unit 44 as map information of the factory.
 前記位置認識部48は、前記センサによって検出される距離データ、及び前記マップ情報記憶部44に格納された工場内のマップ情報を基に、工場内における無人搬送車35の位置及び姿勢を認識する機能部であり、この位置認識部48によって認識される無人搬送車35の位置及び姿勢に基づいて、当該無人搬送車35の動作が前記自動運転制御部46によって制御される。 The position recognition unit 48 recognizes the position and posture of the automatic guided vehicle 35 in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 44. Based on the position and orientation of the automatic guided vehicle 35 recognized by the position recognition section 48 , the operation of the automatic guided vehicle 35 is controlled by the automatic operation control section 46 .
 前記移動位置記憶部42は、前記無人搬送車35が移動する具体的な目標位置としての移動位置であって、前記動作プログラム中の指令コードに対応した具体的な移動位置を記憶する機能部であり、この移動位置には、上述した工作機械100、材料ストッカ120及び製品ストッカ121に対して設定される各作業位置が含まれる。尚、この移動位置は、例えば、前記手動運転制御部45による制御の下、前記操作盤37により前記無人搬送車35を手動運転して、目標とする各位置に移動させた後、前記位置認識部48によって認識される位置データを前記移動位置記憶部42に格納する操作によって設定される。この操作は所謂ティーチング操作と呼ばれる。 The movement position storage unit 42 is a movement position as a specific target position to which the automatic guided vehicle 35 moves, and is a functional unit that stores a specific movement position corresponding to the command code in the operation program. These movement positions include the work positions set for the machine tool 100, the material stocker 120, and the product stocker 121 described above. This movement position is determined, for example, by manually operating the automatic guided vehicle 35 using the operation panel 37 under the control of the manual operation control unit 45 to move it to each target position, and then performing the position recognition. It is set by the operation of storing the position data recognized by the unit 48 in the movement position storage unit 42 . This operation is called a so-called teaching operation.
 前記動作姿勢記憶部43は、前記ロボット25が所定の順序で動作することによって順次変化するロボット25の姿勢(動作姿勢)であって、前記動作プログラム中の指令コードに対応した動作姿勢に係るデータを記憶する機能部である。この動作姿勢に係るデータは、前記手動運転制御部45による制御の下で、前記操作盤37を用いたティーチング操作により、当該ロボット25を手動運転して、目標とする各姿勢を取らせたときの、当該各姿勢におけるロボット25の各関節(モータ)の回転角度データであり、この回転角度データが動作姿勢に係るデータとして前記動作姿勢記憶部43に格納される。 The motion posture storage unit 43 stores data relating to motion postures (motion postures) of the robot 25 that sequentially change as the robot 25 moves in a predetermined order, corresponding to command codes in the motion program. is a functional unit that stores the The data relating to the motion posture is obtained when the robot 25 is manually operated by teaching operation using the operation panel 37 under the control of the manual operation control unit 45 to take each target posture. , the rotation angle data of each joint (motor) of the robot 25 in each posture, and this rotation angle data is stored in the motion posture storage unit 43 as data relating to the motion posture.
 ロボット25の具体的な動作姿勢は、前記材料ストッカ120、工作機械100及び製品ストッカ121において、それぞれ設定される。例えば、材料ストッカ120では、当該材料ストッカ120において作業を開始するときの作業開始姿勢(取出開始姿勢)、当該材料ストッカ120に収納された加工前ワークWをハンド30によって把持して、当該材料ストッカ120から取り出すための各作業姿勢(各取出姿勢)及び取り出しを完了したときの姿勢(取出完了姿勢であり、本例では、取出開始姿勢と同じ姿勢)が取出動作姿勢として設定される。また、工作機械100では、加工済のワークW’を工作機械100から取り出すワーク取出動作姿勢、及び加工前ワークWを工作機械100に取り付けるワーク取付動作姿勢が設定される。 The specific motion posture of the robot 25 is set in each of the material stocker 120, the machine tool 100 and the product stocker 121. For example, in the material stocker 120, the work start posture (takeout start posture) when work is started in the material stocker 120, the unprocessed work W stored in the material stocker 120 is gripped by the hand 30, and the material stocker 120 is Each work posture (each take-out posture) for taking out from 120 and the posture when taking out is completed (take-out completion posture, which is the same posture as the take-out start posture in this example) are set as take-out motion postures. In addition, in the machine tool 100, a work take-out motion attitude for taking out the machined work W' from the machine tool 100 and a work mounting motion attitude for attaching the unmachined work W to the machine tool 100 are set.
 前記自動運転制御部46は、前記動作プログラム記憶部41に格納された自動運転用プログラム及びマップ生成用プログラムの何れかを用い、当該プログラムに従って無人搬送車35及びロボット25を動作させる機能部である。その際、前記移動位置記憶部42及び動作姿勢記憶部43に格納されたデータが必要に応じて使用される。 The automatic operation control unit 46 is a functional unit that uses either the automatic operation program or the map generation program stored in the operation program storage unit 41 and operates the automatic guided vehicle 35 and the robot 25 according to the program. . At that time, the data stored in the movement position storage section 42 and the motion posture storage section 43 are used as necessary.
 斯くして、以上の構成を備えた本例の生産システム1によれば、前記制御装置40の自動運転制御部46による制御の下で、前記動作プログラム記憶部41に格納された自動運転用プログラムが実行され、この自動運転用プログラムに従って、無人搬送車35及びロボット25が動作されて、無人の自動生産が実行される。 Thus, according to the production system 1 of the present embodiment having the above configuration, the automatic operation program stored in the operation program storage unit 41 under the control of the automatic operation control unit 46 of the control device 40 is executed, the automatic guided vehicle 35 and the robot 25 are operated according to this automatic operation program, and unmanned automatic production is executed.
[位置決め精度測定方法]
 次に、上述した生産システム1において、前記ロボットシステム20の位置決め精度を測定する測定方法について説明する。尚、この測定方法では、工作機械100及びロボットシステム20を、それぞれ数値制御装置(図示せず)及び制御装置40による制御の下で、手動操作することによって行うものとし、以下では、ロボット25の先端部のx軸方向における位置決め精度を測定するものとする。
[Positioning accuracy measurement method]
Next, a measuring method for measuring the positioning accuracy of the robot system 20 in the production system 1 described above will be described. In this measurement method, the machine tool 100 and the robot system 20 are manually operated under the control of a numerical control device (not shown) and the control device 40, respectively. The positioning accuracy of the tip in the xr -axis direction shall be measured.
 まず、識別図形111が貼着されたホルダ110を工具マガジン(図示せず)から取り出して、工作機械100の移動体である工具主軸105に装着するとともに、無人搬送車35及びロボット25を動作させて、当該ロボット25に装着したカメラ32によって前記識別図形111を撮像可能な位置(この位置を「第1位置」と言う。)に、前記工具主軸105、並びに無人搬送車35及びロボット25を位置決めする。このとき、無人搬送車35は、自身に対して設定された前記x軸が、工作機械100におけるZ軸と平行になるように、前記第1位置に位置決めされる。このようにして位置決めされた状態のホルダ110とロボット25の先端部の位置関係を図6(a)において実線で示している。 First, the holder 110 to which the identification figure 111 is attached is taken out from a tool magazine (not shown) and mounted on the tool spindle 105, which is the moving body of the machine tool 100, and the automatic guided vehicle 35 and the robot 25 are operated. Then, the tool spindle 105, the automatic guided vehicle 35, and the robot 25 are positioned at a position where the identification graphic 111 can be imaged by the camera 32 attached to the robot 25 (this position is referred to as a "first position"). do. At this time, the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 . The positional relationship between the holder 110 positioned in this way and the tip of the robot 25 is indicated by a solid line in FIG. 6(a).
 尚、この無人搬送車35の位置決めは、上述したように、前記位置認識部48によって認識される無人搬送車35の位置及び姿勢に係る情報に基づいて行われる。また、前記識別図形111は、図3及び図5に示すように、複数の正方形をした画素が二次元に配列されたマトリクス構造を有するものであり、各画素が白または黒で表示されている。図3及び図5では、黒色の画素に斜線を付している。 The positioning of the automatic guided vehicle 35 is performed based on information relating to the position and orientation of the automatic guided vehicle 35 recognized by the position recognition section 48, as described above. As shown in FIGS. 3 and 5, the identification figure 111 has a matrix structure in which a plurality of square pixels are arranged two-dimensionally, and each pixel is displayed in white or black. . In FIGS. 3 and 5, black pixels are shaded.
 次に、前記第1位置において、前記カメラ32により前記識別図形111を撮像し、得られた画像に基づいて、第1位置におけるカメラ32と識別図形111との相対的な位置関係を算出する。図7(a)に、第1位置において、カメラ32により撮像された識別図形111の画像を示しているが、例えば、x軸-z軸を基準軸とする撮像画像上における識別図形111の中心位置(xr1,zr1)を算出することによって、第1位置におけるカメラ32と識別図形111との相対的な位置関係を得る。 Next, the identification graphic 111 is imaged by the camera 32 at the first position, and the relative positional relationship between the camera 32 and the identification graphic 111 at the first position is calculated based on the obtained image. FIG. 7(a) shows an image of the identification figure 111 captured by the camera 32 at the first position. By calculating the center position (x r1 , z r1 ) of , the relative positional relationship between the camera 32 and the identification graphic 111 at the first position is obtained.
 次に、ロボット25を駆動して、その先端部をx軸プラス方向に、予め定められた目標とする距離だけ移動させた位置(この位置を「第2位置」と言う。)に位置決めするとともに、前記工具主軸105を、ロボット25の先端部の移動方向と同じ方向であるZ軸プラス方向に、同じ目標距離だけ移動させる。この状態を図6(b)において一点鎖線で示している。ロボット25の先端部の移動距離は、手動運転制御部45における制御上の位置情報に基づいて認識され、同様に、工具主軸105の移動距離は、前記数値制御装置(図示せず)における制御上の位置情報に基づいて認識される。 Next, the robot 25 is driven to position its tip in the xr -axis plus direction at a position (this position is referred to as a "second position") moved by a predetermined target distance. At the same time, the tool spindle 105 is moved by the same target distance in the Zm -axis positive direction, which is the same direction as the moving direction of the tip of the robot 25 . This state is indicated by a dashed line in FIG. 6(b). The movement distance of the tip of the robot 25 is recognized based on control position information in the manual operation control unit 45. Similarly, the movement distance of the tool spindle 105 is controlled by the numerical controller (not shown). is recognized based on the location information of
 そして、この後、カメラ32によって識別図形111を撮像し、得られた画像に基づいて、第2位置におけるカメラ32と識別図形111の相対的な位置関係を算出する。図7(b)に、第2位置において、カメラ32により撮像された識別図形111の画像を示しているが、上記と同様にして、x軸-z軸を基準軸とする撮像画像上における識別図形111の中心位置(xr2,zr2)を算出することにより、第2位置におけるカメラ32と識別図形111との相対的な位置関係が得られる。 After that, the identification graphic 111 is imaged by the camera 32, and the relative positional relationship between the camera 32 and the identification graphic 111 at the second position is calculated based on the obtained image. FIG. 7(b) shows an image of the identification figure 111 captured by the camera 32 at the second position. By calculating the center position (x r2 , z r2 ) of the identification graphic 111 at , the relative positional relationship between the camera 32 and the identification graphic 111 at the second position is obtained.
 そして、前記第1位置におけるカメラ32と識別図形111との相対的な位置関係(xr1,zr1)、及び第2位置におけるカメラ32と識別図形111との相対的な位置関係(xr2,zr2)から、ロボット25の先端部のx軸方向における位置決め精度を算出する。具体的には、x軸方向の位置決め精度に相当する位置誤差Δxを下式によって算出する。
Δx=xr2-xr1
Then, the relative positional relationship (x r1 , z r1 ) between the camera 32 and the identification figure 111 at the first position, and the relative positional relationship (x r2 , z r2 ), the positioning accuracy of the tip of the robot 25 in the x r -axis direction is calculated. Specifically, the position error Δxr corresponding to the positioning accuracy in the xr -axis direction is calculated by the following equation.
Δx r =x r2 −x r1
 以上のように、この第1の実施形態に係る測定方法によれば、従来のような高価な測定装置を用いなくても、ロボットシステム20が適用される工作機械100を用いることで、ロボット25の先端部の位置決め精度を安価且つ容易に測定することができる。尚、前記位置決め精度は前記カメラ32の解像度に依存するが、このカメラ32として高解像度のものを用いることで、前記位置決め精度を高精度に測定することができる。 As described above, according to the measuring method according to the first embodiment, the robot 25 can be measured by using the machine tool 100 to which the robot system 20 is applied without using an expensive measuring device as in the conventional art. It is possible to measure the positioning accuracy of the tip portion of the tip easily and inexpensively. The positioning accuracy depends on the resolution of the camera 32. By using a high-resolution camera 32, the positioning accuracy can be measured with high accuracy.
 ところで、この第1の実施形態に係る測定方法では、ロボット25にカメラ32を配設し、工作機械100の工具主軸105に識別図形111を配設したが、カメラ32と識別図形111の配置は相対的なものであり、図8に示すように、工具主軸105にカメラ112を配設するとともに、図9に示すように、ロボット25に識別図形を配設しても良い。この状態で、上記と同様の操作を行うことにより、ロボット25の先端部の位置決め精度を測定することができる。尚、図8に示した例では、カメラ112を保持したホルダ110を工具主軸105に装着している。また、図9に示した例では、ロボット25の支持バー31の前面に識別図形111を貼着している。 By the way, in the measuring method according to the first embodiment, the camera 32 is arranged on the robot 25 and the identification figure 111 is arranged on the tool spindle 105 of the machine tool 100. However, the arrangement of the camera 32 and the identification figure 111 is It is relative, and as shown in FIG. 8, a camera 112 may be arranged on the tool spindle 105, and an identification figure may be arranged on the robot 25, as shown in FIG. By performing the same operation as above in this state, the positioning accuracy of the tip of the robot 25 can be measured. Incidentally, in the example shown in FIG. 8, a holder 110 holding a camera 112 is attached to the tool spindle 105 . Also, in the example shown in FIG. 9, an identification figure 111 is attached to the front surface of the support bar 31 of the robot 25 .
2.第2の実施形態
 次に、上述した生産システム1を用いた本発明の第2の実施形態に係る測定方法について説明する。この測定方法においても、工作機械100及びロボットシステム20を、それぞれ数値制御装置(図示せず)及び制御装置40による制御の下で、手動操作することによって行うものとし、以下では、ロボット25の先端部のx軸方向における位置決め精度を測定するものとする。
2. Second Embodiment Next, a measuring method according to a second embodiment of the present invention using the production system 1 described above will be described. This measurement method is also performed by manually operating the machine tool 100 and the robot system 20 under the control of a numerical control device (not shown) and the control device 40, respectively. The positioning accuracy in the xr -axis direction of the part shall be measured.
 まず、第1の実施形態と同様にして、識別図形111が貼着されたホルダ110を工具マガジン(図示せず)から取り出して、工作機械100の移動体である工具主軸105に装着するとともに(図3参照)、無人搬送車35及びロボット25を動作させて、当該ロボット25に装着したカメラ32によって前記識別図形111を撮像可能な位置(第1位置)に、前記工具主軸105、並びに無人搬送車35及びロボット25を位置決めする。このとき、無人搬送車35は、自身に対して設定された前記x軸が、工作機械100におけるZ軸と平行になるように、前記第1位置に位置決めされる(図6参照)。そして、この第1位置において、カメラ32により識別図形111を撮像する。 First, in the same way as in the first embodiment, the holder 110 to which the identification figure 111 is attached is taken out from a tool magazine (not shown) and attached to the tool spindle 105, which is a moving body of the machine tool 100 ( 3), the automatic guided vehicle 35 and the robot 25 are operated, and the tool spindle 105 and the automatic guided vehicle are moved to a position (first position) where the identification graphic 111 can be imaged by the camera 32 attached to the robot 25. Position the car 35 and the robot 25 . At this time, the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 (see FIG. 6). Then, at this first position, the identification graphic 111 is imaged by the camera 32 .
 次に、ロボット25を駆動して、その先端部をx軸プラス方向に、予め定められた目標とする距離だけ移動させた位置(第2位置)に位置決めした後、前記工具主軸105を、ロボット25の先端部の移動方向と同じ方向であるZ軸プラス方向に移動させて、前記カメラ32によって捉えられる識別図形111のカメラフレーム上の位置が、第1位置において撮像された識別図形111のカメラフレーム上の位置と一致する位置に位置決めするとともに、工具主軸105の移動距離を検出する。 Next, the robot 25 is driven to position its tip in the xr -axis plus direction by a predetermined target distance (second position). The robot 25 is moved in the positive direction of the Zm axis, which is the same direction as the moving direction of the tip of the robot 25, and the position of the identification figure 111 captured by the camera 32 on the camera frame is the identification figure 111 imaged at the first position. , and the movement distance of the tool spindle 105 is detected.
 識別図形111のカメラフレーム上の位置は、例えば、図7に示すように、x軸-z軸を基準軸とする撮像画像上における識別図形111の中心位置((xr1,zr1)、(xr2,zr2))を算出することによって得られる。したがって、例えば、第1位置において撮像された画像を基に、当該撮像画像上における識別図形111の中心位置(xr1,zr1)を算出しておき、前記第2位置に位置決めされたカメラ32によって撮像される識別図形111の中心位置(xr2,zr2)が、第1位置における中心位置(xr1,zr1)と一致するように、前記工具主軸105を位置決めする。 The position of the identification graphic 111 on the camera frame is, for example, as shown in FIG. 7, the central position ((x r1 , z r1 ) , (x r2 , z r2 )). Therefore, for example, based on the image captured at the first position, the center position (x r1 , z r1 ) of the identification graphic 111 on the captured image is calculated, and the camera 32 positioned at the second position is calculated. The tool spindle 105 is positioned such that the center position (x r2 , z r2 ) of the identification figure 111 imaged by , coincides with the center position (x r1 , z r1 ) at the first position.
 そして、工具主軸105の移動距離を前記数値制御装置(図示せず)における制御上の位置情報に基づいて検出した後、検出された工具主軸105の移動距離と、ロボット25の先端部の移動距離との差分値を算出することにより、ロボット25の先端部のx軸方向における位置決め精度を算出する。 After detecting the movement distance of the tool spindle 105 based on the control position information in the numerical controller (not shown), the detected movement distance of the tool spindle 105 and the movement distance of the tip of the robot 25 are calculated. The positioning accuracy of the tip of the robot 25 in the xr -axis direction is calculated by calculating the difference value between .
 以上のように、この第2の実施形態に係る測定方法によれば、上述した第1の実施形態に係る測定方法と同様に、従来のような高価な測定装置を用いなくても、ロボットシステム20が適用される工作機械100を用いることで、ロボット25の先端部の位置決め精度を安価且つ容易に測定することができる。そして、この態様においても、カメラ32として高解像度のものを用いることで、前記位置決め精度を高精度に測定することができる。 As described above, according to the measurement method according to the second embodiment, similarly to the measurement method according to the first embodiment, the robot system can be By using the machine tool 100 to which 20 is applied, the positioning accuracy of the tip of the robot 25 can be measured easily and inexpensively. Also in this aspect, by using a high-resolution camera 32, the positioning accuracy can be measured with high accuracy.
 また、この態様においても、ロボット25にカメラ32を配設し、工作機械100の工具主軸105に識別図形111を配設したが、カメラ32と識別図形111の配置は相対的なものであり、図8に示すように、工具主軸105にカメラ112を配設するとともに、図9に示すように、ロボット25に識別図形を配設しても良い。このようにしても、上記と同様の操作を行うことにより、ロボット25の先端部の位置決め精度を測定することができる。 Also in this embodiment, the camera 32 is arranged on the robot 25 and the identification figure 111 is arranged on the tool spindle 105 of the machine tool 100, but the arrangement of the camera 32 and the identification figure 111 is relative. A camera 112 may be arranged on the tool spindle 105 as shown in FIG. 8, and an identification figure may be arranged on the robot 25 as shown in FIG. Even in this manner, the positioning accuracy of the tip of the robot 25 can be measured by performing the same operation as described above.
3.第3の実施形態
 次に、上述した生産システム1を用いた本発明の第3の実施形態に係る測定方法について説明する。この測定方法においても、工作機械100及びロボットシステム20を、それぞれ数値制御装置(図示せず)及び制御装置40による制御の下で、手動操作することによって行うものとし、以下では、ロボット25の先端部のx軸方向における位置決め精度を測定するものとする。
3. Third Embodiment Next, a measurement method according to a third embodiment of the present invention using the production system 1 described above will be described. This measurement method is also performed by manually operating the machine tool 100 and the robot system 20 under the control of a numerical control device (not shown) and the control device 40, respectively. The positioning accuracy in the xr -axis direction of the part shall be measured.
 まず、図10に示すように、検知部としてのタッチプローブ114を保持したホルダ110を工具マガジン(図示せず)から取り出して、工作機械100の移動体である工具主軸105に装着する。ついで、無人搬送車35及びロボット25を動作させて、当該ロボット25の支持バー31の端面の内、向かって右側の端面が、前記Z軸方向において、前記タッチプローブ114よりプラス側の近傍位置(第1位置)に位置するように位置決めする。このとき、無人搬送車35は、自身に対して設定された前記x軸が、工作機械100におけるZ軸と平行になるように、前記第1位置に位置決めされる(図11参照)。 First, as shown in FIG. 10 , the holder 110 holding the touch probe 114 as the detection unit is taken out from a tool magazine (not shown) and mounted on the tool spindle 105 which is the moving body of the machine tool 100 . Next, the automatic guided vehicle 35 and the robot 25 are operated so that the right end face of the end face of the support bar 31 of the robot 25 moves to a position near the plus side of the touch probe 114 in the Zm -axis direction. (first position). At this time, the automatic guided vehicle 35 is positioned at the first position such that the xr -axis set for itself is parallel to the Zm -axis of the machine tool 100 (see FIG. 11).
 次に、工具主軸105を、タッチプローブ114が支持バー31の端面に接触して当該支持バー31を検知するまで、Z軸プラス方向に移動させる(図11(a)参照)。尚、前記支持バー31はタッチプローブ114に対する被検知部として機能する。 Next, the tool spindle 105 is moved in the Z m -axis plus direction until the touch probe 114 touches the end surface of the support bar 31 and detects the support bar 31 (see FIG. 11(a)). Incidentally, the support bar 31 functions as a part to be detected for the touch probe 114 .
 ついで、ロボット25を駆動して、図11に示すように、その先端部をx軸のプラス方向に、予め定められた目標とする距離xraだけ移動させた第2位置に位置決めした後、前記工具主軸105を、ロボット25の先端部の移動方向と同じ方向であるZ軸プラス方向に、タッチプローブ114が支持バー31の端面に接触して当該支持バー31を検知するまで移動させる(図11(b)参照)。そして、当該工具主軸105の移動距離Zmaを前記数値制御装置(図示せず)における制御上の位置情報に基づいて検出した後、検出された工具主軸105の移動距離Zmaと、ロボット25の先端部の移動距離xraとの差分値を算出することにより、ロボット25の先端部のx軸方向における位置決め精度を算出する。 Next, the robot 25 is driven to move the tip end portion of the robot 25 in the positive direction of the xr axis by a predetermined target distance xra as shown in FIG. The tool spindle 105 is moved in the Zm -axis plus direction, which is the same direction as the moving direction of the tip of the robot 25, until the touch probe 114 touches the end surface of the support bar 31 and detects the support bar 31 ( See FIG. 11(b)). Then, after the movement distance Zma of the tool spindle 105 is detected based on the control position information in the numerical controller (not shown), the detected movement distance Zma of the tool spindle 105 and the robot 25 The positioning accuracy of the tip of the robot 25 in the xr -axis direction is calculated by calculating the difference from the moving distance xra of the tip.
 以上のように、この第3の実施形態に係る測定方法によれば、上述した第1の実施形態及び第2の実施形態に係る測定方法と同様に、従来のような高価な測定装置を用いることなく、ロボットシステム20が適用される工作機械100を用いることで、ロボット25の先端部の位置決め精度を安価且つ容易に測定することができる。 As described above, according to the measurement method according to the third embodiment, similarly to the measurement methods according to the first and second embodiments, a conventional expensive measurement device is used. By using the machine tool 100 to which the robot system 20 is applied, the positioning accuracy of the tip portion of the robot 25 can be measured easily and inexpensively.
 また、この態様においても、ロボット25に被検知部を設定し(即ち、配設し)、工作機械100の工具主軸105に検知部としてのタッチプローブ114を配設したが、タッチプローブ114と被検知部の配置は相対的なものであり、図12に示すように、ロボット25のハンド30によりタッチプローブ114を把持するとともに、図13に示すように、ブロック状の被検知部116が形成されたホルダ115を工具主軸105に装着した態様としても良い。このような態様によっても、上記と同様の操作を行うことにより、ロボット25の先端部の位置決め精度を測定することができる。 Also in this embodiment, the part to be detected is set (that is, arranged) in the robot 25, and the touch probe 114 as the detection part is arranged in the tool spindle 105 of the machine tool 100. 12, the hand 30 of the robot 25 grips the touch probe 114, and as shown in FIG. 13, a block-shaped detected portion 116 is formed. Alternatively, the holder 115 may be attached to the tool spindle 105 . Also in this mode, the positioning accuracy of the tip of the robot 25 can be measured by performing the same operation as described above.
 以上、本発明の具体的な実施形態について説明したが、本発明が採り得る態様は、これに限定されるものではない。 Although the specific embodiments of the present invention have been described above, the aspects that the present invention can take are not limited to these.
 例えば、上記の第1、第2及び第3の実施形態では、ロボット25の先端部のx軸方向の位置決め精度を測定するようにしたが、これに限られるものではなく、上記と同様の操作によって、ロボット25の先端部のz軸方向の位置決め精度を測定することができる。また、ロボット25に限られるものではなく、同様の操作によって、無人搬送車35のx軸方向の位置決め精度を測定することができる。更には、無人搬送車35及びロボット25の複合動作、即ち双方を移動させる態様での、ロボット25の先端部のx軸方向又はz軸方向の位置決め精度を、上記と同様の操作によって測定することができる。 For example, in the first, second and third embodiments described above, the positioning accuracy of the tip of the robot 25 in the xr -axis direction is measured. By the operation, the positioning accuracy of the tip of the robot 25 in the zr -axis direction can be measured. Further, the positioning accuracy of the automatic guided vehicle 35 in the xr -axis direction can be measured by a similar operation, not limited to the robot 25 . Furthermore, the positioning accuracy of the tip of the robot 25 in the xr -axis direction or the zr -axis direction in the combined operation of the automatic guided vehicle 35 and the robot 25, that is, in a mode in which both are moved, is measured by the same operation as above. can do.
 また、移動体としての工具主軸105の移動距離は、例えば、リニアスケールと称される汎用の安価な測定器を用いて直接的に測定することができる。 Also, the moving distance of the tool spindle 105 as a moving body can be directly measured using, for example, a general-purpose inexpensive measuring instrument called a linear scale.
 また、識別図形は上例のものに限られるものではなく、撮像画像から、カメラと識別図形との相対的な位置関係を取得できれば、どのような形状であっても良い。 Also, the identification figure is not limited to the above example, and may be of any shape as long as the relative positional relationship between the camera and the identification figure can be obtained from the captured image.
 繰り返しになるが、上述した実施形態の説明は、すべての点で例示であって、制限的なものではない。当業者にとって変形および変更が適宜可能である。本発明の範囲は、上述の実施形態ではなく、特許請求の範囲によって示される。さらに、本発明の範囲には、特許請求の範囲内と均等の範囲内での実施形態からの変更が含まれる。 Again, the above description of the embodiment is illustrative in all respects and is not restrictive. Modifications and modifications are possible for those skilled in the art. The scope of the invention is indicated by the claims rather than the above-described embodiments. Furthermore, the scope of the present invention includes modifications from the embodiments within the scope of claims and equivalents.
 1  生産システム
 20 ロボットシステム
 25 ロボット
 26 第1アーム
 27 第2アーム
 28 第3アーム
 30 ハンド
 31 支持バー
 32 カメラ
 35 無人搬送車
 37 操作盤
 40 制御装置
 41 動作プログラム記憶部
 42 移動位置記憶部
 43 動作姿勢記憶部
 44 マップ情報記憶部
 45 手動運転制御部
 46 自動運転制御部
 47 マップ情報生成部
 48 位置認識部
 49 入出力インターフェース
 100 工作機械
 101 第1主軸
 103 第2主軸
 105 工具主軸
 110 ホルダ
 111 識別図形
 120 材料ストッカ
 121 製品ストッカ
 
 
1 production system 20 robot system 25 robot 26 first arm 27 second arm 28 third arm 30 hand 31 support bar 32 camera 35 unmanned guided vehicle 37 operation panel 40 control device 41 operation program storage unit 42 movement position storage unit 43 operation attitude Storage unit 44 Map information storage unit 45 Manual operation control unit 46 Automatic operation control unit 47 Map information generation unit 48 Position recognition unit 49 Input/output interface 100 Machine tool 101 First spindle 103 Second spindle 105 Tool spindle 110 Holder 111 Identification figure 120 Material stocker 121 Product stocker

Claims (4)

  1.  先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
     前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
     前記無人搬送車及びロボットを制御する制御装置とを備え、
     識別図形を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記識別図形を撮像するカメラを前記工作機械の移動体又は前記ロボットの先端部の他方に配設したロボットシステムにおいて、
     前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
     前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第1位置におけるカメラと識別図形との相対的な位置関係を算出する第1工程と、
     前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させた第2位置に位置決めするとともに、前記工作機械の移動体を、前記ロボットの先端部の移動方向に、同じ目標距離だけ移動させる第2工程と、
     この後、前記カメラによって前記識別図形を撮像し、得られた画像に基づいて、第2位置におけるカメラと識別図形との相対的な位置関係を算出する第3工程と、
     前記第1位置及び第2位置におけるカメラと識別図形との相対的な位置関係に基づいて、前記ロボットの先端部の前記移動方向における位置決め精度を算出する第4工程とを、順次実施するようにしたことを特徴とするロボットシステムの位置決め精度測定方法。
    a robot having an end effector at its tip and using the end effector to work on a machine tool;
    an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
    A control device that controls the automatic guided vehicle and the robot,
    A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot. in
    A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
    Positioning the moving body of the machine tool and the tip of the robot at a first position where the camera can capture the identification figure, capturing the identification figure with the camera, and based on the obtained image, a first step of calculating a relative positional relationship between the camera and the identification figure at the first position;
    By operating at least one of the robot and the automatic guided vehicle, the tip portion of the robot is positioned in a predetermined direction at a second position moved by a predetermined target distance, and the machining is performed. a second step of moving the moving body of the machine by the same target distance in the moving direction of the tip of the robot;
    Thereafter, a third step of capturing an image of the identification figure with the camera and calculating a relative positional relationship between the camera and the identification figure at a second position based on the obtained image;
    and a fourth step of calculating the positioning accuracy of the distal end portion of the robot in the moving direction based on the relative positional relationship between the camera and the identification graphic at the first position and the second position. A positioning accuracy measuring method for a robot system, characterized by:
  2.  先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
     前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
     前記無人搬送車及びロボットを制御する制御装置とを備え、
     識別図形を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記識別図形を撮像するカメラを前記工作機械の移動体又は前記ロボットの先端部の他方に配設したロボットシステムにおいて、
     前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
     前記カメラによって前記識別図形を撮像可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記カメラによって前記識別図形を撮像する第1工程と、
     前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする第2工程と、
     前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記カメラによって捉えられる前記識別図形のカメラフレーム上の位置が、前記第1位置において撮像された識別図形のカメラフレーム上の位置と一致する位置に位置決めするとともに、前記移動体の移動距離を検出する第3工程と、
     前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボット先端部の前記移動方向における位置決め精度とする第4工程とを、順次実施するようにしたことを特徴とするロボットシステムの位置決め精度測定方法。
    a robot having an end effector at its tip and using the end effector to work on a machine tool;
    an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
    A control device that controls the automatic guided vehicle and the robot,
    A robot system in which an identification figure is arranged on one of the moving body of the machine tool or the tip of the robot, and a camera for imaging the identification figure is arranged on the other of the moving body of the machine tool or the tip of the robot. in
    A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
    a first step of positioning the moving body of the machine tool and the distal end portion of the robot at a first position where the identification figure can be imaged by the camera, and imaging the identification figure by the camera;
    a second step of moving at least one of the robot and the automatic guided vehicle to move the distal end portion of the robot in a predetermined direction by a predetermined target distance and positioning it at a second position; ,
    The moving body of the machine tool is moved in the moving direction of the tip of the robot, and the position of the identification figure captured by the camera on the camera frame is changed to the position on the camera frame of the identification figure imaged at the first position. A third step of positioning at a position coinciding with the position of and detecting the moving distance of the moving body;
    a fourth step of calculating a difference value between the moving distance of the robot tip and the detected moving distance of the moving body, and determining the positioning accuracy of the robot tip in the moving direction; A positioning accuracy measuring method for a robot system, characterized by:
  3.  先端部にエンドエフェクタを有し、このエンドエフェクタを用いて工作機械に対して作業を行うロボットと、
     前記ロボットを搭載し、前記工作機械に対して設定された作業位置に経由する無人搬送車と、
     前記無人搬送車及びロボットを制御する制御装置とを備え、
     検知部を前記工作機械の移動体又は前記ロボットの先端部の一方に配設し、前記検知部によって検出される被検知部が前記工作機械の移動体又は前記ロボットの先端部の他方に設定されたロボットシステムにおいて、
     前記ロボット又は無人搬送車の少なくとも一方の動作によって位置決めされる前記ロボットの先端部の位置決め精度を、前記工作機械を用いて測定する方法であって、
     前記検知部によって前記被検知部を検出可能な第1位置に前記工作機械の移動体と前記ロボットの先端部とを位置決めして、前記検知部により前記被検知部を検出する第1工程と、
     前記ロボット又は無人搬送車の少なくとも一方を動作させることにより、前記ロボットの先端部を予め定められた方向に、予め定められた目標とする距離だけ移動させて第2位置に位置決めする第2工程と、
     前記工作機械の移動体を前記ロボットの先端部の移動方向に移動させて、前記検知部が前記被検知部を検知する位置に位置決めするとともに、前記移動体の移動距離を検出する第3工程と、
     前記ロボット先端部の移動距離と、検出された前記移動体の移動距離との差分値を算出して、前記ロボットの先端部の前記移動方向における位置決め精度とする第4工程とを、順次実施するようにしたことを特徴とするロボットシステムの位置決め精度測定方法。
    a robot having an end effector at its tip and using the end effector to work on a machine tool;
    an automatic guided vehicle equipped with the robot and passing through a work position set for the machine tool;
    A control device that controls the automatic guided vehicle and the robot,
    A detection unit is disposed on either the moving body of the machine tool or the tip of the robot, and the detected portion detected by the detection unit is set to the other of the moving body of the machine tool or the tip of the robot. In a robot system with
    A method of measuring, using the machine tool, the positioning accuracy of the tip of the robot positioned by the operation of at least one of the robot and the automatic guided vehicle,
    a first step of positioning the moving body of the machine tool and the distal end portion of the robot at a first position where the detected portion can be detected by the detection portion, and detecting the detected portion by the detection portion;
    a second step of moving at least one of the robot and the automatic guided vehicle to move the distal end portion of the robot in a predetermined direction by a predetermined target distance and positioning it at a second position; ,
    a third step of moving the moving body of the machine tool in the movement direction of the tip portion of the robot to position the detection section at a position where the detected portion is detected, and detecting the moving distance of the moving body; ,
    a fourth step of calculating a difference value between the movement distance of the robot tip and the detected movement distance of the moving body, and determining the positioning accuracy of the tip of the robot in the movement direction; A positioning accuracy measuring method for a robot system, characterized by:
  4.  前記工作機械は数値制御装置を備えており、
     前記第3工程では、前記移動体の移動距離を、前記数値制御装置から得られる移動体の位置情報に基づいて算出することを特徴とする請求項1又は2記載のロボットシステムの位置決め精度測定方法。
     
     
    The machine tool is equipped with a numerical controller,
    3. The positioning accuracy measuring method for a robot system according to claim 1, wherein, in said third step, the moving distance of said moving body is calculated based on the positional information of said moving body obtained from said numerical controller. .

PCT/JP2021/037694 2021-03-15 2021-10-12 Robot system positioning accuracy measurement method WO2022195938A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021041392A JP6937444B1 (en) 2021-03-15 2021-03-15 Robot system positioning accuracy measurement method
JP2021-041392 2021-03-15

Publications (1)

Publication Number Publication Date
WO2022195938A1 true WO2022195938A1 (en) 2022-09-22

Family

ID=78028228

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/037694 WO2022195938A1 (en) 2021-03-15 2021-10-12 Robot system positioning accuracy measurement method

Country Status (2)

Country Link
JP (1) JP6937444B1 (en)
WO (1) WO2022195938A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (en) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001300875A (en) * 2000-04-19 2001-10-30 Denso Corp Robot system
JP2017056546A (en) * 2015-09-14 2017-03-23 ファナック株式会社 Measurement system used for calibrating mechanical parameters of robot
JP2019093481A (en) * 2017-11-24 2019-06-20 株式会社安川電機 Robot system and robot system control method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS60135409A (en) * 1983-12-23 1985-07-18 Chisso Corp Olefin polymerization catalyst and its production
US11273530B2 (en) * 2016-11-16 2022-03-15 Makino Milling Machine Co., Ltd. Machine tool system
JP7131162B2 (en) * 2018-07-20 2022-09-06 オムロン株式会社 Control system, control system control method, and control system program
JP6832408B1 (en) * 2019-10-09 2021-02-24 Dmg森精機株式会社 Production system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001300875A (en) * 2000-04-19 2001-10-30 Denso Corp Robot system
JP2017056546A (en) * 2015-09-14 2017-03-23 ファナック株式会社 Measurement system used for calibrating mechanical parameters of robot
JP2019093481A (en) * 2017-11-24 2019-06-20 株式会社安川電機 Robot system and robot system control method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (en) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot
CN116175256B (en) * 2023-04-04 2024-04-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot

Also Published As

Publication number Publication date
JP6937444B1 (en) 2021-09-22
JP2022141188A (en) 2022-09-29

Similar Documents

Publication Publication Date Title
US10500731B2 (en) Robot system including robot supported by movable carriage
JP6785931B1 (en) Production system
EP1043126B1 (en) Teaching model generating method
JP7153085B2 (en) ROBOT CALIBRATION SYSTEM AND ROBOT CALIBRATION METHOD
EP1650530B1 (en) Three-dimensional shape measuring method and measuring apparatus thereof
EP1607194A2 (en) Robot system comprising a plurality of robots provided with means for calibrating their relative position
JP2010142910A (en) Robot system
WO2021161950A1 (en) Robot system
WO2022195938A1 (en) Robot system positioning accuracy measurement method
JP2787891B2 (en) Automatic teaching device for laser robot
JP2024096756A (en) Robot-mounted mobile device and control method thereof
JP2773917B2 (en) Work positioning device for bending equipment
WO2023032400A1 (en) Automatic transport device, and system
WO2022091767A1 (en) Image processing method, image processing device, robot mounted-type conveyance device, and system
JP6832408B1 (en) Production system
WO2022097536A1 (en) Robot-mounted mobile device and positioning control method for system
JP2022190834A (en) robot system
JP2020138315A (en) Production system
WO2022196052A1 (en) Adhesion position measurement device and machine tool provided therewith
JP2024068115A (en) Robot mounted moving device
JP3218553B2 (en) Robot system control method and device
US20200023522A1 (en) Robot system
WO2022097535A1 (en) Setting method using teaching operation
KR0176540B1 (en) Position correction device of mobile robot
JPH07122823B2 (en) Teaching and control method for robot / automatic machine with hand vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21931681

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21931681

Country of ref document: EP

Kind code of ref document: A1