WO2023032400A1 - Automatic transport device, and system - Google Patents

Automatic transport device, and system Download PDF

Info

Publication number
WO2023032400A1
WO2023032400A1 PCT/JP2022/023266 JP2022023266W WO2023032400A1 WO 2023032400 A1 WO2023032400 A1 WO 2023032400A1 JP 2022023266 W JP2022023266 W JP 2022023266W WO 2023032400 A1 WO2023032400 A1 WO 2023032400A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
posture
coordinate system
stereo camera
camera
Prior art date
Application number
PCT/JP2022/023266
Other languages
French (fr)
Japanese (ja)
Inventor
勇太 大場
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2023032400A1 publication Critical patent/WO2023032400A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present disclosure includes an industrial machine such as a machine tool that processes a workpiece, a robot that performs work on the industrial machine, a transport device equipped with the robot and capable of moving to a work position set with respect to the industrial machine, and the robot. It relates to a system provided with a control device for controlling the
  • Patent Document 1 the system disclosed in JP-A-2017-132002 (Patent Document 1 below) is known as an example of the above system.
  • an unmanned guided vehicle equipped with a robot moves to a working position set for a machine tool, which is one type of industrial machine. work is performed.
  • a single robot moved by an unmanned guided vehicle can carry out operations such as attaching and detaching workpieces to and from a plurality of machine tools. Since the degree of freedom in the layout of the machine tool is increased compared to the case of arranging them, the layout of the machine tool can be set to a layout that can further improve production efficiency. In addition, compared to conventional systems in which robots are fixed, one robot can work on a larger number of machine tools, so equipment costs can be reduced.
  • the robot attitude when the automatic guided vehicle is positioned at the working position and the robot attitude set at the time of teaching, which is the reference for control are required. It is necessary to compare the posture with a reference posture, detect the amount of error, and correct the working posture of the robot according to the amount of error.
  • a position correction method disclosed in Japanese Patent Application Laid-Open No. 2016-221622 (Patent Document 2 below) is conventionally known.
  • a visual target consisting of two calibration markers is arranged on the outer surface of the machine tool, and the visual target is imaged by a camera provided on the movable part of the robot. Based on the captured image and the position and orientation of the camera, the relative positional relationship between the robot and the machine tool is measured, and the working posture of the robot is corrected based on the measured positional relationship. .
  • a so-called stereo camera having two cameras is known, and it would be convenient if the posture of the robot could be corrected using such a known stereo camera.
  • the present invention provides an automatic transport device and system described in the claims.
  • an industrial machine in which a robot actually works is configured, or a structure provided in this industrial machine is imaged by a stereo camera, and the posture of the robot is corrected based on the obtained image. Therefore, there is no need to prepare a special component such as a calibration marker for correcting the posture of the robot as in the conventional art, and the troublesome preparation of arranging this on the outer surface of the machine tool is eliminated. no need to do any work. Therefore, it is possible to easily construct a system for correcting the posture of the robot.
  • it is possible to correct the posture of the robot by taking an image of a single structure, so that the operation time of the robot for taking an image can be shortened compared to the conventional art. , the production efficiency in the system can be increased compared to the conventional system.
  • FIG. 1 is a plan view showing a schematic configuration of a system according to one embodiment of the present invention
  • FIG. 1 is a block diagram showing the configuration of a system according to this embodiment
  • FIG. It is the perspective view which showed the automatic guided vehicle and robot which concern on this embodiment.
  • 1 is a perspective view showing a spindle and a chuck, which are structures constituting the machine tool according to the present embodiment
  • FIG. 4 is an explanatory diagram for explaining an imaging posture of a robot
  • FIG. 11 is an explanatory diagram for explaining processing for calculating a correction amount for correcting the posture of the robot;
  • this system 1 includes a machine tool 10 as an industrial machine, a material stocker 20 and a product stocker 21, an automatic guided vehicle 35 as a transport device, and a robot mounted on the automatic guided vehicle 35. 25, and a control device 40 that controls the robot 25 and the automatic guided vehicle 35.
  • the automatic guided vehicle 35, the robot 25, and the controller 40 constitute an automatic guided vehicle.
  • the robot 25 is provided with a stereo camera 31 .
  • the machine tool 10 and the material stocker 20 and the product stocker 21 provided around the machine tool 10 are exemplified as the industrial machine, but the industrial machine to which the present invention is applicable is not limited to this.
  • a known horizontal NC (numerical control) lathe is exemplified as the machine tool 10, but the machine tool is, of course, not limited to this.
  • machine tools such as multi-tasking machines equipped with a tool spindle and a work spindle are included.
  • the machine tool 10 of this example is a conventionally known horizontal NC (numerical control) lathe, and as shown in FIG.
  • the workpiece W can be turned by using an appropriate tool.
  • 4 shows only the main spindle 11 and the chuck 12 as components (structures) of the machine tool 10, but it goes without saying that the machine tool 10 is a carriage that is disposed within the machining area.
  • other components known in the field such as a door arranged outside the processing area, a handle attached to the door, a control panel and an indicator light be able to.
  • the material stocker 20 is arranged on the left side of the machine tool 10 in FIG.
  • the product stocker 21 is arranged on the right side of the machine tool 10 in FIG. .
  • the automatic guided vehicle 35 has the robot 25 mounted on its upper surface, that is, a mounting surface 36, and is provided with an operation panel 37 that can be carried by an operator.
  • the operation panel 37 includes an input/output unit for inputting/outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
  • the automatic guided vehicle 35 is equipped with a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light), and under the control of the control device 40, the machine tool 10 , the machine tool 10, the material stocker 20 and the product stocker 21. through each working position.
  • a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light)
  • the control device 40 under the control of the control device 40, the machine tool 10 , the machine tool 10, the material stocker 20 and the product stocker 21. through each working position.
  • the robot 25 of this example is an articulated robot having three arms, a first arm 26, a second arm 27 and a third arm 28.
  • a hand 29 as an end effector (action part) is attached to the tip of the camera, and two cameras 31a and 31b are attached via a support bar 30.
  • These two cameras 31a and 31b are so-called stereo cameras 31 arranged so that their imaging optical axes intersect.
  • the aspect of the robot applicable to the present invention is not limited to the aspect of the robot 25 of this example, and the robot includes (i) a camera and (ii) a hand portion for gripping a workpiece or a tool. , (iii) a second arm portion movably connected to the hand portion, and (iv) a first arm portion movably connected to the second arm portion.
  • the hand portion corresponds to the hand 29
  • the second arm portion corresponds to the joint portion rotatably (movably) coupled to the second arm 27
  • the first arm portion corresponds to a joint portion rotatably (movably) coupled to the first arm 26 .
  • the third arm 28 of the robot of the present embodiment and the joint portion that is rotatably or advanceably retractable (movable) may be understood to correspond to the second arm portion. That is, although the robot 25 of this example has three arms, the robot may have at least two arms.
  • the control device 40 includes an operation program storage unit 41, a movement position storage unit 42, an operation attitude storage unit 43, a map information storage unit 44, a reference data storage unit 45, a manual operation control unit 46, an automatic It is composed of an operation control unit 47 , a map information generation unit 48 , a position recognition unit 49 , a point cloud data calculation unit 50 , a correction amount calculation unit 51 and an input/output interface 52 .
  • the control device 40 is connected to the machine tool 10 , material stocker 20 , product stocker 21 , robot 25 , stereo camera 31 , automatic guided vehicle 35 and operation panel 37 via this input/output interface 52 .
  • the control device 40 is not limited to this aspect.
  • the control device 40 may have at least a control section for controlling the position of the hand 30 of the robot 25, and the other storage section and the like may be provided by another device.
  • the control device 40 of this example is composed of a computer including a CPU, a RAM, a ROM, etc., and includes the manual operation control unit 46, the automatic operation control unit 47, the map information generation unit 48, the position recognition unit 49, and the point cloud data calculation unit. 50, the correction amount calculation unit 51, and the input/output interface 52 are realized by a computer program, and execute processing described later.
  • the motion program storage unit 41, movement position storage unit 42, motion posture storage unit 43, map information storage unit 44, and reference data storage unit 45 are configured from appropriate storage media such as RAM.
  • control device 40 is attached to the automatic guided vehicle 35 and is connected to the machine tool 10, the material stocker 20 and the product stocker 21 by suitable communication means, and the robot 25, the stereo camera 31, the automatic guided vehicle 35 and the operator. It is connected to the board 37 by wire or wirelessly.
  • the control device 40 is not limited to such a mode, and the control device 40 may be arranged at an appropriate position other than the automatic guided vehicle 35 . In this case, the control device 40 is appropriately connected to each section by communication means.
  • the manual operation control unit 46 is a functional unit that operates the unmanned guided vehicle 35, the robot 25, and the stereo camera 31 according to operation signals input from the operation panel 37 by the operator. That is, the operator can manually operate the automatic guided vehicle 35 , the robot 25 and the stereo camera 31 using the operation panel 37 under the control of the manual operation control section 46 .
  • the operation program storage unit 41 operates the automatic guided vehicle 35 when generating an automatic driving program for automatically driving the automatic guided vehicle 35 and the robot 25 during production, and map information in the factory, which will be described later.
  • This is a functional unit that stores a map generation program for The automatic driving program and the map generating program are input from, for example, an input/output unit provided on the operation panel 37 and stored in the operation program storage unit 41 .
  • the automatic operation program includes command codes relating to the movement position as a target position to which the automatic guided vehicle 35 moves, the movement speed, and the orientation of the automatic guided vehicle 35. and a command code for operating the stereo camera 31 are included.
  • the map generation program includes command codes for causing the automatic guided vehicle 35 to travel all over the factory without a track so that the map information generation unit 48 can generate map information.
  • the map information storage unit 44 is a functional unit that stores map information including arrangement information of machines, devices, equipment, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 travels. It is generated by the map information generator 48 .
  • the map information generation unit 48 causes the automatic guided vehicle 35 to travel according to the map generation program stored in the operation program storage unit 41 under the control of the automatic driving control unit 47 of the control device 40, which will be described later in detail.
  • the space information in the factory is acquired from the distance data detected by the sensor, and the planar shape of the equipment etc. arranged in the factory is recognized, for example, the planar shape of the equipment etc. registered in advance Based on this, the positions, plane shapes, etc. (layout information) of the specific devices arranged in the factory, in this example, the machine tool 10, the material stocker 20 and the product stocker 21 are recognized.
  • the map information generating unit 48 stores the obtained spatial information and arrangement information of the devices in the map information storage unit 44 as map information of the factory.
  • the position recognition unit 49 is a function unit that recognizes the position of the automatic guided vehicle 35 in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 44. Based on the position of the automatic guided vehicle 35 recognized by the position recognition section 49 , the operation of the automatic guided vehicle 35 is controlled by the automatic operation control section 47 .
  • the movement position storage unit 42 is a movement position as a specific target position to which the automatic guided vehicle 35 moves, and is a functional unit that stores a specific movement position corresponding to the command code in the operation program. These movement positions include the work positions set for the machine tool 10, the material stocker 20, and the product stocker 21 described above. This movement position is determined by, for example, manually operating the automatic guided vehicle 35 using the operation panel 37 under the control of the manual operation control unit 46 to move it to each target position, and then performing the position recognition. It is set by the operation of storing the position data recognized by the unit 49 in the movement position storage unit 42 . This operation is called a so-called teaching operation.
  • the motion posture storage unit 43 stores data relating to motion postures (motion postures) of the robot 25 that sequentially change as the robot 25 moves in a predetermined order, corresponding to command codes in the motion program. is a functional unit that stores the The data relating to the motion posture is obtained when the robot 25 is manually operated by teaching operation using the operation panel 37 under the control of the manual operation control unit 46 to take each target posture. , the rotation angle data of each joint (motor) of the robot 25 in each posture, and this rotation angle data is stored in the motion posture storage unit 43 as data relating to the motion posture.
  • Specific motion postures of the robot 25 are set in the material stocker 20, the machine tool 10 and the product stocker 21 respectively.
  • the work start posture take-out start posture
  • the unprocessed work W stored in the material stocker 20 is gripped by the hand 29, and the material stocker
  • Each work posture (each take-out posture) for taking out from 20 and the posture when taking out is completed (take-out completion posture, which is the same posture as the take-out start posture in this example) are set as take-out motion postures.
  • a work take-out motion posture for taking out the machined work W' from the machine tool 10 and a work mounting motion posture for attaching the pre-machined work W to the machine tool 10 are set.
  • the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 chucks the workpiece.
  • 12 imaging posture
  • the posture in which the hand 29 is opposed to the processed work W′ gripped by the chuck 12 (retrieval preparation posture), and the hand 29 is moved to the chuck 12 side.
  • Outer posture), and a posture in which the hand 29 and the camera 31 are removed from the machine tool 10 (work completion posture) are set.
  • the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 images the chuck 12.
  • posture (imaging posture) (see FIG. 5), posture in which the unprocessed work W gripped by the hand 29 is opposed to the chuck 12 of the machine tool 10 (mounting preparation posture), and the hand 29 is moved to the chuck 12 side.
  • the chuck 12 can grip the workpiece W before machining (mounting posture), the hand 29 is separated from the chuck 12 (separation posture), and the hand 29 and the camera 31 are removed from the machine tool 10.
  • Each posture of the posture (work completion posture) is set.
  • FIG. 5 illustrates a state in which the stereo camera 31 faces the chuck 12, the imaging posture is not limited to this. It can be a posture of being.
  • a two-dot chain line shown in FIG. 5 is the field of view of the stereo camera 31 .
  • the work start posture (storage start posture) when work is started in the product stocker 21
  • each work posture for storing the processed work W′ gripped by the hand 29 in the product stocker 21 ( storage posture)
  • storage completion posture which is the same posture as the storage start posture in this example
  • the automatic operation control unit 47 uses either the automatic operation program or the map generation program stored in the operation program storage unit 41, and operates the automatic guided vehicle 35, the robot 25, and the camera 31 according to the program. Department. At that time, the data stored in the movement position storage section 42 and the motion posture storage section 43 are used as necessary.
  • the point cloud data calculation unit 50 calculates the focal length of each camera 31a and 31b, the distance between the cameras 31a and 31b, and the distance between the two cameras 31a and 31b. From the parallax, the position in the three-dimensional space of the camera coordinate system set for the stereo camera 31 is calculated for each element obtained by dividing the object to be imaged into a predetermined size.
  • Each of the above elements is recognized as a point group forming an object to be imaged, and the position data thereof is the position data of the point group in the three-dimensional space of the camera coordinate system (three-dimensional point group position data).
  • the point cloud data calculation unit 50 detects the stereo image when the automatic guided vehicle 35 is at the work position set with respect to the machine tool 10 and the robot 25 takes the above-described imaging posture. Based on two images of the chuck 12 captured by the camera 31, three-dimensional point cloud position data of the chuck 12 in the camera coordinate system is calculated from these images.
  • the reference data storage unit 45 captures images with the stereo camera 31 when the unmanned guided vehicle 35 is at the working position set with respect to the machine tool 10 and the robot 25 is in the imaging posture during teaching operation. It is a functional unit that stores the three-dimensional point cloud position data of the chuck 12 calculated by the point cloud data calculation unit 50 based on the image of the chuck 12 obtained as reference data.
  • the reference data storage unit 45 also stores three-dimensional point group position data obtained from CAD data of the chuck 12 , which is an object coordinate system set for the chuck 12 .
  • 3D point cloud position data in the coordinate system is stored.
  • this object coordinate system is defined by, for example, two orthogonal x-axis and y-axis set in a plane orthogonal to the axis of the chuck 12, and three orthogonal axes of the z-axis orthogonal to these x-axis and y-axis. can be defined.
  • the correction amount calculation unit 51 determines whether the robot 25 is When the chuck 12 is imaged by the stereo camera 31 in the imaging posture and the current three-dimensional point cloud position data of the chuck 12 is calculated by the point cloud data calculation unit 50, this current three-dimensional point cloud position data is obtained. and the reference data (three-dimensional point cloud position data of the chuck 12 calculated during the teaching operation) stored in the reference data storage unit 45, the current posture of the robot 25 and the posture during the teaching operation.
  • the amount of positional error and the amount of rotational error of the stereo camera 31 in between are estimated, and based on each estimated amount of error, the amount of correction for the action portion in the working posture is calculated.
  • the correction amount calculation unit 51 calculates the following based on the current three-dimensional point cloud position data of the chuck 12 obtained during automatic operation and the reference data stored in the reference data storage unit 45.
  • the amount of positional error of the stereo camera 31 between the current posture of the robot 25 and the posture during the teaching operation which is set within a plane orthogonal to the axis of the chuck 12, is determined.
  • each working posture of the robot 25 is corrected based on the estimated position error amount and rotation error amount.
  • the correction amount calculation unit 51 sets the chuck 12 from the camera coordinate system corresponding to the stereo camera 31 based on the reference data obtained during the teaching operation.
  • coordinate transformation matrix for transforming to the object coordinate system which is the coordinate system to get
  • the camera coordinate system can be defined, for example, by orthogonal three axes that are set centering on an intermediate position between the imaging elements in a plane that includes the imaging elements (for example, CMOS sensors) of the cameras 31a and 31b. .
  • this coordinate transformation matrix can be calculated by the following procedure. That is, first, the three-dimensional point cloud position data in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45, the internal parameters of the stereo camera 31 (for example, the focal length of the camera), and Coordinate transformation matrix from camera coordinate system to object coordinate system related to CAD data based on external parameters (distance between cameras 31a and 31b, parallax, etc.) to get
  • the correction amount calculation unit 51 stores the three-dimensional point cloud position data of the chuck 12 in the object coordinate system in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45 as well.
  • the reference data which is the three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation
  • these are applied to the RANSAC algorithm (global alignment) and the ICP algorithm (local alignment)
  • the coordinate transformation matrix from the object coordinate system related to the CAD data to the object coordinate system related to the chuck 12 at the time of teaching operation to get as conceptually shown in FIG.
  • the figure indicated by the dashed line is the figure related to the three-dimensional point group position data in the object coordinate system of the chuck 12 obtained from the CAD data, and the figure indicated by the solid line is obtained during the teaching operation. , the above coordinate transformation matrix is obtained.
  • the coordinate transformation matrix After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix and a camera position in the camera coordinate system, which is a camera position at the time of imaging in the teaching operation and based on the camera position during the teaching operation in the object coordinate system is calculated by Equation 2 below. (Formula 2)
  • the correction amount calculation unit 51 calculates the camera position in the robot coordinate system during the teaching operation. is calculated by Equation 3 below.
  • the robot coordinate system is a three-dimensional coordinate system set for the control device 40 to control the robot 25, and is defined by three orthogonal axes with origins set at appropriate positions. (Formula 3)
  • the correction amount calculation unit 51 calculates the coordinate transformation matrix for converting from the camera coordinate system during the teaching operation to the robot coordinate system during the teaching operation. is calculated by Equation 4 below. (Formula 4) here, Rotation matrix elements of Rotation angles around x, y and z axes based on , , Calculate
  • a coordinate transformation matrix for transforming from the object coordinate system to the robot coordinate system at the time of teaching operation is a coordinate transformation matrix for transforming from the object coordinate system to the robot coordinate system at the time of teaching operation.
  • the coordinate transformation matrix for converting from the object coordinate system to the camera coordinate system during teaching operation and a coordinate transformation matrix for transforming from the camera coordinate system to the robot coordinate system at the time of teaching operation can be obtained by Equation 5 below. (Formula 5)
  • the correction amount calculation unit 51 calculates the above based on the current three-dimensional point cloud position data in the camera coordinate system of the chuck 12 calculated by the point cloud data calculation unit 50 during automatic operation (during actual operation).
  • the coordinate transformation matrix from the camera coordinate system to the object coordinate system at the time of teaching operation is the three-dimensional point cloud position data in the object coordinate system of the chuck 12 at the time of the teaching operation, the internal parameters of the stereo camera 31 (for example, the focal length of the camera, etc.), which are stored in the reference data storage unit 45, as described above ) and external parameters (distance between cameras 31a and 31b, parallax, etc.).
  • the coordinate transformation matrix from the object coordinate system during teaching operation to the object coordinate system during automatic operation are three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation stored in the reference data storage unit 45, and calculated by the point cloud data calculation unit 50 during automatic operation.
  • the 3D point cloud position data of the chuck 12 for example, according to the RANSAC algorithm (global alignment) and ICP algorithm (local alignment)
  • the 3D point cloud position data during teaching operation is automatically operated. It can be calculated by performing a process of superimposing it on the three-dimensional point cloud position data of time.
  • the figure indicated by the dashed line is the figure related to the three-dimensional point group position data of the chuck 12 in the object coordinate system during teaching operation
  • the figure indicated by the solid line is the object coordinate system during automatic operation.
  • the coordinate transformation matrix is obtained.
  • the coordinate transformation matrix After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix and the camera position in the camera coordinate system, which is the camera position at the time of imaging in automatic driving and based on the camera position during automatic driving in the object coordinate system is calculated by Equation 7 below. (Formula 7)
  • the correction amount calculation unit 51 calculates the current camera position in the object coordinate system during automatic driving. is calculated by the following formula 8, and the current camera position in the robot coordinate system during automatic operation is calculated by Equation 9 below. (Formula 8) (Formula 9)
  • the correction amount calculation unit 51 calculates the coordinate transformation matrix for transforming the current camera coordinate system to the robot coordinate system at the time of the teaching operation. is calculated by Equation 10 below. (Formula 10) here, Rotation matrix elements of Rotation angles around x, y and z axes based on , , Calculate
  • the correction amount calculation unit 51 calculates the camera angle during the teaching operation in the coordinate system during the teaching operation calculated as described above. , , and the current camera angle in the coordinate system during teaching operation , , Rotational errors ⁇ rx, ⁇ ry, and ⁇ rz about the x-axis, y-axis, and z-axis are calculated by calculating these differences based on and. however,
  • the correction amount calculation unit 51 calculates the rotation matrix between the robot coordinate system at the time of the teaching operation and the current robot coordinate system. That is, the rotation error amount is calculated by the following formula 11, and the translation matrix from the robot coordinate system at the time of the teaching operation to the current robot coordinate system That is, the position error amount is calculated by Equation 12 below. (Formula 11) (Formula 12)
  • the automatic driving control unit 47 determines the position of the hand 29 in the subsequent motion posture of the robot 25 based on the correction amount calculated by the correction amount calculation unit 51 . is corrected according to Equation 14 below. (Formula 14)
  • unmanned automatic production is executed as follows.
  • the automatic operation program stored in the operation program storage unit 41 is executed, and according to this automatic operation program, for example, the automatic guided vehicle 35 and Robot 25 operates as follows.
  • the unmanned guided vehicle 35 moves to the work position set with respect to the machine tool 10, and the robot 25 assumes the work start posture for the above-described work take-out operation.
  • the machine tool 10 has completed the predetermined machining and the door cover is open so that the robot 25 can enter the machining area.
  • the robot 25 shifts to the imaging posture and images the chuck 12 with the stereo camera 31 .
  • the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12
  • the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above.
  • the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
  • the automatic operation control unit 47 determines the subsequent work take-out operation postures, that is, the above-described take-out preparation posture, gripping posture, removal posture, and work completion posture.
  • the position of the hand 29 at is corrected according to Equation 14, and the machined workpiece W' gripped by the chuck 12 of the machine tool 10 is gripped by the hand 29 and taken out from the machine tool 10.
  • the chuck 12 is opened by transmitting a chuck opening command from the automatic operation control unit 47 to the machine tool 10 after causing the robot 25 to take the gripping posture.
  • the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the product stocker 21, and instructs the robot 25 to perform the storage start posture when starting work in the product stocker 21. , each storage attitude for storing the machined workpiece gripped by the hand 29 in the product stocker 21 and the storage completion attitude when the storage is completed.
  • the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the material stocker 20, and instructs the robot 25 to take out a take-out start posture when starting work in the material stocker 20,
  • the unprocessed work stored in the material stocker 20 is grasped by the hand 29, and the hand 29 is caused to sequentially take each take-out posture for taking out the work from the material stocker 20 and the take-out completion posture when the take-out is completed, and the work is processed by the hand 29. Grip the front workpiece.
  • the automatic operation control unit 47 again moves the unmanned guided vehicle 35 to the work position set with respect to the machine tool 10, and causes the robot 25 to assume the work start posture for the work mounting operation described above.
  • the robot 25 is shifted to the imaging posture, and the chuck 12 is imaged by the stereo camera 31 .
  • the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12, and the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above.
  • the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
  • the automatic operation control unit 47 determines the subsequent work mounting posture of the robot 25, that is, the above-described mounting preparation posture, mounting posture, separation posture, and After correcting the position of the hand 29 in the work completion posture according to Equation 14, the robot 25 attaches the unprocessed workpiece W gripped by the hand 29 to the chuck 12 of the machine tool 10, and then moves out of the machine. Let After that, the automatic operation control unit 47 transmits a machining start command to the machine tool 10 to cause the machine tool 10 to perform the machining operation. After the robot 25 takes the mounting posture, the automatic operation control unit 47 transmits a chuck close command to the machine tool 10, whereby the chuck 12 is closed and the workpiece W to be machined is gripped by the chuck 12. be done.
  • system 1 of this example continuously executes unmanned automatic production.
  • the stereo camera 31 images the chuck 12, which is a structure that constitutes the machine tool 10 on which the robot 25 actually works. Since the posture is corrected, there is no need to prepare a special component such as a calibration marker in order to correct the posture of the robot 25 as in the conventional art, and this can be installed outside the machine tool 10. There is no need to perform troublesome preparatory work for arranging on the surface. Therefore, a system for correcting the posture of the robot 25 can be easily constructed. In addition, in the system 1 of the present example, it is possible to correct the posture of the robot 25 by capturing an image of the chuck 12, which is a single structure. This can be done by motion, and as a result, the operating time of the robot 25 for imaging can be shortened compared to the conventional art, and as a result, the production efficiency in the system 1 can be increased compared to the conventional art. can.
  • the operation on the chuck 12 requires the most accurate operation. Since the working posture of the robot 25 is corrected based on the image obtained by the above operation, the working posture can be corrected accurately. However, the work can be performed accurately.
  • the unmanned guided vehicle 35 is configured to move by the motion of the wheels with a relatively high degree of freedom, the mounting surface on which the robot 25 is mounted is likely to tilt with respect to the floor surface, and the robot to be mounted can be tilted with respect to the floor surface. 25, in other words, according to a change in the position of the center of gravity of the robot 25, the inclination is likely to change. Therefore, when the workpieces W and W′ described above are attached and detached, the robot 25 causes the hand 29 and the stereo camera 31 to enter the processing area of the machine tool 10 so that the hand 29 is far away from the automatic guided vehicle 35. When the hand 29 is out of the working area of the machine tool 10, the hand 29 does not overhang the automatic guided vehicle 35, or the overhang is small even if it does. is larger than the slope of
  • the posture correction amount does not accurately reflect the posture error amount of the robot 25 when the hand 29 of the robot 25 is in the machining area of the machine tool 10. The posture of the robot 25 cannot be corrected accurately.
  • the hand 29 of the robot 25 cannot be accurately positioned with respect to the chuck 12.
  • the chuck 12 is a collet chuck or the like, and when the movement allowance (stroke) of the gripping part is extremely small, that is, when the clearance between the workpieces W, W' and the collet chuck is extremely small, the workpiece W is moved to the collet chuck. , W′ cannot be securely gripped. If the attachment and detachment of the workpieces W and W' cannot be reliably performed, the operation rate of the system 1 will decrease, and unmanned production with good production efficiency cannot be realized.
  • the chuck 12, which requires the most accurate work in the work of the robot 25, is set as a structure to be imaged by the stereo camera 31, but it is limited to such a mode.
  • other structures that make up the machine tool 10, such as a tool post, a tool, a carriage, etc. that are also arranged in the machining area may be imaged according to the part where the robot 25 works.
  • a door disposed outside the processing area, a handle attached to the door, a structure such as an operation panel or an indicator light, and the like may be imaged.
  • This aspect also has the effect that a system for correcting the posture of the robot 25 can be easily constructed without preparing special components or performing troublesome preparatory work. be done.
  • a system for correcting the posture of the robot 25 can be easily constructed without preparing special components or performing troublesome preparatory work. be done.
  • it is possible to correct the posture of the robot 25 by capturing an image of a single structure it is possible to perform this in a single operation when capturing an image of the structure with the stereo camera 31.
  • the operation time of the robot 25 for imaging can be shortened as compared with the conventional art, and as a result, the production efficiency in the system 1 can be improved as compared with the conventional art.
  • the imaging attitude is the attitude before the hand 29 and the stereo camera 31 enter the machining area of the machine tool 10, that is, the hand
  • the 29 and the stereo camera 31 are outside the machining area of the machine tool 10, they are in a posture for imaging the imaging target.
  • the working posture of the robot 25 may be corrected in the same manner as described above.
  • the industrial machine according to the present invention includes all machines used industrially, such as a measuring device, a cleaning device, a washing device, and robots other than the robot 25 of this example. When the operator 25 works on this industrial machine, it is possible to correct its working posture in the manner described above.
  • an aspect using the automatic guided vehicle 35 was exemplified, but the present invention is not limited to this. It may be a configured transport device. Then, a robot 25 is mounted on the transport device, and the transport device is manually transported to the working position of the machine tool 10, and the robot 25 attaches and detaches the work to and from the machine tool 10. Also good.

Abstract

This system comprises: an automatic transport device comprising a robot (25) with a stereo camera (31), a transport device (35) for moving the robot (25) to a work position, and a control device (40) for controlling the robot (25); and a machine tool (10) as an industrial machine. The control device (40), when performing a teaching operation, captures an image of a structure of the machine tool (10) with the stereo camera (31), and generates reference data composed of three-dimensional point cloud position data of the structure; and, when actually operating the robot (25), captures an image of the structure with the stereo camera (31), generates current data composed of three-dimensional point cloud position data of the structure, estimates, on the basis of the obtained current data and reference data, a position error amount and a rotation error amount of the stereo camera (31) between a current posture of the robot (25) and a posture during the teaching operation, and corrects the posture of the robot (25) on the basis of the estimated position error amount and rotation error amount.

Description

自動搬送装置、及びシステムAutomatic transport device and system
 本開示は、ワークを加工する工作機械などの産業機械、産業機械に対して作業を行うロボット、ロボットを搭載し、産業機械に対して設定された作業位置に移動可能な搬送装置、及び前記ロボットを制御する制御装置など備えたシステムに関する。 The present disclosure includes an industrial machine such as a machine tool that processes a workpiece, a robot that performs work on the industrial machine, a transport device equipped with the robot and capable of moving to a work position set with respect to the industrial machine, and the robot. It relates to a system provided with a control device for controlling the
 従来、上述したシステムの一例として、特開2017-132002号公報(下記特許文献1)に開示されたシステムが知られている。このシステムでは、ロボットを搭載した無人搬送車が、産業機械の一つである工作機械に対して設定された作業位置に移動し、当該作業位置において、ロボットにより工作機械に対してワークの着脱等の作業が実行される。 Conventionally, the system disclosed in JP-A-2017-132002 (Patent Document 1 below) is known as an example of the above system. In this system, an unmanned guided vehicle equipped with a robot moves to a working position set for a machine tool, which is one type of industrial machine. work is performed.
 このようなシステムでは、無人搬送車によって移動する一台のロボットにより、複数の工作機械に対してワークの着脱等の作業を実施することができるので、工作機械に対してロボットを固定した状態で配設する場合に比べて、工作機械のレイアウトの自由度が増すため、工作機械のレイアウトをより生産効率を高めることが可能なレイアウトに設定することができる。また、ロボットを固定状態で配設した旧来のシステムに比べて、一台のロボットにより、より多くの工作機械に対して作業を行うことができるので、設備費用の低廉化を図ることができる。 In such a system, a single robot moved by an unmanned guided vehicle can carry out operations such as attaching and detaching workpieces to and from a plurality of machine tools. Since the degree of freedom in the layout of the machine tool is increased compared to the case of arranging them, the layout of the machine tool can be set to a layout that can further improve production efficiency. In addition, compared to conventional systems in which robots are fixed, one robot can work on a larger number of machine tools, so equipment costs can be reduced.
 その一方、無人搬送車は車輪を用いて自走する構造であるが故に、前記作業位置に停止するその位置決め精度は必ずしも高いものとは言えない。このため、ロボットが工作機械に対して正確な作業を行うためには、無人搬送車を前記作業位置へ位置決めした際のロボットの姿勢と、制御上の基準となる所謂ティーチング時に設定されたロボットの基準姿勢とを比較して、その誤差量を検出し、当該誤差量に応じてロボットの作業姿勢を補正する必要がある。 On the other hand, since the unmanned guided vehicle is self-propelled using wheels, it cannot be said that the positioning accuracy of stopping at the above working position is necessarily high. For this reason, in order for the robot to perform accurate work on the machine tool, the robot attitude when the automatic guided vehicle is positioned at the working position and the robot attitude set at the time of teaching, which is the reference for control, are required. It is necessary to compare the posture with a reference posture, detect the amount of error, and correct the working posture of the robot according to the amount of error.
 このようなロボットの姿勢を補正する技術として、従来、特開2016-221622号公報(下記特許文献2)に開示されるような位置補正手法が知られている。具体的には、この位置補正手法は、2つの較正用マーカからなる視覚ターゲットを工作機械の外表面に配設し、ロボットの可動部に設けられたカメラにより、前記視覚ターゲットを撮像し、得られた画像と、カメラの位置及び姿勢とを基に、ロボットと工作機械との相対的な位置関係を測定し、測定された位置関係に基づいて、ロボットの作業姿勢を補正するというものである。 As a technique for correcting the posture of such a robot, a position correction method disclosed in Japanese Patent Application Laid-Open No. 2016-221622 (Patent Document 2 below) is conventionally known. Specifically, in this position correction method, a visual target consisting of two calibration markers is arranged on the outer surface of the machine tool, and the visual target is imaged by a camera provided on the movable part of the robot. Based on the captured image and the position and orientation of the camera, the relative positional relationship between the robot and the machine tool is measured, and the working posture of the robot is corrected based on the measured positional relationship. .
特開2017-132002号公報Japanese Patent Application Laid-Open No. 2017-132002 特開2016-221622号公報JP 2016-221622 A
 ところが、上述した従来の位置補正手法では、補正を行うために、2つの較正用マーカという特別な構成要素を用意するとともに、これを工作機械の外表面に配設するという煩わしい準備作業を行う必要があるため、ロボットの姿勢を補正するための体制を簡単には構築することができないという問題があった。 However, in the conventional position correction method described above, in order to perform correction, it is necessary to prepare two special calibration markers, which are special components, and to perform troublesome preparatory work of arranging them on the outer surface of the machine tool. Therefore, there is a problem that a system for correcting the posture of the robot cannot be easily constructed.
 さらに、特許文献2に開示された位置補正手法では、2つの較正用マーカをそれぞれカメラによって撮像するようにしているので、較正用マーカを撮像するためのロボットの動作時間が長く、このため、当該システムにおける生産効率が低くなるという問題がある。 Furthermore, in the position correction method disclosed in Patent Document 2, the two calibration markers are captured by cameras, respectively. There is a problem that the production efficiency in the system is low.
 また、前記カメラとして、2つのカメラを備えた所謂ステレオカメラが公知であるが、このような公知のステレオカメラを用いてロボットの姿勢を補正することができれば、便利である。 Also, as the camera, a so-called stereo camera having two cameras is known, and it would be convenient if the posture of the robot could be corrected using such a known stereo camera.
 そこで、本発明は、請求の範囲に記載した自動搬送装置、及びシステムを提供するものである。 Therefore, the present invention provides an automatic transport device and system described in the claims.
 本発明によれば、ロボットが実際に作業する産業機械を構成する、又はこの産業機械に設けられた構造体をステレオカメラにより撮像し、得られた画像を基に、ロボットの姿勢を補正するようにしているので、従来のように、ロボットの姿勢を補正するための較正用マーカといった特別な構成要素をわざわざ用意する必要がなく、また、これを工作機械の外表面に配設するという煩わしい準備作業を行う必要がない。このため、ロボットの姿勢を補正するための体制を容易に構築することができる。また、本発明では、単体の構造体を撮像することで、ロボットの姿勢を補正することが可能であるので、撮像のためのロボットの動作時間を従来に比べて短くすることができ、これにより、当該システムにおける生産効率を従来に比べて高めることができる。 According to the present invention, an industrial machine in which a robot actually works is configured, or a structure provided in this industrial machine is imaged by a stereo camera, and the posture of the robot is corrected based on the obtained image. Therefore, there is no need to prepare a special component such as a calibration marker for correcting the posture of the robot as in the conventional art, and the troublesome preparation of arranging this on the outer surface of the machine tool is eliminated. no need to do any work. Therefore, it is possible to easily construct a system for correcting the posture of the robot. In addition, in the present invention, it is possible to correct the posture of the robot by taking an image of a single structure, so that the operation time of the robot for taking an image can be shortened compared to the conventional art. , the production efficiency in the system can be increased compared to the conventional system.
本発明の一実施形態に係るシステムの概略構成を示した平面図である。1 is a plan view showing a schematic configuration of a system according to one embodiment of the present invention; FIG. 本実施形態に係るシステムの構成を示したブロック図である。1 is a block diagram showing the configuration of a system according to this embodiment; FIG. 本実施形態に係る無人搬送車及びロボットを示した斜視図である。It is the perspective view which showed the automatic guided vehicle and robot which concern on this embodiment. 本実施形態に係る工作機械を構成する構造体である主軸及びチャックを示した斜視図である。1 is a perspective view showing a spindle and a chuck, which are structures constituting the machine tool according to the present embodiment; FIG. ロボットの撮像姿勢を説明するための説明図である。FIG. 4 is an explanatory diagram for explaining an imaging posture of a robot; ロボットの姿勢を補正する補正量の算出処理について説明するための説明図である。FIG. 11 is an explanatory diagram for explaining processing for calculating a correction amount for correcting the posture of the robot;
 以下、本発明の具体的な実施の形態について、図面を参照しながら説明する。 Specific embodiments of the present invention will be described below with reference to the drawings.
 図1及び図2に示すように、このシステム1は、産業機械としての工作機械10、材料ストッカ20及び製品ストッカ21、搬送装置としての無人搬送車35、この無人搬送車35に搭載されるロボット25、並びにロボット25及び無人搬送車35を制御する制御装置40などから構成される。そして、これらの内、前記無人搬送車35、前記ロボット25、及び前記制御装置40から自動搬送装置が構成される。また、前記ロボット25にはステレオカメラ31が設けられている。また、本例では、産業機械として工作機械10、並びにその周辺に設けられる材料ストッカ20及び製品ストッカ21を例示するが、本発明を適用可能な産業機械はこれに限られるものではなく、測定装置、清掃装置や洗浄装置の他、本例の前記ロボット25以外のロボットなど、産業上使用されるあらゆる機械が含まれる。また、本例では工作機械10として公知の横形のNC(数値制御)旋盤を例示するが、当然のことながら、工作機械はこれに限られるものではなく、立形の旋盤、立形及び横形のマシニングセンタの他、工具主軸とワーク主軸を備えた複合加工型の加工機械など、従前公知のあらゆる工作機械が含まれる。 As shown in FIGS. 1 and 2, this system 1 includes a machine tool 10 as an industrial machine, a material stocker 20 and a product stocker 21, an automatic guided vehicle 35 as a transport device, and a robot mounted on the automatic guided vehicle 35. 25, and a control device 40 that controls the robot 25 and the automatic guided vehicle 35. Among them, the automatic guided vehicle 35, the robot 25, and the controller 40 constitute an automatic guided vehicle. Also, the robot 25 is provided with a stereo camera 31 . In addition, in this example, the machine tool 10 and the material stocker 20 and the product stocker 21 provided around the machine tool 10 are exemplified as the industrial machine, but the industrial machine to which the present invention is applicable is not limited to this. , cleaning devices, cleaning devices, robots other than the robot 25 in this example, and all machines used industrially. Further, in this example, a known horizontal NC (numerical control) lathe is exemplified as the machine tool 10, but the machine tool is, of course, not limited to this. In addition to machining centers, all conventionally known machine tools such as multi-tasking machines equipped with a tool spindle and a work spindle are included.
 本例の工作機械10は、上記のように、従来公知の横形のNC(数値制御)旋盤であって、図4に示すように、ワークWを把持するチャック12が装着される主軸11を備えており、適宜工具により、当該ワークWに対して旋削加工を行うことができるようになっている。尚、図4では、工作機械10の構成物(構造体)として、主軸11及びチャック12のみを示しているが、当然のことながら、工作機械10は、加工領域内に配設される往復台,刃物台や工具などの構成物の他、加工領域外に配設される扉、扉に付設される取手、操作盤や表示灯といった構成物など、当該分野において公知の他の構成物を備えることができる。 As described above, the machine tool 10 of this example is a conventionally known horizontal NC (numerical control) lathe, and as shown in FIG. The workpiece W can be turned by using an appropriate tool. 4 shows only the main spindle 11 and the chuck 12 as components (structures) of the machine tool 10, but it goes without saying that the machine tool 10 is a carriage that is disposed within the machining area. , In addition to components such as a tool post and tools, other components known in the field such as a door arranged outside the processing area, a handle attached to the door, a control panel and an indicator light be able to.
 前記材料ストッカ20は、図1において工作機械10の左隣に配設され、当該工作機械10で加工される複数の材料(加工前ワークW)をストックする装置である。また、前記製品ストッカ21は、図1において工作機械10の右隣に配設され、当該工作機械10で加工された複数の製品、又は半製品(加工済ワークW’)をストックする装置である。 The material stocker 20 is arranged on the left side of the machine tool 10 in FIG. The product stocker 21 is arranged on the right side of the machine tool 10 in FIG. .
 図1に示すように、前記無人搬送車35には、その上面である載置面36に前記ロボット25が搭載され、また、オペレータが携帯可能な操作盤37が付設されている。尚、この操作盤37は、データの入出力を行う入出力部、当該無人搬送車35及びロボット25を手動操作する操作部、並びに画面表示可能なディスプレイなどを備えている。 As shown in FIG. 1, the automatic guided vehicle 35 has the robot 25 mounted on its upper surface, that is, a mounting surface 36, and is provided with an operation panel 37 that can be carried by an operator. The operation panel 37 includes an input/output unit for inputting/outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
 また、無人搬送車35は、工場内における自身の位置を認識可能なセンサ(例えば、レーザ光を用いた距離計測センサ)を備えており、前記制御装置40による制御の下で、前記工作機械10、材料ストッカ20及び製品ストッカ21が配設される領域を含む工場内を無軌道で走行するように構成され、本例では、前記工作機械10、材料ストッカ20及び製品ストッカ21のそれぞれに対して設定された各作業位置に経由する。 Further, the automatic guided vehicle 35 is equipped with a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light), and under the control of the control device 40, the machine tool 10 , the machine tool 10, the material stocker 20 and the product stocker 21. through each working position.
 図1及び図3に示すように、本例のロボット25は、第1アーム26、第2アーム27及び第3アーム28の3つのアームを備えた多関節型のロボットであり、第3アーム28の先端部にはエンドエフェクタ(作用部)としてのハンド29が装着され、また、支持バー30を介して2つのカメラ31a,31bが装着されている。この2つのカメラ31a,31bはその撮像光軸が交差するように配置された所謂ステレオカメラ31である。 As shown in FIGS. 1 and 3, the robot 25 of this example is an articulated robot having three arms, a first arm 26, a second arm 27 and a third arm 28. A hand 29 as an end effector (action part) is attached to the tip of the camera, and two cameras 31a and 31b are attached via a support bar 30. As shown in FIG. These two cameras 31a and 31b are so-called stereo cameras 31 arranged so that their imaging optical axes intersect.
 尚、本発明に適用可能なロボットの態様は、本例のロボット25の態様に限られるものでは無く、ロボットは、(i)カメラと、(ii)ワークまたは工具を把持するためのハンド部と、(iii)前記ハンド部を可動可能に繋いでいる第2アーム部と、(iv)第2アーム部と可動可能に繋いでいる第1アーム部とを有していればよい。本実施形態のロボット25と対比すると、ハンド部はハンド29に相当し、第2アーム部は、第2アーム27と回転可能(可動可能)に結合するジョイント部とに相当し、第1アーム部は、第1アーム26と回転可能(可動可能)に結合するジョイント部とに相当する。また、本実施形態のロボットの第3アーム28と回転可能や進退可能(可動可能)に結合するジョイント部とが第2アーム部に相当すると解してもよい。つまり、本例のロボット25では、3つのアーム備えているが、ロボットは、少なくとも2つのアームを備えていればよい。 It should be noted that the aspect of the robot applicable to the present invention is not limited to the aspect of the robot 25 of this example, and the robot includes (i) a camera and (ii) a hand portion for gripping a workpiece or a tool. , (iii) a second arm portion movably connected to the hand portion, and (iv) a first arm portion movably connected to the second arm portion. When compared with the robot 25 of this embodiment, the hand portion corresponds to the hand 29, the second arm portion corresponds to the joint portion rotatably (movably) coupled to the second arm 27, and the first arm portion corresponds to a joint portion rotatably (movably) coupled to the first arm 26 . In addition, the third arm 28 of the robot of the present embodiment and the joint portion that is rotatably or advanceably retractable (movable) may be understood to correspond to the second arm portion. That is, although the robot 25 of this example has three arms, the robot may have at least two arms.
 図2に示すように、前記制御装置40は、動作プログラム記憶部41、移動位置記憶部42、動作姿勢記憶部43、マップ情報記憶部44、基準データ憶部45、手動運転制御部46、自動運転制御部47、マップ情報生成部48、位置認識部49、点群データ算出部50、補正量算出部51及び入出力インターフェース52から構成される。そして、制御装置40は、この入出力インターフェース52を介して、前記工作機械10、材料ストッカ20、製品ストッカ21、ロボット25、ステレオカメラ31、無人搬送車35及び操作盤37に接続している。なお、制御装置40は、この態様に限定されるものではない。制御装置40は、少なくとも、ロボット25のハンド30の位置を制御する制御部を有していればよく、他の記憶部等は別の装置が有していてもよい。 As shown in FIG. 2, the control device 40 includes an operation program storage unit 41, a movement position storage unit 42, an operation attitude storage unit 43, a map information storage unit 44, a reference data storage unit 45, a manual operation control unit 46, an automatic It is composed of an operation control unit 47 , a map information generation unit 48 , a position recognition unit 49 , a point cloud data calculation unit 50 , a correction amount calculation unit 51 and an input/output interface 52 . The control device 40 is connected to the machine tool 10 , material stocker 20 , product stocker 21 , robot 25 , stereo camera 31 , automatic guided vehicle 35 and operation panel 37 via this input/output interface 52 . Note that the control device 40 is not limited to this aspect. The control device 40 may have at least a control section for controlling the position of the hand 30 of the robot 25, and the other storage section and the like may be provided by another device.
 本例の制御装置40は、CPU、RAM、ROMなどを含むコンピュータから構成され、前記手動運転制御部46、自動運転制御部47、マップ情報生成部48、位置認識部49、点群データ算出部50、補正量算出部51及び入出力インターフェース52は、コンピュータプログラムによってその機能が実現され、後述する処理を実行する。また、動作プログラム記憶部41、移動位置記憶部42、動作姿勢記憶部43、マップ情報記憶部44及び基準データ記憶部45はRAMなどの適宜記憶媒体から構成される。本例では、制御装置40は無人搬送車35に付設され、適宜通信手段によって工作機械10、材料ストッカ20及び製品ストッカ21と接続されるとともに、ロボット25、ステレオカメラ31、無人搬送車35及び操作盤37とは有線又は無線によって接続されている。但し、このような態様に限られるものではなく、制御装置40は無人搬送車35以外の適宜位置に配設されていても良い。この場合、制御装置40は適宜通信手段によって各部と接続される。 The control device 40 of this example is composed of a computer including a CPU, a RAM, a ROM, etc., and includes the manual operation control unit 46, the automatic operation control unit 47, the map information generation unit 48, the position recognition unit 49, and the point cloud data calculation unit. 50, the correction amount calculation unit 51, and the input/output interface 52 are realized by a computer program, and execute processing described later. The motion program storage unit 41, movement position storage unit 42, motion posture storage unit 43, map information storage unit 44, and reference data storage unit 45 are configured from appropriate storage media such as RAM. In this example, the control device 40 is attached to the automatic guided vehicle 35 and is connected to the machine tool 10, the material stocker 20 and the product stocker 21 by suitable communication means, and the robot 25, the stereo camera 31, the automatic guided vehicle 35 and the operator. It is connected to the board 37 by wire or wirelessly. However, the control device 40 is not limited to such a mode, and the control device 40 may be arranged at an appropriate position other than the automatic guided vehicle 35 . In this case, the control device 40 is appropriately connected to each section by communication means.
 前記手動運転制御部46は、オペレータにより前記操作盤37から入力される操作信号に従って、前記無人搬送車35、ロボット25及びステレオカメラ31を動作させる機能部である。即ち、オペレータは、この手動運転制御部46による制御の下で、操作盤37を用いた、前記無人搬送車35、ロボット25及びステレオカメラ31の手動操作を実行することができる。 The manual operation control unit 46 is a functional unit that operates the unmanned guided vehicle 35, the robot 25, and the stereo camera 31 according to operation signals input from the operation panel 37 by the operator. That is, the operator can manually operate the automatic guided vehicle 35 , the robot 25 and the stereo camera 31 using the operation panel 37 under the control of the manual operation control section 46 .
 前記動作プログラム記憶部41は、生産時に前記無人搬送車35及び前記ロボット25を自動運転するための自動運転用プログラム、並びに後述する工場内のマップ情報を生成する際に前記無人搬送車35を動作させるためのマップ生成用プログラムを記憶する機能部である。自動運転用プログラム及びマップ生成用プログラムは、例えば、前記操作盤37に設けられた入出力部から入力され、当該動作プログラム記憶部41に格納される。 The operation program storage unit 41 operates the automatic guided vehicle 35 when generating an automatic driving program for automatically driving the automatic guided vehicle 35 and the robot 25 during production, and map information in the factory, which will be described later. This is a functional unit that stores a map generation program for The automatic driving program and the map generating program are input from, for example, an input/output unit provided on the operation panel 37 and stored in the operation program storage unit 41 .
 尚、この自動運転用プログラムには、無人搬送車35が移動する目標位置としての移動位置、移動速度及び無人搬送車35の向きに関する指令コードが含まれ、また、ロボット25が順次動作する当該動作に関する指令コード、及び前記ステレオカメラ31の操作に関する指令コードが含まれる。また、マップ生成用プログラムは、前記マップ情報生成部48においてマップ情報を生成できるように、無人搬送車35を無軌道で工場内を隈なく走行させるための指令コードが含まれる。 The automatic operation program includes command codes relating to the movement position as a target position to which the automatic guided vehicle 35 moves, the movement speed, and the orientation of the automatic guided vehicle 35. and a command code for operating the stereo camera 31 are included. The map generation program includes command codes for causing the automatic guided vehicle 35 to travel all over the factory without a track so that the map information generation unit 48 can generate map information.
 前記マップ情報記憶部44は、無人搬送車35が走行する工場内に配置される機械、装置、機器など(装置等)の配置情報を含むマップ情報を記憶する機能部であり、このマップ情報は前記マップ情報生成部48によって生成される。 The map information storage unit 44 is a functional unit that stores map information including arrangement information of machines, devices, equipment, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 travels. It is generated by the map information generator 48 .
 前記マップ情報生成部48は、詳しくは後述する前記制御装置40の自動運転制御部47による制御の下で、前記動作プログラム記憶部41に格納されたマップ生成用プログラムに従って無人搬送車35を走行させた際に、前記センサによって検出される距離データから工場内の空間情報を取得するとともに、工場内に配設される装置等の平面形状を認識し、例えば、予め登録された装置等の平面形状を基に、工場内に配設された具体的な装置、本例では、工作機械10、材料ストッカ20及び製品ストッカ21の位置、平面形状等(配置情報)を認識する。そして、マップ情報生成部48は、得られた空間情報及び装置等の配置情報を工場内のマップ情報として前記マップ情報記憶部44に格納する。 The map information generation unit 48 causes the automatic guided vehicle 35 to travel according to the map generation program stored in the operation program storage unit 41 under the control of the automatic driving control unit 47 of the control device 40, which will be described later in detail. At that time, the space information in the factory is acquired from the distance data detected by the sensor, and the planar shape of the equipment etc. arranged in the factory is recognized, for example, the planar shape of the equipment etc. registered in advance Based on this, the positions, plane shapes, etc. (layout information) of the specific devices arranged in the factory, in this example, the machine tool 10, the material stocker 20 and the product stocker 21 are recognized. Then, the map information generating unit 48 stores the obtained spatial information and arrangement information of the devices in the map information storage unit 44 as map information of the factory.
 前記位置認識部49は、前記センサによって検出される距離データ、及び前記マップ情報記憶部44に格納された工場内のマップ情報を基に、工場内における無人搬送車35の位置を認識する機能部であり、この位置認識部49によって認識される無人搬送車35の位置に基づいて、当該無人搬送車35の動作が前記自動運転制御部47によって制御される。 The position recognition unit 49 is a function unit that recognizes the position of the automatic guided vehicle 35 in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 44. Based on the position of the automatic guided vehicle 35 recognized by the position recognition section 49 , the operation of the automatic guided vehicle 35 is controlled by the automatic operation control section 47 .
 前記移動位置記憶部42は、前記無人搬送車35が移動する具体的な目標位置としての移動位置であって、前記動作プログラム中の指令コードに対応した具体的な移動位置を記憶する機能部であり、この移動位置には、上述した工作機械10、材料ストッカ20及び製品ストッカ21に対して設定される各作業位置が含まれる。尚、この移動位置は、例えば、前記手動運転制御部46による制御の下、前記操作盤37により前記無人搬送車35を手動運転して、目標とする各位置に移動させた後、前記位置認識部49によって認識される位置データを前記移動位置記憶部42に格納する操作によって設定される。この操作は所謂ティーチング操作と呼ばれる。 The movement position storage unit 42 is a movement position as a specific target position to which the automatic guided vehicle 35 moves, and is a functional unit that stores a specific movement position corresponding to the command code in the operation program. These movement positions include the work positions set for the machine tool 10, the material stocker 20, and the product stocker 21 described above. This movement position is determined by, for example, manually operating the automatic guided vehicle 35 using the operation panel 37 under the control of the manual operation control unit 46 to move it to each target position, and then performing the position recognition. It is set by the operation of storing the position data recognized by the unit 49 in the movement position storage unit 42 . This operation is called a so-called teaching operation.
 前記動作姿勢記憶部43は、前記ロボット25が所定の順序で動作することによって順次変化するロボット25の姿勢(動作姿勢)であって、前記動作プログラム中の指令コードに対応した動作姿勢に係るデータを記憶する機能部である。この動作姿勢に係るデータは、前記手動運転制御部46による制御の下で、前記操作盤37を用いたティーチング操作により、当該ロボット25を手動運転して、目標とする各姿勢を取らせたときの、当該各姿勢におけるロボット25の各関節(モータ)の回転角度データであり、この回転角度データが動作姿勢に係るデータとして前記動作姿勢記憶部43に格納される。 The motion posture storage unit 43 stores data relating to motion postures (motion postures) of the robot 25 that sequentially change as the robot 25 moves in a predetermined order, corresponding to command codes in the motion program. is a functional unit that stores the The data relating to the motion posture is obtained when the robot 25 is manually operated by teaching operation using the operation panel 37 under the control of the manual operation control unit 46 to take each target posture. , the rotation angle data of each joint (motor) of the robot 25 in each posture, and this rotation angle data is stored in the motion posture storage unit 43 as data relating to the motion posture.
 ロボット25の具体的な動作姿勢は、前記材料ストッカ20、工作機械10及び製品ストッカ21において、それぞれ設定される。例えば、材料ストッカ20では、当該材料ストッカ20において作業を開始するときの作業開始姿勢(取出開始姿勢)、当該材料ストッカ20に収納された加工前ワークWをハンド29により把持して、当該材料ストッカ20から取り出すための各作業姿勢(各取出姿勢)及び取出を完了したときの姿勢(取出完了姿勢であり、本例では、取出開始姿勢と同じ姿勢)が取出動作姿勢として設定される。 Specific motion postures of the robot 25 are set in the material stocker 20, the machine tool 10 and the product stocker 21 respectively. For example, in the material stocker 20, the work start posture (take-out start posture) when work is started in the material stocker 20, the unprocessed work W stored in the material stocker 20 is gripped by the hand 29, and the material stocker Each work posture (each take-out posture) for taking out from 20 and the posture when taking out is completed (take-out completion posture, which is the same posture as the take-out start posture in this example) are set as take-out motion postures.
 また、工作機械10では、加工済のワークW’を工作機械10から取り出すワーク取出動作姿勢、及び加工前ワークWを工作機械10に取り付けるワーク取付動作姿勢が設定される。 Also, in the machine tool 10, a work take-out motion posture for taking out the machined work W' from the machine tool 10 and a work mounting motion posture for attaching the pre-machined work W to the machine tool 10 are set.
 具体的には、ワーク取出動作姿勢では、例えば、工作機械10に進入する前の作業開始姿勢、ハンド29及びステレオカメラ31を工作機械10の加工領域内に進入させて、当該ステレオカメラ31によってチャック12を撮像する姿勢(撮像姿勢)(図5参照)、チャック12に把持された加工済ワークW’に対してハンド29を対向させた姿勢(取出準備姿勢)、ハンド29をチャック12側に移動させて、当該チャック12に把持された加工済ワークW’をハンド29によって把持する姿勢(把持姿勢)、ハンド29をチャック12から離隔させて加工済ワークW’をチャック12から取り外した姿勢(取外姿勢)、ハンド29及びカメラ31を工作機械10から抜け出させた姿勢(作業完了姿勢)の各姿勢が設定される。 Specifically, in the work take-out operation posture, for example, the work start posture before entering the machine tool 10, the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 chucks the workpiece. 12 (imaging posture) (see FIG. 5), the posture in which the hand 29 is opposed to the processed work W′ gripped by the chuck 12 (retrieval preparation posture), and the hand 29 is moved to the chuck 12 side. and a posture (gripping posture) in which the hand 29 grips the machined work W′ held by the chuck 12, and a posture in which the hand 29 is separated from the chuck 12 and the machined work W′ is removed from the chuck 12 (grasping posture). Outer posture), and a posture in which the hand 29 and the camera 31 are removed from the machine tool 10 (work completion posture) are set.
 また、ワーク取付動作姿勢では、例えば、工作機械10に進入する前の作業開始姿勢、ハンド29及びステレオカメラ31を工作機械10の加工領域内に進入させて、当該ステレオカメラ31によってチャック12を撮像する姿勢(撮像姿勢)(図5参照)、工作機械10のチャック12に対してハンド29に把持された加工前ワークWを対向させた姿勢(取付準備姿勢)、ハンド29をチャック12側に移動させて、加工前ワークWを当該チャック12によって把持可能にした姿勢(取付姿勢)、ハンド29をチャック12から離隔させた姿勢(離隔姿勢)、ハンド29及びカメラ31を工作機械10から抜け出させた姿勢(作業完了姿勢)の各姿勢が設定される。尚、図5では、ステレオカメラ31をチャック12に対して正対させた状態を図示しているが、撮像姿勢は、これに限られるものではなく、ステレオカメラ31がチャック12に対して斜めになる姿勢であっても良い。また、図5において示した二点鎖線は、ステレオカメラ31の撮像視野である。 In addition, in the work mounting posture, for example, the work start posture before entering the machine tool 10, the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 images the chuck 12. posture (imaging posture) (see FIG. 5), posture in which the unprocessed work W gripped by the hand 29 is opposed to the chuck 12 of the machine tool 10 (mounting preparation posture), and the hand 29 is moved to the chuck 12 side. The chuck 12 can grip the workpiece W before machining (mounting posture), the hand 29 is separated from the chuck 12 (separation posture), and the hand 29 and the camera 31 are removed from the machine tool 10. Each posture of the posture (work completion posture) is set. Although FIG. 5 illustrates a state in which the stereo camera 31 faces the chuck 12, the imaging posture is not limited to this. It can be a posture of being. A two-dot chain line shown in FIG. 5 is the field of view of the stereo camera 31 .
 前記製品ストッカ21では、当該製品ストッカ21において作業を開始するときの作業開始姿勢(収納開始姿勢)、ハンド29に把持した加工済ワークW’を製品ストッカ21内に収納するための各作業姿勢(収納姿勢)及び収納を完了したときの姿勢(収納完了姿勢であり、本例では、収納開始姿勢と同じ姿勢)が収納動作姿勢として設定される。 In the product stocker 21, the work start posture (storage start posture) when work is started in the product stocker 21, each work posture for storing the processed work W′ gripped by the hand 29 in the product stocker 21 ( storage posture) and the posture when the storage is completed (storage completion posture, which is the same posture as the storage start posture in this example) are set as the storage operation posture.
 前記自動運転制御部47は、前記動作プログラム記憶部41に格納された自動運転用プログラム及びマップ生成用プログラムの何れかを用い、当該プログラムに従って無人搬送車35、ロボット25及びカメラ31を動作させる機能部である。その際、前記移動位置記憶部42及び動作姿勢記憶部43に格納されたデータが必要に応じて使用される。 The automatic operation control unit 47 uses either the automatic operation program or the map generation program stored in the operation program storage unit 41, and operates the automatic guided vehicle 35, the robot 25, and the camera 31 according to the program. Department. At that time, the data stored in the movement position storage section 42 and the motion posture storage section 43 are used as necessary.
 前記点群データ算出部50は、前記カメラ31a,31bによって撮像された2つの画像を基に、各カメラ31a,31bの焦点距離、カメラ31a,31b間の距離及び2つのカメラ31a,31b間の視差から、撮像対象物を所定の大きさで分割した各要素について、当該ステレオカメラ31に対して設定されるカメラ座標系の3次元空間における位置を算出する。前記各要素は撮像対象物を形成する点群として認識されるものであり、その位置データは、カメラ座標系の3次元空間における点群の位置データ(3次元点群位置データ)である。 Based on the two images captured by the cameras 31a and 31b, the point cloud data calculation unit 50 calculates the focal length of each camera 31a and 31b, the distance between the cameras 31a and 31b, and the distance between the two cameras 31a and 31b. From the parallax, the position in the three-dimensional space of the camera coordinate system set for the stereo camera 31 is calculated for each element obtained by dividing the object to be imaged into a predetermined size. Each of the above elements is recognized as a point group forming an object to be imaged, and the position data thereof is the position data of the point group in the three-dimensional space of the camera coordinate system (three-dimensional point group position data).
 尚、本例では、前記点群データ算出部50は、前記無人搬送車35が工作機械10に対して設定された作業位置に在り、前記ロボット25が上述した撮像姿勢を取るときに、前記ステレオカメラ31によって撮像されたチャック12の2つの画像を基に、これらの画像から、当該チャック12のカメラ座標系における3次元点群位置データを算出する。 In this example, the point cloud data calculation unit 50 detects the stereo image when the automatic guided vehicle 35 is at the work position set with respect to the machine tool 10 and the robot 25 takes the above-described imaging posture. Based on two images of the chuck 12 captured by the camera 31, three-dimensional point cloud position data of the chuck 12 in the camera coordinate system is calculated from these images.
 前記基準データ記憶部45は、ティーチング操作時に、前記無人搬送車35が工作機械10に対して設定された作業位置に在り、前記ロボット25が前記撮像姿勢にあるときに、前記ステレオカメラ31により撮像して得られたチャック12の画像を基に、前記点群データ算出部50によって算出されたチャック12の3次元点群位置データを基準データとして記憶する機能部である。 The reference data storage unit 45 captures images with the stereo camera 31 when the unmanned guided vehicle 35 is at the working position set with respect to the machine tool 10 and the robot 25 is in the imaging posture during teaching operation. It is a functional unit that stores the three-dimensional point cloud position data of the chuck 12 calculated by the point cloud data calculation unit 50 based on the image of the chuck 12 obtained as reference data.
 また、この基準データ記憶部45には、上記基準データの他に、チャック12のCADデータから得られる3次元点群位置データであって、このチャック12に対して設定された座標系であるオブジェクト座標系における3次元点群位置データが格納されている。尚、このオブジェクト座標系は、例えば、チャック12の軸線と直交する平面内において設定される直交2軸のx軸及びy軸、並びにこれらx軸及びy軸と直交するz軸の直交3軸によって定義することができる。 In addition to the reference data, the reference data storage unit 45 also stores three-dimensional point group position data obtained from CAD data of the chuck 12 , which is an object coordinate system set for the chuck 12 . 3D point cloud position data in the coordinate system is stored. It should be noted that this object coordinate system is defined by, for example, two orthogonal x-axis and y-axis set in a plane orthogonal to the axis of the chuck 12, and three orthogonal axes of the z-axis orthogonal to these x-axis and y-axis. can be defined.
 前記補正量算出部51は、前記自動運転制御部47による制御の下で、前記動作プログラム記憶部41に格納された自動運転用プログラムに従って前記ロボット25が自動運転される際に、当該ロボット25が撮像姿勢に在り、ステレオカメラ31によりチャック12が撮像されて、前記点群データ算出部50によりチャック12の現在の3次元点群位置データが算出されると、この現在の3次元点群位置データと、前記基準データ記憶部45に格納された基準データ(ティーチング操作時に算出されたチャック12の3次元点群位置データ)とに基づいて、ロボット25の現在の姿勢とティーチング操作時の姿勢との間におけるステレオカメラ31の位置誤差量及び回転誤差量を推定し、推定された各誤差量に基づいて、前記作業姿勢における前記作用部に対する補正量を算出する。 When the robot 25 is automatically operated in accordance with the automatic operation program stored in the operation program storage unit 41 under the control of the automatic operation control unit 47, the correction amount calculation unit 51 determines whether the robot 25 is When the chuck 12 is imaged by the stereo camera 31 in the imaging posture and the current three-dimensional point cloud position data of the chuck 12 is calculated by the point cloud data calculation unit 50, this current three-dimensional point cloud position data is obtained. and the reference data (three-dimensional point cloud position data of the chuck 12 calculated during the teaching operation) stored in the reference data storage unit 45, the current posture of the robot 25 and the posture during the teaching operation. The amount of positional error and the amount of rotational error of the stereo camera 31 in between are estimated, and based on each estimated amount of error, the amount of correction for the action portion in the working posture is calculated.
 具体的には、補正量算出部51は、自動運転時に得られたチャック12の現在の3次元点群位置データと、前記基準データ記憶部45に格納された基準データ等に基づいて、以下の処理を実行して、ロボット25の現在の姿勢とティーチング操作時の姿勢との間におけるステレオカメラ31の位置誤差量であって、チャック12の軸線と直交する平面内で設定される相互に直交するx軸及びy軸、並びにこれらx軸及びy軸に直交するz軸方向におけるステレオカメラ31の位置誤差量と、x軸、y軸及びz軸回りのステレオカメラ31の回転誤差量とを推定するとともに、推定した位置誤差量及び回転誤差量に基づいて、ロボット25の各作業姿勢を補正する。 Specifically, the correction amount calculation unit 51 calculates the following based on the current three-dimensional point cloud position data of the chuck 12 obtained during automatic operation and the reference data stored in the reference data storage unit 45. By executing processing, the amount of positional error of the stereo camera 31 between the current posture of the robot 25 and the posture during the teaching operation, which is set within a plane orthogonal to the axis of the chuck 12, is determined. Estimate the positional error amount of the stereo camera 31 in the x-axis and y-axis and the z-axis direction orthogonal to these x-axis and y-axis, and the rotation error amount of the stereo camera 31 around the x-axis, y-axis and z-axis. At the same time, each working posture of the robot 25 is corrected based on the estimated position error amount and rotation error amount.
(前処理)
 前記補正量算出部51は、まず、前処理として、前記ティーチング操作時に得られた前記基準データに基づいて、ステレオカメラ31に対応した座標系であるカメラ座標系から、前記チャック12に対して設定された座標系であるオブジェクト座標系に変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000029
を取得する。尚、カメラ座標系は、例えば、カメラ31a,31bの撮像素子(例えば、CMOSセンサ)を含む平面内において、各撮像素子間の中間位置を中心として設定される直交3軸で定義することができる。
(Preprocessing)
First, as pre-processing, the correction amount calculation unit 51 sets the chuck 12 from the camera coordinate system corresponding to the stereo camera 31 based on the reference data obtained during the teaching operation. coordinate transformation matrix for transforming to the object coordinate system, which is the coordinate system
Figure JPOXMLDOC01-appb-I000029
to get Note that the camera coordinate system can be defined, for example, by orthogonal three axes that are set centering on an intermediate position between the imaging elements in a plane that includes the imaging elements (for example, CMOS sensors) of the cameras 31a and 31b. .
 具体的には、この座標変換行列
Figure JPOXMLDOC01-appb-I000030
は、以下の手順により、算出することができる。即ち、まず、前記基準データ記憶部45に格納された、CADデータから得られるチャック12のオブジェクト座標系における3次元点群位置データ、ステレオカメラ31の内部パラメータ(例えば、カメラの焦点距離など)及び外部パラメータ(カメラ31a,31b間の距離、視差など)に基づいて、カメラ座標系からCADデータに係るオブジェクト座標系への座標変換行列
Figure JPOXMLDOC01-appb-I000031
を取得する。
Specifically, this coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000030
can be calculated by the following procedure. That is, first, the three-dimensional point cloud position data in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45, the internal parameters of the stereo camera 31 (for example, the focal length of the camera), and Coordinate transformation matrix from camera coordinate system to object coordinate system related to CAD data based on external parameters (distance between cameras 31a and 31b, parallax, etc.)
Figure JPOXMLDOC01-appb-I000031
to get
 次に、補正量算出部51は、前記基準データ記憶部45に格納された、CADデータから得られるチャック12のオブジェクト座標系における3次元点群位置データと、同じく前記基準データ記憶部45に格納された、ティーチング操作時のカメラ座標系におけるチャック12の3次元点群位置データである基準データとを用い、例えば、これらを、RANSACアルゴリズム(大域的位置合わせ)及びICPアルゴリズム(局所的位置合わせ)に従って、CADデータに係る3次元点群位置データを、前記基準データに重ね合わせる処理を行うことにより、CADデータに係るオブジェクト座標系からティーチング操作時のチャック12に係るオブジェクト座標系への座標変換行列
Figure JPOXMLDOC01-appb-I000032
を取得する。図6において概念的に示すように、一点鎖線で示した図形がCADデータから得られるチャック12のオブジェクト座標系における3次元点群位置データに係る図形とし、実線で示した図形がティーチング操作時に得られたチャック12のオブジェクト座標系における3次元点群位置データに係る図形であるとすると、一点鎖線の図形が実線の図形に重ね合わさるように処理することで、上記の座標変換行列
Figure JPOXMLDOC01-appb-I000033
が得られる。
Next, the correction amount calculation unit 51 stores the three-dimensional point cloud position data of the chuck 12 in the object coordinate system in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45 as well. Using the reference data, which is the three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation, for example, these are applied to the RANSAC algorithm (global alignment) and the ICP algorithm (local alignment) Accordingly, by superimposing the three-dimensional point cloud position data related to the CAD data on the reference data, the coordinate transformation matrix from the object coordinate system related to the CAD data to the object coordinate system related to the chuck 12 at the time of teaching operation
Figure JPOXMLDOC01-appb-I000032
to get As conceptually shown in FIG. 6, the figure indicated by the dashed line is the figure related to the three-dimensional point group position data in the object coordinate system of the chuck 12 obtained from the CAD data, and the figure indicated by the solid line is obtained during the teaching operation. , the above coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000033
is obtained.
 次に、上記のようにして得られた座標変換行列
Figure JPOXMLDOC01-appb-I000034
、及び座標変換行列
Figure JPOXMLDOC01-appb-I000035
から、以下の数式1に従って、上述した座標変換行列
Figure JPOXMLDOC01-appb-I000036
を算出する。
(数式1)
Figure JPOXMLDOC01-appb-I000037
Next, the coordinate transformation matrix obtained as above
Figure JPOXMLDOC01-appb-I000034
, and coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000035
, according to Equation 1 below, the coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000036
Calculate
(Formula 1)
Figure JPOXMLDOC01-appb-I000037
 以上のようにして、座標変換行列
Figure JPOXMLDOC01-appb-I000038
を算出した後、補正量算出部51は、算出した座標変換行列
Figure JPOXMLDOC01-appb-I000039
と、前記カメラ座標系におけるカメラ位置であって、前記ティーチング操作における撮像時のカメラ位置
Figure JPOXMLDOC01-appb-I000040
とに基づいて、前記オブジェクト座標系におけるティーチング操作時のカメラ位置
Figure JPOXMLDOC01-appb-I000041
を以下の数式2によって算出する。
(数式2)
Figure JPOXMLDOC01-appb-I000042
As described above, the coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000038
After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000039
and a camera position in the camera coordinate system, which is a camera position at the time of imaging in the teaching operation
Figure JPOXMLDOC01-appb-I000040
and based on the camera position during the teaching operation in the object coordinate system
Figure JPOXMLDOC01-appb-I000041
is calculated by Equation 2 below.
(Formula 2)
Figure JPOXMLDOC01-appb-I000042
(ティーチング操作時のカメラ位置算出処理)
 次に、補正量算出部51は、ティーチング操作時のロボット座標系におけるカメラ位置
Figure JPOXMLDOC01-appb-I000043
を以下の数式3によって算出する。尚、ロボット座標系は、制御装置40がロボット25を制御するために設定された3次元の座標系であり、適宜位置に原点が設定される直交3軸で定義される。
(数式3)
Figure JPOXMLDOC01-appb-I000044
(Camera position calculation processing during teaching operation)
Next, the correction amount calculation unit 51 calculates the camera position in the robot coordinate system during the teaching operation.
Figure JPOXMLDOC01-appb-I000043
is calculated by Equation 3 below. The robot coordinate system is a three-dimensional coordinate system set for the control device 40 to control the robot 25, and is defined by three orthogonal axes with origins set at appropriate positions.
(Formula 3)
Figure JPOXMLDOC01-appb-I000044
 続いて、補正量算出部51は、ティーチング操作時のカメラ座標系からティーチング操作時のロボット座標系へ変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000045
を以下の数式4によって算出する。
(数式4)
Figure JPOXMLDOC01-appb-I000046
ここで、
Figure JPOXMLDOC01-appb-I000047
の回転行列成分
Figure JPOXMLDOC01-appb-I000048
を基に、x軸、y軸及びz軸回りの各回転角度
Figure JPOXMLDOC01-appb-I000049

Figure JPOXMLDOC01-appb-I000050

Figure JPOXMLDOC01-appb-I000051
を算出する。
Subsequently, the correction amount calculation unit 51 calculates the coordinate transformation matrix for converting from the camera coordinate system during the teaching operation to the robot coordinate system during the teaching operation.
Figure JPOXMLDOC01-appb-I000045
is calculated by Equation 4 below.
(Formula 4)
Figure JPOXMLDOC01-appb-I000046
here,
Figure JPOXMLDOC01-appb-I000047
Rotation matrix elements of
Figure JPOXMLDOC01-appb-I000048
Rotation angles around x, y and z axes based on
Figure JPOXMLDOC01-appb-I000049
,
Figure JPOXMLDOC01-appb-I000050
,
Figure JPOXMLDOC01-appb-I000051
Calculate
 尚、
Figure JPOXMLDOC01-appb-I000052
は、オブジェクト座標系からティーチング操作時のロボット座標系へ変換するための座標変換行列である。例えば、オブジェクト座標系からティーチング操作時のカメラ座標系へ変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000053
と、前記カメラ座標系からティーチング操作時のロボット座標系へ変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000054
に基づいて、以下の数式5によって取得することができる。
(数式5)
Figure JPOXMLDOC01-appb-I000055
still,
Figure JPOXMLDOC01-appb-I000052
is a coordinate transformation matrix for transforming from the object coordinate system to the robot coordinate system at the time of teaching operation. For example, the coordinate transformation matrix for converting from the object coordinate system to the camera coordinate system during teaching operation
Figure JPOXMLDOC01-appb-I000053
and a coordinate transformation matrix for transforming from the camera coordinate system to the robot coordinate system at the time of teaching operation
Figure JPOXMLDOC01-appb-I000054
can be obtained by Equation 5 below.
(Formula 5)
Figure JPOXMLDOC01-appb-I000055
(自動運転時のカメラ位置算出処理)
 次に、補正量算出部51は、自動運転時(実動作時)に前記点群データ算出部50によって算出された現在のチャック12のカメラ座標系における3次元点群位置データに基づいて、上記と同様にして、カメラ座標系からオブジェクト座標系に変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000056
を、以下の数式6に従って算出する。
(数式6)
Figure JPOXMLDOC01-appb-I000057
(Camera position calculation processing during automatic driving)
Next, the correction amount calculation unit 51 calculates the above based on the current three-dimensional point cloud position data in the camera coordinate system of the chuck 12 calculated by the point cloud data calculation unit 50 during automatic operation (during actual operation). A coordinate transformation matrix for transforming from the camera coordinate system to the object coordinate system in the same way as
Figure JPOXMLDOC01-appb-I000056
is calculated according to Equation 6 below.
(Formula 6)
Figure JPOXMLDOC01-appb-I000057
 尚、カメラ座標系からティーチング操作時のオブジェクト座標系への座標変換行列
Figure JPOXMLDOC01-appb-I000058
は、上記と同様に、前記基準データ記憶部45に格納された、ティーチング操作時のチャック12のオブジェクト座標系における3次元点群位置データ、ステレオカメラ31の内部パラメータ(例えば、カメラの焦点距離など)及び外部パラメータ(カメラ31a,31b間の距離、視差など)に基づいて算出することができる。
Note that the coordinate transformation matrix from the camera coordinate system to the object coordinate system at the time of teaching operation
Figure JPOXMLDOC01-appb-I000058
is the three-dimensional point cloud position data in the object coordinate system of the chuck 12 at the time of the teaching operation, the internal parameters of the stereo camera 31 (for example, the focal length of the camera, etc.), which are stored in the reference data storage unit 45, as described above ) and external parameters (distance between cameras 31a and 31b, parallax, etc.).
 また、ティーチング操作時のオブジェクト座標系から自動運転時のオブジェクト座標系への座標変換行列
Figure JPOXMLDOC01-appb-I000059
は、前記基準データ記憶部45に格納された、ティーチング操作時のカメラ座標系におけるチャック12の3次元点群位置データである基準データと、自動運転時に前記点群データ算出部50によって算出されたチャック12の3次元点群位置データを用い、例えば、これらを、RANSACアルゴリズム(大域的位置合わせ)及びICPアルゴリズム(局所的位置合わせ)に従って、ティーチング操作時の3次元点群位置データを、自動運転時の3次元点群位置データに重ね合わせる処理を行うことにより算出することができる。図6を用いて説明すると、一点鎖線で示した図形がティーチング操作時のオブジェクト座標系におけるチャック12の3次元点群位置データに係る図形とし、実線で示した図形が自動運転時のオブジェクト座標系におけるチャック12の3次元点群位置データに係る図形であるとすると、一点鎖線の図形が実線の図形に重ね合わさるように処理することで、上記の座標変換行列
Figure JPOXMLDOC01-appb-I000060
が得られる。
In addition, the coordinate transformation matrix from the object coordinate system during teaching operation to the object coordinate system during automatic operation
Figure JPOXMLDOC01-appb-I000059
are three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation stored in the reference data storage unit 45, and calculated by the point cloud data calculation unit 50 during automatic operation. Using the 3D point cloud position data of the chuck 12, for example, according to the RANSAC algorithm (global alignment) and ICP algorithm (local alignment), the 3D point cloud position data during teaching operation is automatically operated. It can be calculated by performing a process of superimposing it on the three-dimensional point cloud position data of time. Referring to FIG. 6, the figure indicated by the dashed line is the figure related to the three-dimensional point group position data of the chuck 12 in the object coordinate system during teaching operation, and the figure indicated by the solid line is the object coordinate system during automatic operation. , the coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000060
is obtained.
 以上のようにして、座標変換行列
Figure JPOXMLDOC01-appb-I000061
を算出した後、補正量算出部51は、算出した座標変換行列
Figure JPOXMLDOC01-appb-I000062
と、前記カメラ座標系におけるカメラ位置であって、自動運転における撮像時のカメラ位置
Figure JPOXMLDOC01-appb-I000063
とに基づいて、前記オブジェクト座標系における自動運転時のカメラ位置
Figure JPOXMLDOC01-appb-I000064
を以下の数式7によって算出する。
(数式7)
Figure JPOXMLDOC01-appb-I000065
As described above, the coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000061
After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix
Figure JPOXMLDOC01-appb-I000062
and the camera position in the camera coordinate system, which is the camera position at the time of imaging in automatic driving
Figure JPOXMLDOC01-appb-I000063
and based on the camera position during automatic driving in the object coordinate system
Figure JPOXMLDOC01-appb-I000064
is calculated by Equation 7 below.
(Formula 7)
Figure JPOXMLDOC01-appb-I000065
 次に、補正量算出部51は、自動運転時のオブジェクト座標系における現在のカメラ位置
Figure JPOXMLDOC01-appb-I000066
を以下の数式8によって算出するとともに、自動運転時のロボット座標系における現在のカメラ位置
Figure JPOXMLDOC01-appb-I000067
を以下の数式9によって算出する。
(数式8)
Figure JPOXMLDOC01-appb-I000068
(数式9)
Figure JPOXMLDOC01-appb-I000069
Next, the correction amount calculation unit 51 calculates the current camera position in the object coordinate system during automatic driving.
Figure JPOXMLDOC01-appb-I000066
is calculated by the following formula 8, and the current camera position in the robot coordinate system during automatic operation
Figure JPOXMLDOC01-appb-I000067
is calculated by Equation 9 below.
(Formula 8)
Figure JPOXMLDOC01-appb-I000068
(Formula 9)
Figure JPOXMLDOC01-appb-I000069
 続いて、補正量算出部51は、現在のカメラ座標系からティーチング操作時のロボット座標系へ変換するための座標変換行列
Figure JPOXMLDOC01-appb-I000070
を以下の数式10によって算出する。
(数式10)
Figure JPOXMLDOC01-appb-I000071
ここで、
Figure JPOXMLDOC01-appb-I000072
の回転行列成分
Figure JPOXMLDOC01-appb-I000073
を基に、x軸、y軸及びz軸回りの各回転角度
Figure JPOXMLDOC01-appb-I000074

Figure JPOXMLDOC01-appb-I000075

Figure JPOXMLDOC01-appb-I000076
を算出する。
Subsequently, the correction amount calculation unit 51 calculates the coordinate transformation matrix for transforming the current camera coordinate system to the robot coordinate system at the time of the teaching operation.
Figure JPOXMLDOC01-appb-I000070
is calculated by Equation 10 below.
(Formula 10)
Figure JPOXMLDOC01-appb-I000071
here,
Figure JPOXMLDOC01-appb-I000072
Rotation matrix elements of
Figure JPOXMLDOC01-appb-I000073
Rotation angles around x, y and z axes based on
Figure JPOXMLDOC01-appb-I000074
,
Figure JPOXMLDOC01-appb-I000075
,
Figure JPOXMLDOC01-appb-I000076
Calculate
(誤差量算出処理)
 次に、補正量算出部51は、上記のようにして算出したティーチング操作時の座標系におけるティーチング操作時のカメラ角度
Figure JPOXMLDOC01-appb-I000077

Figure JPOXMLDOC01-appb-I000078

Figure JPOXMLDOC01-appb-I000079
と、ティーチング操作時の座標系における現在のカメラ角度
Figure JPOXMLDOC01-appb-I000080

Figure JPOXMLDOC01-appb-I000081

Figure JPOXMLDOC01-appb-I000082
とに基づいて、これらの差分を算出することによって、x軸、y軸及びz軸回りの各回転誤差Δrx、Δry及びΔrzを算出する。
但し、
Figure JPOXMLDOC01-appb-I000083
Figure JPOXMLDOC01-appb-I000084
Figure JPOXMLDOC01-appb-I000085
(Error amount calculation process)
Next, the correction amount calculation unit 51 calculates the camera angle during the teaching operation in the coordinate system during the teaching operation calculated as described above.
Figure JPOXMLDOC01-appb-I000077
,
Figure JPOXMLDOC01-appb-I000078
,
Figure JPOXMLDOC01-appb-I000079
and the current camera angle in the coordinate system during teaching operation
Figure JPOXMLDOC01-appb-I000080
,
Figure JPOXMLDOC01-appb-I000081
,
Figure JPOXMLDOC01-appb-I000082
Rotational errors Δrx, Δry, and Δrz about the x-axis, y-axis, and z-axis are calculated by calculating these differences based on and.
however,
Figure JPOXMLDOC01-appb-I000083
Figure JPOXMLDOC01-appb-I000084
Figure JPOXMLDOC01-appb-I000085
 次に、補正量算出部51は、上記のようにして算出した各回転誤差Δrx、Δry及びΔrzに基づいて、ティーチング操作時におけるロボット座標系と、現在のロボット座標系との間の回転行列
Figure JPOXMLDOC01-appb-I000086
、即ち、回転誤差量を以下の数式11によって算出するとともに、ティーチング操作時のロボット座標系から現在のロボット座標系への並進行列
Figure JPOXMLDOC01-appb-I000087
、即ち、位置誤差量を以下の数式12により算出する。
(数式11)
Figure JPOXMLDOC01-appb-I000088
(数式12)
Figure JPOXMLDOC01-appb-I000089
Next, based on the rotation errors Δrx, Δry, and Δrz calculated as described above, the correction amount calculation unit 51 calculates the rotation matrix between the robot coordinate system at the time of the teaching operation and the current robot coordinate system.
Figure JPOXMLDOC01-appb-I000086
That is, the rotation error amount is calculated by the following formula 11, and the translation matrix from the robot coordinate system at the time of the teaching operation to the current robot coordinate system
Figure JPOXMLDOC01-appb-I000087
That is, the position error amount is calculated by Equation 12 below.
(Formula 11)
Figure JPOXMLDOC01-appb-I000088
(Formula 12)
Figure JPOXMLDOC01-appb-I000089
(補正量算出処理)
 次に、補正量算出部51は、上記のようにして算出した誤差量に基づいて、この誤差量を補正するための補正量
Figure JPOXMLDOC01-appb-I000090
を以下の数式13によって算出する。
(数式13)
Figure JPOXMLDOC01-appb-I000091
(Correction amount calculation process)
Next, based on the error amount calculated as described above, the correction amount calculation unit 51 calculates a correction amount for correcting this error amount.
Figure JPOXMLDOC01-appb-I000090
is calculated by Equation 13 below.
(Formula 13)
Figure JPOXMLDOC01-appb-I000091
 そして、前記自動運転制御部47は、補正量算出部51により算出された補正量に基づいて、以降のロボット25の動作姿勢におけるハンド29の位置
Figure JPOXMLDOC01-appb-I000092
を以下の数式14に従って補正する。
(数式14)
Figure JPOXMLDOC01-appb-I000093
Then, the automatic driving control unit 47 determines the position of the hand 29 in the subsequent motion posture of the robot 25 based on the correction amount calculated by the correction amount calculation unit 51 .
Figure JPOXMLDOC01-appb-I000092
is corrected according to Equation 14 below.
(Formula 14)
Figure JPOXMLDOC01-appb-I000093
 以上の構成を備えた本例のシステム1によれば、以下のようにして、無人自動生産が実行される。 According to the system 1 of this example having the above configuration, unmanned automatic production is executed as follows.
 即ち、前記制御装置40の自動運転制御部47による制御の下で、前記動作プログラム記憶部41に格納された自動運転用プログラムが実行され、この自動運転用プログラムに従って、例えば、無人搬送車35及びロボット25が以下のように動作する。 That is, under the control of the automatic operation control unit 47 of the control device 40, the automatic operation program stored in the operation program storage unit 41 is executed, and according to this automatic operation program, for example, the automatic guided vehicle 35 and Robot 25 operates as follows.
 まず、無人搬送車35は、工作機械10に対して設定された作業位置に移動するとともに、ロボット25は上述したワーク取出動作の作業開始姿勢を取る。尚、この時、工作機械10は所定の加工を完了して、ロボット25が加工領域内に侵入可能なようにドアカバーを開いているものとする。 First, the unmanned guided vehicle 35 moves to the work position set with respect to the machine tool 10, and the robot 25 assumes the work start posture for the above-described work take-out operation. At this time, it is assumed that the machine tool 10 has completed the predetermined machining and the door cover is open so that the robot 25 can enter the machining area.
 ついで、ロボット25は前記撮像姿勢に移行し、前記チャック12をステレオカメラ31によって撮像する。そして、このようにして、ステレオカメラ31によってチャック12が撮像されると、前記点群データ算出部50によって、チャック12の3次元点群位置データが算出されるとともに、前記補正量算出部51において、当該3次元点群位置データと、前記基準データ記憶部45に格納された基準データとを基に、上記数式11及び12に従ってロボット25のティーチング操作時における撮像姿勢と、現在の撮像姿勢との間の誤差量が推定され、推定された各誤差量に基づき、上述した数式13に従って、ロボット25の以降のワーク取出動作姿勢に対する補正量が算出される。 Next, the robot 25 shifts to the imaging posture and images the chuck 12 with the stereo camera 31 . When the chuck 12 is imaged by the stereo camera 31 in this manner, the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12, and the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above. Based on each estimated error amount, the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
 そして、自動運転制御部47は、補正量算出部51により算出された各補正量に基づいて、以降のワーク取出動作姿勢、即ち、上述した取出準備姿勢、把持姿勢、取外姿勢及び作業完了姿勢におけるハンド29の位置を数式14に従って補正して、工作作機械10のチャック12に把持された加工済ワークW’をハンド29に把持して当該工作機械10から取り出す。尚、ロボット25に前記把持姿勢を取らせた後に、自動運転制御部47から工作機械10にチャック開指令を送信することで、当該チャック12が開かれる。 Based on the correction amounts calculated by the correction amount calculation unit 51, the automatic operation control unit 47 determines the subsequent work take-out operation postures, that is, the above-described take-out preparation posture, gripping posture, removal posture, and work completion posture. The position of the hand 29 at is corrected according to Equation 14, and the machined workpiece W' gripped by the chuck 12 of the machine tool 10 is gripped by the hand 29 and taken out from the machine tool 10. The chuck 12 is opened by transmitting a chuck opening command from the automatic operation control unit 47 to the machine tool 10 after causing the robot 25 to take the gripping posture.
 次に、自動運転制御部47は、無人搬送車35を、製品ストッカ21に対して設定された作業位置に移動させるとともに、ロボット25に、当該製品ストッカ21において作業を開始するときの収納開始姿勢、ハンド29に把持した加工済ワークを製品ストッカ21内に収納するための各収納姿勢及び収納を完了したときの収納完了姿勢を順次取らせて、ハンド29に把持した加工済ワークを製品ストッカ21に収納する。 Next, the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the product stocker 21, and instructs the robot 25 to perform the storage start posture when starting work in the product stocker 21. , each storage attitude for storing the machined workpiece gripped by the hand 29 in the product stocker 21 and the storage completion attitude when the storage is completed. Store in
 ついで、自動運転制御部47は、無人搬送車35を、材料ストッカ20に対して設定された作業位置に移動させるとともに、ロボット25に、当該材料ストッカ20において作業を開始するときの取出開始姿勢、当該材料ストッカ20に収納された加工前ワークをハンド29によって把持して、当該材料ストッカ20から取り出すための各取出姿勢及び取出を完了したときの取出完了姿勢を順次取らせて、ハンド29に加工前ワークを把持させる。 Next, the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the material stocker 20, and instructs the robot 25 to take out a take-out start posture when starting work in the material stocker 20, The unprocessed work stored in the material stocker 20 is grasped by the hand 29, and the hand 29 is caused to sequentially take each take-out posture for taking out the work from the material stocker 20 and the take-out completion posture when the take-out is completed, and the work is processed by the hand 29. Grip the front workpiece.
 次に、自動運転制御部47は、再度、無人搬送車35を工作機械10に対して設定された作業位置に移動させるとともに、ロボット25に上述したワーク取付動作の作業開始姿勢を取らせる。ついで、ロボット25を前記撮像姿勢に移行させ、ステレオカメラ31によってチャック12を撮像させる。そして、このようにして、ステレオカメラ31によってチャック12が撮像されると、前記点群データ算出部50によって、チャック12の3次元点群位置データが算出されるとともに、前記補正量算出部51において、当該3次元点群位置データと、前記基準データ記憶部45に格納された基準データとを基に、上記数式11及び12に従ってロボット25のティーチング操作時における撮像姿勢と、現在の撮像姿勢との間の誤差量が推定され、推定された各誤差量に基づき、上述した数式13に従って、ロボット25の以降のワーク取出動作姿勢に対する補正量が算出される。 Next, the automatic operation control unit 47 again moves the unmanned guided vehicle 35 to the work position set with respect to the machine tool 10, and causes the robot 25 to assume the work start posture for the work mounting operation described above. Next, the robot 25 is shifted to the imaging posture, and the chuck 12 is imaged by the stereo camera 31 . When the chuck 12 is imaged by the stereo camera 31 in this manner, the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12, and the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above. Based on each estimated error amount, the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
 この後、自動運転制御部47は、補正量算出部51により算出された各補正量に基づいて、以降のロボット25のワーク取付動作姿勢、即ち、上述した取付準備姿勢、取付姿勢、離隔姿勢及び作業完了姿勢におけるハンド29の位置を数式14に従って補正して、ロボット25に、ハンド29に把持された加工前ワークWを工作機械10のチャック12に取り付けた後、機外に退出する動作を行わせる。この後、自動運転制御部47は、工作機械10に加工開始指令を送信して、工作機械10に加工動作を行わせる。尚、ロボット25に前記取付姿勢を取らせた後に、自動運転制御部47から工作機械10にチャック閉指令を送信することで、当該チャック12が閉じられ、当該チャック12によって加工前ワークWが把持される。 After that, the automatic operation control unit 47 determines the subsequent work mounting posture of the robot 25, that is, the above-described mounting preparation posture, mounting posture, separation posture, and After correcting the position of the hand 29 in the work completion posture according to Equation 14, the robot 25 attaches the unprocessed workpiece W gripped by the hand 29 to the chuck 12 of the machine tool 10, and then moves out of the machine. Let After that, the automatic operation control unit 47 transmits a machining start command to the machine tool 10 to cause the machine tool 10 to perform the machining operation. After the robot 25 takes the mounting posture, the automatic operation control unit 47 transmits a chuck close command to the machine tool 10, whereby the chuck 12 is closed and the workpiece W to be machined is gripped by the chuck 12. be done.
 そして、以上を繰り返すことにより、本例のシステム1では、無人自動生産が連続して実行される。 By repeating the above, system 1 of this example continuously executes unmanned automatic production.
 このように、本例のシステム1では、ロボット25が実際に作業する工作機械10を構成する構造体であるチャック12をステレオカメラ31により撮像し、得られた画像を基に、ロボット25の作業姿勢を補正するようにしているので、従来のように、ロボット25の姿勢を補正するために、較正用マーカといった特別な構成要素をわざわざ用意する必要がなく、また、これを工作機械10の外表面に配設するという煩わしい準備作業を行う必要がない。このため、ロボット25の姿勢を補正するための体制を容易に構築することができる。また、本例のシステム1では、チャック12という単体の構造体を撮像することで、ロボット25の姿勢を補正することが可能であるので、ステレオカメラ31によってチャック12を撮像する際に、一度の動作でこれを実行することができ、この結果、撮像のためのロボット25の動作時間を従来に比べて短くすることができ、これにより、当該システム1における生産効率を従来に比べて高めることができる。 Thus, in the system 1 of this example, the stereo camera 31 images the chuck 12, which is a structure that constitutes the machine tool 10 on which the robot 25 actually works. Since the posture is corrected, there is no need to prepare a special component such as a calibration marker in order to correct the posture of the robot 25 as in the conventional art, and this can be installed outside the machine tool 10. There is no need to perform troublesome preparatory work for arranging on the surface. Therefore, a system for correcting the posture of the robot 25 can be easily constructed. In addition, in the system 1 of the present example, it is possible to correct the posture of the robot 25 by capturing an image of the chuck 12, which is a single structure. This can be done by motion, and as a result, the operating time of the robot 25 for imaging can be shortened compared to the conventional art, and as a result, the production efficiency in the system 1 can be increased compared to the conventional art. can.
 また、本例のロボット25が行なう作業の内、チャック12に対する作業において最も正確な作業が求められるが、本例のシステム1では、正確な作業が求められる対象のチャック12をステレオカメラ31により撮像して得られた画像を基に、ロボット25の作業姿勢を補正するようにしているので、当該作業姿勢を正確に補正することができ、これにより、ロボット25は、高い動作精度が求められる作業でも、当該作業を精度良く実行することができる。 Among the operations performed by the robot 25 of this example, the operation on the chuck 12 requires the most accurate operation. Since the working posture of the robot 25 is corrected based on the image obtained by the above operation, the working posture can be corrected accurately. However, the work can be performed accurately.
 前記無人搬送車35は、比較的自由度の高い車輪の動作によって移動するように構成されているため、ロボット25が搭載された載置面は床面に対して傾き易く、また、搭載するロボット25の姿勢の変化に応じて、言い換えれば、ロボット25の重心位置の変化に応じて、当該傾きが変動し易いという特性を有している。このため、上述したワークW,W’の着脱を行う際に、ロボット25がそのハンド29及びステレオカメラ31を工作機械10の加工領域内に進入させて、当該ハンド29が無人搬送車35から大きくオーバハングした状態となるときの前記載置面の傾きは、ハンド29が工作機械10の加工領域外に在り、無人搬送車35からオーバハングしていないか、或いはオーバハングしていたとしても少量である場合の傾きよりも大きなものとなる。 Since the unmanned guided vehicle 35 is configured to move by the motion of the wheels with a relatively high degree of freedom, the mounting surface on which the robot 25 is mounted is likely to tilt with respect to the floor surface, and the robot to be mounted can be tilted with respect to the floor surface. 25, in other words, according to a change in the position of the center of gravity of the robot 25, the inclination is likely to change. Therefore, when the workpieces W and W′ described above are attached and detached, the robot 25 causes the hand 29 and the stereo camera 31 to enter the processing area of the machine tool 10 so that the hand 29 is far away from the automatic guided vehicle 35. When the hand 29 is out of the working area of the machine tool 10, the hand 29 does not overhang the automatic guided vehicle 35, or the overhang is small even if it does. is larger than the slope of
 したがって、上述した従来の位置補正手法のように、較正用マーカである視覚ターゲットを工作機械10の外表面に配設し、ロボット25が工作機械10の加工領域外に在る状態で、ロボット25の姿勢補正量を取得する態様では、この姿勢補正量は、ロボット25のハンド29が工作機械10の加工領域内にあるときのロボット25の姿勢誤差量を正確に反映したものではないため、当該ロボット25の姿勢を正確には補正することができない。 Therefore, as in the conventional position correction technique described above, visual targets, which are calibration markers, are provided on the outer surface of machine tool 10, and robot 25 is positioned outside the machining area of machine tool 10. In the aspect of acquiring the posture correction amount, the posture correction amount does not accurately reflect the posture error amount of the robot 25 when the hand 29 of the robot 25 is in the machining area of the machine tool 10. The posture of the robot 25 cannot be corrected accurately.
 そして、ワークW,W’を着脱する際のロボット25の姿勢を正確に補正することができなければ、チャック12に対してロボット25のハンド29を正確に位置決めすることができず、例えば、チャック12がコレットチャックなど、把持部の動き代(ストローク)が極僅か、即ち、ワークW,W’とコレットチャックとの間のクリアランスが極僅かである場合には、当該コレットチャックに対してワークW,W’を確実に把持させることができない状態となり得る。そして、ワークW,W’の着脱を確実に実行することができなければ、当該システム1の稼働率が低下するこため、生産効率の良い無人化を実現することはできない。 If the posture of the robot 25 when attaching and detaching the works W and W' cannot be corrected accurately, the hand 29 of the robot 25 cannot be accurately positioned with respect to the chuck 12. For example, the chuck 12 is a collet chuck or the like, and when the movement allowance (stroke) of the gripping part is extremely small, that is, when the clearance between the workpieces W, W' and the collet chuck is extremely small, the workpiece W is moved to the collet chuck. , W′ cannot be securely gripped. If the attachment and detachment of the workpieces W and W' cannot be reliably performed, the operation rate of the system 1 will decrease, and unmanned production with good production efficiency cannot be realized.
 本例のシステム1では、上述したように、正確な作業が求められる対象のチャック12をステレオカメラ31により撮像して得られた画像、言い換えれば、正確な作業が求められるときの姿勢と同様の姿勢を取ったときに得られた画像を基に、ロボット25の作業姿勢を補正するようにしているので、当該作業姿勢を正確に補正することができ、これにより、ロボット25は、高い動作精度が求められる作業でも、当該作業を精度良く実行することができる。また、このように、ロボット25が精度の良い作業を実行することで、当該システム1は不要な中断を招くことなく高い稼働率で稼働し、結果、当該システム1によれば、信頼性が高く、生産効率の高い無人化を図ることが可能となる。 In the system 1 of the present example, as described above, an image obtained by imaging the chuck 12 for which accurate work is required by the stereo camera 31, in other words, is similar to the posture when accurate work is required. Since the working posture of the robot 25 is corrected based on the image obtained when the posture is taken, the working posture can be corrected accurately, thereby allowing the robot 25 to operate with high accuracy. can be performed with high accuracy even for work that requires In addition, since the robot 25 performs work with high precision in this way, the system 1 operates at a high operating rate without causing unnecessary interruptions, and as a result, the system 1 is highly reliable. , it is possible to achieve unmanned production with high production efficiency.
 以上、本発明の一実施の形態について説明したが、本発明が採り得る具体的な態様は、何らこれに限定されるものではない。 Although one embodiment of the present invention has been described above, specific aspects that the present invention can take are not limited to this.
 例えば、上例では、最も好ましい態様として、ロボット25の作業において最も正確な作業が求められるチャック12を、ステレオカメラ31によって撮像する対象としての構造体に設定したが、このような態様に限られるものでは無く、ロボット25が作業する部位に応じて、工作機械10を構成する他の構造体、例えば、同じく加工領域内に配設される刃物台,工具や往復台などを撮像対象としても良い。或いは、それほど高い補正精度が求められない場合には、例えば、加工領域外に配設される扉、この扉に付設される取手、操作盤や表示灯といった構造体などを撮像対象としても良い。このような態様によっても、特別な構成要素をわざわざ用意することなく、また、煩わしい準備作業を行うことなく、ロボット25の姿勢を補正するための体制を容易に構築することができるという効果が奏される。また、単体の構造体を撮像することで、ロボット25の姿勢を補正することが可能であるので、ステレオカメラ31によって当該構造体を撮像する際に、一度の動作でこれを実行することができ、この結果、撮像のためのロボット25の動作時間を従来に比べて短くすることができ、これにより、当該システム1における生産効率を従来に比べて高めることができるという効果が奏される。尚、撮像対象を加工領域外に設けられる構造体とする場合には、前記撮像姿勢は、前記ハンド29及びステレオカメラ31を工作機械10の加工領域内に進入させる前の姿勢、即ち、前記ハンド29及びステレオカメラ31が工作機械10の加工領域外にあるときに、前記撮像対象を撮像する姿勢となる。 For example, in the above example, as the most preferable mode, the chuck 12, which requires the most accurate work in the work of the robot 25, is set as a structure to be imaged by the stereo camera 31, but it is limited to such a mode. Instead, other structures that make up the machine tool 10, such as a tool post, a tool, a carriage, etc. that are also arranged in the machining area, may be imaged according to the part where the robot 25 works. . Alternatively, when such high correction accuracy is not required, for example, a door disposed outside the processing area, a handle attached to the door, a structure such as an operation panel or an indicator light, and the like may be imaged. This aspect also has the effect that a system for correcting the posture of the robot 25 can be easily constructed without preparing special components or performing troublesome preparatory work. be done. In addition, since it is possible to correct the posture of the robot 25 by capturing an image of a single structure, it is possible to perform this in a single operation when capturing an image of the structure with the stereo camera 31. As a result, the operation time of the robot 25 for imaging can be shortened as compared with the conventional art, and as a result, the production efficiency in the system 1 can be improved as compared with the conventional art. When the imaging target is a structure provided outside the machining area, the imaging attitude is the attitude before the hand 29 and the stereo camera 31 enter the machining area of the machine tool 10, that is, the hand When the 29 and the stereo camera 31 are outside the machining area of the machine tool 10, they are in a posture for imaging the imaging target.
 また、上述した実施形態では、工作機械10に対するロボット25の作業において、その作業姿勢を補正する態様を採用したが、工作機械10以外の産業機械、例えば本例では、材料ストッカ20及び製品ストッカ21に対する作業において、上述と同様の態様によりロボット25の作業姿勢を補正するようにしても良い。また、上述したように、本発明に係る産業機械には、測定装置、清掃装置や洗浄装置の他、本例の前記ロボット25以外のロボットなど、産業上使用されるあらゆる機械が含まれ、ロボット25がこの産業機械に対して作業する際に、上述した態様によって、その作業姿勢を補正することができる。 In addition, in the above-described embodiment, in the operation of the robot 25 with respect to the machine tool 10, a mode of correcting the working posture is adopted. , the working posture of the robot 25 may be corrected in the same manner as described above. Further, as described above, the industrial machine according to the present invention includes all machines used industrially, such as a measuring device, a cleaning device, a washing device, and robots other than the robot 25 of this example. When the operator 25 works on this industrial machine, it is possible to correct its working posture in the manner described above.
 また、上述した実施形態では、無人搬送車35を用いた態様を例示したが、これに限定されるものではなく、従来公知の一般的な台車のように、人が押すことによって移動するように構成された搬送装置でもよい。そして、この搬送装置上にロボット25を搭載して、人力により当該搬送装置を工作機械10の作業位置に搬送し、当該ロボット25により工作機械10に対してワークの着脱を実行させるような態様としても良い。 Further, in the above-described embodiment, an aspect using the automatic guided vehicle 35 was exemplified, but the present invention is not limited to this. It may be a configured transport device. Then, a robot 25 is mounted on the transport device, and the transport device is manually transported to the working position of the machine tool 10, and the robot 25 attaches and detaches the work to and from the machine tool 10. Also good.
 繰り返しになるが、上述した実施形態の説明は、すべての点で例示であって、制限的なものではない。当業者にとって変形および変更が適宜可能である。本発明の範囲は、上述の実施形態ではなく、特許請求の範囲によって示される。さらに、本発明の範囲には、特許請求の範囲内と均等の範囲内での実施形態からの変更が含まれる。 Again, the above description of the embodiment is illustrative in all respects and is not restrictive. Modifications and modifications are possible for those skilled in the art. The scope of the invention is indicated by the claims rather than the above-described embodiments. Furthermore, the scope of the present invention includes modifications from the embodiments within the scope of claims and equivalents.
 1  システム
 10 工作機械
 11 主軸
 12 チャック
 20 材料ストッカ
 21 製品ストッカ
 25 ロボット
 29 ハンド
 31 ステレオカメラ
 35 無人搬送車
 37 操作盤
 40 制御装置
 41 動作プログラム記憶部
 42 移動位置記憶部
 43 動作姿勢記憶部
 44 マップ情報記憶部
 45 基準データ記憶部
 46 手動運転制御部
 47 自動運転制御部
 48 マップ情報生成部
 49 位置認識部
 50 点群データ算出部
 51 補正量算出部
 52 入出力インターフェース
 W  加工前ワーク
 W’ 加工済ワーク
1 system 10 machine tool 11 spindle 12 chuck 20 material stocker 21 product stocker 25 robot 29 hand 31 stereo camera 35 unmanned guided vehicle 37 operation panel 40 control device 41 operation program storage unit 42 movement position storage unit 43 operation posture storage unit 44 map information Storage unit 45 Reference data storage unit 46 Manual operation control unit 47 Automatic operation control unit 48 Map information generation unit 49 Position recognition unit 50 Point cloud data calculation unit 51 Correction amount calculation unit 52 Input/output interface W Pre-machining work W' Processed work

Claims (5)

  1.  画像を撮像する2つのカメラを有するステレオカメラ、及び対象物に対して作用する作用部を備え、産業機械に対して作業を行うロボットと、
     前記ロボットを搭載し、前記産業機械に対して設定された作業位置に移動可能に構成された搬送装置と、
     予め設定された動作指令を含む動作プログラムに従って前記ロボットを制御し、前記ロボットに、作業開始姿勢、前記産業機械を構成する、又は前記産業機械に設けられた構造体を前記ステレオカメラよって撮像する撮像姿勢、前記対象物に対して前記作用部を作用させるための1以上の作業姿勢を順次取らせるとともに、前記搬送装置を制御するように構成された制御装置とを備え、
     前記作業開始姿勢、撮像姿勢及び作業姿勢は、前記ロボットをティーチング操作することによって予め設定される自動搬送装置であって、
     前記制御装置は、
     前記ティーチング操作時に、前記ロボットを撮像姿勢に移行させた状態で、前記ステレオカメラにより前記構造体を撮像し、得られた画像を基に、前記ステレオカメラに対して設定されたカメラ座標系における前記構造体の3次元点群位置データからなる基準データを生成する処理を実行し、
     前記動作プログラムに従って、前記ロボットを実動作させる際に、前記搬送装置が前記作業位置に移動した状態で、前記ロボットを、前記作業開始姿勢から前記撮像姿勢に移行させた後、前記ステレオカメラにより前記構造体を撮像し、得られた画像を基に、前記カメラ座標系における前記構造体の3次元点群位置データからなる現データを生成する処理を実行するとともに、得られた現データ及び前記基準データに基づいて、前記ロボットの現在の姿勢とティーチング操作時の姿勢との間における前記ステレオカメラの位置の誤差量であって、相互に直交する所定の第1軸及び第2軸方向、並びに前記第1軸及び第2軸と直交する第3軸方向における前記ステレオカメラの各位置誤差量と、前記第1軸、第2軸及び第3軸回りの前記ステレオカメラの各回転誤差量とを推定し、推定された各位置誤差量及び各回転誤差量に基づいて、前記作業姿勢における前記作用部の位置を補正するように構成された自動搬送装置。
    A stereo camera having two cameras that capture images, and a robot that has an action unit that acts on an object and performs work on an industrial machine;
    a transport device equipped with the robot and configured to be movable to a work position set with respect to the industrial machine;
    Controlling the robot according to an operation program including preset operation commands, and imaging the robot with a work start posture, a structure constituting the industrial machine, or a structure provided in the industrial machine with the stereo camera. a posture, a control device configured to sequentially take one or more working postures for causing the action portion to act on the object, and to control the conveying device;
    The work start posture, the imaging posture and the work posture are set in advance by teaching the robot, wherein
    The control device is
    During the teaching operation, the structure is imaged by the stereo camera while the robot is shifted to the imaging posture, and based on the obtained image, the structure in the camera coordinate system set for the stereo camera is determined. Execute a process of generating reference data consisting of three-dimensional point cloud position data of the structure,
    When actually operating the robot according to the operation program, the robot is moved from the work start posture to the imaging posture in a state in which the transport device has moved to the work position, and then the stereo camera performs the An image of a structure is captured, and based on the obtained image, processing is performed to generate current data consisting of three-dimensional point cloud position data of the structure in the camera coordinate system, and the obtained current data and the reference are executed. Based on the data, the amount of error in the position of the stereo camera between the current posture of the robot and the posture at the time of the teaching operation, which is a predetermined first axis and a second axis direction orthogonal to each other, and the Estimate each position error amount of the stereo camera in a third axis direction perpendicular to the first axis and the second axis, and each rotation error amount of the stereo camera about the first axis, the second axis and the third axis. and correcting the position of the working portion in the working posture based on each estimated positional error amount and each rotational error amount.
  2.  産業機械と、
     画像を撮像する2つのカメラを有するステレオカメラ、及び対象物に対して作用する作用部を備え、前記産業機械に対して作業を行うロボットと、
     前記ロボットを搭載し、前記産業機械に対して設定された作業位置に移動可能に構成された搬送装置と、
     予め設定された動作指令を含む動作プログラムに従って、前記ロボットに、作業開始姿勢、前記産業機械を構成する、又は前記産業機械に設けられた構造体を前記ステレオカメラよって撮像する撮像姿勢、前記対象物に対して前記作用部を作用させるための1以上の作業姿勢を順次取らせるように構成された制御装置とを備え、
     前記作業開始姿勢、撮像姿勢及び作業姿勢は、前記ロボットをティーチング操作することによって予め設定されるシステムであって、
     前記制御装置は、
     前記ティーチング操作時に、前記ロボットを撮像姿勢に移行させた状態で、前記ステレオカメラにより前記構造体を撮像し、得られた画像を基に、前記ステレオカメラに対して設定されたカメラ座標系における前記構造体の3次元点群位置データからなる基準データを生成する処理を実行し、
     前記動作プログラムに従って、前記ロボットを実動作させる際に、前記搬送装置が前記作業位置に移動した状態で、前記ロボットを、前記作業開始姿勢から前記撮像姿勢に移行させた後、前記ステレオカメラにより前記構造体を撮像し、得られた画像を基に、前記カメラ座標系における前記構造体の3次元点群位置データからなる現データを生成する処理を実行するとともに、得られた現データ及び前記基準データに基づいて、前記ロボットの現在の姿勢とティーチング操作時の姿勢との間における前記ステレオカメラの位置の誤差量であって、相互に直交する所定の第1軸及び第2軸方向、並びに前記第1軸及び第2軸と直交する第3軸方向における前記ステレオカメラの各位置誤差量と、前記第1軸、第2軸及び第3軸回りの前記ステレオカメラの各回転誤差量とを推定し、推定された各位置誤差量及び各回転誤差量に基づいて、前記作業姿勢における前記作用部の位置を補正するように構成されたシステム。
    industrial machinery;
    a stereo camera having two cameras that capture images, and a robot that has an action unit that acts on an object and performs work on the industrial machine;
    a transport device equipped with the robot and configured to be movable to a work position set with respect to the industrial machine;
    According to an operation program including preset operation commands, the robot is given a work start posture, an imaging posture for imaging a structure constituting the industrial machine or provided on the industrial machine with the stereo camera, and the object. A control device configured to sequentially take one or more working postures for causing the action part to act on,
    A system in which the work start posture, the imaging posture, and the work posture are set in advance by teaching the robot,
    The control device is
    During the teaching operation, the structure is imaged by the stereo camera while the robot is shifted to the imaging posture, and based on the obtained image, the structure in the camera coordinate system set for the stereo camera is determined. Execute a process of generating reference data consisting of three-dimensional point cloud position data of the structure,
    When actually operating the robot according to the operation program, the robot is moved from the work start posture to the imaging posture in a state in which the transport device has moved to the work position, and then the stereo camera performs the An image of a structure is captured, and based on the obtained image, processing is performed to generate current data consisting of three-dimensional point cloud position data of the structure in the camera coordinate system, and the obtained current data and the reference are executed. Based on the data, the amount of error in the position of the stereo camera between the current posture of the robot and the posture at the time of the teaching operation, which is a predetermined first axis and a second axis direction orthogonal to each other, and the Estimate each position error amount of the stereo camera in a third axis direction perpendicular to the first axis and the second axis, and each rotation error amount of the stereo camera about the first axis, the second axis and the third axis. and correcting the position of the working portion in the working posture based on each estimated positional error amount and each rotational error amount.
  3.  前記制御装置は、
     前記基準データに基づいて、前記ロボットに対して設定されたロボット座標系であって、前記ティーチング操作時のロボット座標系における、該ティーチング操作による撮像時のカメラ位置
    Figure JPOXMLDOC01-appb-I000001
    を算出する処理と、
     前記実動作時に得られる前記現データに基づいて、前記ティーチング操作時のロボット座標系における、該実動作による撮像時のカメラ位置
    Figure JPOXMLDOC01-appb-I000002
    を算出する処理と、
     前記ティーチング操作時のロボット座標系における前記カメラ位置
    Figure JPOXMLDOC01-appb-I000003
    、及び前記カメラ位置
    Figure JPOXMLDOC01-appb-I000004
    に基づいて、前記実動作時のロボット座標系における前記ステレオカメラの前記第1軸、第2軸及び第3軸方向の位置誤差量、並びに前記第1軸、第2軸及び第3軸回りの回転誤差量を推定する処理と、
     推定された前記位置誤差量及び回転誤差量に基づいて、前記作業姿勢における前記作用部の位置を補正する処理とを実行するように構成された請求項2記載のシステム。
    The control device is
    A camera position at the time of imaging by the teaching operation in a robot coordinate system set for the robot based on the reference data, in the robot coordinate system at the time of the teaching operation
    Figure JPOXMLDOC01-appb-I000001
    a process of calculating
    Based on the current data obtained during the actual movement, the camera position at the time of imaging by the actual movement in the robot coordinate system during the teaching operation
    Figure JPOXMLDOC01-appb-I000002
    a process of calculating
    The position of the camera in the robot coordinate system during the teaching operation
    Figure JPOXMLDOC01-appb-I000003
    , and the camera position
    Figure JPOXMLDOC01-appb-I000004
    based on the position error amount of the stereo camera in the first, second and third axis directions in the robot coordinate system during the actual operation, and the positional error amount around the first, second and third axes of the stereo camera a process of estimating the amount of rotation error;
    3. The system according to claim 2, further configured to correct the position of said working portion in said working posture based on said estimated position error amount and rotation error amount.
  4.  前記制御装置は、
     前記ティーチング操作時のロボット座標系における前記カメラ位置
    Figure JPOXMLDOC01-appb-I000005
    、及び前記カメラ位置
    Figure JPOXMLDOC01-appb-I000006
    を算出した後、
     ティーチング操作時の座標系におけるティーチング操作時のカメラ角度
    Figure JPOXMLDOC01-appb-I000007

    Figure JPOXMLDOC01-appb-I000008

    Figure JPOXMLDOC01-appb-I000009
    と、ティーチング操作時の座標系における現在のカメラ角度
    Figure JPOXMLDOC01-appb-I000010

    Figure JPOXMLDOC01-appb-I000011

    Figure JPOXMLDOC01-appb-I000012
    とに基づいて、これらの差分を算出することによって、前記第1軸をx軸、第2軸をy軸、第3軸をz軸として、x軸、y軸及びz軸回りの各回転誤差量Δrx、Δry及びΔrzを算出し、算出した前記回転誤差量Δrx、Δry及びΔrzに基づいて、前記実動作時のロボット座標系における前記ステレオカメラの前記回転誤差量として、ティーチング操作時のロボット座標系と実動作時のロボット座標系との間の回転行列
    Figure JPOXMLDOC01-appb-I000013
    を算出する処理と、
     前記実動作時のロボット座標系における前記ステレオカメラの位置誤差量として、前記ティーチング操作時のロボット座標系から前記実動作時のロボット座標系への並進行列
    Figure JPOXMLDOC01-appb-I000014
    を算出する処理と、
     前記回転行列
    Figure JPOXMLDOC01-appb-I000015
    及び並進行列
    Figure JPOXMLDOC01-appb-I000016
    に基づいて、補正量
    Figure JPOXMLDOC01-appb-I000017
    を算出する処理と、
     前記補正量
    Figure JPOXMLDOC01-appb-I000018
    に基づいて、前記作業姿勢における前記作用部の補正された位置
    Figure JPOXMLDOC01-appb-I000019
    を算出する処理とを実行するように構成された請求項3記載のシステム。
    The control device is
    The position of the camera in the robot coordinate system during the teaching operation
    Figure JPOXMLDOC01-appb-I000005
    , and the camera position
    Figure JPOXMLDOC01-appb-I000006
    After calculating
    Camera angle during teaching operation in the coordinate system during teaching operation
    Figure JPOXMLDOC01-appb-I000007
    ,
    Figure JPOXMLDOC01-appb-I000008
    ,
    Figure JPOXMLDOC01-appb-I000009
    and the current camera angle in the coordinate system during teaching operation
    Figure JPOXMLDOC01-appb-I000010
    ,
    Figure JPOXMLDOC01-appb-I000011
    ,
    Figure JPOXMLDOC01-appb-I000012
    Rotational errors around the x-, y-, and z-axes, where the first axis is the x-axis, the second axis is the y-axis, and the third axis is the z-axis, are calculated by calculating these differences based on The amounts Δrx, Δry, and Δrz are calculated, and based on the calculated rotation error amounts Δrx, Δry, and Δrz, the robot coordinates during the teaching operation are calculated as the rotation error amount of the stereo camera in the robot coordinate system during the actual operation. Rotation matrix between system and robot coordinate system during operation
    Figure JPOXMLDOC01-appb-I000013
    a process of calculating
    A translation matrix from the robot coordinate system during the teaching operation to the robot coordinate system during the actual operation as the positional error amount of the stereo camera in the robot coordinate system during the actual operation
    Figure JPOXMLDOC01-appb-I000014
    a process of calculating
    the rotation matrix
    Figure JPOXMLDOC01-appb-I000015
    and parallel matrix
    Figure JPOXMLDOC01-appb-I000016
    based on the correction amount
    Figure JPOXMLDOC01-appb-I000017
    a process of calculating
    Correction amount
    Figure JPOXMLDOC01-appb-I000018
    the corrected position of the working part in the working posture based on
    Figure JPOXMLDOC01-appb-I000019
    4. The system of claim 3, wherein the system is configured to:
  5.  前記制御装置は、前記回転行列
    Figure JPOXMLDOC01-appb-I000020
    、並進行列
    Figure JPOXMLDOC01-appb-I000021
    、補正量
    Figure JPOXMLDOC01-appb-I000022
    及び補正位置
    Figure JPOXMLDOC01-appb-I000023
    を以下の算出式によって算出するように構成された請求項4記載のシステム。
    Figure JPOXMLDOC01-appb-I000024
    Figure JPOXMLDOC01-appb-I000025
    Figure JPOXMLDOC01-appb-I000026
    Figure JPOXMLDOC01-appb-I000027
    但し、
    Figure JPOXMLDOC01-appb-I000028
    は、ティーチング操作時のロボット座標系におけるティーチング操作時の作用部の位置である。
     
    The controller controls the rotation matrix
    Figure JPOXMLDOC01-appb-I000020
    , a parallel matrix
    Figure JPOXMLDOC01-appb-I000021
    ,Correction amount
    Figure JPOXMLDOC01-appb-I000022
    and correction position
    Figure JPOXMLDOC01-appb-I000023
    5. The system according to claim 4, which is configured to calculate by the following formula.
    Figure JPOXMLDOC01-appb-I000024
    Figure JPOXMLDOC01-appb-I000025
    Figure JPOXMLDOC01-appb-I000026
    Figure JPOXMLDOC01-appb-I000027
    however,
    Figure JPOXMLDOC01-appb-I000028
    is the position of the action part during the teaching operation in the robot coordinate system during the teaching operation.
PCT/JP2022/023266 2021-09-06 2022-06-09 Automatic transport device, and system WO2023032400A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-144538 2021-09-06
JP2021144538A JP7093881B1 (en) 2021-09-06 2021-09-06 System and automatic guided vehicle

Publications (1)

Publication Number Publication Date
WO2023032400A1 true WO2023032400A1 (en) 2023-03-09

Family

ID=82217736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/023266 WO2023032400A1 (en) 2021-09-06 2022-06-09 Automatic transport device, and system

Country Status (2)

Country Link
JP (1) JP7093881B1 (en)
WO (1) WO2023032400A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (en) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot
CN116175256B (en) * 2023-04-04 2024-04-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07146121A (en) * 1993-10-01 1995-06-06 Nippondenso Co Ltd Recognition method and device for three dimensional position and attitude based on vision
JPH0970781A (en) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd Method for calibrating three dimensional position of self sustaining traveling robot
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
JP2016170050A (en) * 2015-03-12 2016-09-23 キヤノン株式会社 Position attitude measurement device, position attitude measurement method and computer program
JP2020015102A (en) * 2018-07-23 2020-01-30 オムロン株式会社 Control system, control method, and program
JP2021035708A (en) * 2019-08-30 2021-03-04 Dmg森精機株式会社 Production system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4013578A4 (en) * 2019-09-11 2024-01-17 Dmg Mori Co Ltd Robot-mounted moving device, system, and machine tool

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07146121A (en) * 1993-10-01 1995-06-06 Nippondenso Co Ltd Recognition method and device for three dimensional position and attitude based on vision
JPH0970781A (en) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd Method for calibrating three dimensional position of self sustaining traveling robot
JP2010172986A (en) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd Robot vision system and automatic calibration method
JP2016170050A (en) * 2015-03-12 2016-09-23 キヤノン株式会社 Position attitude measurement device, position attitude measurement method and computer program
JP2020015102A (en) * 2018-07-23 2020-01-30 オムロン株式会社 Control system, control method, and program
JP2021035708A (en) * 2019-08-30 2021-03-04 Dmg森精機株式会社 Production system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (en) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot
CN116175256B (en) * 2023-04-04 2024-04-30 杭州纳志机器人科技有限公司 Automatic positioning method for loading and unloading of trolley type robot

Also Published As

Publication number Publication date
JP7093881B1 (en) 2022-06-30
JP2023037769A (en) 2023-03-16

Similar Documents

Publication Publication Date Title
JP6785931B1 (en) Production system
US7200260B1 (en) Teaching model generating device
EP3222393B1 (en) Automated guidance system and method for a coordinated movement machine
US20220331970A1 (en) Robot-mounted moving device, system, and machine tool
JP5365379B2 (en) Robot system and robot system calibration method
US20160375532A1 (en) Fastening device, robot system, and fastening method for fastening plurality of fastening members
CN111470309A (en) Following robot and operation robot system
WO2022091767A1 (en) Image processing method, image processing device, robot mounted-type conveyance device, and system
WO2023032400A1 (en) Automatic transport device, and system
JP2022126768A (en) Robot-equipped moving device
JP2016203282A (en) Robot with mechanism for changing end effector attitude
JP6832408B1 (en) Production system
WO2022075303A1 (en) Robot system
JP7012181B1 (en) Robot system
EP3224004B1 (en) Robotic system comprising a telemetric device with a laser measuring device and a passive video camera
JP6851535B1 (en) Setting method using teaching operation
JP2022530589A (en) Robot-mounted mobile devices, systems and machine tools
JP7015949B1 (en) Sticking position measuring device and machine tool equipped with it
JP6937444B1 (en) Robot system positioning accuracy measurement method
US20230381969A1 (en) Calibration Method And Robot System
Callegari et al. Cartesian space visual control of a translating parallel manipulator

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22863988

Country of ref document: EP

Kind code of ref document: A1