WO2023032400A1 - Dispositif de transport automatique et système - Google Patents

Dispositif de transport automatique et système Download PDF

Info

Publication number
WO2023032400A1
WO2023032400A1 PCT/JP2022/023266 JP2022023266W WO2023032400A1 WO 2023032400 A1 WO2023032400 A1 WO 2023032400A1 JP 2022023266 W JP2022023266 W JP 2022023266W WO 2023032400 A1 WO2023032400 A1 WO 2023032400A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
posture
coordinate system
stereo camera
camera
Prior art date
Application number
PCT/JP2022/023266
Other languages
English (en)
Japanese (ja)
Inventor
勇太 大場
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2023032400A1 publication Critical patent/WO2023032400A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices

Definitions

  • the present disclosure includes an industrial machine such as a machine tool that processes a workpiece, a robot that performs work on the industrial machine, a transport device equipped with the robot and capable of moving to a work position set with respect to the industrial machine, and the robot. It relates to a system provided with a control device for controlling the
  • Patent Document 1 the system disclosed in JP-A-2017-132002 (Patent Document 1 below) is known as an example of the above system.
  • an unmanned guided vehicle equipped with a robot moves to a working position set for a machine tool, which is one type of industrial machine. work is performed.
  • a single robot moved by an unmanned guided vehicle can carry out operations such as attaching and detaching workpieces to and from a plurality of machine tools. Since the degree of freedom in the layout of the machine tool is increased compared to the case of arranging them, the layout of the machine tool can be set to a layout that can further improve production efficiency. In addition, compared to conventional systems in which robots are fixed, one robot can work on a larger number of machine tools, so equipment costs can be reduced.
  • the robot attitude when the automatic guided vehicle is positioned at the working position and the robot attitude set at the time of teaching, which is the reference for control are required. It is necessary to compare the posture with a reference posture, detect the amount of error, and correct the working posture of the robot according to the amount of error.
  • a position correction method disclosed in Japanese Patent Application Laid-Open No. 2016-221622 (Patent Document 2 below) is conventionally known.
  • a visual target consisting of two calibration markers is arranged on the outer surface of the machine tool, and the visual target is imaged by a camera provided on the movable part of the robot. Based on the captured image and the position and orientation of the camera, the relative positional relationship between the robot and the machine tool is measured, and the working posture of the robot is corrected based on the measured positional relationship. .
  • a so-called stereo camera having two cameras is known, and it would be convenient if the posture of the robot could be corrected using such a known stereo camera.
  • the present invention provides an automatic transport device and system described in the claims.
  • an industrial machine in which a robot actually works is configured, or a structure provided in this industrial machine is imaged by a stereo camera, and the posture of the robot is corrected based on the obtained image. Therefore, there is no need to prepare a special component such as a calibration marker for correcting the posture of the robot as in the conventional art, and the troublesome preparation of arranging this on the outer surface of the machine tool is eliminated. no need to do any work. Therefore, it is possible to easily construct a system for correcting the posture of the robot.
  • it is possible to correct the posture of the robot by taking an image of a single structure, so that the operation time of the robot for taking an image can be shortened compared to the conventional art. , the production efficiency in the system can be increased compared to the conventional system.
  • FIG. 1 is a plan view showing a schematic configuration of a system according to one embodiment of the present invention
  • FIG. 1 is a block diagram showing the configuration of a system according to this embodiment
  • FIG. It is the perspective view which showed the automatic guided vehicle and robot which concern on this embodiment.
  • 1 is a perspective view showing a spindle and a chuck, which are structures constituting the machine tool according to the present embodiment
  • FIG. 4 is an explanatory diagram for explaining an imaging posture of a robot
  • FIG. 11 is an explanatory diagram for explaining processing for calculating a correction amount for correcting the posture of the robot;
  • this system 1 includes a machine tool 10 as an industrial machine, a material stocker 20 and a product stocker 21, an automatic guided vehicle 35 as a transport device, and a robot mounted on the automatic guided vehicle 35. 25, and a control device 40 that controls the robot 25 and the automatic guided vehicle 35.
  • the automatic guided vehicle 35, the robot 25, and the controller 40 constitute an automatic guided vehicle.
  • the robot 25 is provided with a stereo camera 31 .
  • the machine tool 10 and the material stocker 20 and the product stocker 21 provided around the machine tool 10 are exemplified as the industrial machine, but the industrial machine to which the present invention is applicable is not limited to this.
  • a known horizontal NC (numerical control) lathe is exemplified as the machine tool 10, but the machine tool is, of course, not limited to this.
  • machine tools such as multi-tasking machines equipped with a tool spindle and a work spindle are included.
  • the machine tool 10 of this example is a conventionally known horizontal NC (numerical control) lathe, and as shown in FIG.
  • the workpiece W can be turned by using an appropriate tool.
  • 4 shows only the main spindle 11 and the chuck 12 as components (structures) of the machine tool 10, but it goes without saying that the machine tool 10 is a carriage that is disposed within the machining area.
  • other components known in the field such as a door arranged outside the processing area, a handle attached to the door, a control panel and an indicator light be able to.
  • the material stocker 20 is arranged on the left side of the machine tool 10 in FIG.
  • the product stocker 21 is arranged on the right side of the machine tool 10 in FIG. .
  • the automatic guided vehicle 35 has the robot 25 mounted on its upper surface, that is, a mounting surface 36, and is provided with an operation panel 37 that can be carried by an operator.
  • the operation panel 37 includes an input/output unit for inputting/outputting data, an operation unit for manually operating the automatic guided vehicle 35 and the robot 25, and a display capable of displaying a screen.
  • the automatic guided vehicle 35 is equipped with a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light), and under the control of the control device 40, the machine tool 10 , the machine tool 10, the material stocker 20 and the product stocker 21. through each working position.
  • a sensor capable of recognizing its own position in the factory (for example, a distance measurement sensor using laser light)
  • the control device 40 under the control of the control device 40, the machine tool 10 , the machine tool 10, the material stocker 20 and the product stocker 21. through each working position.
  • the robot 25 of this example is an articulated robot having three arms, a first arm 26, a second arm 27 and a third arm 28.
  • a hand 29 as an end effector (action part) is attached to the tip of the camera, and two cameras 31a and 31b are attached via a support bar 30.
  • These two cameras 31a and 31b are so-called stereo cameras 31 arranged so that their imaging optical axes intersect.
  • the aspect of the robot applicable to the present invention is not limited to the aspect of the robot 25 of this example, and the robot includes (i) a camera and (ii) a hand portion for gripping a workpiece or a tool. , (iii) a second arm portion movably connected to the hand portion, and (iv) a first arm portion movably connected to the second arm portion.
  • the hand portion corresponds to the hand 29
  • the second arm portion corresponds to the joint portion rotatably (movably) coupled to the second arm 27
  • the first arm portion corresponds to a joint portion rotatably (movably) coupled to the first arm 26 .
  • the third arm 28 of the robot of the present embodiment and the joint portion that is rotatably or advanceably retractable (movable) may be understood to correspond to the second arm portion. That is, although the robot 25 of this example has three arms, the robot may have at least two arms.
  • the control device 40 includes an operation program storage unit 41, a movement position storage unit 42, an operation attitude storage unit 43, a map information storage unit 44, a reference data storage unit 45, a manual operation control unit 46, an automatic It is composed of an operation control unit 47 , a map information generation unit 48 , a position recognition unit 49 , a point cloud data calculation unit 50 , a correction amount calculation unit 51 and an input/output interface 52 .
  • the control device 40 is connected to the machine tool 10 , material stocker 20 , product stocker 21 , robot 25 , stereo camera 31 , automatic guided vehicle 35 and operation panel 37 via this input/output interface 52 .
  • the control device 40 is not limited to this aspect.
  • the control device 40 may have at least a control section for controlling the position of the hand 30 of the robot 25, and the other storage section and the like may be provided by another device.
  • the control device 40 of this example is composed of a computer including a CPU, a RAM, a ROM, etc., and includes the manual operation control unit 46, the automatic operation control unit 47, the map information generation unit 48, the position recognition unit 49, and the point cloud data calculation unit. 50, the correction amount calculation unit 51, and the input/output interface 52 are realized by a computer program, and execute processing described later.
  • the motion program storage unit 41, movement position storage unit 42, motion posture storage unit 43, map information storage unit 44, and reference data storage unit 45 are configured from appropriate storage media such as RAM.
  • control device 40 is attached to the automatic guided vehicle 35 and is connected to the machine tool 10, the material stocker 20 and the product stocker 21 by suitable communication means, and the robot 25, the stereo camera 31, the automatic guided vehicle 35 and the operator. It is connected to the board 37 by wire or wirelessly.
  • the control device 40 is not limited to such a mode, and the control device 40 may be arranged at an appropriate position other than the automatic guided vehicle 35 . In this case, the control device 40 is appropriately connected to each section by communication means.
  • the manual operation control unit 46 is a functional unit that operates the unmanned guided vehicle 35, the robot 25, and the stereo camera 31 according to operation signals input from the operation panel 37 by the operator. That is, the operator can manually operate the automatic guided vehicle 35 , the robot 25 and the stereo camera 31 using the operation panel 37 under the control of the manual operation control section 46 .
  • the operation program storage unit 41 operates the automatic guided vehicle 35 when generating an automatic driving program for automatically driving the automatic guided vehicle 35 and the robot 25 during production, and map information in the factory, which will be described later.
  • This is a functional unit that stores a map generation program for The automatic driving program and the map generating program are input from, for example, an input/output unit provided on the operation panel 37 and stored in the operation program storage unit 41 .
  • the automatic operation program includes command codes relating to the movement position as a target position to which the automatic guided vehicle 35 moves, the movement speed, and the orientation of the automatic guided vehicle 35. and a command code for operating the stereo camera 31 are included.
  • the map generation program includes command codes for causing the automatic guided vehicle 35 to travel all over the factory without a track so that the map information generation unit 48 can generate map information.
  • the map information storage unit 44 is a functional unit that stores map information including arrangement information of machines, devices, equipment, etc. (devices, etc.) arranged in the factory where the automatic guided vehicle 35 travels. It is generated by the map information generator 48 .
  • the map information generation unit 48 causes the automatic guided vehicle 35 to travel according to the map generation program stored in the operation program storage unit 41 under the control of the automatic driving control unit 47 of the control device 40, which will be described later in detail.
  • the space information in the factory is acquired from the distance data detected by the sensor, and the planar shape of the equipment etc. arranged in the factory is recognized, for example, the planar shape of the equipment etc. registered in advance Based on this, the positions, plane shapes, etc. (layout information) of the specific devices arranged in the factory, in this example, the machine tool 10, the material stocker 20 and the product stocker 21 are recognized.
  • the map information generating unit 48 stores the obtained spatial information and arrangement information of the devices in the map information storage unit 44 as map information of the factory.
  • the position recognition unit 49 is a function unit that recognizes the position of the automatic guided vehicle 35 in the factory based on the distance data detected by the sensor and the map information in the factory stored in the map information storage unit 44. Based on the position of the automatic guided vehicle 35 recognized by the position recognition section 49 , the operation of the automatic guided vehicle 35 is controlled by the automatic operation control section 47 .
  • the movement position storage unit 42 is a movement position as a specific target position to which the automatic guided vehicle 35 moves, and is a functional unit that stores a specific movement position corresponding to the command code in the operation program. These movement positions include the work positions set for the machine tool 10, the material stocker 20, and the product stocker 21 described above. This movement position is determined by, for example, manually operating the automatic guided vehicle 35 using the operation panel 37 under the control of the manual operation control unit 46 to move it to each target position, and then performing the position recognition. It is set by the operation of storing the position data recognized by the unit 49 in the movement position storage unit 42 . This operation is called a so-called teaching operation.
  • the motion posture storage unit 43 stores data relating to motion postures (motion postures) of the robot 25 that sequentially change as the robot 25 moves in a predetermined order, corresponding to command codes in the motion program. is a functional unit that stores the The data relating to the motion posture is obtained when the robot 25 is manually operated by teaching operation using the operation panel 37 under the control of the manual operation control unit 46 to take each target posture. , the rotation angle data of each joint (motor) of the robot 25 in each posture, and this rotation angle data is stored in the motion posture storage unit 43 as data relating to the motion posture.
  • Specific motion postures of the robot 25 are set in the material stocker 20, the machine tool 10 and the product stocker 21 respectively.
  • the work start posture take-out start posture
  • the unprocessed work W stored in the material stocker 20 is gripped by the hand 29, and the material stocker
  • Each work posture (each take-out posture) for taking out from 20 and the posture when taking out is completed (take-out completion posture, which is the same posture as the take-out start posture in this example) are set as take-out motion postures.
  • a work take-out motion posture for taking out the machined work W' from the machine tool 10 and a work mounting motion posture for attaching the pre-machined work W to the machine tool 10 are set.
  • the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 chucks the workpiece.
  • 12 imaging posture
  • the posture in which the hand 29 is opposed to the processed work W′ gripped by the chuck 12 (retrieval preparation posture), and the hand 29 is moved to the chuck 12 side.
  • Outer posture), and a posture in which the hand 29 and the camera 31 are removed from the machine tool 10 (work completion posture) are set.
  • the hand 29 and the stereo camera 31 are moved into the machining area of the machine tool 10, and the stereo camera 31 images the chuck 12.
  • posture (imaging posture) (see FIG. 5), posture in which the unprocessed work W gripped by the hand 29 is opposed to the chuck 12 of the machine tool 10 (mounting preparation posture), and the hand 29 is moved to the chuck 12 side.
  • the chuck 12 can grip the workpiece W before machining (mounting posture), the hand 29 is separated from the chuck 12 (separation posture), and the hand 29 and the camera 31 are removed from the machine tool 10.
  • Each posture of the posture (work completion posture) is set.
  • FIG. 5 illustrates a state in which the stereo camera 31 faces the chuck 12, the imaging posture is not limited to this. It can be a posture of being.
  • a two-dot chain line shown in FIG. 5 is the field of view of the stereo camera 31 .
  • the work start posture (storage start posture) when work is started in the product stocker 21
  • each work posture for storing the processed work W′ gripped by the hand 29 in the product stocker 21 ( storage posture)
  • storage completion posture which is the same posture as the storage start posture in this example
  • the automatic operation control unit 47 uses either the automatic operation program or the map generation program stored in the operation program storage unit 41, and operates the automatic guided vehicle 35, the robot 25, and the camera 31 according to the program. Department. At that time, the data stored in the movement position storage section 42 and the motion posture storage section 43 are used as necessary.
  • the point cloud data calculation unit 50 calculates the focal length of each camera 31a and 31b, the distance between the cameras 31a and 31b, and the distance between the two cameras 31a and 31b. From the parallax, the position in the three-dimensional space of the camera coordinate system set for the stereo camera 31 is calculated for each element obtained by dividing the object to be imaged into a predetermined size.
  • Each of the above elements is recognized as a point group forming an object to be imaged, and the position data thereof is the position data of the point group in the three-dimensional space of the camera coordinate system (three-dimensional point group position data).
  • the point cloud data calculation unit 50 detects the stereo image when the automatic guided vehicle 35 is at the work position set with respect to the machine tool 10 and the robot 25 takes the above-described imaging posture. Based on two images of the chuck 12 captured by the camera 31, three-dimensional point cloud position data of the chuck 12 in the camera coordinate system is calculated from these images.
  • the reference data storage unit 45 captures images with the stereo camera 31 when the unmanned guided vehicle 35 is at the working position set with respect to the machine tool 10 and the robot 25 is in the imaging posture during teaching operation. It is a functional unit that stores the three-dimensional point cloud position data of the chuck 12 calculated by the point cloud data calculation unit 50 based on the image of the chuck 12 obtained as reference data.
  • the reference data storage unit 45 also stores three-dimensional point group position data obtained from CAD data of the chuck 12 , which is an object coordinate system set for the chuck 12 .
  • 3D point cloud position data in the coordinate system is stored.
  • this object coordinate system is defined by, for example, two orthogonal x-axis and y-axis set in a plane orthogonal to the axis of the chuck 12, and three orthogonal axes of the z-axis orthogonal to these x-axis and y-axis. can be defined.
  • the correction amount calculation unit 51 determines whether the robot 25 is When the chuck 12 is imaged by the stereo camera 31 in the imaging posture and the current three-dimensional point cloud position data of the chuck 12 is calculated by the point cloud data calculation unit 50, this current three-dimensional point cloud position data is obtained. and the reference data (three-dimensional point cloud position data of the chuck 12 calculated during the teaching operation) stored in the reference data storage unit 45, the current posture of the robot 25 and the posture during the teaching operation.
  • the amount of positional error and the amount of rotational error of the stereo camera 31 in between are estimated, and based on each estimated amount of error, the amount of correction for the action portion in the working posture is calculated.
  • the correction amount calculation unit 51 calculates the following based on the current three-dimensional point cloud position data of the chuck 12 obtained during automatic operation and the reference data stored in the reference data storage unit 45.
  • the amount of positional error of the stereo camera 31 between the current posture of the robot 25 and the posture during the teaching operation which is set within a plane orthogonal to the axis of the chuck 12, is determined.
  • each working posture of the robot 25 is corrected based on the estimated position error amount and rotation error amount.
  • the correction amount calculation unit 51 sets the chuck 12 from the camera coordinate system corresponding to the stereo camera 31 based on the reference data obtained during the teaching operation.
  • coordinate transformation matrix for transforming to the object coordinate system which is the coordinate system to get
  • the camera coordinate system can be defined, for example, by orthogonal three axes that are set centering on an intermediate position between the imaging elements in a plane that includes the imaging elements (for example, CMOS sensors) of the cameras 31a and 31b. .
  • this coordinate transformation matrix can be calculated by the following procedure. That is, first, the three-dimensional point cloud position data in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45, the internal parameters of the stereo camera 31 (for example, the focal length of the camera), and Coordinate transformation matrix from camera coordinate system to object coordinate system related to CAD data based on external parameters (distance between cameras 31a and 31b, parallax, etc.) to get
  • the correction amount calculation unit 51 stores the three-dimensional point cloud position data of the chuck 12 in the object coordinate system in the object coordinate system of the chuck 12 obtained from the CAD data stored in the reference data storage unit 45 as well.
  • the reference data which is the three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation
  • these are applied to the RANSAC algorithm (global alignment) and the ICP algorithm (local alignment)
  • the coordinate transformation matrix from the object coordinate system related to the CAD data to the object coordinate system related to the chuck 12 at the time of teaching operation to get as conceptually shown in FIG.
  • the figure indicated by the dashed line is the figure related to the three-dimensional point group position data in the object coordinate system of the chuck 12 obtained from the CAD data, and the figure indicated by the solid line is obtained during the teaching operation. , the above coordinate transformation matrix is obtained.
  • the coordinate transformation matrix After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix and a camera position in the camera coordinate system, which is a camera position at the time of imaging in the teaching operation and based on the camera position during the teaching operation in the object coordinate system is calculated by Equation 2 below. (Formula 2)
  • the correction amount calculation unit 51 calculates the camera position in the robot coordinate system during the teaching operation. is calculated by Equation 3 below.
  • the robot coordinate system is a three-dimensional coordinate system set for the control device 40 to control the robot 25, and is defined by three orthogonal axes with origins set at appropriate positions. (Formula 3)
  • the correction amount calculation unit 51 calculates the coordinate transformation matrix for converting from the camera coordinate system during the teaching operation to the robot coordinate system during the teaching operation. is calculated by Equation 4 below. (Formula 4) here, Rotation matrix elements of Rotation angles around x, y and z axes based on , , Calculate
  • a coordinate transformation matrix for transforming from the object coordinate system to the robot coordinate system at the time of teaching operation is a coordinate transformation matrix for transforming from the object coordinate system to the robot coordinate system at the time of teaching operation.
  • the coordinate transformation matrix for converting from the object coordinate system to the camera coordinate system during teaching operation and a coordinate transformation matrix for transforming from the camera coordinate system to the robot coordinate system at the time of teaching operation can be obtained by Equation 5 below. (Formula 5)
  • the correction amount calculation unit 51 calculates the above based on the current three-dimensional point cloud position data in the camera coordinate system of the chuck 12 calculated by the point cloud data calculation unit 50 during automatic operation (during actual operation).
  • the coordinate transformation matrix from the camera coordinate system to the object coordinate system at the time of teaching operation is the three-dimensional point cloud position data in the object coordinate system of the chuck 12 at the time of the teaching operation, the internal parameters of the stereo camera 31 (for example, the focal length of the camera, etc.), which are stored in the reference data storage unit 45, as described above ) and external parameters (distance between cameras 31a and 31b, parallax, etc.).
  • the coordinate transformation matrix from the object coordinate system during teaching operation to the object coordinate system during automatic operation are three-dimensional point cloud position data of the chuck 12 in the camera coordinate system during the teaching operation stored in the reference data storage unit 45, and calculated by the point cloud data calculation unit 50 during automatic operation.
  • the 3D point cloud position data of the chuck 12 for example, according to the RANSAC algorithm (global alignment) and ICP algorithm (local alignment)
  • the 3D point cloud position data during teaching operation is automatically operated. It can be calculated by performing a process of superimposing it on the three-dimensional point cloud position data of time.
  • the figure indicated by the dashed line is the figure related to the three-dimensional point group position data of the chuck 12 in the object coordinate system during teaching operation
  • the figure indicated by the solid line is the object coordinate system during automatic operation.
  • the coordinate transformation matrix is obtained.
  • the coordinate transformation matrix After calculating the correction amount calculation unit 51, the calculated coordinate transformation matrix and the camera position in the camera coordinate system, which is the camera position at the time of imaging in automatic driving and based on the camera position during automatic driving in the object coordinate system is calculated by Equation 7 below. (Formula 7)
  • the correction amount calculation unit 51 calculates the current camera position in the object coordinate system during automatic driving. is calculated by the following formula 8, and the current camera position in the robot coordinate system during automatic operation is calculated by Equation 9 below. (Formula 8) (Formula 9)
  • the correction amount calculation unit 51 calculates the coordinate transformation matrix for transforming the current camera coordinate system to the robot coordinate system at the time of the teaching operation. is calculated by Equation 10 below. (Formula 10) here, Rotation matrix elements of Rotation angles around x, y and z axes based on , , Calculate
  • the correction amount calculation unit 51 calculates the camera angle during the teaching operation in the coordinate system during the teaching operation calculated as described above. , , and the current camera angle in the coordinate system during teaching operation , , Rotational errors ⁇ rx, ⁇ ry, and ⁇ rz about the x-axis, y-axis, and z-axis are calculated by calculating these differences based on and. however,
  • the correction amount calculation unit 51 calculates the rotation matrix between the robot coordinate system at the time of the teaching operation and the current robot coordinate system. That is, the rotation error amount is calculated by the following formula 11, and the translation matrix from the robot coordinate system at the time of the teaching operation to the current robot coordinate system That is, the position error amount is calculated by Equation 12 below. (Formula 11) (Formula 12)
  • the automatic driving control unit 47 determines the position of the hand 29 in the subsequent motion posture of the robot 25 based on the correction amount calculated by the correction amount calculation unit 51 . is corrected according to Equation 14 below. (Formula 14)
  • unmanned automatic production is executed as follows.
  • the automatic operation program stored in the operation program storage unit 41 is executed, and according to this automatic operation program, for example, the automatic guided vehicle 35 and Robot 25 operates as follows.
  • the unmanned guided vehicle 35 moves to the work position set with respect to the machine tool 10, and the robot 25 assumes the work start posture for the above-described work take-out operation.
  • the machine tool 10 has completed the predetermined machining and the door cover is open so that the robot 25 can enter the machining area.
  • the robot 25 shifts to the imaging posture and images the chuck 12 with the stereo camera 31 .
  • the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12
  • the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above.
  • the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
  • the automatic operation control unit 47 determines the subsequent work take-out operation postures, that is, the above-described take-out preparation posture, gripping posture, removal posture, and work completion posture.
  • the position of the hand 29 at is corrected according to Equation 14, and the machined workpiece W' gripped by the chuck 12 of the machine tool 10 is gripped by the hand 29 and taken out from the machine tool 10.
  • the chuck 12 is opened by transmitting a chuck opening command from the automatic operation control unit 47 to the machine tool 10 after causing the robot 25 to take the gripping posture.
  • the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the product stocker 21, and instructs the robot 25 to perform the storage start posture when starting work in the product stocker 21. , each storage attitude for storing the machined workpiece gripped by the hand 29 in the product stocker 21 and the storage completion attitude when the storage is completed.
  • the automatic operation control unit 47 moves the unmanned guided vehicle 35 to the work position set with respect to the material stocker 20, and instructs the robot 25 to take out a take-out start posture when starting work in the material stocker 20,
  • the unprocessed work stored in the material stocker 20 is grasped by the hand 29, and the hand 29 is caused to sequentially take each take-out posture for taking out the work from the material stocker 20 and the take-out completion posture when the take-out is completed, and the work is processed by the hand 29. Grip the front workpiece.
  • the automatic operation control unit 47 again moves the unmanned guided vehicle 35 to the work position set with respect to the machine tool 10, and causes the robot 25 to assume the work start posture for the work mounting operation described above.
  • the robot 25 is shifted to the imaging posture, and the chuck 12 is imaged by the stereo camera 31 .
  • the point cloud data calculation unit 50 calculates three-dimensional point cloud position data of the chuck 12, and the correction amount calculation unit 51 calculates , based on the three-dimensional point cloud position data and the reference data stored in the reference data storage unit 45, the imaging posture of the robot 25 at the time of the teaching operation and the current imaging posture are calculated according to Equations 11 and 12 above.
  • the correction amount for the subsequent work take-out motion posture of the robot 25 is calculated according to Equation 13 described above.
  • the automatic operation control unit 47 determines the subsequent work mounting posture of the robot 25, that is, the above-described mounting preparation posture, mounting posture, separation posture, and After correcting the position of the hand 29 in the work completion posture according to Equation 14, the robot 25 attaches the unprocessed workpiece W gripped by the hand 29 to the chuck 12 of the machine tool 10, and then moves out of the machine. Let After that, the automatic operation control unit 47 transmits a machining start command to the machine tool 10 to cause the machine tool 10 to perform the machining operation. After the robot 25 takes the mounting posture, the automatic operation control unit 47 transmits a chuck close command to the machine tool 10, whereby the chuck 12 is closed and the workpiece W to be machined is gripped by the chuck 12. be done.
  • system 1 of this example continuously executes unmanned automatic production.
  • the stereo camera 31 images the chuck 12, which is a structure that constitutes the machine tool 10 on which the robot 25 actually works. Since the posture is corrected, there is no need to prepare a special component such as a calibration marker in order to correct the posture of the robot 25 as in the conventional art, and this can be installed outside the machine tool 10. There is no need to perform troublesome preparatory work for arranging on the surface. Therefore, a system for correcting the posture of the robot 25 can be easily constructed. In addition, in the system 1 of the present example, it is possible to correct the posture of the robot 25 by capturing an image of the chuck 12, which is a single structure. This can be done by motion, and as a result, the operating time of the robot 25 for imaging can be shortened compared to the conventional art, and as a result, the production efficiency in the system 1 can be increased compared to the conventional art. can.
  • the operation on the chuck 12 requires the most accurate operation. Since the working posture of the robot 25 is corrected based on the image obtained by the above operation, the working posture can be corrected accurately. However, the work can be performed accurately.
  • the unmanned guided vehicle 35 is configured to move by the motion of the wheels with a relatively high degree of freedom, the mounting surface on which the robot 25 is mounted is likely to tilt with respect to the floor surface, and the robot to be mounted can be tilted with respect to the floor surface. 25, in other words, according to a change in the position of the center of gravity of the robot 25, the inclination is likely to change. Therefore, when the workpieces W and W′ described above are attached and detached, the robot 25 causes the hand 29 and the stereo camera 31 to enter the processing area of the machine tool 10 so that the hand 29 is far away from the automatic guided vehicle 35. When the hand 29 is out of the working area of the machine tool 10, the hand 29 does not overhang the automatic guided vehicle 35, or the overhang is small even if it does. is larger than the slope of
  • the posture correction amount does not accurately reflect the posture error amount of the robot 25 when the hand 29 of the robot 25 is in the machining area of the machine tool 10. The posture of the robot 25 cannot be corrected accurately.
  • the hand 29 of the robot 25 cannot be accurately positioned with respect to the chuck 12.
  • the chuck 12 is a collet chuck or the like, and when the movement allowance (stroke) of the gripping part is extremely small, that is, when the clearance between the workpieces W, W' and the collet chuck is extremely small, the workpiece W is moved to the collet chuck. , W′ cannot be securely gripped. If the attachment and detachment of the workpieces W and W' cannot be reliably performed, the operation rate of the system 1 will decrease, and unmanned production with good production efficiency cannot be realized.
  • the chuck 12, which requires the most accurate work in the work of the robot 25, is set as a structure to be imaged by the stereo camera 31, but it is limited to such a mode.
  • other structures that make up the machine tool 10, such as a tool post, a tool, a carriage, etc. that are also arranged in the machining area may be imaged according to the part where the robot 25 works.
  • a door disposed outside the processing area, a handle attached to the door, a structure such as an operation panel or an indicator light, and the like may be imaged.
  • This aspect also has the effect that a system for correcting the posture of the robot 25 can be easily constructed without preparing special components or performing troublesome preparatory work. be done.
  • a system for correcting the posture of the robot 25 can be easily constructed without preparing special components or performing troublesome preparatory work. be done.
  • it is possible to correct the posture of the robot 25 by capturing an image of a single structure it is possible to perform this in a single operation when capturing an image of the structure with the stereo camera 31.
  • the operation time of the robot 25 for imaging can be shortened as compared with the conventional art, and as a result, the production efficiency in the system 1 can be improved as compared with the conventional art.
  • the imaging attitude is the attitude before the hand 29 and the stereo camera 31 enter the machining area of the machine tool 10, that is, the hand
  • the 29 and the stereo camera 31 are outside the machining area of the machine tool 10, they are in a posture for imaging the imaging target.
  • the working posture of the robot 25 may be corrected in the same manner as described above.
  • the industrial machine according to the present invention includes all machines used industrially, such as a measuring device, a cleaning device, a washing device, and robots other than the robot 25 of this example. When the operator 25 works on this industrial machine, it is possible to correct its working posture in the manner described above.
  • an aspect using the automatic guided vehicle 35 was exemplified, but the present invention is not limited to this. It may be a configured transport device. Then, a robot 25 is mounted on the transport device, and the transport device is manually transported to the working position of the machine tool 10, and the robot 25 attaches and detaches the work to and from the machine tool 10. Also good.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention concerne un système qui comprend : un dispositif de transport automatique comprenant un robot (25) avec une caméra stéréo (31), un dispositif de transport (35) permettant de déplacer le robot (25) vers une position de travail et un dispositif de commande (40) permettant de commander le robot (25); et une machine-outil (10) en tant que machine industrielle. Le dispositif de commande (40), lors de la réalisation d'une opération d'enseignement, capture une image d'une structure de la machine-outil (10) au moyen de la caméra stéréo (31) et génère des données de référence composées de données de position de nuage de points tridimensionnels de la structure; et, lors du fonctionnement réel du robot (25), capture une image de la structure au moyen de la caméra stéréo (31), génère des données réelles composées de données de position de nuage de points tridimensionnels de la structure, estime, sur la base des données réelles et des données de référence obtenues, une quantité d'erreur de position et une quantité d'erreur de rotation de la caméra stéréo (31) entre une posture réelle du robot (25) et une posture pendant l'opération d'enseignement et corrige la posture du robot (25) sur la base de la quantité d'erreur de position et de la quantité d'erreur de rotation estimées.
PCT/JP2022/023266 2021-09-06 2022-06-09 Dispositif de transport automatique et système WO2023032400A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-144538 2021-09-06
JP2021144538A JP7093881B1 (ja) 2021-09-06 2021-09-06 システム、及び自動搬送車

Publications (1)

Publication Number Publication Date
WO2023032400A1 true WO2023032400A1 (fr) 2023-03-09

Family

ID=82217736

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/023266 WO2023032400A1 (fr) 2021-09-06 2022-06-09 Dispositif de transport automatique et système

Country Status (2)

Country Link
JP (1) JP7093881B1 (fr)
WO (1) WO2023032400A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (zh) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07146121A (ja) * 1993-10-01 1995-06-06 Nippondenso Co Ltd 視覚に基く三次元位置および姿勢の認識方法ならびに視覚に基く三次元位置および姿勢の認識装置
JPH0970781A (ja) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd 自立走行ロボットの三次元位置姿勢較正方法
JP2010172986A (ja) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd ロボットビジョンシステムおよび自動キャリブレーション方法
JP2016170050A (ja) * 2015-03-12 2016-09-23 キヤノン株式会社 位置姿勢計測装置、位置姿勢計測方法及びコンピュータプログラム
JP2020015102A (ja) * 2018-07-23 2020-01-30 オムロン株式会社 制御システム、制御方法およびプログラム
JP2021035708A (ja) * 2019-08-30 2021-03-04 Dmg森精機株式会社 生産システム

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021050646A1 (fr) 2019-09-11 2021-03-18 Dmg Mori Co., Ltd. Dispositif mobile monté sur robot, système et machine-outil

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07146121A (ja) * 1993-10-01 1995-06-06 Nippondenso Co Ltd 視覚に基く三次元位置および姿勢の認識方法ならびに視覚に基く三次元位置および姿勢の認識装置
JPH0970781A (ja) * 1995-09-07 1997-03-18 Shinko Electric Co Ltd 自立走行ロボットの三次元位置姿勢較正方法
JP2010172986A (ja) * 2009-01-28 2010-08-12 Fuji Electric Holdings Co Ltd ロボットビジョンシステムおよび自動キャリブレーション方法
JP2016170050A (ja) * 2015-03-12 2016-09-23 キヤノン株式会社 位置姿勢計測装置、位置姿勢計測方法及びコンピュータプログラム
JP2020015102A (ja) * 2018-07-23 2020-01-30 オムロン株式会社 制御システム、制御方法およびプログラム
JP2021035708A (ja) * 2019-08-30 2021-03-04 Dmg森精機株式会社 生産システム

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116175256A (zh) * 2023-04-04 2023-05-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法
CN116175256B (zh) * 2023-04-04 2024-04-30 杭州纳志机器人科技有限公司 一种推车式机器人上下料自动定位方法

Also Published As

Publication number Publication date
JP7093881B1 (ja) 2022-06-30
JP2023037769A (ja) 2023-03-16

Similar Documents

Publication Publication Date Title
JP6785931B1 (ja) 生産システム
US7200260B1 (en) Teaching model generating device
EP3222393B1 (fr) Système et procédé de guidage automatique pour machine à mouvement coordonné
JP5365379B2 (ja) ロボットシステム及びロボットシステムのキャリブレーション方法
US20160375532A1 (en) Fastening device, robot system, and fastening method for fastening plurality of fastening members
US20220331970A1 (en) Robot-mounted moving device, system, and machine tool
JP2024096756A (ja) ロボット搭載移動装置、及びその制御方法
WO2023032400A1 (fr) Dispositif de transport automatique et système
WO2022091767A1 (fr) Procédé de traitement d'images, dispositif de traitement d'images, dispositif de transfert de type monté sur un robot et système
JP2022126768A (ja) ロボット搭載移動装置
JP2016203282A (ja) エンドエフェクタの姿勢変更機構を備えたロボット
JP6832408B1 (ja) 生産システム
WO2022195938A1 (fr) Procédé de mesure de précision de positionnement de système robotique
WO2022075303A1 (fr) Système de robot
EP3224004B1 (fr) Système robotique comprenant un dispositif télémétrique avec un appareil de mesure laser et une caméra video passive
JP6851535B1 (ja) ティーチング操作を用いた設定方法
JP2024068115A (ja) ロボット搭載移動装置
JP7015949B1 (ja) 貼着位置測定装置及びこれを備えた工作機械
WO2024135220A1 (fr) Système de commande de robot
CN118019620A (zh) 控制装置以及机器人系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22863988

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22863988

Country of ref document: EP

Kind code of ref document: A1