WO2015109555A1 - Method, apparatus and robot system for moving objects to target position - Google Patents

Method, apparatus and robot system for moving objects to target position Download PDF

Info

Publication number
WO2015109555A1
WO2015109555A1 PCT/CN2014/071452 CN2014071452W WO2015109555A1 WO 2015109555 A1 WO2015109555 A1 WO 2015109555A1 CN 2014071452 W CN2014071452 W CN 2014071452W WO 2015109555 A1 WO2015109555 A1 WO 2015109555A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
motion
deviation
unit
effector
Prior art date
Application number
PCT/CN2014/071452
Other languages
French (fr)
Inventor
Peng KONG
Original Assignee
Abb Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Technology Ltd filed Critical Abb Technology Ltd
Priority to PCT/CN2014/071452 priority Critical patent/WO2015109555A1/en
Priority to CN201480074081.8A priority patent/CN105934313B/en
Publication of WO2015109555A1 publication Critical patent/WO2015109555A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1687Assembly, peg and hole, palletising, straight line, weaving pattern movement
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/36Nc in input of data, input key till input tape
    • G05B2219/36195Assembly, mount of electronic parts onto board
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40584Camera, non-contact sensor mounted on wrist, indep from gripper
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40609Camera to monitor end effector as well as object to be handled

Definitions

  • Embodiments of the present disclosure relate to an industrial robot system, and specifically to a method, an apparatus and a robot system for moving objects to target positions.
  • Some robot manufactures have provided an additional absolution accuracy tuning service for individual robot in the factory, which can improve the absolution accuracy in robot coordinate system.
  • the improvement is not enough for qualifying the accuracy requirement of robot systems such as an assembly system.
  • embodiments of the present disclosure propose a method, an apparatus and a robot system for moving objects to target positions with high accuracy.
  • embodiments of the present disclosure provide a method of moving an object to a target position by an industrial robot.
  • the robot comprises at least one arm and an effector unit mounted at the end of said at least one arm.
  • the method comprises causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and in response to a deviation between the second position and the target position, correcting the deviation by a motion of the effector unit.
  • the effector unit may comprise a motion unit movably mounted at the end of said at least one arm, and an end effector arranged at a first part of the motion unit. Besides, the motion of the effector unit may be caused by a motion of the motion unit with respect to the at least one arm.
  • the effector unit may further comprise a camera arranged at a second part of the motion unit, and the correcting the deviation by a motion of the motion unit may further comprise causing the robot to move the camera by a motion of the motion unit so that the vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position; and causing the robot to move the end effector by a motion of the motion unit so that a center axis of the picked object coincides with the reference axis.
  • the first part and the second part are movable with respect to each other.
  • the first part and the second part are fixed with respect to each other.
  • the motion type of the effector unit comprises linear, rotary or their combination.
  • the robot may further comprise a camera arranged at said at least one arm, and wherein the causing the robot to correct the deviation by a motion of the effector unit in response to a deviation between the second position and the target position further comprises causing the camera to capture images of the target position; calculating a deviation between the second position and the target position based on the captured images of the target position; and in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation. Moreover, in response to the deviation being larger than or equal to the predefined threshold, the robot may be caused to move to reduce the deviation.
  • the robot may be a serial robot, and the effector unit is integrated with the end of an end arm of the at least one arm.
  • the correcting the deviation by a motion of the effector unit is made by a movement of the end arm during which the other arms of the at least one arm remain motionless.
  • the method may further comprise receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot; determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object; and in response to the determination of mismatch, causing the robot to perform an adjustment regarding the mismatch.
  • the industrial robot is an assembly robot.
  • the end effector comprises a sucker or a gripper.
  • embodiments of the present invention provide an apparatus for moving an object to a target position by an industrial robot.
  • the robot comprises at least one arm and an effector unit mounted at the end of said at least one arm.
  • the apparatus comprises means for causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position.
  • embodiments of the present invention provide an industrial robot system comprising an industrial robot and a control unit.
  • the industrial robot comprises at least one arm and an effector unit mounted at the end of said at least one arm.
  • the control unit is configured to cause the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and in response to a deviation between the second position and the target position, correct the deviation by at least a motion of the effector unit.
  • FIG. 1 schematically illustrates an example layout of a vision guidance robot system in which embodiments of the present disclosure may be implemented
  • FIG. 2 schematically illustrates an example of a center P 0 of target position 106 calculated in a robot coordinate system
  • FIGs. 3A and 3B schematically illustrate an example process using an effector unit to correct a deviation between the position of the object and the target position according to an embodiment of the present disclosure
  • FIGs. 4A and 4B schematically illustrate another example process using another effector unit to correct a deviation between the position of the object and the target position according to an embodiment of the present disclosure
  • FIGs. 5A and 5B schematically illustrate exemplary path of the motion of the motion unit 301 in a robot coordinate system according to an embodiment of the present disclosure
  • FIGs. 6A and 6B schematically illustrate a schematic diagram in which the motion of the motion unit is caused by the motion of the robot arm;
  • FIGs. 7A and 7B schematically illustrate schematic diagrams for a match and mismatch between the picked object and the effector regarding their orientation and center point respectively;
  • FIGs. 8A and 8B schematically illustrate schematic diagrams for moving object as shown in Figs 7A and 7B to a target position respectively;
  • FIG. 9 schematically illustrates another robot system with a second camera according to a further embodiment of the present disclosure
  • Fig. 10 schematically illustrates a flow chart of a method for correcting a deviation between the position of the object and the target position according to an embodiment of the present disclosure
  • FIG. 11 schematically illustrates a flow chart of a method for implementing a step as presented in Fig. 10 according to an embodiment of the present disclosure.
  • Fig. 12 schematically illustrates a flow chart of a method for implementing a step as presented in Fig. 10 according to another embodiment of the present disclosure.
  • each block in the flowchart or block diagrams may represent a module, program, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • the robot system may include a robot 100, a control unit 102 and a vision system 104.
  • the robot 100 comprises at least one arm 101 which may (or may not) hold a camera 103 to capture images of a target position 106 on a PCB board 105 where an object 107 should be placed.
  • the arm 101 may be any type of arm, for example, a 6-axis arm. The present disclosure is not limited in this regard.
  • Camera 103 takes pictures for the board 105 and send related information to the vision system 104;
  • the vision system 104 calculates a target position 106 in the robot coordinate system based on the related information
  • the control unit 102 Based on the calculated target position 106, the control unit 102 causes the robot 100 to pick the object 107 and move it to the target position 106 on the PCB board 105 for assembling.
  • robot system is described as an assembly robot system in the above or a following description, it is understood that other types of robot systems such as a picking and placing system also fall within the scope of the present disclosure as long as an automation process of picking an object and moving it to a target position is applicable in it.
  • the present invention is not limited to any specific robot system in this regard.
  • Fig. 2 schematically illustrates an example of a center Po of target position 106 calculated in a robot coordinate system.
  • Po is calculated by the vision system 104 based on the pictures taken by camera 103.
  • P 0 can be very precise, e.g., with an accuracy of 0.01 mm.
  • the robot 101 usually arrives at a physical position with a center Po' instead of Po when it thinks it reaches the target pose Po and stops due to an absolute pose accuracy of the robot 100.
  • the deviation ⁇ P between P 0 and Po ' usually varies between 0.5-1.0 mm in the robot coordinate system.
  • ⁇ P can be decreased to +-0.1-0.2 mm, which is still larger than the system requirement.
  • the robot 100 may further comprise an effector unit mounted at the end of the at least one arm 101.
  • the effector unit may be arranged at the end of all the arms, while for a serial robot (e.g., 6-axis), the effector unit may be arranged at the end of the end arm of all the arms.
  • the effector unit is mounted at the end of the at least one arm 101, and may comprise a motion unit 301 and an end effector 302.
  • the motion unit 301 is movably mounted at the end of the at least one arm.
  • the effector unit may even be directly integrated with the end of an end arm of the at least one arm (e.g., the 6 th axis) and act as an end effector only.
  • the end effector 302 comprises, for example, a gripper or a sucker to pick the object 107.
  • the deviation between the position of the object and the target position may be corrected by following steps taught in Fig. 10, which schematically illustrates a flow chart of a method for correcting the deviation according to an embodiment of the present disclosure.
  • step S1010 first the robot is caused to pick the object at a first position by means of the effector unit and move the picked object to a second position. Then, method proceeds to step S1012, in response to a deviation between the second position and the target position, the robot is caused to correct the deviation by a motion of the effector unit.
  • the correction to the deviation as performed by means of a motion of the effector unit is more effective compared to that performed by the movement of the robot.
  • the motion of the effector unit in step S1012 may be caused by a motion of the motion unit with respect to the at least one arm.
  • the correcting the deviation by a motion of the effector unit in step S1012 may be made by a movement of the end arm (e.g., 6 axis) during which the other arms (e.g., l st -5 th axes) of the at least one arm 101 remain motionless.
  • the robot may further comprise a camera to perform a correction to the deviation.
  • the robot 100 comprises a camera 103 arranged at the at least one arm of the robot.
  • step 1012 as shown in Fig. 10 may further comprises steps as taught in Fig. 12.
  • Fig. 12 schematically illustrates an implementation of step S1012 according to an embodiment of present disclosure. As shown in Fig.
  • the correcting the deviation by a motion of the effector unit further comprises: causing the camera to capture images of the target position (S1210), calculating a deviation between the second position and the target position based on the captured images of the target position (S1212), and in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation (S1214).
  • images of the target position 106 can be captured anytime anywhere in correcting the deviation.
  • vision system 104 may iteratively re-calculate and then correct a current deviation between the current position of the object 107 and the target position 106, until a required accuracy is achieved. If the deviation is small (e.g., smaller than a predefined threshold), the deviation may be corrected by the motion of the effector unit as described above. However, in response to the deviation being larger than or equal to the predefined threshold, according to another embodiment of present disclosure, the deviation may be reduced by moving the robot first, and then performing steps S1210 to S1212 again to see if the deviation is small enough. This process (capture-calculate-causing the robot to move) may be performed repeatedly until a small deviation is achieved. Then, the small deviation may be corrected by the motion of the effector unit.
  • the camera may be alternatively arranged at a second part of the motion unit 301, as shown in Figs. 3 A and 3B.
  • the step S1012 may be implemented by steps which are taught in Fig. 11 and are different from the implementation as taught in Fig. 12.
  • the correcting the deviation by at least a motion of the motion unit further comprises:
  • the robot 100 is first caused, e.g., by the control unit 102, to move the camera 303 by a motion of the motion unit 301, so that a vision center axis of the camera 301 coincides with a center axis of the target position 106, as shown in Fig. 3 A.
  • the robot 100 knows exactly where the current vision center axis (referred to as a "reference axis" hereinafter) of the camera 301 is. That is, the reference axis is found by the robot system.
  • motion of the motion unit 301 in steps (i) and (ii) may be various, e.g., rotation about its center axis 204, linear movement, or their combination.
  • such motion of the unit 301 may even be caused partly or totally by a rotation of the arm 101 of robot 100 (which is similar to the case that the effector unit is integrated with the end arm as described above).
  • such motion of the unit 301 is merely a motion with respect to the arm 101. That is, after the robot 100 arrives at P 0 ', the correction to the deviation is performed only by one or more motions of the unit 301, while the arm 101 stays still.
  • the correction may be performed by the combination of the previous implementations.
  • the present disclosure is not limited in this regard.
  • the first part where the end effector 302 is arranged and the second part where the camera 303 is arranged can be either fixed or movable with respect to each other.
  • the motion unit 301 is an integrated component, and thus the first part and the second part as well as other part of the motion unit 301 are a unit.
  • the controller 102 may simply cause the motion unit 301 to rotate about its center axis 304 to make the center axis of the picked object coincide with the reference axis in step (ii).
  • An example of the proper arrangement is symmetrically arranging the end effector 302 and the camera 303 with respect to the rotation axis 304.
  • Figs. 4A and 4B show the latter scenario where the first part and the second part are movable with respect to each other.
  • the motion unit 301 may be comprised of, for example, a first part 3011, a second part 3012, and other part 3013.
  • the first part 3011 where the object 107 is arranged may be at a lower level compared to the second part 3012 where the camera 303 is arranged.
  • the motion of the unit 301 in steps (i) and (ii) may be motions of the second part 3012 and the first part 3011 respectively.
  • step (i) as illustrated in Fig.
  • the control unit 102 can merely move the first part 3011 of the motion unit 301 to make the center axis of the object 107 coincide with the vision center axis of the camera 303 (see Fig. 4B). That is, the center axes of the object 107 and the camera 303 are both referred to as the "reference axis". Also, the motion of the first/second part of the unit 301 in steps (ii)/(i) may be rotary or linear or their combination. In this scenario, if the end effector 302 and the camera 303 are properly arranged (e.g., symmetrically arranged with respect to the center axis 304) , step (ii) may be performed by simply rotate the first part 3011 around the axis 304.
  • FIGs. 5A and 5B schematically illustrate exemplary path of the motion of the motion unit 301 in a robot coordinate system according to an embodiment of the present disclosure.
  • the end effector 302 is shown symmetrically arranged with the camera 303 with respect to the center axis 304 of the motion unit 301, thus the object 107 may be moved to make its center axis coincide with the vision center axis of the camera 303 only by rotation.
  • Fig. 5B shows a linear movement of the object 107 to make its center axis coincide with the vision center axis of the camera 303.
  • Figs. 6A and 6B show a case where the motion in steps (i) and (ii) are caused by the motion of the robot arm 101.
  • Fig. 6A shows an initial status before the correction to the deviation of the robot arm 101 (end arm) as well as the motion unit 301.
  • the robot arm 101 may tilt to make the vision center axis of the camera 303 coincide with the center axis of the target position 106.
  • the robot arm 101 will tilt in an opposite direction to make the center axis of the object 107 coincide with the reference axis.
  • the object 107 With the correction process as described above, the object 107 will be directly on the top of the target position for follow-up actions such as assembly actions.
  • Fig. 9 schematically illustrates another robot system with a second camera 901 according to a further embodiment of the present disclosure. As shown in Fig. 9, the camera 901 is arranged under the end effector of the robot 100 and in a path that the robot 100 moves along. Referring back to Fig. 10, which shows in steps S1014 to S 1018 a workflow for ensuring the object 107 fully matched with the effector.
  • the workflow comprises receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot (SI 014); determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object (S1016); and in response to the determination of mismatch, causing the robot to perform an adjustment regarding the mismatch (S1018).
  • steps S1014-S1018 are supplementary steps to steps S1010-1012 and are optimal. It should be also noted that the order between steps S1010-S1012 and steps S1014-S1018 are not necessarily as shown in the Fig. 10. Instead, they can be performed concurrently or in a reverse order as needed. For example, S1014-S1018 may be performed before, after, or during the process of S1012.
  • the camera 901 captures images of the picked object 107 and send them to the control unit 102. Based on the images of the picked object 107, the control unit 102 determines whether there is a mismatch between the end effector and the object 107 in terms of their orientation and center point. In response to a determination of mismatch, the control unit 102 will cause the robot 100 to make related adjustment. These adjustments may be done by various means in the art, and is well-known to those skilled in the art. In addition, the adjustments may be made anytime and anywhere before the object 107 is finally assembled.
  • the method and apparatus described herein may be embodied as a method, system, or computer program product. Accordingly, that method and apparatus may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.”
  • method and apparatus may take the form of a computer program product on a computer-usable or computer-readable medium having computer-usable program code embodied in the medium.
  • the computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device and may by way of example but without limitation, be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium or even be paper or other suitable medium upon which the program is printed.
  • Computer-readable medium More specific examples (a non-exhaustive list) of the computer-readable medium would include: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc randomly memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device.
  • Computer program code for carrying out operations of the method and apparatus described herein may be written in an object oriented programming language such as Java, Smalltalk, C++, C# or the like, or may also be written in conventional procedural programming languages, such as the "C" programming language.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) .
  • LAN local area network
  • WAN wide area network
  • Internet Service Provider for example, AT&T, MCI, Sprint, EarthLink, MSN, GTE, etc.
  • the method and apparatus herein may be implemented in the control unit 102 for the robot 100 or may be implemented in a system that includes a computing device that is connected to the control unit 102.
  • the computing device includes software that performs the calculations used in the method and apparatus described herein to allow the assembly of a part, for example.
  • the software that is used by the computing device to perform those calculations is on a suitable media in a form that can be loaded into the computing device for execution.
  • the software can be loaded into the computing device or may be downloaded into the control unit 102 or the computing device, as described above, by well known means from the same site where the computing device is located or from another site that is remote from the site where the computing device is located.
  • the software may be resident in the computing device.
  • the system 100 does not include the computing device but instead includes only the control unit 102 and the software is either loaded into the control unit 102 from a suitable media or downloaded into the control unit 102 as described above for loading the software into control unit 102 or resident in the control unit 102 and the control unit 102 directly receives the inputs from the camera 103, 303 or 901.
  • the computing device or control unit 102 when the method is implemented in software in computing device or control unit 102, the computing device or control unit 102 functions to execute the software to thereby make the calculations of the method and system described above.
  • the control unit 102 is connected to robot 100 which is used to perform, for example, the assembly of an object that is described above.
  • the robot 100 is controlled to perform the assembly process (comprising a correction process) in accordance with the method and system described herein.
  • the technique described herein can be implemented on the robot control unit 102 as a software product, or implemented partly or entirely on the computing device which communicates with the robot control unit 102 via a communication network, such as, but not limited to, the Internet .

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A method of moving an object to a target position by an industrial robot is provided, wherein the robot comprises at least one arm and an effector unit mounted at the end of said at least one arm. The method comprises: causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and in response to a deviation between the second position and the target position, causing the robot to correct the deviation by a motion of the effector unit.

Description

METHOD, APPARATUS AND ROBOT SYSTEM FOR MOVING OBJECTS TO
TARGET POSITION
FIELD OF THE INVENTION
[0001] Embodiments of the present disclosure relate to an industrial robot system, and specifically to a method, an apparatus and a robot system for moving objects to target positions.
BACKGROUND OF THE INVENTION
[0002] As robot technology develops, an automation process of picking objects and moving them to target positions by an industrial robot is widely used in various scenarios. The robot picks an object at a first site, and moves it to a second site for follow-up actions, for example, placing it on a conveyer or performing an assembly task, etc. However, a main problem for the automation process is that the accuracy of the robot may not qualify the system's requirement. This problem is especially prominent for an assembly robot. For example, such type of system requires very high accuracy for the assembly, for instance: <0.05mm. However, in general, an absolution accuracy in the robot coordinate system might be 0.5-1.0 mm in its overall work space and is much worse than the required accuracy.
[0003] Some robot manufactures have provided an additional absolution accuracy tuning service for individual robot in the factory, which can improve the absolution accuracy in robot coordinate system. However, the improvement is not enough for qualifying the accuracy requirement of robot systems such as an assembly system.
[0004] In view of the forgoing, there are lacks of effective methods and apparatus for a robot system to move objects to a target position with high accuracy.
SUMMARY OF THE INVENTION
[0005] In order to address the foregoing and other potential problems, embodiments of the present disclosure propose a method, an apparatus and a robot system for moving objects to target positions with high accuracy. [0006] According to a first aspect, embodiments of the present disclosure provide a method of moving an object to a target position by an industrial robot. The robot comprises at least one arm and an effector unit mounted at the end of said at least one arm. The method comprises causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and in response to a deviation between the second position and the target position, correcting the deviation by a motion of the effector unit.
[0007] The effector unit may comprise a motion unit movably mounted at the end of said at least one arm, and an end effector arranged at a first part of the motion unit. Besides, the motion of the effector unit may be caused by a motion of the motion unit with respect to the at least one arm.
[0008] The effector unit may further comprise a camera arranged at a second part of the motion unit, and the correcting the deviation by a motion of the motion unit may further comprise causing the robot to move the camera by a motion of the motion unit so that the vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position; and causing the robot to move the end effector by a motion of the motion unit so that a center axis of the picked object coincides with the reference axis.
[0009] In an implementation, the first part and the second part are movable with respect to each other.
[0010] In an implementation, the first part and the second part are fixed with respect to each other.
[0011] In an implementation, the motion type of the effector unit comprises linear, rotary or their combination.
[0012] In an implementation, the robot may further comprise a camera arranged at said at least one arm, and wherein the causing the robot to correct the deviation by a motion of the effector unit in response to a deviation between the second position and the target position further comprises causing the camera to capture images of the target position; calculating a deviation between the second position and the target position based on the captured images of the target position; and in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation. Moreover, in response to the deviation being larger than or equal to the predefined threshold, the robot may be caused to move to reduce the deviation.
[0013] In an implementation, the robot may be a serial robot, and the effector unit is integrated with the end of an end arm of the at least one arm. In this case, the correcting the deviation by a motion of the effector unit is made by a movement of the end arm during which the other arms of the at least one arm remain motionless.
[0014] In an implementation, the method may further comprise receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot; determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object; and in response to the determination of mismatch, causing the robot to perform an adjustment regarding the mismatch.
[0015] In an implementation, the industrial robot is an assembly robot.
[0016] In an implementation, the end effector comprises a sucker or a gripper.
[0017] According to a second aspect, embodiments of the present invention provide an apparatus for moving an object to a target position by an industrial robot. The robot comprises at least one arm and an effector unit mounted at the end of said at least one arm. The apparatus comprises means for causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position.
[0018] According to a third aspect, embodiments of the present invention provide an industrial robot system comprising an industrial robot and a control unit. The industrial robot comprises at least one arm and an effector unit mounted at the end of said at least one arm. And the control unit is configured to cause the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and in response to a deviation between the second position and the target position, correct the deviation by at least a motion of the effector unit.
[0019] These and other optional embodiments of the present invention can be implemented to realize one or more of the following advantages. In accordance with some embodiments of the present disclosure, the robot pose accuracy related to the target position can be improved greatly and conveniently.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] Through the more detailed description of some preferred embodiments of the present disclosure in the accompanying drawings, the above and other objects, features and advantages of the present disclosure will become more apparent, wherein the same reference numerals generally refer to the same components in the embodiments of the present disclosure.
[0021] Fig. 1 schematically illustrates an example layout of a vision guidance robot system in which embodiments of the present disclosure may be implemented;
[0022] Fig. 2 schematically illustrates an example of a center P0 of target position 106 calculated in a robot coordinate system;
[0023] Figs. 3A and 3B schematically illustrate an example process using an effector unit to correct a deviation between the position of the object and the target position according to an embodiment of the present disclosure;
[0024] Figs. 4A and 4B schematically illustrate another example process using another effector unit to correct a deviation between the position of the object and the target position according to an embodiment of the present disclosure;
[0025] Figs. 5A and 5B schematically illustrate exemplary path of the motion of the motion unit 301 in a robot coordinate system according to an embodiment of the present disclosure;
[0026] Figs. 6A and 6B schematically illustrate a schematic diagram in which the motion of the motion unit is caused by the motion of the robot arm;
[0027] Figs. 7A and 7B schematically illustrate schematic diagrams for a match and mismatch between the picked object and the effector regarding their orientation and center point respectively;
[0028] Figs. 8A and 8B schematically illustrate schematic diagrams for moving object as shown in Figs 7A and 7B to a target position respectively;
[0029] Fig. 9 schematically illustrates another robot system with a second camera according to a further embodiment of the present disclosure;
[0030] Fig. 10 schematically illustrates a flow chart of a method for correcting a deviation between the position of the object and the target position according to an embodiment of the present disclosure;
[0031] Fig. 11 schematically illustrates a flow chart of a method for implementing a step as presented in Fig. 10 according to an embodiment of the present disclosure; and
[0032] Fig. 12 schematically illustrates a flow chart of a method for implementing a step as presented in Fig. 10 according to another embodiment of the present disclosure.
[0033] The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatuses, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, program, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
DETAILED DESCRIPTION OF EMBODIMENTS
[0034] Some preferred embodiments will be described in more detail with reference to the accompanying drawings, in which the preferred embodiments of the present disclosure have been illustrated. However, the present disclosure can be implemented in various manners, and thus should not be construed to be limited to the embodiments disclosed herein. On the contrary, those embodiments are provided for thorough and complete understanding of the present disclosure, and completely conveying the scope of the present disclosure to those skilled in the art.
[0035] Reference is first made to Fig.1, which schematically illustrates an example layout of a vision guidance robot system in which embodiments of the present disclosure may be implemented. As shown in Fig. 1, the robot system may include a robot 100, a control unit 102 and a vision system 104. The robot 100 comprises at least one arm 101 which may (or may not) hold a camera 103 to capture images of a target position 106 on a PCB board 105 where an object 107 should be placed. Those skilled in the art will understand that the arm 101 may be any type of arm, for example, a 6-axis arm. The present disclosure is not limited in this regard.
[0036] Take the robot 100 as an assembly system for example, to pick an object 107 (e.g., a part or a component) from an original position and assemble it at a target position 106 on the PCB board 105, normally there are following key steps:
[0037] (1) Camera 103 takes pictures for the board 105 and send related information to the vision system 104;
[0038] (2) The vision system 104 calculates a target position 106 in the robot coordinate system based on the related information; and
[0039] (3) Based on the calculated target position 106, the control unit 102 causes the robot 100 to pick the object 107 and move it to the target position 106 on the PCB board 105 for assembling.
[0040] It is noted that although the robot system is described as an assembly robot system in the above or a following description, it is understood that other types of robot systems such as a picking and placing system also fall within the scope of the present disclosure as long as an automation process of picking an object and moving it to a target position is applicable in it. The present invention is not limited to any specific robot system in this regard.
[0041] Fig. 2 schematically illustrates an example of a center Po of target position 106 calculated in a robot coordinate system. As mention above, Po is calculated by the vision system 104 based on the pictures taken by camera 103. According to the performance of the vision system, P0 can be very precise, e.g., with an accuracy of 0.01 mm. However, the robot 101 usually arrives at a physical position with a center Po' instead of Po when it thinks it reaches the target pose Po and stops due to an absolute pose accuracy of the robot 100. And the deviation Δ P between P0 and Po ' usually varies between 0.5-1.0 mm in the robot coordinate system. Even considering of an additional absolute accuracy tuning, Δ P can be decreased to +-0.1-0.2 mm, which is still larger than the system requirement.
[0042] To solve this problem, the robot 100 may further comprise an effector unit mounted at the end of the at least one arm 101. For example, for a parallel robot, the effector unit may be arranged at the end of all the arms, while for a serial robot (e.g., 6-axis), the effector unit may be arranged at the end of the end arm of all the arms.
[0043] In one implementation, with reference to Figs. 3A and 3B, which schematically illustrate an example process using an effector unit to correct Δ P according to an embodiment of the present disclosure, the effector unit is mounted at the end of the at least one arm 101, and may comprise a motion unit 301 and an end effector 302. The motion unit 301 is movably mounted at the end of the at least one arm. However, it is noted that, in the case that the robot is a serial robot (e.g., 6-axis), the effector unit may even be directly integrated with the end of an end arm of the at least one arm (e.g., the 6th axis) and act as an end effector only. The end effector 302 comprises, for example, a gripper or a sucker to pick the object 107. By means of the effector unit, the deviation between the position of the object and the target position may be corrected by following steps taught in Fig. 10, which schematically illustrates a flow chart of a method for correcting the deviation according to an embodiment of the present disclosure.
[0044] In step S1010, first the robot is caused to pick the object at a first position by means of the effector unit and move the picked object to a second position. Then, method proceeds to step S1012, in response to a deviation between the second position and the target position, the robot is caused to correct the deviation by a motion of the effector unit. In this way, as an effector unit usually has a higher accuracy than a robot, the correction to the deviation as performed by means of a motion of the effector unit is more effective compared to that performed by the movement of the robot.
[0045] For an effector unit as shown in Figs. 3A and 3B, as an example, the motion of the effector unit in step S1012 may be caused by a motion of the motion unit with respect to the at least one arm. However, in the case that the effector unit is integrated with the end of the end arm of the at least arm 101, the correcting the deviation by a motion of the effector unit in step S1012 may be made by a movement of the end arm (e.g., 6 axis) during which the other arms (e.g., lst-5th axes) of the at least one arm 101 remain motionless.
[0046] If a robot system with vision guidance is considered, the robot may further comprise a camera to perform a correction to the deviation. Referring back to Fig. 1, in an implementation according to an embodiment of the present disclosure, for example, the robot 100 comprises a camera 103 arranged at the at least one arm of the robot. In a robot system with vision systems as shown in Fig. 1, step 1012 as shown in Fig. 10 may further comprises steps as taught in Fig. 12. In other word, Fig. 12 schematically illustrates an implementation of step S1012 according to an embodiment of present disclosure. As shown in Fig. 12, the correcting the deviation by a motion of the effector unit further comprises: causing the camera to capture images of the target position (S1210), calculating a deviation between the second position and the target position based on the captured images of the target position (S1212), and in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation (S1214).
[0047] In another words, through the camera 103, images of the target position 106 can be captured anytime anywhere in correcting the deviation. Based on the captured images, vision system 104 may iteratively re-calculate and then correct a current deviation between the current position of the object 107 and the target position 106, until a required accuracy is achieved. If the deviation is small (e.g., smaller than a predefined threshold), the deviation may be corrected by the motion of the effector unit as described above. However, in response to the deviation being larger than or equal to the predefined threshold, according to another embodiment of present disclosure, the deviation may be reduced by moving the robot first, and then performing steps S1210 to S1212 again to see if the deviation is small enough. This process (capture-calculate-causing the robot to move) may be performed repeatedly until a small deviation is achieved. Then, the small deviation may be corrected by the motion of the effector unit.
[0048] In another implementation according to present disclosure, the camera may be alternatively arranged at a second part of the motion unit 301, as shown in Figs. 3 A and 3B. By such arrangement, the step S1012 may be implemented by steps which are taught in Fig. 11 and are different from the implementation as taught in Fig. 12. For example, the correcting the deviation by at least a motion of the motion unit further comprises:
[0049] (i) Causing the robot to move the camera by a motion of the motion unit so that the vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position (SI 110). In this step, the robot 100 is first caused, e.g., by the control unit 102, to move the camera 303 by a motion of the motion unit 301, so that a vision center axis of the camera 301 coincides with a center axis of the target position 106, as shown in Fig. 3 A. After that, the robot 100 knows exactly where the current vision center axis (referred to as a "reference axis" hereinafter) of the camera 301 is. That is, the reference axis is found by the robot system.
[0050] (ii) Causing the robot to move the end effector by at least a motion of the motion unit so that a center axis of the picked object coincides with the reference axis (SI 112). In this step, the robot 101 is then caused, by the control unit 102 for example, to move motion unit 301 again, so that a center axis of the picked object 107 coincides with the reference axis, as shown in Fig. 3B.
[0051] Here, motion of the motion unit 301 in steps (i) and (ii) may be various, e.g., rotation about its center axis 204, linear movement, or their combination. In an exemplary implementation, such motion of the unit 301 may even be caused partly or totally by a rotation of the arm 101 of robot 100 (which is similar to the case that the effector unit is integrated with the end arm as described above). In another exemplary implementation, such motion of the unit 301 is merely a motion with respect to the arm 101. That is, after the robot 100 arrives at P0', the correction to the deviation is performed only by one or more motions of the unit 301, while the arm 101 stays still. In a further exemplary implementation, the correction may be performed by the combination of the previous implementations. The present disclosure is not limited in this regard.
[0052] In practice, the first part where the end effector 302 is arranged and the second part where the camera 303 is arranged can be either fixed or movable with respect to each other. In the former shown in Figs. 3A and 3B, the motion unit 301 is an integrated component, and thus the first part and the second part as well as other part of the motion unit 301 are a unit. In this scenario, if the end effector 302 and the camera 303 are properly arranged, the controller 102 may simply cause the motion unit 301 to rotate about its center axis 304 to make the center axis of the picked object coincide with the reference axis in step (ii). An example of the proper arrangement is symmetrically arranging the end effector 302 and the camera 303 with respect to the rotation axis 304.
[0053] Figs. 4A and 4B show the latter scenario where the first part and the second part are movable with respect to each other. As illustrated in Figs. 4A and 4B, the motion unit 301 may be comprised of, for example, a first part 3011, a second part 3012, and other part 3013. And the first part 3011 where the object 107 is arranged may be at a lower level compared to the second part 3012 where the camera 303 is arranged. Thus, the motion of the unit 301 in steps (i) and (ii) may be motions of the second part 3012 and the first part 3011 respectively. In this scenario, after step (i) as illustrated in Fig. 4A, the control unit 102 can merely move the first part 3011 of the motion unit 301 to make the center axis of the object 107 coincide with the vision center axis of the camera 303 (see Fig. 4B). That is, the center axes of the object 107 and the camera 303 are both referred to as the "reference axis". Also, the motion of the first/second part of the unit 301 in steps (ii)/(i) may be rotary or linear or their combination. In this scenario, if the end effector 302 and the camera 303 are properly arranged (e.g., symmetrically arranged with respect to the center axis 304) , step (ii) may be performed by simply rotate the first part 3011 around the axis 304.
[0054] Figs. 5A and 5B schematically illustrate exemplary path of the motion of the motion unit 301 in a robot coordinate system according to an embodiment of the present disclosure. In Fig. 5A, the end effector 302 is shown symmetrically arranged with the camera 303 with respect to the center axis 304 of the motion unit 301, thus the object 107 may be moved to make its center axis coincide with the vision center axis of the camera 303 only by rotation. In contrast, Fig. 5B shows a linear movement of the object 107 to make its center axis coincide with the vision center axis of the camera 303.
[0055] Figs. 6A and 6B show a case where the motion in steps (i) and (ii) are caused by the motion of the robot arm 101. Fig. 6A shows an initial status before the correction to the deviation of the robot arm 101 (end arm) as well as the motion unit 301. Next, the robot arm 101 may tilt to make the vision center axis of the camera 303 coincide with the center axis of the target position 106. And then, the robot arm 101 will tilt in an opposite direction to make the center axis of the object 107 coincide with the reference axis.
[0056] It should be appreciated that the correction process described above with reference to Figs. 1, 11 and that described above with reference to Figs. 3-6, 12 can be combined together. For example, the camera 303 shown in Figs. 3-6 may also capture images of the target position 106 as done by the camera 103. The re-calculation and the correction are the same as described with respect to Figs. 1 and 11, and are not detailed here.
[0057] With the correction process as described above, the object 107 will be directly on the top of the target position for follow-up actions such as assembly actions.
[0058] In view of the forgoing, correcting a deviation between the position of the picked object and the target position by means of a motion unit arranged at the end of the robot's arm instead of the movement of the robot per se improves the accuracy of the robot system without increasing complexity and sacrificing cycle time.
[0059] In practice, the object 107 is not always picked in fully match with the effector 302 (e.g., gripper or sucker) in terms of their orientation and center point. Figs. 7A, 8A and 7B, 8B respectively show a matched case and a mismatched one. To adjust the mismatch (if any), Fig. 9 schematically illustrates another robot system with a second camera 901 according to a further embodiment of the present disclosure. As shown in Fig. 9, the camera 901 is arranged under the end effector of the robot 100 and in a path that the robot 100 moves along. Referring back to Fig. 10, which shows in steps S1014 to S 1018 a workflow for ensuring the object 107 fully matched with the effector.
[0060] The workflow comprises receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot (SI 014); determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object (S1016); and in response to the determination of mismatch, causing the robot to perform an adjustment regarding the mismatch (S1018).
[0061] It should be noted that, steps S1014-S1018 are supplementary steps to steps S1010-1012 and are optimal. It should be also noted that the order between steps S1010-S1012 and steps S1014-S1018 are not necessarily as shown in the Fig. 10. Instead, they can be performed concurrently or in a reverse order as needed. For example, S1014-S1018 may be performed before, after, or during the process of S1012.
[0062] Considering an implementation of steps S1014 to S1018 in system as illustrated in Fig. 9, the camera 901 captures images of the picked object 107 and send them to the control unit 102. Based on the images of the picked object 107, the control unit 102 determines whether there is a mismatch between the end effector and the object 107 in terms of their orientation and center point. In response to a determination of mismatch, the control unit 102 will cause the robot 100 to make related adjustment. These adjustments may be done by various means in the art, and is well-known to those skilled in the art. In addition, the adjustments may be made anytime and anywhere before the object 107 is finally assembled.
[0063] As will be appreciated by one of skill in the art, the method and apparatus described herein may be embodied as a method, system, or computer program product. Accordingly, that method and apparatus may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system."
[0064] Furthermore, that method and apparatus may take the form of a computer program product on a computer-usable or computer-readable medium having computer-usable program code embodied in the medium. The computer-usable or computer-readable medium may be any medium that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device and may by way of example but without limitation, be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, device, or propagation medium or even be paper or other suitable medium upon which the program is printed. More specific examples (a non-exhaustive list) of the computer-readable medium would include: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc randomly memory (CD-ROM), an optical storage device, a transmission media such as those supporting the Internet or an intranet, or a magnetic storage device. Computer program code for carrying out operations of the method and apparatus described herein may be written in an object oriented programming language such as Java, Smalltalk, C++, C# or the like, or may also be written in conventional procedural programming languages, such as the "C" programming language. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) .
[0065] As was described above, the method and apparatus herein may be implemented in the control unit 102 for the robot 100 or may be implemented in a system that includes a computing device that is connected to the control unit 102. In a case that a computing device is included in the system, the computing device includes software that performs the calculations used in the method and apparatus described herein to allow the assembly of a part, for example. The software that is used by the computing device to perform those calculations is on a suitable media in a form that can be loaded into the computing device for execution. Alternatively, the software can be loaded into the computing device or may be downloaded into the control unit 102 or the computing device, as described above, by well known means from the same site where the computing device is located or from another site that is remote from the site where the computing device is located. As another alternative, the software may be resident in the computing device. In another embodiment not shown in Fig. 7, the system 100 does not include the computing device but instead includes only the control unit 102 and the software is either loaded into the control unit 102 from a suitable media or downloaded into the control unit 102 as described above for loading the software into control unit 102 or resident in the control unit 102 and the control unit 102 directly receives the inputs from the camera 103, 303 or 901.
[0066] As can be appreciated by those of ordinary skill in the art, when the method is implemented in software in computing device or control unit 102, the computing device or control unit 102 functions to execute the software to thereby make the calculations of the method and system described above. The control unit 102 is connected to robot 100 which is used to perform, for example, the assembly of an object that is described above. Thus, if the software is executed by control unit 102 or if the control unit 102 receives commands from computing device that executes the software for the technique, the robot 100 is controlled to perform the assembly process (comprising a correction process) in accordance with the method and system described herein. It should be appreciated that the technique described herein can be implemented on the robot control unit 102 as a software product, or implemented partly or entirely on the computing device which communicates with the robot control unit 102 via a communication network, such as, but not limited to, the Internet .
[0067] The several exemplary embodiments of the present invention have been described above just for the purpose of illustration. It should be understood that the present invention is not limited to the disclosed embodiments. On the contrary, the present invention intends to cover various modifications and equivalent arrangements included in the spirit and scope of the appended claims. The scope of the appended claims meets the broadest explanations and covers all such modifications and equivalent structures and functions.

Claims

WHAT IS CLAIMED IS:
1. A method of moving an object to a target position by an industrial robot, wherein the industrial robot comprises at least one arm and an effector unit mounted at the end of said at least one arm, the method comprising:
causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and
in response to a deviation between the second position and the target position, causing the robot to correct the deviation by means of a motion of the effector unit.
2. The method of Claim 1, wherein the effector unit comprises a motion unit movably mounted at the end of said at least one arm, and an end effector arranged at a first part of the motion unit,
wherein the motion of the effector unit is caused by a motion of the motion unit with respect to the at least one arm.
3. The method of Claim 2, wherein the effector unit comprises a camera arranged at a second part of the motion unit,
wherein the correcting the deviation by the motion of the effector unit further comprises:
causing the robot to move the camera by the motion of the motion unit so that a vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position; and
causing the robot to move the end effector by the motion of the motion unit so that a center axis of the picked object coincides with the reference axis.
4. The method of Claim 3, wherein the first part and the second part are movable with respect to each other.
5. The method of Claim 3, wherein the first part and the second part are fixed with respect to each other.
6. The method of any one of the preceding claims, wherein a motion type of the effector unit comprises linear, rotary or their combination.
7. The method of Claim 1 or 2, wherein the robot further comprises a camera arranged at said at least one arm, and
wherein in response to a deviation between the second position and the target position, causing the robot to correct the deviation by a motion of the effector unit further comprises:
causing the camera to capture images of the target position;
calculating a deviation between the second position and the target position based on the captured images of the target position; and
in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation.
8. The method of Claim 7, further comprising:
in response to the deviation being larger than or equal to the predefined threshold, causing the robot to move to reduce the deviation.
9. The method of Claim 1, wherein the robot is a serial robot, and the effector unit is integrated with an end of an end arm of the at least one arm, and
wherein the correcting the deviation by a motion of the effector unit is made by a movement of the end arm during which the other arms of the at least one arm remain motionless.
10. The method of any of Claims 2 to 9, further comprising:
receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot;
determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object; and in response to the determination of mismatch, causing the robot to perform an adjustment regarding the mismatch.
11. The method of any one of the preceding claims, wherein the industrial robot is an assembly robot.
12. The method of any one of the preceding claims, wherein the end effector comprises a sucker or a gripper.
13. An apparatus for moving an object to a target position by an industrial robot, wherein the robot comprises at least one arm and an effector unit mounted at the end of said at least one arm, the apparatus comprising:
means for causing the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and
means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position.
14. The apparatus of Claim 13, wherein the effector unit comprises a motion unit movably mounted at the end of said at least one arm, and an end effector arranged at a first part of the motion unit,
wherein the means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position causes the motion of the effector unit by a motion of the motion unit with respect to the at least one arm.
15. The apparatus of Claim 14, wherein the effector unit comprises a camera arranged at a second part of the motion unit, and wherein the means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position further comprises:
means for causing the robot to move the camera by a motion of the motion unit so that the vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position; and
means for causing the robot to move the end effector by a motion of the motion unit so that a center axis of the picked object coincides with the reference axis.
16. The apparatus of Claim 15, wherein the first part and the second part are movable with respect to each other.
17. The apparatus of Claim 15, wherein the first part and the second part are fixed with respect to each other.
18. The apparatus of any preceding claim, wherein the motion type of the effector unit comprises linear, rotary or their combination.
19. The apparatus of Claim 13 or 14, wherein the robot further comprises a camera arranged at said at least one arm, and wherein the means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position further comprises:
means for causing the camera to capture images of the target position; means for calculating a deviation between the second position and the target position based on the captured images of the target position; and
means for in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation.
20. The apparatus of Claim 19, wherein the means for correcting the deviation by a motion of the effector unit in response to a deviation between the second position and the target position further comprises means for in response to the deviation being larger than or equal to the predefined threshold, causing the robot to move to reduce the deviation.
21. The apparatus of Claim 13, wherein the robot is a serial robot, and the effector unit is integrated with the end of an end arm of the at least one arm, and
wherein the correcting the deviation by a motion of the effector is made by a movement of the end arm during which the other arms of the at least one arm remain motionless.
22. The apparatus of any of Claims 14 to 21, further comprising: means for receiving images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot;
means for determining whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object; and
means for causing the robot to perform an adjustment regarding the mismatch in response to the determination of mismatch.
23. The apparatus of any preceding claim, wherein the industrial robot is an assembly robot.
24. The apparatus of any preceding claim, wherein the end effector comprises a sucker or a gripper.
25. An industrial robot system comprising:
an industrial robot, comprises:
at least one arm; and
an effector unit mounted at the end of said at least one arm; and a control unit, configured to:
cause the robot to pick the object at a first position by means of the effector unit and move the picked object to a second position; and
in response to a deviation between the second position and the target position, correct the deviation by a motion of the effector unit.
26. The system of Claim 25, wherein the effector unit comprises a motion unit movably mounted at the end of said at least one arm, and an end effector arranged at a first part of the motion unit,
wherein the control unit is configured to correct the deviation by causing a motion of the motion unit with respect to the at least one arm.
27. The system of Claim 26, wherein the effector unit further comprises a camera arranged at a second part of the motion unit, and wherein the control unit is configured to correct the deviation by:
^ causing the robot to move the camera by a motion of the motion unit so that the vision center axis of the camera arrives at a reference axis which coincides with a center axis of the target position; and
causing the robot to move the end effector by a motion of the motion unit so that a center axis of the picked object coincides with the reference axis.
0
28. The system of Claim 27, wherein the first part and the second part are movable with respect to each other.
29. The system of Claim 27, wherein the first part and the second part are fixed with respect to each other.
30. The system of any preceding claim, wherein the motion type of the effector unit comprises linear, rotary or their combination.
31. The system of Claim 25 or 26, wherein the robot further comprises a camera arranged at said at least one arm, and wherein the control unit is configured to correct the deviation by:
causing the camera to capture images of the target position;
calculating a deviation between the second position and the target position based on the captured images of the target position; and
in response to the deviation being smaller than a predefined threshold, causing the robot to move the effector unit to correct the deviation.
32. The system of Claim 31, wherein the control unit is configured to correct the deviation by:
in response to the deviation being larger than or equal to the predefined threshold, causing the robot to move to reduce the deviation.
33. The system of Claim 25, wherein the robot is a serial robot, and the effector unit is integrated with the end of an end arm of the at least one arm, and
wherein the deviation is corrected by a movement of the end arm during which the other arms of the at least one arm remain motionless.
34. The system of any of Claims 26 to 33, wherein the control unit is further configured to:
receive images of the picked object captured by a second camera arranged in a path that the robot moves along and under the end effector of the robot;
determine whether an orientation or a center point of the picked object mismatches with that of the end effector based on the images of the picked object; and in response to the determination of mismatch, cause the robot to perform an adjustment regarding the mismatch.
35. The system of any preceding claim, wherein the industrial robot is an assembly robot.
36. The system of any preceding claim, wherein the end effector comprises a sucker or a gripper.
PCT/CN2014/071452 2014-01-26 2014-01-26 Method, apparatus and robot system for moving objects to target position WO2015109555A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2014/071452 WO2015109555A1 (en) 2014-01-26 2014-01-26 Method, apparatus and robot system for moving objects to target position
CN201480074081.8A CN105934313B (en) 2014-01-26 2014-01-26 For object to be moved to the method, apparatus and robot system of target location

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2014/071452 WO2015109555A1 (en) 2014-01-26 2014-01-26 Method, apparatus and robot system for moving objects to target position

Publications (1)

Publication Number Publication Date
WO2015109555A1 true WO2015109555A1 (en) 2015-07-30

Family

ID=53680659

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2014/071452 WO2015109555A1 (en) 2014-01-26 2014-01-26 Method, apparatus and robot system for moving objects to target position

Country Status (2)

Country Link
CN (1) CN105934313B (en)
WO (1) WO2015109555A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220016770A1 (en) * 2019-02-01 2022-01-20 Mitsubishi Electric Corporation Work determination apparatus and work determination method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106584463B (en) * 2016-12-29 2019-09-24 南京天祥智能设备科技有限公司 Assembly system and assembly method based on inertia measurement
CN108181374A (en) * 2018-02-08 2018-06-19 聚光科技(杭州)股份有限公司 The method of work of plasma-mass spectrometry system
US20210260761A1 (en) * 2018-07-04 2021-08-26 Abb Schweiz Ag Method And Control System For Controlling An Industrial Actuator
CN114364569B (en) * 2019-05-31 2024-03-15 Abb电动汽车有限责任公司 Device and method for charging an electric vehicle, and method for calibrating a device for charging an electric vehicle
WO2021009800A1 (en) * 2019-07-12 2021-01-21 株式会社Fuji Robot control system and robot control method
JP7447676B2 (en) * 2020-05-26 2024-03-12 株式会社デンソーウェーブ Robot arm control device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103264738A (en) * 2013-06-07 2013-08-28 上海发那科机器人有限公司 Automatic assembling system and method for vehicle windshield glass
CN103342240A (en) * 2013-07-10 2013-10-09 深圳先进技术研究院 Bagged material car-loading system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2259011A4 (en) * 2008-03-28 2011-04-27 Honda Motor Co Ltd Work measuring method, method for attaching suspension assembly and apparatus for attaching suspension assembly
CN103192386B (en) * 2012-01-06 2014-10-22 沈阳新松机器人自动化股份有限公司 Image-vision-based automatic calibration method of clean robot

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005342832A (en) * 2004-06-02 2005-12-15 Fanuc Ltd Robot system
CN103264738A (en) * 2013-06-07 2013-08-28 上海发那科机器人有限公司 Automatic assembling system and method for vehicle windshield glass
CN103342240A (en) * 2013-07-10 2013-10-09 深圳先进技术研究院 Bagged material car-loading system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220016770A1 (en) * 2019-02-01 2022-01-20 Mitsubishi Electric Corporation Work determination apparatus and work determination method
US11865721B2 (en) * 2019-02-01 2024-01-09 Mitsubishi Electric Corporation Work determination apparatus and work determination method

Also Published As

Publication number Publication date
CN105934313B (en) 2018-02-06
CN105934313A (en) 2016-09-07

Similar Documents

Publication Publication Date Title
WO2015109555A1 (en) Method, apparatus and robot system for moving objects to target position
CN104889986B (en) Robot controller
WO2019116891A1 (en) Robot system and robot control method
US20140031982A1 (en) Robotic system and robot control device
CN106104195B (en) Image processing apparatus and substrate production system
JP2012183606A (en) Robot-position detecting device and robot system
US11340576B2 (en) Method and apparatus for estimating system error of commissioning tool of industrial robot
JP6809891B2 (en) Image processing system and image processing method
JP2015213139A (en) Positioning device
JP6906404B2 (en) Manufacturing method for robot systems, robot control devices and workpieces
CN110636923A (en) Motion control method of robot, robot and controller
JP2015116655A (en) Robot, robot control method, and robot control program
CN114080590A (en) Robotic bin picking system and method using advanced scanning techniques
JP5282014B2 (en) Teaching line correction device, teaching line correction method, and program thereof
JP2014188617A (en) Robot control system, robot, robot control method, and program
CN107263469B (en) Mechanical arm attitude compensation method and device, storage medium and mechanical arm
CN116494250B (en) Mechanical arm control method, controller, medium and system based on speed compensation
JP5082895B2 (en) Robot vision system
JP2015003348A (en) Robot control system, control device, robot, control method for robot control system and robot control method
CN116922387A (en) Real-time control method and system for photographic robot
WO2023207164A1 (en) Robot operation control method and apparatus
EP4094135A1 (en) System and method for controlling the robot, electronic device and computer readable medium
CN118061168A (en) Robot control method, device, equipment and storage medium through singular points
CN116512278B (en) Mechanical arm tail end linear motion control method and system based on virtual target point
WO2021065881A1 (en) Control device, control method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14879342

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14879342

Country of ref document: EP

Kind code of ref document: A1