WO2019183859A1 - Method and device for robot control - Google Patents

Method and device for robot control Download PDF

Info

Publication number
WO2019183859A1
WO2019183859A1 PCT/CN2018/080959 CN2018080959W WO2019183859A1 WO 2019183859 A1 WO2019183859 A1 WO 2019183859A1 CN 2018080959 W CN2018080959 W CN 2018080959W WO 2019183859 A1 WO2019183859 A1 WO 2019183859A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
placing
picking
determining
time
Prior art date
Application number
PCT/CN2018/080959
Other languages
French (fr)
Inventor
Wenyao SHAO
Shaojie Cheng
Jiajing TAN
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Priority to PCT/CN2018/080959 priority Critical patent/WO2019183859A1/en
Publication of WO2019183859A1 publication Critical patent/WO2019183859A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39102Manipulator cooperating with conveyor
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45063Pick and place manipulator

Definitions

  • Embodiments of the present disclosure generally relate to industrial robots, and more particularly, to a method and a device for robot control, a computer readable medium, and a computer program product.
  • Robots are more and more widely used in these industries for performing operations, such as picking and placing certain objects, to reduce heavy human labor.
  • An optimal operation strategy for robots performing their operations can improve the efficiency and reduce the cost.
  • Embodiments of the present disclosure present a method and a device for robot control, a computer readable medium, and a computer program product.
  • the embodiments of the present disclosure provide a method for robot control.
  • the method comprises determining correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots. Determining correspondence comprises for each of the plurality of objects, selecting at least one of the plurality of robots to perform an operation on the object, determining a start time for the at least one robot to perform the operation on the object, and determining a time length for the at least one robot to perform the operation on the object.
  • the correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
  • the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
  • the at least one robot is selected such that a robot operates a predetermined number of objects at a time; an object is operated by a robot only if the object is within a work zone of the robot; if an object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
  • determining the initial position and the first speed of the moving object comprises detecting the initial position and the first speed using an image capture device and/or a sensor.
  • the predetermined condition includes the sum of the time lengths reaching a minimum.
  • the embodiments of the present disclosure provide a device for robot control.
  • the device comprises at least one processor and at least one memory including computer program instructions.
  • the at least one memory and the computer program instructions are configured, with the processor, to cause the device to determine correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots. Determining correspondence comprises for each of the plurality of objects, selecting at least one of the plurality of robots to perform an operation on the object, determining a start time for the at least one robot to perform the operation on the object, and determining a time length for the at least one robot to perform the operation on the object. The correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
  • the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
  • the at least one robot is selected such that a robot operates a predetermined number of objects at a time; an object is operated by a robot only if the object is within a work zone of the robot; if an object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
  • At least one of the plurality of objects is a moving object.
  • the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to: determine a work zone of the at least one robot; determine an initial position and a first speed of the moving object; determine, based on the initial position and the first speed of the moving object, an entry time when the moving object enters the work zone and an exit time when the moving object exits the work zone; and select the start time between the entry time and the exit time.
  • the operation includes a picking-and-placing operation.
  • the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine a picking position in which the object is to be picked by the at least one robot and a placing position in which the object is to be placed by the at least one robot; determine a picking-and-placing distance based on the picking position and the placing position; and determine the time length for the at least one robot to perform the picking-and-placing operation on the object, based on the picking-and-placing distance and a picking-and-placing speed of the at least one robot.
  • At least one of the plurality of objects is a moving object.
  • the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine an initial position and a first speed of the moving object; and determine the picking position based on the initial position and the first speed of the moving object and the start time for the picking-and-placing operation.
  • the placing position changes over time.
  • the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine an initial placing position and a second speed at which the placing position changes; and determine the placing position based on the initial placing position, the second speed and the start time for the picking-and-placing operation.
  • the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to detect the initial position and the first speed using an image capture device and/or a sensor.
  • the predetermined condition includes the sum of the time lengths reaching a minimum.
  • the embodiments of the present disclosure provide a computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, cause the at least one processor to perform the method of the first aspect.
  • the embodiments of the present disclosure provide a computer program product being tangibly stored on a computer readable storage medium and comprising instructions which, when executed on at least one processor, cause the at least one processor to perform the method of the first aspect.
  • Fig. 1 illustrates a schematic diagram of an industrial robot system in which the embodiments of the present disclosure may be implemented.
  • Fig. 2 illustrates a flowchart of a method for robot control in accordance with the embodiments of the present disclosure.
  • Fig. 3 illustrates a block diagram of a device that can be used to implement the embodiments of the present disclosure.
  • the term “comprises” or “includes” and its variants are to be read as open terms that mean “includes, but is not limited to. ”
  • the term “or” is to be read as “and/or” unless the context clearly indicates otherwise.
  • the term “based on” is to be read as “based at least in part on. ”
  • the term “being operable to” is to mean a function, an action, a motion or a state can be achieved by an operation induced by a user or an external mechanism.
  • the term “one embodiment” and “an embodiment” are to be read as “at least one embodiment. ”
  • the term “another embodiment” is to be read as “at least one other embodiment. ”
  • embodiments of the present disclosure provide an improved robot control solution for a plurality of robots and a plurality of operation objects. Now some embodiments of the present disclosure will be discussed with reference to the figures.
  • Fig. 1 illustrates a schematic diagram of an industrial robot system 100 in which the embodiments of the present disclosure may be implemented.
  • the industrial robot system 100 includes a first conveyor 110, a second conveyor 130, and robots 150-1 and 150-2 (collectively or individually referred to as robots 150) .
  • robots 150 are shown in Fig. 1, it is to be understood that the industrial robot system 100 may include any number of robots and conveyors.
  • each of the robots 150 may perform an operation on one or more of the objects 120.
  • the operations that can be performed by a robot 150 on an object 120 may include, but are not limited to, machining, welding, packaging, picking-and-placing, or the like.
  • a robot 150 may have its own work zone. The robot 150 is capable of performing the corresponding operation on an object 120 only if the object 120 is within the work zone 180.
  • Fig. 1 shows an example work zone 180 of the robot 150-1.
  • the robot 150-1 can operate one or more objects 120 that are located in or moved into the work zone 180 by the first or second conveyor 110 or 130.
  • the work zone 180 is shown to have a particular shape (a circle in this example) and a particular size, in other examples, the work zone 180 may have a different shape and a different size.
  • the scope of the embodiments of the present disclosure is not limited in this regard.
  • the objects 120 may each have respective predefined placing positions after being picked by the robots 150 from their initial positions.
  • the second conveyor 130 is a placing conveyor.
  • a position 140 on the second conveyor 130 may be a placing position at which an object 120 is to be placed.
  • a placing position for an object 120 may also be outside of the second conveyor 130, such as a placing position 145 as shown in Fig. 1. It is to be understood that there may be more placing positions in the system 100 and Fig. 1 only shows two of them for simplicity.
  • the robots 150 need to pick the objects 120 and place them in their respective placing positions on the second conveyor 130 or outside the second conveyor 130.
  • the object 120-6 is within the work zone 180 of the robot 150-1 and has the position 140 as its placing position, so the robot 150-1 will pick the object 120-6 from the first conveyor 110 and place it at the placing position 140.
  • the objects 120-1 to 120-6 may move with the first conveyor 110 at a first speed Vi.
  • the object 120-7 is stationary outside the first conveyor 110.
  • the placing positions on the second conveyor (such as the placing position 140) may move with the second conveyor 130 at a second speed V2, while the placing position 145 is stationary outside of the second conveyor 130. It should be understood that depending on the initial positions of the objects and the presetting of the placing positions, there may be only moving or stationary objects and/or placing positions in some other cases.
  • the industrial robot system 100 further includes a computing device 160 and an image capture device 170.
  • the computing device 160 is communicatively coupled or connected to the image capture device 170 and the robots 150.
  • the computing device 160 may control the robots 150.
  • the image capture device 170 may be used to capture an image of the objects 120 to be operated by the robots 150 and send the image to the computing device 160 for the purpose of robot control.
  • the computing device 160 may obtain, based on the received image, information related to the objects 120, for example, an initial position and a first speed V1 of the objects 120-1 to 120-6.
  • the image capture device 170 may capture images of objects periodically or based on a trigger from the computing device 160.
  • the computing device 160 may determine the objects that need to be operated by determining new coming objects in the new image and the objects in the previous image which have not been operated.
  • the image capture device 170 may include, for example, a digital camera, a video camera, a mobile phone with image capture capability, a tablet with image capture capability, and so on.
  • the information related to the objects 120 may be detected by a sensor, such as a gravity sensor, and/or other detecting devices.
  • a sensor such as a gravity sensor, and/or other detecting devices.
  • the embodiments of the present disclosure are not limited in this regard.
  • the computing device 160 may determine correspondence between the robots 150 and the objects 120 to be operated by the robots 150. Such correspondence indicates, for each of the objects 120, which robot of the robots 150 at what time to perform an operation on the object. After determining the correspondence between the robots 150 and the objects 120, the computing device 160 may control the robots 150 to operate the objects 120 based on the determined correspondence. In the operation, upon an object 120 is performed or missed by a robot 150, a state of the object 120 may be updated by the computing device 160. The state may be picked, placed, and missed, for example.
  • the computing device 160 may need additional information in determining the correspondence, for example, the information related to the robots 150 and the industrial environment.
  • the additional information may be obtained from a database related to the robots 150 and the industrial environment. Alternatively, or in addition, this information may be obtained from the image capture device 170 and/or some sensor. Example embodiments of a method for robot control that may be performed by the computing device 160 will be described in details below with reference to Fig. 2.
  • Fig. 2 illustrates a flowchart of a method 200 for robot control in accordance with the embodiments of the present disclosure.
  • the method 200 can be implemented in the computing device 160 in the industrial robot system 100 as shown in Fig. 1.
  • the method 200 will be described with reference to Fig. 1. It would be appreciated that in some other embodiments, the method 200 can also be implemented in other components or in more than one component in the industrial robot system 100.
  • the computing device 160 determines correspondence between a plurality of robots 150 and a plurality of objects 120 to be operated by the plurality of robots 150. As mentioned, in determining the correspondence, the computing device 160 may determine, for each of the objects 120, which robot of the plurality of robots 150 at what time to perform an operation on the object. In the following, the object 120-6 will be taken as an example to describe the determination of the correspondence. It is to be appreciated that, for other objects 120-1 to 120-5 and 120-7, the computing device 160 may determine the correspondence in a similar manner.
  • the computing device 160 selects at least one of the plurality of robots 150 to perform the operation on the object 120-6. For example, if the operation to be performed on the object 120-6 is a picking-and-placing operation, the computing device 160 may select one robot, such as robot 150-1, to perform the picking-and-placing operation on the object 120-6. Alternatively, if the operation to be performed needs more than one robot to cooperate, the computing device 160 may select more than one robot of the robots 150 to perform the operation on the object 120-6.
  • a proper selection of a robot to perform the operation on a particular object 120 may improve the global performance of the plurality of robots 150, and thus the selection of the robot for an object needs to be optimized.
  • One of these constraint conditions may be a limit on the number of objects 120 that can be operated by a robot 150 at a time.
  • This limit can be set by a user of the industrial robot system 100. For example, the user can specify that a robot 150 can operate only one object 120 at a time.
  • Another constraint condition may be that an object 120 is operated by a robot 150 only if the object 120 is within a work zone of the robot 150. As shown in Fig. 1, the object 120-2 is within the work zone 180 of the robot 150-1, and thus the object 120-2 may be operated by the robot 150-1.
  • a further constraint condition may be that if an object 120 is in a work zone of one robot 150 only and is not to be in a work zone of a further robot 150, the object 120 is to be operated by that robot 150.
  • the object 120-3 as shown in Fig. 1 is at the end of the first conveyor 110 and is only in the work zone (not shown) of the robot 150-2.
  • the object 120-3 is not to be in a work zone of a further robot 150, since there are no more robots after the robot 150-2. In such a case, the object 120-3 should be operated by the robot 150-2.
  • a still further constraint condition may be that if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object. For example, as shown in Fig. 1, if the robot 150-1 has three objects 120-2, 120-5, and 120-6 to be operated within its work zone whereas the robot 150-2 has only one object 120-3 to be operated within in its work zone, a new coming object (such as the object 120-1) will be preferably performed by the robot 150-2 because the robot 150-2 has a lower workload than the robot 150-1.
  • the computing device 160 determines a start time for the at least one robot, such as the robot 150-1, to perform the operation on the object. It is to be appreciated that the start time for performing the operation may influence the global performance of the plurality of robots 150, and thus the start time for each object 120 also needs to be optimized. Further, it is noted that when the start times for all the objects 120 are optimized and determined, the order of the objects 120 to be operated by the robots 150 is thus optimized and determined.
  • the object 120-6 may be a moving object which moves with the first conveyor 110 at the first speed V1.
  • the computing device 160 may determine a work zone 180 of the robot 150-1, so as to determine when the moving object 120-6 is within the work zone 180 of the robot 150-1.
  • the computing device 160 also determines an initial position and a first speed V1 of the moving object 120-6. For example, this may be done by using the image capture device 170, a sensor, and/or other detecting devices.
  • a size of the work zone 180 of the robot 150-1 may be set by the user of the industrial robot system 100.
  • the computing device 160 determines an entry time when the moving object 120-6 enters the work zone 180 and an exit time when the moving object 120-6 exits the work zone 180. According to the entry time and the exit time, the computing device 160 may select the start time between the entry time and the exit time for the moving object 120-6. In this manner, the selection of the start time will be improved.
  • the computing device 160 determines a time length for the at least one robot, such as the robot 150-1, to perform the operation on the object.
  • the operation may include a picking-and-placing operation.
  • the computing device 160 may determine a picking position in which the object 120-6 is to be picked by the robot 150-1 and a placing position 140 in which the object 120-6 is to be placed by the at least one robot 150-1.
  • the computing device 160 determines an initial position and a first speed V1 of the moving object 120-6, this information may be obtained by the image capture device 170, a sensor, and/or other detecting devices, for example. Then, the computing device 160 may determine the picking position based on the initial position and the first speed V1 of the moving object 120-6 and the start time for the picking-and-placing operation. For example, though a mathematical calculation.
  • the placing position may also vary over time.
  • the placing position 140 as shown in Fig. 1 moves with the second conveyor 130 at a second speed V2.
  • the computing device 160 may determine an initial placing position and a second speed V2 at which the placing position 140 changes. Again, this information may be obtained by the image capture device 170, a sensor, and/or other detecting devices, for example. Further, the computing device 160 may determine the placing position 140 based on the initial placing position, the second speed V2, and the start time for the picking-and-placing operation, for example, though a mathematical calculation.
  • the computing device 160 may determine an operation distance, such as a picking-and-placing distance, based on the picking position and the placing position.
  • an operation distance such as a picking-and-placing distance
  • the picking-and-placing distance may be determined by a geometry method.
  • the computing device 160 can determine the time length for the at least one robot, for example the robot 1501-1, to perform the picking-and-placing operation on the object 120-6, based on the picking-and-placing distance and a picking-and-placing speed of robot 150-1. In the case that errors are neglected, this time length may be calculated through the picking-and-placing distance divided by the picking-and-placing speed of robot 150-1.
  • the computing device 160 may obtain a sum of the time lengths for the plurality of objects 120.
  • the correspondence is determined such that the sum of the time lengths for the plurality of objects 120 meets a predetermined condition.
  • the sum of the time lengths for the plurality of objects 120 can reflect a total work time for all the robots 150. If the sum of the time lengths meets a predetermined condition for optimization, the global optimization solution may be achieved.
  • the predetermined condition includes the sum of the time lengths reaching a minimum. This means that the total work time and the whole workload are minimized.
  • the correspondence may be determined according to an optimization model.
  • the information regarding which robot at what time to operate on which object is taken as decision variables.
  • an objective function of the optimization model may be related to the sum of the time lengths.
  • optimization models may be used to obtain the optimization solution according to the embodiments of the present disclosure.
  • These optimization models include but not limited to an artificial neural network model, the least mean square algorithm, the maximum likelihood estimation, a method of exhaustion, the genetic algorithm, the ant colony algorithm, the tabu search algorithm, the simulated annealing algorithm, the hill climbing algorithm, and so on.
  • the operation strategy for a plurality of robots and a plurality of objects is optimized and the global production efficiency is improved, without increasing an operation speed of a robot.
  • the embodiments of the present disclosure may also reduce an object drop ratio and balance workloads of the plurality of robots.
  • Fig. 3 illustrates a block diagram of a device 300 that can be used to implement the embodiments of the present disclosure.
  • the device 300 comprises a Central Processing Unit (CPU) 301 which can perform various appropriate actions and processing based on computer program instructions stored in a Read Only Memory (ROM) 302 or computer program instructions uploaded from storage unit 308 to a Random Access Memory (RAM) 303.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the CPU 301, ROM 302 and RAM 303 are connected one another via a bus 304.
  • An input/output (I/O) interface 305 can also be connected to the bus 304.
  • I/O input/output
  • a plurality of components in the device 300 are connected to the I/O interface 305, including an input unit 306, such as a keyboard, a mouse, and the like; an output unit 307, such as display of various types and loudspeakers; a storage unit 308, such as a magnetic disk and an optical disk; a communication unit 309, such as a network card, a modem, a wireless communication transceiver and so on.
  • the communication unit 309 allows the device 300 to exchange information/data with other devices via computer networks, such as Internet, and/or various telecommunication networks.
  • the method 200 for instance, can be performed by the CPU 301.
  • the method 200 can be implemented as a computer software program which is corporeally contained in a machine readable medium, such as a storage unit 308.
  • the computer program can be partly or wholly loaded and/or mounted on the device 300 by the ROM 302 and/or the communication unit 309.
  • the computer program is uploaded to the RAM 303 and executed by the CPU 301, one or more steps of method 200 described above can be executed.
  • the present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer readable storage medium.
  • the computer program product comprises computer-executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out the process or method 200 as described above with reference to Fig. 2.
  • program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types.
  • the functionality of the program modules may be combined or split between program modules as desired in various embodiments.
  • Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
  • Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented.
  • the program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
  • the above program code may be embodied on a machine readable medium, which may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • the machine readable medium may be a machine readable signal medium or a machine readable storage medium.
  • a machine readable medium may include but not limited to an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
  • machine readable storage medium More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • CD-ROM portable compact disc read-only memory
  • magnetic storage device or any suitable combination of the foregoing.

Abstract

Embodiments of the present disclosure provide a method and a device for robot control, a computer readable medium, and a computer program product. The method comprises determining correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots comprising for each of the plurality of objects, selecting at least one of the plurality of robots to perform an operation on the object; determining a start time for the at least one robot to perform the operation on the object; and determining a time length for the at least one robot to perform the operation on the object. The correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition. With the method and device in accordance with the embodiments of the present disclosure, the operation strategy for a plurality of robots and a plurality of objects is optimized and the global production efficiency is improved.

Description

METHOD AND DEVICE FOR ROBOT CONTROL TECHNICAL FIELD
Embodiments of the present disclosure generally relate to industrial robots, and more particularly, to a method and a device for robot control, a computer readable medium, and a computer program product.
BACKGROUND
There is a lot of repetitive work in various industries such as the food industry, pharmaceutical industry, and so on. Robots are more and more widely used in these industries for performing operations, such as picking and placing certain objects, to reduce heavy human labor. An optimal operation strategy for robots performing their operations can improve the efficiency and reduce the cost.
There are some known picking strategies like Load balancing and Adaptive Task Completion, which can be utilized to assign target objects to different robots. These strategies generally fail to optimize the order of the operations on the target objects. In addition, researchers have proposed some picking path optimization methods to find an efficient solution for one robot. However, in many use cases, a plurality of robots may be employed and the maximizing efficiency of every single robot does not mean a global maximum efficiency of all the robots. Thus, by directly applying the known operation strategies for respective ones of a plurality of robots, it is impossible to achieve a global optimization for the robots, and thus typically cannot satisfy performance requirement in industrial production.
SUMMARY
Embodiments of the present disclosure present a method and a device for robot control, a computer readable medium, and a computer program product.
In a first aspect, the embodiments of the present disclosure provide a method for robot control. The method comprises determining correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots. Determining correspondence comprises for each of the plurality of objects, selecting at least one of the plurality of robots to perform an operation on the object, determining a start time for the at  least one robot to perform the operation on the object, and determining a time length for the at least one robot to perform the operation on the object. The correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
In some embodiments, the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
In some embodiments, for each object, the at least one robot is selected such that a robot operates a predetermined number of objects at a time; an object is operated by a robot only if the object is within a work zone of the robot; if an object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
In some embodiments, at least one of the plurality of objects is a moving object. Determining the start time for the at least one robot to perform the operation comprises determining a work zone of the at least one robot; determining an initial position and a first speed of the moving object; determining, based on the initial position and the first speed of the moving object, an entry time when the moving object enters the work zone and an exit time when the moving object exits the work zone; and selecting the start time between the entry time and the exit time.
In some embodiments, the operation includes a picking-and-placing operation. Determining the time length for the at least one robot to perform the operation comprises determining a picking position in which the object is to be picked by the at least one robot and a placing position in which the object is to be placed by the at least one robot; determining a picking-and-placing distance based on the picking position and the placing position; and determining the time length for the at least one robot to perform the picking-and-placing operation on the object, based on the picking-and-placing distance and a picking-and-placing speed of the at least one robot.
In some embodiments, at least one of the plurality of objects is a moving object. Determining the picking position comprises determining an initial position and a first speed of the moving object; and determining the picking position based on the initial position and the first speed of the moving object and the start time for the  picking-and-placing operation.
In some embodiments, the placing position changes over time. Determining the placing position comprises determining an initial placing position and a second speed at which the placing position changes; and determining the placing position based on the initial placing position, the second speed and the start time for the picking-and-placing operation.
In some embodiments, determining the initial position and the first speed of the moving object comprises detecting the initial position and the first speed using an image capture device and/or a sensor.
In some embodiments, the predetermined condition includes the sum of the time lengths reaching a minimum.
In a second aspect, the embodiments of the present disclosure provide a device for robot control. The device comprises at least one processor and at least one memory including computer program instructions. The at least one memory and the computer program instructions are configured, with the processor, to cause the device to determine correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots. Determining correspondence comprises for each of the plurality of objects, selecting at least one of the plurality of robots to perform an operation on the object, determining a start time for the at least one robot to perform the operation on the object, and determining a time length for the at least one robot to perform the operation on the object. The correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
In some embodiments, the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
In some embodiments, for each object, the at least one robot is selected such that a robot operates a predetermined number of objects at a time; an object is operated by a robot only if the object is within a work zone of the robot; if an object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
In some embodiments, at least one of the plurality of objects is a moving object.  The at least one memory and the computer program instructions are further configured, with the processor, to cause the device to: determine a work zone of the at least one robot; determine an initial position and a first speed of the moving object; determine, based on the initial position and the first speed of the moving object, an entry time when the moving object enters the work zone and an exit time when the moving object exits the work zone; and select the start time between the entry time and the exit time.
In some embodiments, the operation includes a picking-and-placing operation. The at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine a picking position in which the object is to be picked by the at least one robot and a placing position in which the object is to be placed by the at least one robot; determine a picking-and-placing distance based on the picking position and the placing position; and determine the time length for the at least one robot to perform the picking-and-placing operation on the object, based on the picking-and-placing distance and a picking-and-placing speed of the at least one robot.
In some embodiments, at least one of the plurality of objects is a moving object. The at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine an initial position and a first speed of the moving object; and determine the picking position based on the initial position and the first speed of the moving object and the start time for the picking-and-placing operation.
In some embodiments, the placing position changes over time. The at least one memory and the computer program instructions are further configured, with the processor, to cause the device to determine an initial placing position and a second speed at which the placing position changes; and determine the placing position based on the initial placing position, the second speed and the start time for the picking-and-placing operation.
In some embodiments, the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to detect the initial position and the first speed using an image capture device and/or a sensor.
In some embodiments, the predetermined condition includes the sum of the time lengths reaching a minimum.
In a third aspect, the embodiments of the present disclosure provide a computer readable medium having instructions stored thereon, the instructions, when executed on at  least one processor, cause the at least one processor to perform the method of the first aspect.
In a fourth aspect, the embodiments of the present disclosure provide a computer program product being tangibly stored on a computer readable storage medium and comprising instructions which, when executed on at least one processor, cause the at least one processor to perform the method of the first aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Through the following detailed descriptions with reference to the accompanying drawings, the above and other objectives, features and advantages of the embodiments disclosed herein will become more comprehensible. In the drawings, several embodiments disclosed herein will be illustrated in an example and in a non-limiting manner, wherein:
Fig. 1 illustrates a schematic diagram of an industrial robot system in which the embodiments of the present disclosure may be implemented.
Fig. 2 illustrates a flowchart of a method for robot control in accordance with the embodiments of the present disclosure.
Fig. 3 illustrates a block diagram of a device that can be used to implement the embodiments of the present disclosure.
Throughout the drawings, the same or corresponding reference symbols refer to the same or corresponding parts.
DETAILED DESCRIPTION
The subject matter described herein will now be discussed with reference to several embodiments. These embodiments are discussed only for the purpose of enabling those skilled persons in the art to better understand and thus implement the subject matter described herein, rather than suggesting any limitations on the scope of the subject matter.
The term “comprises” or “includes” and its variants are to be read as open terms that mean “includes, but is not limited to. ” The term “or” is to be read as “and/or” unless the context clearly indicates otherwise. The term “based on” is to be read as “based at least in part on. ” The term “being operable to” is to mean a function, an action, a motion  or a state can be achieved by an operation induced by a user or an external mechanism. The term “one embodiment” and “an embodiment” are to be read as “at least one embodiment. ” The term “another embodiment” is to be read as “at least one other embodiment. ” 
Unless specified or limited otherwise, the terms “mounted, ” “connected, ” “supported, ” and “coupled” and variations thereof are used broadly and encompass direct and indirect mountings, connections, supports, and couplings. Furthermore, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings. In the description below, like reference numerals and labels are used to describe the same, similar or corresponding parts in the Figures. Other definitions, explicit and implicit, may be included below.
As mentioned above, known operation strategies for a plurality of robots fail to achieve a global optimization for the robots, and thus typically cannot satisfy performance requirement in industrial production. In order to at least in part solve the above or other potential problems, embodiments of the present disclosure provide an improved robot control solution for a plurality of robots and a plurality of operation objects. Now some embodiments of the present disclosure will be discussed with reference to the figures.
Fig. 1 illustrates a schematic diagram of an industrial robot system 100 in which the embodiments of the present disclosure may be implemented. As shown in Fig. 1, the industrial robot system 100 includes a first conveyor 110, a second conveyor 130, and robots 150-1 and 150-2 (collectively or individually referred to as robots 150) . Although only two robots and two conveyors are shown in Fig. 1, it is to be understood that the industrial robot system 100 may include any number of robots and conveyors.
As illustrated, there are objects 120-1 to 120-7 (collectively or individually referred to as objects 120) located on the first conveyor 110 and out of the first conveyor 110. In some embodiments, each of the robots 150 may perform an operation on one or more of the objects 120. The operations that can be performed by a robot 150 on an object 120 may include, but are not limited to, machining, welding, packaging, picking-and-placing, or the like. In some embodiments, a robot 150 may have its own work zone. The robot 150 is capable of performing the corresponding operation on an object 120 only if the object 120 is within the work zone 180. Fig. 1 shows an example work zone 180 of the robot 150-1. The robot 150-1 can operate one or more objects 120  that are located in or moved into the work zone 180 by the first or  second conveyor  110 or 130.
It is to be appreciated that although the work zone 180 is shown to have a particular shape (a circle in this example) and a particular size, in other examples, the work zone 180 may have a different shape and a different size. The scope of the embodiments of the present disclosure is not limited in this regard.
In the cases where the objects 120 are to be picked and placed by the robots 150, the objects 120 may each have respective predefined placing positions after being picked by the robots 150 from their initial positions. It is assumed that the second conveyor 130 is a placing conveyor. A position 140 on the second conveyor 130 may be a placing position at which an object 120 is to be placed. Additionally, a placing position for an object 120 may also be outside of the second conveyor 130, such as a placing position 145 as shown in Fig. 1. It is to be understood that there may be more placing positions in the system 100 and Fig. 1 only shows two of them for simplicity.
In a picking-and-placing operation, the robots 150 need to pick the objects 120 and place them in their respective placing positions on the second conveyor 130 or outside the second conveyor 130. In the example scenario as shown in Fig. 1, the object 120-6 is within the work zone 180 of the robot 150-1 and has the position 140 as its placing position, so the robot 150-1 will pick the object 120-6 from the first conveyor 110 and place it at the placing position 140.
As shown in Fig. 1, the objects 120-1 to 120-6 may move with the first conveyor 110 at a first speed Vi. In contrast, the object 120-7 is stationary outside the first conveyor 110. Similarly, the placing positions on the second conveyor (such as the placing position 140) may move with the second conveyor 130 at a second speed V2, while the placing position 145 is stationary outside of the second conveyor 130. It should be understood that depending on the initial positions of the objects and the presetting of the placing positions, there may be only moving or stationary objects and/or placing positions in some other cases.
The industrial robot system 100 further includes a computing device 160 and an image capture device 170. The computing device 160 is communicatively coupled or connected to the image capture device 170 and the robots 150. The computing device 160 may control the robots 150. The image capture device 170 may be used to capture  an image of the objects 120 to be operated by the robots 150 and send the image to the computing device 160 for the purpose of robot control. The computing device 160 may obtain, based on the received image, information related to the objects 120, for example, an initial position and a first speed V1 of the objects 120-1 to 120-6.
In some embodiments, the image capture device 170 may capture images of objects periodically or based on a trigger from the computing device 160. In response to receiving a new image of objects, the computing device 160 may determine the objects that need to be operated by determining new coming objects in the new image and the objects in the previous image which have not been operated.
The image capture device 170 may include, for example, a digital camera, a video camera, a mobile phone with image capture capability, a tablet with image capture capability, and so on. Alternatively, or in addition, the information related to the objects 120 may be detected by a sensor, such as a gravity sensor, and/or other detecting devices. The embodiments of the present disclosure are not limited in this regard.
As will be further discussed below, based on the information related to the objects 120, the computing device 160 may determine correspondence between the robots 150 and the objects 120 to be operated by the robots 150. Such correspondence indicates, for each of the objects 120, which robot of the robots 150 at what time to perform an operation on the object. After determining the correspondence between the robots 150 and the objects 120, the computing device 160 may control the robots 150 to operate the objects 120 based on the determined correspondence. In the operation, upon an object 120 is performed or missed by a robot 150, a state of the object 120 may be updated by the computing device 160. The state may be picked, placed, and missed, for example.
In some embodiments, the computing device 160 may need additional information in determining the correspondence, for example, the information related to the robots 150 and the industrial environment. The additional information may be obtained from a database related to the robots 150 and the industrial environment. Alternatively, or in addition, this information may be obtained from the image capture device 170 and/or some sensor. Example embodiments of a method for robot control that may be performed by the computing device 160 will be described in details below with reference to Fig. 2.
Fig. 2 illustrates a flowchart of a method 200 for robot control in accordance  with the embodiments of the present disclosure. The method 200 can be implemented in the computing device 160 in the industrial robot system 100 as shown in Fig. 1. For the purpose of discussion, the method 200 will be described with reference to Fig. 1. It would be appreciated that in some other embodiments, the method 200 can also be implemented in other components or in more than one component in the industrial robot system 100.
At block 210, the computing device 160 determines correspondence between a plurality of robots 150 and a plurality of objects 120 to be operated by the plurality of robots 150. As mentioned, in determining the correspondence, the computing device 160 may determine, for each of the objects 120, which robot of the plurality of robots 150 at what time to perform an operation on the object. In the following, the object 120-6 will be taken as an example to describe the determination of the correspondence. It is to be appreciated that, for other objects 120-1 to 120-5 and 120-7, the computing device 160 may determine the correspondence in a similar manner.
In particular, at block 212, for each of the plurality of objects, such as the object 120-6, the computing device 160 selects at least one of the plurality of robots 150 to perform the operation on the object 120-6. For example, if the operation to be performed on the object 120-6 is a picking-and-placing operation, the computing device 160 may select one robot, such as robot 150-1, to perform the picking-and-placing operation on the object 120-6. Alternatively, if the operation to be performed needs more than one robot to cooperate, the computing device 160 may select more than one robot of the robots 150 to perform the operation on the object 120-6.
It is to be appreciated that a proper selection of a robot to perform the operation on a particular object 120 may improve the global performance of the plurality of robots 150, and thus the selection of the robot for an object needs to be optimized. In industrial practice, in selecting the at least one robots 150, there may be some constraint conditions to be met. Examples of the constraint conditions are provided below. It would be appreciated that one or more of those constraint conditions can be utilized by the computing device 160 in the selecting.
One of these constraint conditions may be a limit on the number of objects 120 that can be operated by a robot 150 at a time. This limit can be set by a user of the industrial robot system 100. For example, the user can specify that a robot 150 can  operate only one object 120 at a time.
Another constraint condition may be that an object 120 is operated by a robot 150 only if the object 120 is within a work zone of the robot 150. As shown in Fig. 1, the object 120-2 is within the work zone 180 of the robot 150-1, and thus the object 120-2 may be operated by the robot 150-1.
A further constraint condition may be that if an object 120 is in a work zone of one robot 150 only and is not to be in a work zone of a further robot 150, the object 120 is to be operated by that robot 150. For example, the object 120-3 as shown in Fig. 1 is at the end of the first conveyor 110 and is only in the work zone (not shown) of the robot 150-2. Further, as shown in the example of Fig. 1, the object 120-3 is not to be in a work zone of a further robot 150, since there are no more robots after the robot 150-2. In such a case, the object 120-3 should be operated by the robot 150-2.
A still further constraint condition may be that if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object. For example, as shown in Fig. 1, if the robot 150-1 has three objects 120-2, 120-5, and 120-6 to be operated within its work zone whereas the robot 150-2 has only one object 120-3 to be operated within in its work zone, a new coming object (such as the object 120-1) will be preferably performed by the robot 150-2 because the robot 150-2 has a lower workload than the robot 150-1.
Referring back to Fig. 2, at block 214, for each of the plurality of objects 120, such as the object 120-6, the computing device 160 determines a start time for the at least one robot, such as the robot 150-1, to perform the operation on the object. It is to be appreciated that the start time for performing the operation may influence the global performance of the plurality of robots 150, and thus the start time for each object 120 also needs to be optimized. Further, it is noted that when the start times for all the objects 120 are optimized and determined, the order of the objects 120 to be operated by the robots 150 is thus optimized and determined.
As shown in Fig. 1, in some embodiments, the object 120-6 may be a moving object which moves with the first conveyor 110 at the first speed V1. In such a case, the computing device 160 may determine a work zone 180 of the robot 150-1, so as to determine when the moving object 120-6 is within the work zone 180 of the robot 150-1. To this end, the computing device 160 also determines an initial position and a first speed  V1 of the moving object 120-6. For example, this may be done by using the image capture device 170, a sensor, and/or other detecting devices. In some embodiments, a size of the work zone 180 of the robot 150-1 may be set by the user of the industrial robot system 100.
Then, based on the initial position and the first speed of the moving object 120-6, the computing device 160 determines an entry time when the moving object 120-6 enters the work zone 180 and an exit time when the moving object 120-6 exits the work zone 180. According to the entry time and the exit time, the computing device 160 may select the start time between the entry time and the exit time for the moving object 120-6. In this manner, the selection of the start time will be improved.
At block 216, for each of the plurality of objects, such as the object 120-6, the computing device 160 determines a time length for the at least one robot, such as the robot 150-1, to perform the operation on the object. For example, the operation may include a picking-and-placing operation. In this event, the computing device 160 may determine a picking position in which the object 120-6 is to be picked by the robot 150-1 and a placing position 140 in which the object 120-6 is to be placed by the at least one robot 150-1.
In embodiments in which at least one of the plurality of objects 120 is a moving object, such as the object 120-6, the computing device 160 determines an initial position and a first speed V1 of the moving object 120-6, this information may be obtained by the image capture device 170, a sensor, and/or other detecting devices, for example. Then, the computing device 160 may determine the picking position based on the initial position and the first speed V1 of the moving object 120-6 and the start time for the picking-and-placing operation. For example, though a mathematical calculation.
Alternatively, or in addition, the placing position may also vary over time. For example, the placing position 140 as shown in Fig. 1 moves with the second conveyor 130 at a second speed V2. Accordingly, the computing device 160 may determine an initial placing position and a second speed V2 at which the placing position 140 changes. Again, this information may be obtained by the image capture device 170, a sensor, and/or other detecting devices, for example. Further, the computing device 160 may determine the placing position 140 based on the initial placing position, the second speed V2, and the start time for the picking-and-placing operation, for example, though a mathematical calculation.
In the picking-and placing operation, after the picking position and the placing position are determined, the computing device 160 may determine an operation distance, such as a picking-and-placing distance, based on the picking position and the placing position. For example, the picking-and-placing distance may be determined by a geometry method.
Based on the determined picking-and-placing distance, the computing device 160 can determine the time length for the at least one robot, for example the robot 1501-1, to perform the picking-and-placing operation on the object 120-6, based on the picking-and-placing distance and a picking-and-placing speed of robot 150-1. In the case that errors are neglected, this time length may be calculated through the picking-and-placing distance divided by the picking-and-placing speed of robot 150-1.
After the time lengths for all the objects 120 are determined, the computing device 160 may obtain a sum of the time lengths for the plurality of objects 120. As shown in block 210, to obtain an optimization solution for a global view of all the objects 120 and all the robots 150, the correspondence is determined such that the sum of the time lengths for the plurality of objects 120 meets a predetermined condition. The sum of the time lengths for the plurality of objects 120 can reflect a total work time for all the robots 150. If the sum of the time lengths meets a predetermined condition for optimization, the global optimization solution may be achieved. In some embodiments, the predetermined condition includes the sum of the time lengths reaching a minimum. This means that the total work time and the whole workload are minimized.
In order to achieve this optimization solution, the correspondence may be determined according to an optimization model. In particular, the information regarding which robot at what time to operate on which object is taken as decision variables. Further, an objective function of the optimization model may be related to the sum of the time lengths.
Mathematically, the target distribution results (that is, which robot at what time to operate on which object) can be expressed as x = (P, t, R) , where P represents the object to be operated, t represents operation (such as, picking-and-placing operation) time of the object, and R represents the robot to operate (such as, pick and place) the object. The objective function may be expressed as T = T 1 + T 2 +...+ T i +...+ T n, where T i represents the time length for a particular robot i. By a mathematical form, the computing device  160 determines a particular value of x = (P, t, R) , which may be represented as x 0 = (P 0, t 0, R 0) and causes a value of the T to be a minimum value.
It will be appreciated that various optimization models may be used to obtain the optimization solution according to the embodiments of the present disclosure. These optimization models include but not limited to an artificial neural network model, the least mean square algorithm, the maximum likelihood estimation, a method of exhaustion, the genetic algorithm, the ant colony algorithm, the tabu search algorithm, the simulated annealing algorithm, the hill climbing algorithm, and so on.
With the method and device for robot control in accordance with the embodiments of the present disclosure, the operation strategy for a plurality of robots and a plurality of objects is optimized and the global production efficiency is improved, without increasing an operation speed of a robot. In addition, the embodiments of the present disclosure may also reduce an object drop ratio and balance workloads of the plurality of robots.
Fig. 3 illustrates a block diagram of a device 300 that can be used to implement the embodiments of the present disclosure. As shown in Fig. 3, the device 300 comprises a Central Processing Unit (CPU) 301 which can perform various appropriate actions and processing based on computer program instructions stored in a Read Only Memory (ROM) 302 or computer program instructions uploaded from storage unit 308 to a Random Access Memory (RAM) 303. In the RAM 303, there further stores various programs and data needed by operation of the device 300. The CPU 301, ROM 302 and RAM 303 are connected one another via a bus 304. An input/output (I/O) interface 305 can also be connected to the bus 304.
A plurality of components in the device 300 are connected to the I/O interface 305, including an input unit 306, such as a keyboard, a mouse, and the like; an output unit 307, such as display of various types and loudspeakers; a storage unit 308, such as a magnetic disk and an optical disk; a communication unit 309, such as a network card, a modem, a wireless communication transceiver and so on. The communication unit 309 allows the device 300 to exchange information/data with other devices via computer networks, such as Internet, and/or various telecommunication networks.
The processes and processing described above, the method 200 for instance, can be performed by the CPU 301. For example, in some embodiments, the method 200  can be implemented as a computer software program which is corporeally contained in a machine readable medium, such as a storage unit 308. In some embodiments, the computer program can be partly or wholly loaded and/or mounted on the device 300 by the ROM 302 and/or the communication unit 309. When the computer program is uploaded to the RAM 303 and executed by the CPU 301, one or more steps of method 200 described above can be executed.
The present disclosure also provides at least one computer program product tangibly stored on a non-transitory computer readable storage medium. The computer program product comprises computer-executable instructions, such as those included in program modules, being executed in a device on a target real or virtual processor, to carry out the process or method 200 as described above with reference to Fig. 2.
Generally, program modules include routines, programs, libraries, objects, classes, components, data structures, or the like that perform particular tasks or implement particular abstract data types. The functionality of the program modules may be combined or split between program modules as desired in various embodiments. Machine-executable instructions for program modules may be executed within a local or distributed device. In a distributed device, program modules may be located in both local and remote storage media.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowcharts and/or block diagrams to be implemented. The program code may execute entirely on a machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
The above program code may be embodied on a machine readable medium, which may be any tangible medium that may contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine readable medium may be a machine readable signal medium or a machine readable storage medium. A machine readable medium may include but not limited to an  electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing.
More specific examples of the machine readable storage medium would include an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM) , a read-only memory (ROM) , an erasable programmable read-only memory (EPROM or Flash memory) , an optical fiber, a portable compact disc read-only memory (CD-ROM) , an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Likewise, while several specific implementation details are contained in the above discussions, these should not be construed as limitations on the scope of the present disclosure, but rather as descriptions of features that may be specific to particular embodiments. Certain features that are described in the context of separate embodiments may also be implemented in combination in a single embodiment. On the other hand, various features that are described in the context of a single embodiment may also be implemented in multiple embodiments separately or in any suitable sub-combination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (20)

  1. A method for robot control, comprising:
    determining correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots comprising for each of the plurality of objects,
    selecting at least one of the plurality of robots to perform an operation on the object;
    determining a start time for the at least one robot to perform the operation on the object; and
    determining a time length for the at least one robot to perform the operation on the object,
    wherein the correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
  2. The method of claim 1, wherein the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
  3. The method of claim 1, wherein for each object, the at least one robot is selected such that:
    a robot operates a predetermined number of objects at a time;
    an object is operated by a robot only if the object is within a work zone of the robot;
    ifan object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or
    if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
  4. The method of claim 1, wherein at least one of the plurality of objects is a moving object, and wherein determining the start time for the at least one robot to perform the operation comprises:
    determining a work zone of the at least one robot;
    determining an initial position and a first speed of the moving object;
    determining, based on the initial position and the first speed of the moving object, an entry time when the moving object enters the work zone and an exit time when the moving object exits the work zone; and
    selecting the start time between the entry time and the exit time.
  5. The method of claim 1, wherein the operation includes a picking-and-placing operation, and wherein determining the time length for the at least one robot to perform the operation comprises:
    determining a picking position in which the object is to be picked by the at least one robot and a placing position in which the object is to be placed by the at least one robot;
    determining a picking-and-placing distance based on the picking position and the placing position; and
    determining the time length for the at least one robot to perform the picking-and-placing operation on the object, based on the picking-and-placing distance and a picking-and-placing speed of the at least one robot.
  6. The method of claim 5, wherein at least one of the plurality of objects is a moving object, and wherein determining the picking position comprises:
    determining an initial position and a first speed of the moving object; and
    determining the picking position based on the initial position and the first speed of the moving object and the start time for the picking-and-placing operation.
  7. The method of claim 5, wherein the placing position changes over time, and wherein determining the placing position comprises:
    determining an initial placing position and a second speed at which the placing position changes; and
    determining the placing position based on the initial placing position, the second speed and the start time for the picking-and-placing operation.
  8. The method of claim 4, wherein determining the initial position and the first speed of the moving object comprises:
    detecting the initial position and the first speed using an image capture device  and/or a sensor.
  9. The method of claim 1, wherein the predetermined condition includes the sum of the time lengths reaching a minimum.
  10. A device for robot control, comprising:
    at least one processor, and
    at least one memory including computer program instructions, the at least one memory and the computer program instructions configured, with the processor, to cause the device to:
    determine correspondence between a plurality of robots and a plurality of objects to be operated by the plurality of robots comprising for each of the plurality of objects,
    selecting at least one of the plurality of robots to perform an operation on the object;
    determining a start time for the at least one robot to perform the operation on the object; and
    determining a time length for the at least one robot to perform the operation on the object,
    wherein the correspondence is determined such that a sum of the time lengths for the plurality of objects meets a predetermined condition.
  11. The device of claim 10, wherein the correspondence is determined according to an optimization model, an objective function of the optimization model related to the sum of the time lengths.
  12. The device of claim 10, wherein for each object, the at least one robot is selected such that:
    a robot operates a predetermined number of objects at a time;
    an object is operated by a robot only if the object is within a work zone of the robot;
    if an object is in a work zone of one robot only and is not to be in a work zone of a further robot, the object is to be operated by the robot; and/or
    if a first robot has a lower workload than a second robot, the first robot has a priority over the second robot to operate an object.
  13. The device of claim 10, wherein at least one of the plurality of objects is a moving object, and wherein the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to:
    determine a work zone of the at least one robot;
    determine an initial position and a first speed of the moving object;
    determine, based on the initial position and the first speed of the moving object, an entry time when the moving object enters the work zone and an exit time when the moving object exiting the work zone; and
    select the start time between the entry time and the exit time.
  14. The device of claim 10, wherein the operation includes a picking-and-placing operation, and wherein the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to:
    determine a picking position in which the object is to be picked by the at least one robot and a placing position in which the object is to be placed by the at least one robot;
    determine a picking-and-placing distance based on the picking position and the placing position; and
    determine the time length for the at least one robot to perform the picking-and-placing operation on the object, based on the picking-and-placing distance and a picking-and-placing speed of the at least one robot.
  15. The device of claim 14, wherein at least one of the plurality of objects is a moving object, and wherein the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to:
    determine an initial position and a first speed of the moving object; and
    determine the picking position based on the initial position and the first speed of the moving object and the start time for the picking-and-placing operation.
  16. The device of claim 14, wherein the placing position changes over time, and wherein the at least one memory and the computer program instructions are further  configured, with the processor, to cause the device to:
    determine an initial placing position and a second speed at which the placing position changes; and
    determine the placing position based on the initial placing position, the second speed and the start time for the picking-and-placing operation.
  17. The device of claim 13, wherein the at least one memory and the computer program instructions are further configured, with the processor, to cause the device to:
    detect the initial position and the first speed using an image capture device and/or a sensor.
  18. The device of claim 10, wherein the predetermined condition includes the sum of the time lengths reaching a minimum.
  19. A computer readable medium having instructions stored thereon, the instructions, when executed on at least one processor, cause the at least one processor to perform the method according to any of claims 1 to 9.
  20. A computer program product being tangibly stored on a computer readable storage medium and comprising instructions which, when executed on at least one processor, cause the at least one processor to perform the method according to any of claims 1 to 9.
PCT/CN2018/080959 2018-03-28 2018-03-28 Method and device for robot control WO2019183859A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/080959 WO2019183859A1 (en) 2018-03-28 2018-03-28 Method and device for robot control

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/080959 WO2019183859A1 (en) 2018-03-28 2018-03-28 Method and device for robot control

Publications (1)

Publication Number Publication Date
WO2019183859A1 true WO2019183859A1 (en) 2019-10-03

Family

ID=68059132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080959 WO2019183859A1 (en) 2018-03-28 2018-03-28 Method and device for robot control

Country Status (1)

Country Link
WO (1) WO2019183859A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002023297A1 (en) * 2000-09-11 2002-03-21 Kunikatsu Takase Mobile body movement control system
CN102648442A (en) * 2009-09-11 2012-08-22 Abb技术有限公司 Improved pick and place
CN106598043A (en) * 2016-11-08 2017-04-26 中国科学院自动化研究所 High-speed pickup path optimizing method of parallel robots facing dynamic objects
CN107111307A (en) * 2014-11-11 2017-08-29 X开发有限责任公司 Dynamically maintaining a map of a fleet of robotic devices in an environment to facilitate robotic actions
CN107430708A (en) * 2015-03-30 2017-12-01 X开发有限责任公司 Cloud-based analysis of robotic system component usage

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002023297A1 (en) * 2000-09-11 2002-03-21 Kunikatsu Takase Mobile body movement control system
CN102648442A (en) * 2009-09-11 2012-08-22 Abb技术有限公司 Improved pick and place
CN107111307A (en) * 2014-11-11 2017-08-29 X开发有限责任公司 Dynamically maintaining a map of a fleet of robotic devices in an environment to facilitate robotic actions
CN107430708A (en) * 2015-03-30 2017-12-01 X开发有限责任公司 Cloud-based analysis of robotic system component usage
CN106598043A (en) * 2016-11-08 2017-04-26 中国科学院自动化研究所 High-speed pickup path optimizing method of parallel robots facing dynamic objects

Similar Documents

Publication Publication Date Title
US11338436B2 (en) Assessing robotic grasping
US10500718B2 (en) Systems and methods for allocating tasks to a plurality of robotic devices
US9663292B1 (en) Forecasted robotic drive unit dispatch
US9536767B1 (en) Material handling method
US11144330B2 (en) Algorithm program loading method and related apparatus
KR20220101611A (en) Merge processing system, method and apparatus
WO2019036931A1 (en) Method, device, and system for placing goods, electronic device and readable storage medium
CN106249703B (en) For controlling and/or the system and method for the process of analytical industry
US10606266B2 (en) Tracking a target moving between states in an environment
EP4116906A1 (en) Method for warehouse storage-location monitoring, computer device, and non-volatile storage medium
US20200241511A1 (en) System for manufacturing dispatching using deep reinforcement and transfer learning
JP2022160552A (en) Robotic system with dynamic motion adjustment mechanism and method for operating the same
US10926952B1 (en) Optimizing storage space utilizing artificial intelligence
WO2019183859A1 (en) Method and device for robot control
KR102527522B1 (en) System for supporting work of aircraft mechanic, and method therefor
JPWO2018189851A1 (en) Transfer operation control device, system, method and program
US9766811B1 (en) Flash-based storage warehouse apparatuses and methods thereof
CN114399247A (en) Task allocation method, electronic device, storage medium, and computer program product
WO2017172013A1 (en) System for optimizing storage location arrangement
US11755003B2 (en) Autonomous task management industrial robot
US20240116172A1 (en) Intelligent robotic system for palletizing and depalletizing
EP3009970B1 (en) A material handling method
CN112454354B (en) Working method and device of industrial robot and storage medium
US20230056286A1 (en) System and method for service enablement and resource allocation in storage facilities
US20240043214A1 (en) Industrial internet of things for intelligent three-dimensional warehouse, controlling methods and storage medium thererof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18912388

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18912388

Country of ref document: EP

Kind code of ref document: A1