WO2017121456A1 - A robot system and a method for operating the robot system - Google Patents

A robot system and a method for operating the robot system Download PDF

Info

Publication number
WO2017121456A1
WO2017121456A1 PCT/EP2016/050368 EP2016050368W WO2017121456A1 WO 2017121456 A1 WO2017121456 A1 WO 2017121456A1 EP 2016050368 W EP2016050368 W EP 2016050368W WO 2017121456 A1 WO2017121456 A1 WO 2017121456A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
data
group
generating
information
Prior art date
Application number
PCT/EP2016/050368
Other languages
French (fr)
Inventor
Hongyu Pei BREIVOLD
Kristian SANDSTRÖM
Larisa RIZVANOVIC
Marko Lehtola
Saad AZHAR
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Priority to PCT/EP2016/050368 priority Critical patent/WO2017121456A1/en
Publication of WO2017121456A1 publication Critical patent/WO2017121456A1/en

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/4185Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by the network communication
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1682Dual arm manipulator; Coordination of several manipulators
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/31From computer integrated manufacturing till monitoring
    • G05B2219/31261Coordination control
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/33Director till display
    • G05B2219/33149Publisher subscriber, publisher, master broadcasts data to slaves, subscriber
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39139Produce program of slave from path of master and desired relative position
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39146Swarm, multiagent, distributed multitask fusion, cooperation multi robots
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Definitions

  • the present disclosure relates to the field of industrial robots, and to hardware information sharing between the industrial robots in order to perform a common task of the robots.
  • the disclosure relates to a robot system for a group of industrial robots, and a method for the robot system.
  • the microcomputer receives data of a welding seam captured by the vision system, processes the data to coordinate adjustments, and sends it to a connected robot.
  • the connected robot interprets the adjustments to its learned path and thereafter wealds the seam.
  • the set up certainly requires special programming and configuration of the microcomputer.
  • the disclosure relates to a robot system with a group of industrial robots where each robot in the group is arranged for cooperation with at least one other robot in the group in order to perform a common task of the group.
  • a first robot in the group includes a sensing system configured to generate certain data needed by a second robot in the group for performing at least part of the common task of the group.
  • the robot system has a publish-subscribe architecture where each robot in the group is related to as a node.
  • the robot system includes a publishing module configured to generate information including the certain data from the first robot in the group related to a first node, for sharing with at least one subscribing module.
  • the robot system further includes a subscribing module of a second node related to the second robot in the group, wherein the subscribing module is configured to subscribe to the information published by the publishing module, and wherein the second robot is configured to use the information for performing at least part of the common task of the group.
  • the robot system provides improved utilization of hardware of a robot by enabling sharing of data between industrial robots.
  • the robot system With the herein described robot system, expensive hardware that one robot is equipped with, for example a vision system, may be shared with other robots as information that the one robot knows from the vision system can be shared with the other robots. As additional sensor hardware information is available, product quality may thereby be improved. Further, the robot system offers improved utilization of existing hardware by sharing in an easier way before, as the need to use difficult I/O configuration tools is obviated.
  • the robot system implements a collaboration system acting as a scalable platform for the information exchange between the robots, and in addition other devices, such that they may cooperate to achieve a common task.
  • the collaboration system is based on a framework, e.g. data distribution service (DDS) or Message Queue Telemetry Transport (MQTT) using the data-centric publish-subscribe programming model, which is used for sending data between nodes.
  • DDS data distribution service
  • MQTT Message Queue Telemetry Transport
  • One feature of the frameworks is to enable real-time distribution, i.e., using the machine-to- machine data-centric middleware standard, which allows real-time data
  • PLCs Programmable Logic Controllers
  • the collaboration system implements a robot controller to robot controller communication without going through any PLC and which that can be scaled to handle large number of heterogeneous devices, and therefore save costs in cabling, programming and configurations.
  • Publishing modules are loosely coupled to subscribing modules, and need not even know of their existence. With the topic being the focus, publishing modules and subscriber modules are allowed to remain unaware of the system topology. Each can continue to operate normally regardless of the other.
  • the publishing/subscribing architecture provides the opportunity for better scalability than traditional client-server architecture, through parallel operation, message caching, tree-based or network-based routing, etc. Nodes communicate simply by sending the data they have and asking for the data they need.
  • the sensing system includes a feedback device arranged for generating feedback data, and wherein the sensing system is arranged to generate the certain data including position data of at least one object determined from the feedback data.
  • the feedback device includes a vision sensor arranged for generating vision data, a proximity sensor arranged for generating proximity data, a tactile sensor arranged for generating tactile data and/or a range sensor arranged for generating range data, and wherein the sensing system is arranged to generate the certain data including position data of at least one object determined from the vision data, proximity data, the tactile data and/or the range data.
  • the robot system is, e.g.
  • the sensing system and/or a robot controller of the first robot arranged to generate the certain data including a characteristic of a common reference object for the first robot and the second robot.
  • the position data is expressed in coordinates in a robot coordinate system of the first robot.
  • the position data is expressed in coordinates of a common coordinate system of the first robot and the second robot.
  • each robot in the group is arranged to be connected to a common network for the information sharing.
  • the disclosure relates to a method for a robot system with a group of industrial robots. Each robot in the group is arranged for cooperation with at least one other robot in the group in order to perform a common task of the group.
  • a first robot in the group includes a sensing system configured to generate certain data needed by a second robot in the group for performing at least part of the common task of the group.
  • the robot system has a publish-subscribe architecture where each robot in the group is related to as a node.
  • the method includes generating, by the sensing system of the first robot, the certain data needed by the second robot in the group for performing at least part of the common task of the group.
  • the method further includes publishing information including the certain data generated from the first robot in the group related to a first node for sharing, receiving the information to a second node related to the second robot in the group subscribing to the information published, and controlling the second robot for performing at least part of the common task of the group based on the received information.
  • the sensing system includes a feedback device generating feedback data, the method further including generating the certain data including position data of at least one object determined from the feedback data.
  • the sensing system includes a vision sensor generating vision data, a proximity sensor generating proximity data, a tactile sensor generating tactile data and/or a range sensor generating range data, the method further including generating the certain data including position data of at least one object determined from the vision data, the proximity data, the tactile data and/or the range data.
  • the method includes generating the certain data including a characteristic of a common reference object for the first robot and the second robot.
  • the position data is expressed in coordinates in a robot coordinate system of the first robot.
  • the position data is expressed in coordinates of a common coordinate system of the first robot and the second robot.
  • the disclosure relates to a computer program P, wherein the computer program P comprises a computer program code to cause a processing unit, or a computer unit connected to the processing unit, to perform the method according to any of the steps disclosed herein.
  • the disclosure relates to a computer program product comprising a computer program code stored on a computer-readable medium to perform the method according to any of the steps herein, when said computer program code is executed by a processing unit or by a computer unit connected to the processing unit.
  • Fig. 1 shows a robot system with a group of devices including two industrial robots implementing a collaboration system for information sharing.
  • Fig. 2 shows a device controller according to one embodiment.
  • Fig. 3 shows a publish-subscribe architecture of the collaboration system of the robot system according to one embodiment.
  • Fig. 4 shows a class chart of the collaboration system according to one
  • Fig. 5 shows a flow chart of a method according to one embodiment.
  • Fig. 6 shows a collaborative group according to one embodiment.
  • Fig. 1 is illustrating a working area for a first robot 1 and a second robot 2 that each is arranged to a fixture 3, 4 for working on the fixture 3, 4.
  • raw material is arranged in the shape of pre-stacked pieces, here wood planks 5.
  • the first robot 1 and the second robot 2 are arranged to collaborate in order to assemble pallets 7 from the pre-stacked wood planks 5 and organize them in a stack.
  • a conveyor belt 6 is located between the first robot 1 and the second robot 2 such that the first robot 1 and the second robot 2 may transfer items such as wood planks 5, assembled pallets 7 and half-assembled pallets in between them.
  • Each of the first robot 1 and the second robot 2 is a programmable industrial robot including a device controller 13 (Fig. 2) in the form of a robot controller with a device processor 14 and a device memory 15.
  • a device controller 13 Fig. 2 in the form of a robot controller with a device processor 14 and a device memory 15.
  • the term "industrial robot” refers to an automatically controlled, reprogrammable, multipurpose manipulator with a plurality of degrees of freedom and an ability to perform work tasks independently.
  • the conveyor belt 6 includes a device controller 13 (Fig. 2) in the form of a conveyer belt controller with a device processor 14 and a device memory 15.
  • a collaborative group 1 1 can be defined as a set of devices that cooperate in order to achieve a common goal or task.
  • the first robot 1 , the second robot 2 and the conveyor belt 6 may be referred to as devices defined as members of a collaborative group 1 1 .
  • Devices of a collaborative group are arranged for cooperation with at least one other device 1 , 2, 6 in the group 1 1 in order to perform a common task of the group 1 1 .
  • a robot system 40 may be defined including a collaborative group 1 1 of industrial robots 1 , 2 where each robot 1 , 2 in the group 1 1 is arranged for cooperation with at least one other robot 1 , 2 in the group 1 1 in order to perform a common task of the group 1 1 .
  • a collaborative group 1 1 of a robot system 40 may however include more devices as members of the group 1 1 .
  • Each robot controller includes a robot program that specifies what the robot 1 , 2 should do in order to perform its part of the common task of the group 1 1 .
  • the conveyor belt controller includes a conveyor belt program that specifies what the conveyor belt 6 should do in order to perform its part of the common task of the group 1 1 .
  • the illustrated collaborative group 1 1 in Fig. 1 is only for illustrative purposes and the collaborative group 1 1 may include more devices.
  • Each device includes a device controller 13 (Fig. 2) with a device processor 14 (Fig. 2) and a device memory 15 (Fig. 2).
  • Each device controller 13 also includes a device program P specifying what the device should do in order to perform its part of the common task of the group 1 1 .
  • the device program P may be loaded into the device memory 15.
  • the device processor 14 may be made up of one or more Central Processing Units (CPU).
  • the device memory 15 may be made up of one or more memory units.
  • a memory unit may include a volatile and/or a non-volatile memory, such as a flash memory or Random Access Memory (RAM).
  • the first robot 1 includes a sensing system 10 configured to generate certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 .
  • the second robot 2, or any other robot the group 1 1 may be equipped with a sensing system 10 configured to generate certain data needed by another robot in the group 1 1 for performing at least part of the common task.
  • the sensing system 10 may include a feedback device 38 such as a vision sensor arranged for generating feedback data such as vision data.
  • the sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the vision data.
  • the vision sensor may e.g. include a laser, a sonar sensor, a camera or a video camera.
  • the sensing system 10 may then be arranged to identify one or several objects from the vision data, and find a position for the at least one object from the vision data. The position is included in the position data.
  • the sensing system 10 includes a feedback device 38 such as a proximity sensor arranged for generating feedback data such as proximity data.
  • the sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the proximity data.
  • the sensing system 10 includes a feedback device 38 such as a tactile sensor arranged for generating feedback data such as tactile data.
  • the sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the tactile data.
  • the sensing system 10 includes a feedback device 38 such as a range sensor arranged for generating feedback data such as range data.
  • the sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the range data.
  • the sensing system 10, and/or the robot controller of the first robot 1 may be arranged to generate a characteristic, e.g. position or coordinate, of a common reference object for the first robot 1 and the second robot 2.
  • the reference object may e.g. be a mark or another object in the working area of the first robot 1 and the second robot 2.
  • the reference object may be a part of the second robot 2, or arranged to the second robot 2 and have a position that is previously known to the second robot 2.
  • the working area of the first robot 1 and the second robot 2 may include a reference object, and the second robot 2 may know or be able to find out the distance from the second robot 2 to the reference object e.g. in a robot coordinate system of the second robot 2.
  • the sensing system 10, or the robot controller of the robot 1 may be arranged to find out the position of the reference object in the data generated by the sensor of the sensing system 10 expressed as a position in a robot coordinate system of the first robot 1 .
  • a relation between the robot coordinate system of the first robot 1 and the robot coordinate system of the second robot 2 may then be established by the robot controller of the second robot 2, such that a position or coordinate in the robot coordinate system of the first robot 1 may be translated to a position or coordinate in the robot coordinate system of the second robot 2.
  • the position data included in the certain data of the at least one object may be expressed in coordinates of the robot coordinate system of the first robot 1 .
  • the robot controller of the second robot 2 may then translate the position data expressed in coordinates of the robot coordinate system of the first robot 1 , to position data expressed in coordinates of the robot coordinate system of the second robot 2.
  • the robot coordinate systems of the first robot 1 and the second robot 2 may of course be correlated in beforehand, such that a coordinate in any of the robot coordinate systems may be translated to a coordinate in the other one of the coordinate systems.
  • the position data may then be said to be expressed in coordinates of a common coordinate system of the first robot 1 and the second robot 2.
  • the collaborating group 1 1 is managed from a collaboration system 16 (Fig. 3) via a user interface of the robot system 40.
  • a user 8 (Fig. 1 ) may via the user interface set up the group 1 1 and configure the group 1 1 .
  • the user interface may be accessed via an application installed on a mobile device 9 (Fig. 1 ) such as a tablet, a mobile phone or other computer unit.
  • a mobile device 9 such as a tablet, a mobile phone or other computer unit.
  • Such a mobile device 9 may be referred to in the collaborative group 1 1 as an associated device.
  • a collaborative group 1 1 can have many associated devices, which are devices that interact with the collaborative group 1 1 , but does not cooperate with other devices 1 , 2, 6 in the group in order to achieve a common goal.
  • the associated device may e.g. read some information from the collaborative group 1 1 .
  • a collaborative group 1 1 may include shared items 3, 4, 5, 7.
  • a shared item e.g., a fixture, a working object, a collaboration area
  • the fixtures 3, 4, the pre-stacked wood pieces 5, the assembled pallet 7 in Fig. 1 are referred to as shared items in the group 1 1 .
  • the collaboration system 16 of the robot system 40 is generally illustrated in Fig. 3, and has a publish-subscribe architecture where each device 1 , 2, 6 in the group 1 1 is related to as a node 17, 18...N.
  • the collaboration system 16 may be based on DDS, MQTT or any other framework offering a publish-subscribe architecture and software of the collaboration system 16 is implemented as middleware in the devices 1 , 2, 6, 9 e.g. in the device controllers 13 and sometimes also in the shared items 3, 4, 5, 7 e.g. in shared item controllers (not shown). Further, some middleware may be implemented on a server connected to a common network 23 (Fig. 3).
  • the collaboration system 16 includes at least one publishing module 17A configured to generate information from a first device in the group 1 1 related to a first node 17 for sharing with at least one subscribing module.
  • the first device may be the first robot 1 .
  • the information may then include the certain data from the first robot 1 in the group.
  • the collaboration system 16 further includes at least one subscribing module 18B of a second node
  • the subscribing module 18B is configured to subscribe to the information published by the publishing module
  • each node 17, 18...N related to a device 1 , 2, 6 in the group may include at least one publishing module 17A, 18A... NA configured to generate information from the related device for sharing, and at least one subscribing module 17B, 18B... NB configured to subscribe to information published by a publishing module of a node of another device in the group 1 1 .
  • the robot system 40 further includes a management module 19 configured for controlling the collaboration system 16.
  • the management module 19 is configured to be controlled via a user authorized by the robot system 40.
  • the management module 19 is configured to be controlled via a user authorized by the robot system 40.
  • the management module 19 may be accessed via the application downloaded to the associated device 9.
  • the associated device 9 is in the publish-subscribe architecture related to as a node but is not configured to cooperate with the devices 1 , 2, 6 of the group 1 1 for performing the common task of the group 1 1 .
  • the management module 19 also includes a subscribing module 20 that may subscribe to information from the devices 1 , 2, 6 and shared items 3, 4, 5, 7 of the group 1 1 .
  • the shared items 3, 4, 5, 7 may also each be related to as a node that each includes a publication module.
  • the shared items 3, 4, 5, 7 may publish via the publication module the status of the shared item, respectively, e.g. how many pieces that are left in the stack of pre-stacked wood pieces 5, or how many pallets 7 that are assembled in the stack.
  • Each device 1 , 2, 6, associated device 9 and shared items 3, 4, 5, 7 in the group 1 1 may further arranged to be connected to a common network 23 for the information sharing, e.g. by wireless communication such as radio, or by wire.
  • the network 23 may e.g. be an Ethernet network.
  • Each device 1 , 2, 6, e.g. the first robot 1 and the second robot 2, associated device 9 and shared items 3, 4, 5, 7 in the group 1 1 may then be arranged with a wireless communication module for receiving and/or transmitting wireless signals.
  • the management module 19 may include an approval submodule 21 configured for applying an approval procedure in order for a device 1 , 2, 6 to leave or join the group 1 1 .
  • a device 1 , 2, 6 that joins a collaborative group 1 1 for the first time has to be approved or acknowledged by an authorized user 8 via the management module 19.
  • the device 1 , 2, 6 leaves the collaborative group 1 1 , the rest of devices 1 , 2, 6 within the collaborative group 1 1 will be notified.
  • the management module 19 may further include a configuring submodule 22 arranged for configuring the publishing module or modules and/or the subscribing module or modules of the collaboration system 16 in order to perform the common task of the group 1 1 .
  • the associated device 9 and the shared items 3, 4, 5, 7 are also connected to the common network 23, e.g. by wireless communication such as radio, or by wire.
  • Fig. 4 a class chart of the collaboration system 16 according to one
  • the class chart here illustrated is only for illustrative purposes, and may have a different layout.
  • the class chart describes a data model of the collaboration system 16 according to this embodiment in a high level language.
  • the architecture of the collaborative system 16 is thus here illustrated in classes where a class includes a set of properties.
  • a property may point to another class or to one or several methods.
  • Collaborative Group 25 may be defined with a set of properties: “Collaborative Service List”, “Device List”, “Shared Item List”, “User Account” etc.
  • the "User Account” may point to another class “Account” 26 where properties for an account are specified.
  • the "Device List” may point to a class “Devices” 30.
  • Data types 29, 32, 33, 34, 35 specifying each device 1 , 2, 6 and shared item 9 are specified.
  • a data type 32, 33, 34 may e.g. be a feeder (not shown), a conveyor belt 6 or a robot 1 , 2.
  • a further data type may be a vision system (not shown) that is referred to as an individual node, if it does not belong to a certain robot 1 , 2.
  • the vision system or a robot 1 , 2 may publish a topic "Vision Sensor Data” that a robot 1 , 2 etc may subscribe to.
  • a data type may further define properties, e.g. a data type being a "Robot” may specify the properties "Collaborative Areas",
  • a “Robot Service” may be a separate class 35 and may be a task that each robot perform and that is defined by the robot program, e.g. a Rapid program.
  • a separate class 36 for "Robot Data” may also be specified, as well as a separate class 28 for "Item Data” of the data type “Shared Item”.
  • the "Collaborate Service” 27 may be dependent on a “Service” 31 on a device level.
  • the “Service” 31 may describe a set of properties and methods such as “StartO" and "Stop()".
  • the user 8 may define data types, topics, data writers and data readers when the collaboration system 16 is to be configured.
  • a publishing module of a node of a robot may include a data writer publishing robot data such as the position of the robot.
  • a subscribing module of a node of a robot 1 , 2 may include a data reader subscribing to robot data such as the position of another robot 1 , 2 etc.
  • the user may define each data writer and data reader on a device level via the management module 19. That is, the user defines which data that should be published or subscribed per device.
  • a robot 1 , 2 may publish a topic "Robot Data” being data such as position and status of the robot 1 ,2 and subscribe to the topic "Shared Item Data” being data such as dimension and position of a shared item 3, 4, 5, 7.
  • An associated device 9 may subscribe to the topic "Robot Data”.
  • the data may be filtered such that not all data is subscribed to.
  • the topic "Robot Data” may be filtered on e.g. uptime and downtime.
  • the first robot 1 of Fig. 1 may publish the topic "Vision Sensor Data” being vision data such as the described certain data.
  • the second robot 2 of Fig. 1 may subscribe to the topic "Vision Sensor Data” and then receive the certain data.
  • the robots may be configured on a device level with robot services, "Robot Service” 34, where a plurality of data writers and/or data readers are set at the same time.
  • the user may then configure a whole sequence of actions or movements of a robot simultaneously by only specifying a robot service.
  • a robot service may be dependent on "Robot Data” 35 or any other data within the collaborative group 1 1 .
  • Other services may also be defined. For example may a "Collaborative Service" 27 on a collaboration group level be pre-set. The user then does not need to be aware of if data is to be communicated within the collaborative group 1 1 within this service.
  • the class chart also has defined a class for user accounts 26 which has defined rights, username etc.
  • Each device 1 , 2, 6 is described in a device level 30.
  • a subscription may be filtered, thus, the subscription to "robot data" may be filtered on e.g. uptime and status.
  • the disclosure also relates to a method for the robot system 40 (Fig. 1 ).
  • the robot system 40 includes a collaboration system 16 having a publish-subscribe architecture where each device 1 , 2, 6 in the group 1 1 is related to as a node.
  • the first robot 1 in the group 1 1 includes the sensing system 10 configured to generate certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 .
  • the sensing system 10 includes a feedback device 38 such as a vision sensor generating vision data, a proximity sensor generating proximity data, a tactile sensor generating tactile data and/or a range sensor generating range data.
  • the method includes generating, by means of the sensing system 10 of the first robot 1 , the certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 , see A1 in Fig. 5.
  • the method may include generating the certain data including position data of at least one object determined from the feedback data, e.g. vision data, proximity data, the tactile data and/or the range data or any other kind of sensor data.
  • the method further includes publishing information including the certain data generated from the first robot 1 in the group 1 1 related to a first node 17 (Fig. 3) for sharing, see A2 in Fig.
  • the published information may be "Vision Sensor Data" from the first robot 1 in Fig. 1 .
  • the method further includes receiving the information to a second node 18 (Fig. 3) related to the second robot 2 in the group 1 1 subscribing to the information published, see A3 in Fig. 5.
  • the method further includes controlling the second robot 2 for performing at least part of the common task of the group 1 1 based on the received information, see A4 in Fig. 5.
  • the received information is needed for the second robot 2 to perform its share of the common task.
  • the share of the common task is specified in the robot program of the second robot 2. It is to be understood that in order to accomplish the common task, the robots 1 , 2 and other devices and shared items of the group may need to publish and/or subscribe continuously in order to exchange information.
  • the method may include generating the certain data including a characteristic of a common reference object for the first robot 1 and the second robot 2.
  • the characteristic may be the position or coordinate of the reference object expressed in a coordinate system of the first robot 1 .
  • the position or coordinate of the common reference object may be known to the second robot 2.
  • the second robot 2 may translate any position or coordinate in the coordinate system of the first robot 1 into a position or coordinate in the coordinate system of the second robot 2.
  • the coordinated systems of the first robot 1 and the second robot 2 may of course be synchronized in beforehand.
  • the position data may then be expressed in coordinates of a common coordinate system of the first robot 1 and the second robot 2. Fig.
  • the first robot 1 includes the described sensing system 10 arranged to the first robot 1 .
  • the sensing system 10 here includes a laser sensor 38 arranged to generate a laser beam 39 and laser data of scanned objects such as the wooden pieces 5 and the fixture 3. By scanning an object with the laser beam, the sensing system 10 may create an image of the object from the laser data.
  • the sensing system 10 or the robot controller of the first robot 1 may process the laser data or the image in order to find
  • the user 8 defines the collaborative group with the first robot 1 and the second robot 2 as approved devices and the fixture 3 and the plurality of wooden pieces 5 as approved shared items via a user interface in the mobile device 9, and defines the mobile device 9 as an associated device of the group.
  • the user then configures the first robot 1 and the second robot 2 and the associated device 9 by defining the publishing and subscribing of the group, thus create topics, data readers, data writers etc. necessary for the group to share data such that it can perform its common task, here to assemble wooden pallets 7.
  • At least the first robot 1 and the second robot 2 and the associated device 9 are connected to a common network. If also the shared items 3, 5 are to publish and/or subscribe to any information, they also have to be configured and connected to the common network.
  • the user initiates the collaboration task: assemble wooden pallets.
  • the first robot 1 scans a wooden piece of the stack 5 of wooden pieces with a laser sensor before picking it up and placing it on the fixture 3.
  • the first robot 1 finds wood knots on the wooden piece and publishes
  • the second robot 2 has subscribed to information of the position of the first robot 1 and the positions of the wood knots.
  • the second robot 2 gets notified with the updated information of the
  • the second robot 2 moves towards the position of the first robot 1 at the fixture.
  • the second robot 2 starts nailing and avoids wood knots.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A robot system (40) with a group (11) of industrial robots (1, 2), and a method for the robot system(40). A first robot (1) in the group (11) includes a sensing system (38) configured to generate certain data needed by a second robot (2) in the group (11). The robot system (40) has a publish-subscribe architecture where each robot (1, 2) in the group (11) is related to as a node (17, 18...N). The robot system (40) includes: a publishing module (17A) configured to generate information including the certain data from the first robot (1) in the group (11) related to a first node (17), for sharing with at least one subscribing module (18B...NB); and a subscribing module (18B) of a second node (18) related to the second robot (2) in the group (11), wherein the subscribing module (18B) is configured to subscribe to the information published by the publishing module (17A), wherein the second robot (2) is configured to use the information for performing at least part of the common task of the group (11).

Description

A ROBOT SYSTEM AND A METHOD FOR OPERATING THE ROBOT SYSTEM
Technical field
The present disclosure relates to the field of industrial robots, and to hardware information sharing between the industrial robots in order to perform a common task of the robots. In particular, the disclosure relates to a robot system for a group of industrial robots, and a method for the robot system.
Background
For many robot operations that are to be performed by industrial robots, guidance or feedback from a feedback device is essential. Therefore many industrial robots have to be equipped with expensive feedback devices such as vision system hardware and software. Feedback devices enable industrial robots to perform robot operations with improved performance. For instance, a robot guided by a vision system may locate parts to be picked up, determine where to apply a weld, inspect parts that have been assembled, and determine where to place a part. However, the cost of installing, integrating and programming a vision system on every robot is very expensive. From US5572102A it is known to have an independent vision system that can be shared among welding robots and aid them in their respective independent welding. Each robot has to be connected via a microcomputer to the vision system. The microcomputer receives data of a welding seam captured by the vision system, processes the data to coordinate adjustments, and sends it to a connected robot. The connected robot interprets the adjustments to its learned path and thereafter wealds the seam. The set up certainly requires special programming and configuration of the microcomputer.
Summary
Traditionally industrial robots use centralized PLC control to coordinate the robots to perform a common task. The PLC is programmed via a Man-Machine-Interface (MMI) or Graphical User Interface (GUI) and require proprietary PLC software and PLC programming expertise. For each robot that is connected to the PLC, an I/O configuration procedure has to be performed such that the PLC and the robot controller can communicate. It is an object of the disclosure to alleviate at least some of the drawbacks with the prior art. It is a further object of the disclosure to provide a robot system that enables sharing of hardware between industrial robots in order to perform a common task of the robots, which is easily handled and set up by an operator. It is a still further object of the disclosure to provide a robot system that is scalable. It is a still further object of the disclosure to provide a robots system that doesn't need any access hardware. These object and others are at least partly achieved by the robot system and method according to the independent claims, and by the embodiments according to the dependent claims. According to one aspect, the disclosure relates to a robot system with a group of industrial robots where each robot in the group is arranged for cooperation with at least one other robot in the group in order to perform a common task of the group. A first robot in the group includes a sensing system configured to generate certain data needed by a second robot in the group for performing at least part of the common task of the group. The robot system has a publish-subscribe architecture where each robot in the group is related to as a node. The robot system includes a publishing module configured to generate information including the certain data from the first robot in the group related to a first node, for sharing with at least one subscribing module. The robot system further includes a subscribing module of a second node related to the second robot in the group, wherein the subscribing module is configured to subscribe to the information published by the publishing module, and wherein the second robot is configured to use the information for performing at least part of the common task of the group. The robot system provides improved utilization of hardware of a robot by enabling sharing of data between industrial robots. With the herein described robot system, expensive hardware that one robot is equipped with, for example a vision system, may be shared with other robots as information that the one robot knows from the vision system can be shared with the other robots. As additional sensor hardware information is available, product quality may thereby be improved. Further, the robot system offers improved utilization of existing hardware by sharing in an easier way before, as the need to use difficult I/O configuration tools is obviated.
The robot system implements a collaboration system acting as a scalable platform for the information exchange between the robots, and in addition other devices, such that they may cooperate to achieve a common task. The collaboration system is based on a framework, e.g. data distribution service (DDS) or Message Queue Telemetry Transport (MQTT) using the data-centric publish-subscribe programming model, which is used for sending data between nodes. One feature of the frameworks is to enable real-time distribution, i.e., using the machine-to- machine data-centric middleware standard, which allows real-time data
exchanges between publishers and subscribers for embedded robot systems.
With the robot system, the need for external Programmable Logic Controllers (PLCs) is eliminated. The traditional centralized PLC control can be replaced with the scalable collaboration system that enables communication between distributed connected industrial robots.
The collaboration system implements a robot controller to robot controller communication without going through any PLC and which that can be scaled to handle large number of heterogeneous devices, and therefore save costs in cabling, programming and configurations. Publishing modules are loosely coupled to subscribing modules, and need not even know of their existence. With the topic being the focus, publishing modules and subscriber modules are allowed to remain ignorant of the system topology. Each can continue to operate normally regardless of the other. The publishing/subscribing architecture provides the opportunity for better scalability than traditional client-server architecture, through parallel operation, message caching, tree-based or network-based routing, etc. Nodes communicate simply by sending the data they have and asking for the data they need.
As no PLC programming has to be done, the engineering becomes reduced and simplified. Savings may be made in collaborating cost and cabling, configurations, system solution etc. There is also no need for proprietary PLC software, communication driver or cable, or specific PLC expertise.
According to one embodiment, the sensing system includes a feedback device arranged for generating feedback data, and wherein the sensing system is arranged to generate the certain data including position data of at least one object determined from the feedback data. According to other embodiments, the feedback device includes a vision sensor arranged for generating vision data, a proximity sensor arranged for generating proximity data, a tactile sensor arranged for generating tactile data and/or a range sensor arranged for generating range data, and wherein the sensing system is arranged to generate the certain data including position data of at least one object determined from the vision data, proximity data, the tactile data and/or the range data. According to one embodiment, the robot system is, e.g. the sensing system and/or a robot controller of the first robot, arranged to generate the certain data including a characteristic of a common reference object for the first robot and the second robot. According to one embodiment, the position data is expressed in coordinates in a robot coordinate system of the first robot. According to another embodiment, the position data is expressed in coordinates of a common coordinate system of the first robot and the second robot. According to one embodiment, each robot in the group is arranged to be connected to a common network for the information sharing. According to a second aspect, the disclosure relates to a method for a robot system with a group of industrial robots. Each robot in the group is arranged for cooperation with at least one other robot in the group in order to perform a common task of the group. A first robot in the group includes a sensing system configured to generate certain data needed by a second robot in the group for performing at least part of the common task of the group. The robot system has a publish-subscribe architecture where each robot in the group is related to as a node. The method includes generating, by the sensing system of the first robot, the certain data needed by the second robot in the group for performing at least part of the common task of the group. The method further includes publishing information including the certain data generated from the first robot in the group related to a first node for sharing, receiving the information to a second node related to the second robot in the group subscribing to the information published, and controlling the second robot for performing at least part of the common task of the group based on the received information.
The same positive effects as by the system may be achieved by the method. According to one embodiment, the sensing system includes a feedback device generating feedback data, the method further including generating the certain data including position data of at least one object determined from the feedback data. According to other embodiments, the sensing system includes a vision sensor generating vision data, a proximity sensor generating proximity data, a tactile sensor generating tactile data and/or a range sensor generating range data, the method further including generating the certain data including position data of at least one object determined from the vision data, the proximity data, the tactile data and/or the range data. According to one embodiment, the method includes generating the certain data including a characteristic of a common reference object for the first robot and the second robot. According to another embodiment, the position data is expressed in coordinates in a robot coordinate system of the first robot. According to a further embodiment, the position data is expressed in coordinates of a common coordinate system of the first robot and the second robot. According to a third aspect, the disclosure relates to a computer program P, wherein the computer program P comprises a computer program code to cause a processing unit, or a computer unit connected to the processing unit, to perform the method according to any of the steps disclosed herein.
According to a fourth aspect, the disclosure relates to a computer program product comprising a computer program code stored on a computer-readable medium to perform the method according to any of the steps herein, when said computer program code is executed by a processing unit or by a computer unit connected to the processing unit.
Brief description of the drawings
Fig. 1 shows a robot system with a group of devices including two industrial robots implementing a collaboration system for information sharing.
Fig. 2 shows a device controller according to one embodiment.
Fig. 3 shows a publish-subscribe architecture of the collaboration system of the robot system according to one embodiment.
Fig. 4 shows a class chart of the collaboration system according to one
embodiment.
Fig. 5 shows a flow chart of a method according to one embodiment.
Fig. 6 shows a collaborative group according to one embodiment.
Detailed description
Fig. 1 is illustrating a working area for a first robot 1 and a second robot 2 that each is arranged to a fixture 3, 4 for working on the fixture 3, 4. In the working area raw material is arranged in the shape of pre-stacked pieces, here wood planks 5. The first robot 1 and the second robot 2 are arranged to collaborate in order to assemble pallets 7 from the pre-stacked wood planks 5 and organize them in a stack. A conveyor belt 6 is located between the first robot 1 and the second robot 2 such that the first robot 1 and the second robot 2 may transfer items such as wood planks 5, assembled pallets 7 and half-assembled pallets in between them.
Each of the first robot 1 and the second robot 2 is a programmable industrial robot including a device controller 13 (Fig. 2) in the form of a robot controller with a device processor 14 and a device memory 15. In the context of this disclosure the term "industrial robot" refers to an automatically controlled, reprogrammable, multipurpose manipulator with a plurality of degrees of freedom and an ability to perform work tasks independently. The conveyor belt 6 includes a device controller 13 (Fig. 2) in the form of a conveyer belt controller with a device processor 14 and a device memory 15.
A collaborative group 1 1 can be defined as a set of devices that cooperate in order to achieve a common goal or task. The first robot 1 , the second robot 2 and the conveyor belt 6 may be referred to as devices defined as members of a collaborative group 1 1 . Devices of a collaborative group are arranged for cooperation with at least one other device 1 , 2, 6 in the group 1 1 in order to perform a common task of the group 1 1 . A robot system 40 may be defined including a collaborative group 1 1 of industrial robots 1 , 2 where each robot 1 , 2 in the group 1 1 is arranged for cooperation with at least one other robot 1 , 2 in the group 1 1 in order to perform a common task of the group 1 1 . A collaborative group 1 1 of a robot system 40 may however include more devices as members of the group 1 1 .
Each robot controller includes a robot program that specifies what the robot 1 , 2 should do in order to perform its part of the common task of the group 1 1 . The conveyor belt controller includes a conveyor belt program that specifies what the conveyor belt 6 should do in order to perform its part of the common task of the group 1 1 . The illustrated collaborative group 1 1 in Fig. 1 is only for illustrative purposes and the collaborative group 1 1 may include more devices. Each device includes a device controller 13 (Fig. 2) with a device processor 14 (Fig. 2) and a device memory 15 (Fig. 2). Each device controller 13 also includes a device program P specifying what the device should do in order to perform its part of the common task of the group 1 1 . The device program P may be loaded into the device memory 15. The device processor 14 may be made up of one or more Central Processing Units (CPU). The device memory 15 may be made up of one or more memory units. A memory unit may include a volatile and/or a non-volatile memory, such as a flash memory or Random Access Memory (RAM). In Fig. 1 , the first robot 1 includes a sensing system 10 configured to generate certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 . Alternatively, the second robot 2, or any other robot the group 1 1 , may be equipped with a sensing system 10 configured to generate certain data needed by another robot in the group 1 1 for performing at least part of the common task.
The sensing system 10 may include a feedback device 38 such as a vision sensor arranged for generating feedback data such as vision data. The sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the vision data. The vision sensor may e.g. include a laser, a sonar sensor, a camera or a video camera. The sensing system 10 may then be arranged to identify one or several objects from the vision data, and find a position for the at least one object from the vision data. The position is included in the position data. According to one embodiment, the sensing system 10 includes a feedback device 38 such as a proximity sensor arranged for generating feedback data such as proximity data. The sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the proximity data. According to another embodiment, the sensing system 10 includes a feedback device 38 such as a tactile sensor arranged for generating feedback data such as tactile data. The sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the tactile data. According to a further embodiment, the sensing system 10 includes a feedback device 38 such as a range sensor arranged for generating feedback data such as range data. The sensing system 10 is then arranged to generate the certain data including position data of at least one object determined from the range data.
The sensing system 10, and/or the robot controller of the first robot 1 , may be arranged to generate a characteristic, e.g. position or coordinate, of a common reference object for the first robot 1 and the second robot 2. The reference object may e.g. be a mark or another object in the working area of the first robot 1 and the second robot 2. The reference object may be a part of the second robot 2, or arranged to the second robot 2 and have a position that is previously known to the second robot 2. For example, the working area of the first robot 1 and the second robot 2 may include a reference object, and the second robot 2 may know or be able to find out the distance from the second robot 2 to the reference object e.g. in a robot coordinate system of the second robot 2. The sensing system 10, or the robot controller of the robot 1 , may be arranged to find out the position of the reference object in the data generated by the sensor of the sensing system 10 expressed as a position in a robot coordinate system of the first robot 1 . A relation between the robot coordinate system of the first robot 1 and the robot coordinate system of the second robot 2 may then be established by the robot controller of the second robot 2, such that a position or coordinate in the robot coordinate system of the first robot 1 may be translated to a position or coordinate in the robot coordinate system of the second robot 2. The position data included in the certain data of the at least one object may be expressed in coordinates of the robot coordinate system of the first robot 1 . The robot controller of the second robot 2 may then translate the position data expressed in coordinates of the robot coordinate system of the first robot 1 , to position data expressed in coordinates of the robot coordinate system of the second robot 2. The robot coordinate systems of the first robot 1 and the second robot 2 may of course be correlated in beforehand, such that a coordinate in any of the robot coordinate systems may be translated to a coordinate in the other one of the coordinate systems. The position data may then be said to be expressed in coordinates of a common coordinate system of the first robot 1 and the second robot 2. The collaborating group 1 1 is managed from a collaboration system 16 (Fig. 3) via a user interface of the robot system 40. A user 8 (Fig. 1 ) may via the user interface set up the group 1 1 and configure the group 1 1 . The user interface may be accessed via an application installed on a mobile device 9 (Fig. 1 ) such as a tablet, a mobile phone or other computer unit. Such a mobile device 9 may be referred to in the collaborative group 1 1 as an associated device. A collaborative group 1 1 can have many associated devices, which are devices that interact with the collaborative group 1 1 , but does not cooperate with other devices 1 , 2, 6 in the group in order to achieve a common goal. The associated device may e.g. read some information from the collaborative group 1 1 .
Besides devices 1 , 2, 6, a collaborative group 1 1 may include shared items 3, 4, 5, 7. A shared item (e.g., a fixture, a working object, a collaboration area), is visible within a collaborative group 1 1 , but it does not define the collaborative group 1 1 . The fixtures 3, 4, the pre-stacked wood pieces 5, the assembled pallet 7 in Fig. 1 are referred to as shared items in the group 1 1 .
The collaboration system 16 of the robot system 40 is generally illustrated in Fig. 3, and has a publish-subscribe architecture where each device 1 , 2, 6 in the group 1 1 is related to as a node 17, 18...N. The collaboration system 16 may be based on DDS, MQTT or any other framework offering a publish-subscribe architecture and software of the collaboration system 16 is implemented as middleware in the devices 1 , 2, 6, 9 e.g. in the device controllers 13 and sometimes also in the shared items 3, 4, 5, 7 e.g. in shared item controllers (not shown). Further, some middleware may be implemented on a server connected to a common network 23 (Fig. 3). In a basic set-up, the collaboration system 16 includes at least one publishing module 17A configured to generate information from a first device in the group 1 1 related to a first node 17 for sharing with at least one subscribing module. The first device may be the first robot 1 . The information may then include the certain data from the first robot 1 in the group. The collaboration system 16 further includes at least one subscribing module 18B of a second node
18 related to a second device in the group 1 1 . The subscribing module 18B is configured to subscribe to the information published by the publishing module
17A, wherein the second device of the group 1 1 is configured to use the information for performing at least part of the common task of the group 1 1 . The second device may be the second robot 2. Generally, each node 17, 18...N related to a device 1 , 2, 6 in the group may include at least one publishing module 17A, 18A... NA configured to generate information from the related device for sharing, and at least one subscribing module 17B, 18B... NB configured to subscribe to information published by a publishing module of a node of another device in the group 1 1 . The robot system 40 further includes a management module 19 configured for controlling the collaboration system 16. The management module 19 is configured to be controlled via a user authorized by the robot system 40. The management module
19 may be accessed via the application downloaded to the associated device 9. The associated device 9 is in the publish-subscribe architecture related to as a node but is not configured to cooperate with the devices 1 , 2, 6 of the group 1 1 for performing the common task of the group 1 1 . The management module 19 also includes a subscribing module 20 that may subscribe to information from the devices 1 , 2, 6 and shared items 3, 4, 5, 7 of the group 1 1 . The shared items 3, 4, 5, 7 may also each be related to as a node that each includes a publication module. The shared items 3, 4, 5, 7 may publish via the publication module the status of the shared item, respectively, e.g. how many pieces that are left in the stack of pre-stacked wood pieces 5, or how many pallets 7 that are assembled in the stack. Each device 1 , 2, 6, associated device 9 and shared items 3, 4, 5, 7 in the group 1 1 may further arranged to be connected to a common network 23 for the information sharing, e.g. by wireless communication such as radio, or by wire. The network 23 may e.g. be an Ethernet network. Each device 1 , 2, 6, e.g. the first robot 1 and the second robot 2, associated device 9 and shared items 3, 4, 5, 7 in the group 1 1 may then be arranged with a wireless communication module for receiving and/or transmitting wireless signals.
The management module 19 may include an approval submodule 21 configured for applying an approval procedure in order for a device 1 , 2, 6 to leave or join the group 1 1 . A device 1 , 2, 6 that joins a collaborative group 1 1 for the first time has to be approved or acknowledged by an authorized user 8 via the management module 19. When the device 1 , 2, 6 leaves the collaborative group 1 1 , the rest of devices 1 , 2, 6 within the collaborative group 1 1 will be notified.
The management module 19 may further include a configuring submodule 22 arranged for configuring the publishing module or modules and/or the subscribing module or modules of the collaboration system 16 in order to perform the common task of the group 1 1 . The associated device 9 and the shared items 3, 4, 5, 7 are also connected to the common network 23, e.g. by wireless communication such as radio, or by wire. In Fig. 4 a class chart of the collaboration system 16 according to one
embodiment is illustrated. The class chart here illustrated is only for illustrative purposes, and may have a different layout. The class chart describes a data model of the collaboration system 16 according to this embodiment in a high level language. The architecture of the collaborative system 16 is thus here illustrated in classes where a class includes a set of properties. A property may point to another class or to one or several methods. For example, a class called
"Collaborative Group" 25 may be defined with a set of properties: "Collaborative Service List", "Device List", "Shared Item List", "User Account" etc. The "User Account" may point to another class "Account" 26 where properties for an account are specified. Further, the "Device List" may point to a class "Devices" 30. Data types 29, 32, 33, 34, 35 specifying each device 1 , 2, 6 and shared item 9 are specified. A data type 32, 33, 34 may e.g. be a feeder (not shown), a conveyor belt 6 or a robot 1 , 2. A further data type may be a vision system (not shown) that is referred to as an individual node, if it does not belong to a certain robot 1 , 2. The vision system or a robot 1 , 2 may publish a topic "Vision Sensor Data" that a robot 1 , 2 etc may subscribe to. A data type may further define properties, e.g. a data type being a "Robot" may specify the properties "Collaborative Areas",
"Robot Data", "Robot Service" etc. A "Robot Service" may be a separate class 35 and may be a task that each robot perform and that is defined by the robot program, e.g. a Rapid program. A separate class 36 for "Robot Data" may also be specified, as well as a separate class 28 for "Item Data" of the data type "Shared Item". The "Collaborate Service" 27 may be dependent on a "Service" 31 on a device level. The "Service" 31 may describe a set of properties and methods such as "StartO" and "Stop()".
The user 8 may define data types, topics, data writers and data readers when the collaboration system 16 is to be configured. For example, a publishing module of a node of a robot may include a data writer publishing robot data such as the position of the robot. A subscribing module of a node of a robot 1 , 2 may include a data reader subscribing to robot data such as the position of another robot 1 , 2 etc. The user may define each data writer and data reader on a device level via the management module 19. That is, the user defines which data that should be published or subscribed per device. For example, a robot 1 , 2 may publish a topic "Robot Data" being data such as position and status of the robot 1 ,2 and subscribe to the topic "Shared Item Data" being data such as dimension and position of a shared item 3, 4, 5, 7. An associated device 9 may subscribe to the topic "Robot Data". The data may be filtered such that not all data is subscribed to. For example, the topic "Robot Data" may be filtered on e.g. uptime and downtime. As a further example, the first robot 1 of Fig. 1 may publish the topic "Vision Sensor Data" being vision data such as the described certain data. The second robot 2 of Fig. 1 may subscribe to the topic "Vision Sensor Data" and then receive the certain data. Alternatively, the robots may be configured on a device level with robot services, "Robot Service" 34, where a plurality of data writers and/or data readers are set at the same time. The user may then configure a whole sequence of actions or movements of a robot simultaneously by only specifying a robot service. For example, a robot service may be dependent on "Robot Data" 35 or any other data within the collaborative group 1 1 . Other services may also be defined. For example may a "Collaborative Service" 27 on a collaboration group level be pre-set. The user then does not need to be aware of if data is to be communicated within the collaborative group 1 1 within this service.
The class chart also has defined a class for user accounts 26 which has defined rights, username etc. Each device 1 , 2, 6 is described in a device level 30. A subscription may be filtered, thus, the subscription to "robot data" may be filtered on e.g. uptime and status.
The disclosure also relates to a method for the robot system 40 (Fig. 1 ). As explained, the robot system 40 includes a collaboration system 16 having a publish-subscribe architecture where each device 1 , 2, 6 in the group 1 1 is related to as a node. As previously described, the first robot 1 in the group 1 1 includes the sensing system 10 configured to generate certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 . The sensing system 10 includes a feedback device 38 such as a vision sensor generating vision data, a proximity sensor generating proximity data, a tactile sensor generating tactile data and/or a range sensor generating range data.
The method will now be explained with reference to the flow chart in Fig. 5. Here it is now assumed that the group 1 1 has been configured and its members are ready to collaborate. The method includes generating, by means of the sensing system 10 of the first robot 1 , the certain data needed by the second robot 2 in the group 1 1 for performing at least part of the common task of the group 1 1 , see A1 in Fig. 5. The method may include generating the certain data including position data of at least one object determined from the feedback data, e.g. vision data, proximity data, the tactile data and/or the range data or any other kind of sensor data. The method further includes publishing information including the certain data generated from the first robot 1 in the group 1 1 related to a first node 17 (Fig. 3) for sharing, see A2 in Fig. 5. The published information may be "Vision Sensor Data" from the first robot 1 in Fig. 1 . The method further includes receiving the information to a second node 18 (Fig. 3) related to the second robot 2 in the group 1 1 subscribing to the information published, see A3 in Fig. 5. The method further includes controlling the second robot 2 for performing at least part of the common task of the group 1 1 based on the received information, see A4 in Fig. 5. Thus, the received information is needed for the second robot 2 to perform its share of the common task. The share of the common task is specified in the robot program of the second robot 2. It is to be understood that in order to accomplish the common task, the robots 1 , 2 and other devices and shared items of the group may need to publish and/or subscribe continuously in order to exchange information. In order for the second robot 2 to be able to interpret any position data from the first robot 1 , the method may include generating the certain data including a characteristic of a common reference object for the first robot 1 and the second robot 2. The characteristic may be the position or coordinate of the reference object expressed in a coordinate system of the first robot 1 . The position or coordinate of the common reference object may be known to the second robot 2. Based on these data, the second robot 2 may translate any position or coordinate in the coordinate system of the first robot 1 into a position or coordinate in the coordinate system of the second robot 2. The coordinated systems of the first robot 1 and the second robot 2 may of course be synchronized in beforehand. The position data may then be expressed in coordinates of a common coordinate system of the first robot 1 and the second robot 2. Fig. 6 shows an example of a robot system 40 with a collaborative group with a working area with the first industrial robot 1 and the second industrial robot 2, a stack of wooden pieces 5 and a fixture 3. The first robot 1 includes the described sensing system 10 arranged to the first robot 1 . The sensing system 10 here includes a laser sensor 38 arranged to generate a laser beam 39 and laser data of scanned objects such as the wooden pieces 5 and the fixture 3. By scanning an object with the laser beam, the sensing system 10 may create an image of the object from the laser data. The sensing system 10 or the robot controller of the first robot 1 , may process the laser data or the image in order to find
characteristics of the object such as positions of wood knots on the wooden pieces 5. The user 8 defines the collaborative group with the first robot 1 and the second robot 2 as approved devices and the fixture 3 and the plurality of wooden pieces 5 as approved shared items via a user interface in the mobile device 9, and defines the mobile device 9 as an associated device of the group. The user then configures the first robot 1 and the second robot 2 and the associated device 9 by defining the publishing and subscribing of the group, thus create topics, data readers, data writers etc. necessary for the group to share data such that it can perform its common task, here to assemble wooden pallets 7. At least the first robot 1 and the second robot 2 and the associated device 9 are connected to a common network. If also the shared items 3, 5 are to publish and/or subscribe to any information, they also have to be configured and connected to the common network.
An example scenario on how shared vision system hardware between the first robot 1 and the second robot 2 can be used to improve product quality when they work on assembling wooden pallets 7 will now be described:
1 . The user initiates the collaboration task: assemble wooden pallets.
2. The first robot 1 scans a wooden piece of the stack 5 of wooden pieces with a laser sensor before picking it up and placing it on the fixture 3.
3. The first robot 1 finds wood knots on the wooden piece and publishes
information on their positions. 4. The second robot 2 has subscribed to information of the position of the first robot 1 and the positions of the wood knots.
5. The second robot 2 gets notified with the updated information of the
position of the first robot 1 and the positions of the wood knots, published by the first robot 1 .
6. The second robot 2 moves towards the position of the first robot 1 at the fixture.
7. The second robot 2 starts nailing and avoids wood knots.
The present invention is not limited to the above-described preferred
embodiments. Various alternatives, modifications and equivalents may be used. Therefore, the above embodiments should not be taken as limiting the scope of the invention, which is defined by the appending claims.

Claims

A robot system (40) with a group (1 1 ) of industrial robots (1 , 2) where each robot (1 , 2) in the group (1 1 ) is arranged for cooperation with at least one other robot (1 , 2) in the group (1 1 ) in order to perform a common task of the group (1 1 ), c h a ra c t e r i z e d i n that a first robot (1 ) in the group (1 1 ) includes a sensing system (10) configured to generate certain data needed by a second robot (2) in the group (1 1 ) for performing at least part of the common task of the group (1 1 ), and wherein the robot system (40) has a publish- subscribe architecture where each robot (1 , 2) in the group (1 1 ) is related to as a node (17, 18...N), the robot system (40) includes:
- a publishing module (17A) configured to generate information including the certain data from the first robot (1 ) in the group (1 1 ) related to a first node (17), for sharing with at least one subscribing module (18B...NB);
- a subscribing module (18B) of a second node (18) related to the second robot (2) in the group (1 1 ), wherein the subscribing module (18B) is configured to subscribe to the information published by the publishing module (17A), wherein the second robot (2) is configured to use the information for performing at least part of the common task of the group (1 1 ).
The robot system (40) according to claim 1 , wherein the sensing system (10) includes a feedback device (38) arranged for generating feedback data, and wherein the sensing system (10) is arranged to generate the certain data including position data of at least one object determined from the feedback data.
The robot system (40) according to claim 1 or 2, wherein the feedback device (38) includes a vision sensor arranged for generating vision data, a proximity sensor arranged for generating proximity data, a tactile sensor arranged for generating tactile data and/or a range sensor arranged for generating range data, and wherein the sensing system (10) is arranged to generate the certain data including position data of at least one object determined from the vision data, the proximity data, the tactile data and/or the range data.
4. The robot system (40) according to claim 3, arranged to generate the certain data including a characteristic of a common reference object for the first robot (1 ) and the second robot (2).
5. The robot system (40) according to claim 3 or 4, wherein the position data is expressed in coordinates in a robot coordinate system of the first robot
(1 )-
6. The robot system (40) according to any of the claims 3 to 5, wherein the position data is expressed in coordinates of a common coordinate system of the first robot (1 ) and the second robot (2).
7. The robot system (40) according to any of the proceeding claims, wherein each robot (1 , 2) in the group (1 1 ) is arranged to be connected to a common network (23) for the information sharing.
8. A method for a robot system (40) with a group (1 1 ) of industrial robots (1 , 2) where each robot (1 , 2) in the group (1 1 ) is arranged for cooperation with at least one other robot (1 , 2) in the group (1 1 ) in order to perform a common task of the group (1 1 ), wherein a first robot (1 ) in the group (1 1 ) includes a sensing system (10) configured to generate certain data needed by a second robot (2) in the group (1 1 ) for performing at least part of the common task of the group (1 1 ), and wherein the robot system (40) has a publish-subscribe architecture where each robot (1 , 2) in the group (1 1 ) is related to as a node (17, 18, ...N), and wherein the method includes:
- generating, by the sensing system (10) of the first robot (1 ), the certain data needed by the second robot (2) in the group (1 1 ) for performing at least part of the common task of the group (1 1 );
- publishing information including the certain data generated from the first robot (1 ) in the group (1 1 ) related to a first node (17) for sharing;
- receiving the information to a second node (18) related to the second robot (2) in the group (1 1 ) subscribing to the information published;
- controlling the second robot (2) for performing at least part of the common task of the group (1 1 ) based on the received information.
9. The method according to claim 8, wherein the sensing system (10) includes a feedback device (38) generating feedback data, the method further including generating the certain data including position data of at least one object determined from the feedback data.
10. The method according to claim 8 or 9, wherein the feedback device (38) includes a vision sensor generating vision data, proximity sensor generating proximity data, a tactile sensor generating tactile data and/or a range sensor generating range data, the method further including generating the certain data including position data of at least one object determined from the vision data, the proximity data, the tactile data and/or the range data.
1 1 . The method according to claim 10, including generating the certain data including a characteristic of a common reference object for the first robot (1 ) and the second robot (2).
12. The method according to any of the claims 9 to 1 1 , wherein the
position data is expressed in coordinates in a robot coordinate system of the first robot (1 ).
13. The method according to any of the claims 9 to 1 1 , wherein the
position data is expressed in coordinates of a common coordinate system of the first robot (1 ) and the second robot (2).
14. The method according to any of the claims 8 to 13 wherein each robot (1 , 2) in the group (1 1 ) is connected to a common network (23) for the information sharing.
PCT/EP2016/050368 2016-01-11 2016-01-11 A robot system and a method for operating the robot system WO2017121456A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/050368 WO2017121456A1 (en) 2016-01-11 2016-01-11 A robot system and a method for operating the robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2016/050368 WO2017121456A1 (en) 2016-01-11 2016-01-11 A robot system and a method for operating the robot system

Publications (1)

Publication Number Publication Date
WO2017121456A1 true WO2017121456A1 (en) 2017-07-20

Family

ID=55083419

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/050368 WO2017121456A1 (en) 2016-01-11 2016-01-11 A robot system and a method for operating the robot system

Country Status (1)

Country Link
WO (1) WO2017121456A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019058694A1 (en) * 2017-09-20 2019-03-28 ソニー株式会社 Control device, control method, and control system
CN112911012A (en) * 2021-02-07 2021-06-04 珠海市一微半导体有限公司 Robot sensor data distribution and subscription system, chip and robot

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040199290A1 (en) * 2003-04-03 2004-10-07 Stoddard Kenneth A. Method and control system for controlling a plurality of robots
US20110046781A1 (en) * 2009-08-21 2011-02-24 Harris Corporation, Corporation Of The State Of Delaware Coordinated action robotic system and related methods
US20140309762A1 (en) * 2011-11-16 2014-10-16 Nissan Motor Co., Ltd. Manufacturing method and manufacturing device for manufacturing a joined piece

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040199290A1 (en) * 2003-04-03 2004-10-07 Stoddard Kenneth A. Method and control system for controlling a plurality of robots
US20110046781A1 (en) * 2009-08-21 2011-02-24 Harris Corporation, Corporation Of The State Of Delaware Coordinated action robotic system and related methods
US20140309762A1 (en) * 2011-11-16 2014-10-16 Nissan Motor Co., Ltd. Manufacturing method and manufacturing device for manufacturing a joined piece

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019058694A1 (en) * 2017-09-20 2019-03-28 ソニー株式会社 Control device, control method, and control system
US11389949B2 (en) 2017-09-20 2022-07-19 Sony Corporation Control device, control method, and control system
CN112911012A (en) * 2021-02-07 2021-06-04 珠海市一微半导体有限公司 Robot sensor data distribution and subscription system, chip and robot

Similar Documents

Publication Publication Date Title
Grau et al. Industrial robotics in factory automation: From the early stage to the Internet of Things
EP2476032B1 (en) Method for configuration soa-based automation devices and for developing an orchestration machine, and production method
KR102257938B1 (en) Skill interface for industrial applications
JP2021057033A (en) Industrial control system hyperconverged architecture
US9632494B2 (en) Method for generating and handling applications for components of a distributed control system and engineering system for implementing the process
Terzimehic et al. Towards an industry 4.0 compliant control software architecture using IEC 61499 & OPC UA
CN110968050B (en) Production module
WO2017121457A1 (en) A collaboration system and a method for operating the collaboration system
US20120030310A1 (en) Redundant Communication In A Communication System
RU2670553C1 (en) Production module for implementation of production function
US11155170B2 (en) Transport system
US11271790B2 (en) Interconnection device, communication method, and system including robot
Rojas et al. Implementation of industrial internet of things and cyber-physical systems in SMEs for distributed and service-oriented control
WO2017121456A1 (en) A robot system and a method for operating the robot system
Lee et al. Implementation of distributed smart factory platform based on edge computing and OPC UA
de las Morenas et al. Shop floor control: A physical agents approach for PLC-controlled systems
Park et al. An extended agent communication framework for rapid reconfiguration of distributed manufacturing systems
Ghodsian et al. Toward designing an integration architecture for a mobile manipulator in production systems: Industry 4.0
Crăciunescu et al. IIoT gateway for edge Computing applications
Starke et al. Flexible collaboration and control of heterogeneous mechatronic devices and systems by means of an event-driven, SOA-based automation concept
Spies et al. Wiring of control cabinets using a distributed control within a robot-based production cell
Etz et al. Functional safety use cases in the context of reconfigurable manufacturing systems
Martinez et al. Setup of the yaskawa sda10f robot for industrial applications, using ros-industrial
De Sousa et al. Distributed mas with leaderless consensus to job-shop scheduler in a virtual smart factory with modular conveyors
US20220011755A1 (en) Device and method for configuring a production machine on the basis of product data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16700302

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16700302

Country of ref document: EP

Kind code of ref document: A1