WO2022085699A1 - システム制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム - Google Patents
システム制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム Download PDFInfo
- Publication number
- WO2022085699A1 WO2022085699A1 PCT/JP2021/038669 JP2021038669W WO2022085699A1 WO 2022085699 A1 WO2022085699 A1 WO 2022085699A1 JP 2021038669 W JP2021038669 W JP 2021038669W WO 2022085699 A1 WO2022085699 A1 WO 2022085699A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- information
- work
- library
- robot controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1661—Programme controls characterised by programming, planning systems for manipulators characterised by task planning, object-oriented languages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1633—Programme controls characterised by the control loop compliant, force, torque control, e.g. combined with position control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40564—Recognize shape, contour of object, extract position and orientation
Definitions
- the present disclosure relates to a system control device, a robot control method, a terminal device, a terminal control method, and a robot control system.
- Patent Document 1 describes a system control device that operates a robot by executing a shaft of various types of joints and a program for driving and controlling the shaft.
- the system control device includes a first interface, a second interface, and a control unit.
- the first interface is communicably connected to at least one terminal device that receives input information input by the user.
- the second interface exhibits robot behavior and is communicably connected to at least one robot controller that controls at least one robot based on at least one library containing undefined portions.
- the control unit identifies the library corresponding to the work to be executed by the at least one robot, which is generated based on the input information, and acquires job information capable of complementing the undefined portion.
- the control unit outputs the work instruction to the at least one robot controller based on the job information.
- a robot control method exhibits at least one terminal device that accepts input information input by a user, and at least one library that shows the operation of the robot and includes an undefined portion. It involves communicating with at least one robot controller that controls one robot.
- the robot control method identifies the library corresponding to the work to be executed by the at least one robot, which is generated based on the input information, and acquires job information capable of complementing the undefined portion from the terminal device. Including that.
- the robot control method includes outputting the work instruction to the at least one robot controller based on the job information.
- the terminal device includes a user interface, a communication interface, and a terminal control unit.
- the user interface accepts input information input by the user.
- the communication interface is communicably connected to a system control device that indicates robot operation and outputs information to at least one robot controller that controls at least one robot based on at least one library containing an undefined portion. Will be done.
- the terminal control unit identifies the library corresponding to the work to be executed by the at least one robot, which is generated based on the input information, and generates job information capable of complementing the undefined portion.
- the communication interface outputs the job information to the system control device.
- the terminal control method includes accepting input information input by a user.
- the terminal control method is an operation of showing the operation of a robot based on the input information and causing at least one robot controlled by at least one robot controller to execute based on at least one library including an undefined portion. It includes specifying the library according to the above and generating and outputting job information including supplementary information that complements the undefined portion.
- the robot control system includes at least one robot controller, a terminal device, and a system control device.
- the at least one robot controller indicates the operation of the robot and controls at least one robot based on at least one library including an undefined portion.
- the terminal device has a user interface. The user interface accepts input information input by the user.
- the system control device is communicably connected to each of the robot controller and the terminal device.
- the terminal device identifies the library corresponding to the work to be executed by the at least one robot, which is generated based on the input information, and generates job information capable of complementing the undefined portion, and the system control device. Output to.
- the system control device outputs the work instruction to the at least one robot controller based on the job information acquired from the terminal device.
- the robot controller causes the robot to perform the work based on the instruction of the work.
- An object of the present disclosure is to provide a system control device, a robot control method, a terminal device, a terminal control method, and a robot control system that can change the work content to be executed by a robot in a short time and at low cost.
- the robot control system 1 includes a system control device 10 and a terminal device 20.
- the robot control system 1 according to the present embodiment further includes a robot controller 30 (see FIG. 2) that controls the robot 40, as will be described later.
- the system control device 10, the terminal device 20, and the robot 40 or the robot controller 30 are communicably connected to each other via the network 80.
- the system control device 10, the terminal device 20, and the robot 40 or the robot controller 30 may be communicably connected to each other without going through the network 80.
- the network 80 and the robot 40 or the robot controller 30 may be communicably connected via the access point 82.
- the network 80 and the robot 40 or the robot controller 30 may be communicably connected without going through the access point 82.
- the number of the system control device 10 and the terminal device 20 is not limited to the three exemplified, and may be two or less, or may be four or more.
- the robot control system 1 receives input of information from the user to specify the work to be executed by the robot 40 by the terminal device 20.
- the robot control system 1 outputs information instructing the robot controller 30 to perform work to be executed by the robot 40 from the system control device 10.
- the robot control system 1 causes the robot 40 to perform work by the robot controller 30.
- the robot control system 1 can make the robot 40 execute the work by the abstracted work instruction.
- the user can configure data indicating the contents of the work to be executed by the robot 40 by, for example, an abstract setting from the GUI (Graphical User Interface), and instruct the robot 40 to execute the work. That is, in the robot control system 1 according to the present embodiment, the work is defined or instructed at an abstract particle size like a work instruction executed by human beings, such as which robot 40, when, where, what, and what. You can do it.
- a system in which the user sets the robot's movement in detail by teaching work and causes the robot to execute the work can be considered.
- the teaching work is required for each type of the object to be carried or for each state.
- the work on the robot 40 can be instructed with an abstract particle size.
- the robot 40 can be made to execute the atypical work only by rearranging the data indicating the contents of the work.
- the work contents can be rearranged in a short time and at low cost, and the robot can be used for more general purposes.
- the robot control system 1 further includes a robot controller 30 that controls the robot 40.
- the system control device 10 is communicably connected to each of the terminal device 20 and the robot controller 30.
- the system control device 10 may be communicably connected to each of the terminal device 20 and the robot controller 30 via the network 80.
- the system control device 10 includes a control unit 11, a first interface 13, and a second interface 14.
- the system control device 10 further includes a robot simulator 50 (described later), although it is not essential.
- the system control device 10 is communicably connected to the terminal device 20 via the first interface 13.
- the system control device 10 is communicably connected to the robot controller 30 via the second interface 14.
- the system control device 10 may be connected to the terminal device 20 and the robot controller 30 so as to be able to communicate with each other by wire, or may be connected to each other so as to be able to communicate wirelessly.
- the robot controller 30 and the robot 40 may be connected so as to be able to communicate by wire, or may be connected so as to be able to communicate wirelessly.
- Each component of the robot control system 1 may be communicably connected via a radio base station or an access point 82 (see FIG. 1), or may be communicably connected without a radio base station or an access point 82. May be done.
- the access point 82 refers to a wireless device for connecting terminals having a wireless connection function to each other or connecting to another network, and is typically the first in the OSI (Open System Interconnect) reference model. It is a device that operates with the communication protocol of the layer (physical layer) and the second layer (data link layer).
- Each component of the robot control system 1 may be communicably connected via a dedicated line.
- Each component of the robot control system 1 may be communicably connected to each other in various other forms, not limited to these examples.
- the control unit 11 may be configured to include at least one processor in order to realize various functions of the system control device 10.
- the processor can execute a program that realizes various functions of the system control device 10.
- the processor may be realized as a single integrated circuit.
- the integrated circuit is also referred to as an IC (Integrated Circuit).
- the processor may be realized as a plurality of communicably connected integrated circuits and discrete circuits.
- the processor may be configured to include a CPU (Central Processing Unit).
- the processor may be configured to include a DSP (Digital Signal Processor) or a GPU (Graphics Processing Unit).
- the processor may be implemented on the basis of various other known techniques.
- the system control device 10 further includes a storage unit 12.
- the storage unit 12 may be configured to include an electromagnetic storage medium such as a magnetic disk, or may be configured to include a memory such as a semiconductor memory or a magnetic memory.
- the storage unit 12 may be configured as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
- the storage unit 12 stores various information and programs executed by the control unit 11.
- the storage unit 12 may function as a work memory of the control unit 11. At least a part of the storage unit 12 may be included in the control unit 11.
- the first interface 13 or the second interface 14 may be configured to include a communication device configured to be able to communicate by wire or wirelessly.
- the communication device may be configured to be communicable by a communication method based on various communication standards.
- the first interface 13 or the second interface 14 can be configured by a known communication technique. Detailed description of the hardware of the first interface 13 or the second interface 14 will be omitted.
- the functions of the first interface 13 and the second interface 14 may be realized by one interface or may be realized by separate interfaces.
- the system control device 10 may be communicably connected to the network 80 by the first interface 13 or the second interface 14.
- the system control device 10 may be communicably connected to the terminal device 20 and the robot controller 30 via the network 80.
- the system control device 10 may be configured as a server device.
- the server device may be configured to include at least one information processing device.
- the server device may be configured to cause a plurality of information processing devices to execute parallel processing.
- the server device does not need to be configured to include a physical housing, and may be configured based on virtualization technology such as a virtual machine or a container orchestration system.
- the server device may be configured using a cloud service. When the server device is configured using a cloud service, it can be configured by combining managed services. That is, the function of the system control device 10 can be realized as a cloud service.
- the server device may include at least one server group and at least one database group.
- the server group functions as the control unit 11.
- the database group functions as a storage unit 12.
- the number of server groups may be one or two or more. When the number of server groups is one, the functions realized by one server group include the functions realized by each server group.
- the servers are connected to each other so as to be able to communicate with each other by wire or wirelessly.
- the number of database groups may be one or two or more. The number of databases may be increased or decreased as appropriate based on the amount of data managed by the server device and the availability requirements of the server device.
- the database group is connected to each server group so as to be able to communicate by wire or wirelessly.
- system control device 10 is described as one configuration in FIGS. 1 and 2, a plurality of configurations can be regarded as one system and operated as necessary. That is, the system control device 10 is configured as a platform with variable capacity.
- the system control device 10 is configured as a platform with variable capacity.
- each of the plurality of configurations is connected by a line regardless of whether it is wired or wireless, and is configured to be able to communicate with each other.
- This plurality of configurations may be built across cloud services and on-premises environments.
- system control device 10 is connected to the terminal device 20, the robot controller 30, and the robot 40 controlled by the robot controller 30 by a line regardless of whether it is wired or wireless.
- the system control device 10, the terminal device 20, and the robot controller 30 each have an interface using a standard protocol, and can communicate in both directions.
- the terminal device 20 includes a terminal control unit 21, a communication interface 22, and a user interface 23.
- the terminal control unit 21 may be configured to include at least one processor.
- the terminal control unit 21 may be configured to be the same as or similar to the control unit 11 of the system control device 10.
- the terminal control unit 21 may execute an application that provides a GUI (Graphical User Interface) corresponding to the user interface 23 described later.
- the terminal control unit 21 may provide the GUI by executing a GUI program distributed from another device such as the system control device 10 on the web browser.
- the terminal control unit 21 receives the GUI program from another device such as the system control device 10 based on the request input by the user to the web browser, and the terminal control unit 21 receives the GUI program on the web browser. It may be configured to draw.
- the terminal device 20 may be constructed across a cloud service and an on-premises environment. That is, for example, the user interface 23 may be constructed in an on-premises environment, and the terminal control unit 21 may be constructed as a cloud service.
- the communication interface 22 may be configured to be the same as or similar to the first interface 13 or the second interface 14 of the system control device 10.
- the user interface 23 is configured to provide the above-mentioned GUI to the user.
- the user interface 23 includes an output device that outputs information to the user and an input device that receives input from the user.
- the output device may be configured to include a display device.
- the display device may include, for example, a liquid crystal display (LCD), an organic EL (Electro-Luminescence) display or an inorganic EL display, a plasma display (PDP: Plasma Display Panel), or the like.
- the display device is not limited to these displays, and may be configured to include various other types of displays.
- the display device may be configured to include a light emitting device such as an LED (Light Emitting Diode).
- the display device may be configured to include various other devices.
- the output device may be configured to include an audio output device such as a speaker that outputs auditory information such as audio.
- the output device is not limited to these examples, and may be configured to include various other devices.
- the input device may be configured to include, for example, a touch panel or a touch sensor, or a pointing device such as a mouse.
- the input device may be configured to include a physical key.
- the input device may be configured to include a voice input device such as a microphone.
- the input device is not limited to these examples, and may be configured to include various other devices.
- the terminal device 20 may be configured to include at least one information processing device.
- the number of terminal devices 20 included in the robot control system 1 is not limited to one, and may be two or more.
- each terminal device 20 may accept input from the user.
- the terminal device 20 may be configured as a tablet terminal.
- the terminal device 20 may be configured as a mobile phone terminal such as a feature phone or a smartphone.
- the terminal device 20 may be configured as a PC terminal such as a desktop PC (Personal Computer) or a notebook PC.
- the terminal device 20 is not limited to these examples, and may be configured as various devices capable of providing a GUI and a communication function.
- the terminal device 20 is used by the user to instruct the robot controller 30 and the robot 40 to work via the system control device 10.
- the terminal device 20 instructs the addition or deletion of the software of the library group 333 (see FIG. 3) that defines the operation of the robot 40, or the setting change of the robot control system 1 via, for example, a browser or a dedicated application. May be used for.
- the terminal device 20 may be used to monitor the state of the robot 40.
- the terminal device 20 is not limited to these examples, and can provide various other functions.
- the robot controller 30 includes a robot control unit 31.
- the robot control unit 31 may be configured to include at least one processor.
- the robot control unit 31 may be configured to be the same as or similar to the control unit 11 of the system control device 10.
- the robot controller 30 can acquire job information and task information that specify the work to be executed by the robot 40 from the system control device 10.
- the robot controller 30 outputs information that identifies the operation of the robot 40 based on the job information and the task information.
- the work to be executed by the robot 40 may include, for example, the work of moving the work object between two points.
- the robot controller 30 may include an interface for acquiring job information and task information from the system control device 10. Further, the robot controller 30 may include an interface for performing signal processing with the robot 40. These interfaces of the robot controller 30 may be configured to be the same as or similar to the first interface 13 or the second interface 14 of the system control device 10.
- the robot controller 30 may include a processor that generates information that identifies the operation of the robot 40 based on job information and task information.
- the processor of the robot controller 30 may be configured to be the same as or similar to the processor constituting the control unit 11 of the system control device 10.
- one robot controller 30 is connected to one robot 40.
- One robot controller 30 may be connected to two or more robots 40.
- One robot controller 30 may control only one robot 40, or may control two or more robots 40.
- the number of the robot controller 30 and the robot 40 is not limited to two, and may be one or three or more.
- the robot controller 30 may be constructed across a cloud service and an on-premises environment. That is, for example, the interface with the robot 40 may be constructed in an on-premises environment, and the robot control unit 31 may be constructed as a cloud service.
- the robot 40 may be configured as a robot arm including an arm.
- the arm may be configured as, for example, a 6-axis or 7-axis vertical articulated robot.
- the arm may be configured as a 3-axis or 4-axis horizontal articulated robot or a SCARA robot.
- the arm may be configured as a 2-axis or 3-axis Cartesian robot.
- the arm may be configured as a parallel link robot or the like.
- the number of axes constituting the arm is not limited to the example.
- the robot 40 may include an end effector attached to the arm.
- the end effector may include, for example, a gripping hand configured to grip the work object.
- the gripping hand may have multiple fingers. The number of fingers of the gripping hand may be two or more.
- the fingers of the gripping hand may have one or more joints.
- the end effector may include a suction hand configured to suck a work object.
- the end effector may include a scooping hand configured to be able to scoop the work object.
- the end effector may include a tool such as a drill and may be configured to be capable of performing various processing such as drilling a hole in a work object.
- the end effector is not limited to these examples, and may be configured to perform various other operations.
- the robot 40 can control the position of the end effector by operating the arm.
- the end effector may have a reference axis in the direction of action on the work object.
- the robot 40 can control the direction of the axis of the end effector by operating the arm.
- the robot 40 controls the start and end of the operation of the end effector acting on the work object.
- the robot 40 can move or process a work object by controlling the operation of the end effector while controlling the position of the end effector or the direction of the axis of the end effector.
- the robot 40 may be configured as an automatic guided vehicle (AGV).
- AGV automatic guided vehicle
- the robot 40 may be configured as a drone.
- the robot 40 is not limited to the robot arm or AGV, and may be configured in various other forms such as a vehicle, an electronic device, or a control machine.
- the robot 40 may further include a sensor for detecting the state of each component of the robot 40.
- the sensor may detect information about the actual position or posture of each component of the robot 40, or the speed or acceleration of each component of the robot 40.
- the sensor may detect the force acting on each component of the robot 40.
- the sensor may detect the current flowing through the motor driving each component of the robot 40 or the torque of the motor.
- the sensor can detect the information obtained as a result of the actual operation of the robot 40.
- the robot controller 30 can grasp the result of the actual operation of the robot 40 by acquiring the detection result of the sensor.
- the terminal device 20 receives input from the user, generates information specifying the work to be executed by the robot 40 based on the user's input, and outputs the information to the system control device 10.
- the information that identifies the work to be executed by the robot 40 based on the input of the user is divided into job information and task information.
- the job information specifies the content of the work to be executed by the robot 40.
- the job information corresponds to the contents described in the so-called work instruction sheet, which is a document that defines a work target or a work procedure, which is used when instructing work between humans.
- the task information specifies the robot 40 that executes the work specified by the job information, and also specifies the start condition and the end condition of the work specified by the job information.
- the task information corresponds to information instructing the start or end of work between humans.
- the system control device 10 outputs job information and task information to the robot controller 30.
- the robot controller 30 calls an action library that defines the operation of the robot 40 based on the job information and the task information, and causes the robot 40 to execute the work.
- the robot controller 30 acquires the result of having the robot 40 execute the work as feedback information, and outputs the result to the system control device 10.
- Feedback information is also referred to as FB information.
- the action library may be a program module installed in the robot controller 30.
- the action library is included in the library group 333.
- the robot controller 30 installs at least one action library.
- the library group 333 is composed of an action library installed by the robot controller 30.
- the robot controller 30 can call the action library by passing the information specifying the action library to be called from the installed action libraries included in the library group 333 to the library group 333.
- the information that identifies the action library to call is also called an identifier. In other words, when the robot controller 30 calls an action library from the library group 333, the called action library is specified by an identifier.
- the action library defines a series of processing procedures for controlling the operation of the robot 40.
- the action library contains an undefined part in the processing procedure.
- the robot controller 30 can complete the processing procedure by complementing the undefined portion included in the processing procedure.
- the robot controller 30 calls the action library, the robot controller 30 can complete the processing procedure of the action library by passing a parameter that specifies information that complements an undefined portion of the action library to the action library.
- Information that complements the undefined part of the action library is also referred to as complement information.
- Job information can be configured to be complementary by including complementary information.
- the complementary information has the first complementary information.
- the first complementary information may be, for example, information about a working environment in which the work is performed.
- the first complementary information may be information about a work object.
- the first complementary information may be specified as a physical quantity that controls each component of the robot 40, for example.
- the physical quantity that controls each component of the robot 40 may include, for example, the amount of movement of the arm or the like, the torque to be output to the motor that drives the arm or the like, or the like.
- the first complementary information may include information for specifying a point at which the force of the robot 40, such as a portion where the robot 40 grips the work object, is applied when the robot 40 grips the work object, for example.
- the first complementary information is not limited to these examples, and may include various other information.
- the first complementary information may be represented by a character string, a numerical value, a boolean value, or the like. Further, if the robot 40 can work without complementing the first complementary information, the first complementary information may be a NULL value.
- the complementary information may have the second complementary information.
- the second complementary information is information for complementing other actions when other actions not defined in the action library are required in order to complete the processing of the action library.
- the second complementary information may be specified, for example, as an identifier that specifies an auxiliary library that supplementarily defines the operation of the robot 40.
- the auxiliary library may define algorithms such as procedures or conditions for the robot 40 to recognize a work object, for example.
- the robot controller 30 acquires information about the result produced by causing the robot 40 to perform an operation or process defined by the auxiliary library specified by the identifier.
- the robot controller 30 may control the operation of the robot 40 based on the processing procedure of the completed action library by complementing the undefined portion of the action library with the execution result of the auxiliary library.
- the second complementary information is not limited to these examples, and may include various other information.
- the second complementary information may be represented by a character string, a numerical value, a boolean value, or the like. Further, if the robot 40 can work without complementing the second complementary information, the second complementary information may be a NULL value.
- the complementary information may be acquired from the robot controller 30 in the terminal device 20 according to the input of the user. Further, the robot controller 30 may be acquired by the robot 40. That is, for example, when the user makes an input, complementary information such as a work object or a work environment may be acquired through equipment (for example, a camera) provided in the robot 40. Further, when the user makes an input, complementary information such as an identifier of an auxiliary library to be executed may be acquired according to the information of the robot 40 itself or the equipment information provided in the robot 40.
- the complementary information when acquiring the complementary information, may include both the first complementary information and the second complementary information, and includes either the first complementary information or the second complementary information. You may. Further, both the first supplementary information and the second supplementary information may have NULL values.
- Job information includes identifiers and run-time arguments.
- the identifier is, for example, an ID (Identification) that identifies an action library.
- the run-time argument of the job information is, for example, an argument passed to another program when the work is executed, and the run-time argument may include complementary information that complements the undefined part of the library described above.
- Job information may include multiple identifiers.
- the robot controller 30 calls the action library specified by each identifier and causes the robot 40 to execute the work.
- the job information may include information that identifies the action library that the robot controller 30 calls first among the plurality of action libraries.
- the information that identifies the action library to be called first among multiple action libraries is also called an entry point. That is, an entry point may be defined in the job information. The entry point can be expressed numerically.
- the job information may include information for specifying the action library to be called next by the robot controller 30 based on the result of the operation of the robot 40 controlled based on the action library first called by the robot controller 30. That is, the job information may include information that determines the action library to be called next by the robot controller 30 based on the result of the work first executed by the robot 40. Further, the job information may include information for determining an action library to be called next by the robot controller 30 in order to execute the next work generated based on the result of the work previously executed by the robot 40.
- the work executed first by the robot 40 may include not only the work executed one before but also the work executed two or more times before.
- the job information may include information that defines subsequent processing by the robot controller 30. Subsequent processing can be defined based on the context information output when the robot controller 30 controls the operation of the robot 40.
- the context information corresponds to a series of information output to the storage unit or the like when the program of the action library called by the robot controller 30 is executed.
- the context information may include, for example, data used by the robot 40 for determining the operation, or information indicating the success or failure of the work (process) executed by the robot 40.
- the context information is not limited to these information, and may include various other information. Contextual information can be appropriately defined based on the implementation of the action library.
- the job information may include information that defines a conditional branch based on the operation result of the robot 40.
- the task information includes information that specifies the start condition and end condition of the work specified in the job information.
- the task information further includes job information and information that identifies the robot 40 that executes the work specified by the job information.
- the robot 40 that executes the work is also referred to as a referent.
- information related to the task information information indicating the progress of the work of each robot 40 may be defined when the robot controller 30 causes a plurality of robots 40 to execute the work.
- information related to the task information information for requesting the suspension of the work of the robot 40 or information for canceling the work of the robot 40 may be defined.
- the task information may include information that defines subsequent processing by the robot controller 30. That is, the robot controller 30 may define a conditional branch based on the operation result of the robot 40, and output job information for specifying the work to be performed next based on the condition.
- the library group 333 includes an action library and an auxiliary library.
- the action library includes a program that defines a series of actions (process flow) of the robot 40.
- the program of the action library includes an undefined part in which the operation of the robot 40 is not defined.
- the action library contains information that defines the format of the information that can be accepted as complementary information to the undefined part. That is, the action library contains information that defines the format of the run-time arguments passed from the robot controller 30 when called from the robot controller 30. Information that defines the format of run-time arguments is also referred to as argument definition information.
- the robot controller 30 can complement the undefined portion of the called action library with the complementary information specified by the run-time argument configured according to the format defined by the argument definition information. For example, when the argument definition information defines that the identifier of the auxiliary library can be accepted as a run-time argument, the robot controller 30 passes the identifier of the auxiliary library as a run-time argument and calls the action library.
- the auxiliary library includes a program that is called when the action library is executed.
- the robot controller 30 calls the auxiliary library specified by the identifier and executes the program during the execution of the program of the action library.
- the auxiliary library may include, for example, a program that implements an AI (Artificial Intelligence) inference algorithm used by the robot 40 to recognize a work object.
- AI Artificial Intelligence
- the auxiliary library is not limited to this example, and may include various other programs.
- the action library or auxiliary library is created, for example, by programming by input by the user or a third party other than the user, or by machine learning of AI.
- the created action library or the like may be used by a robot 40 different from the robot 40 used at the time of creation.
- the library group 333 may be managed by meta information.
- the meta information may include an identifier of an action library or an auxiliary library.
- the meta information may include incidental information such as the display name of the action library or the auxiliary library.
- the robot control system 1 can make the robot 40 execute the work by the abstracted work instruction.
- the user can configure data indicating the contents of the work to be executed by the robot 40 by, for example, an abstract setting from the GUI (Graphical User Interface), and instruct the robot 40 to execute the work. That is, in the robot control system 1 according to the present embodiment, the work is defined or instructed at an abstract particle size like a work instruction executed by human beings, such as which robot 40, when, where, what, and what. You can do it.
- the software executed by the control unit 11 of the system control device 10 includes a work content management procedure 322, a work instruction management procedure 323, a work instruction output procedure 326, and a feedback management procedure 328.
- the software executed by the control unit 11 is also referred to as an internal module.
- the control unit 11 refers to the databases 324, 325 and 329 to register data and acquire data. It is assumed that the databases 324, 325 and 329 are stored in the storage unit 12.
- the control unit 11 registers, replaces, or deletes job information in the database 324.
- the control unit 11 registers, replaces, or deletes task information in the database 325.
- the control unit 11 registers, replaces, or deletes feedback information in the database 329.
- the control unit 11 registers the job information in the database 324 based on the request input to the application or the browser executed by the terminal device 20 by executing the work content management procedure 322. Specifically, the control unit 11 acquires job information from the terminal device 20 and registers it in the database 324 by executing the work content management procedure 322. Further, the control unit 11 acquires the job information from the database 324 based on the job information acquisition request by executing the work content management procedure 322, and outputs the job information to the request source.
- the control unit 11 registers the task information in the database 325 based on the request input to the application or the browser executed by the terminal device 20 by executing the work instruction management procedure 323. Specifically, the control unit 11 acquires task information from the terminal device 20 and registers it in the database 325 by executing the work instruction management procedure 323. Further, the control unit 11 acquires the task information from the database 325 based on the task information acquisition request by executing the work instruction management procedure 323, and outputs the task information to the request source.
- the control unit 11 may generate job information or task information based on a request from the terminal device 20 and register it in the database 324 or 325.
- control unit 11 cancels the task information output to the robot controller 30 based on the request from the terminal device 20 or temporarily operates the robot 40 based on the task information by executing the work instruction management procedure 323. You can stop it.
- the control unit 11 By executing the work instruction output procedure 326, the control unit 11 outputs the task information registered in the database 325 to the robot controller 30 and instructs the robot 40 to perform the work.
- the robot controller 30 causes the robot 40 to start the work based on the start condition specified in the task information, and ends the work based on the end condition.
- the control unit 11 may determine the timing at which the task information is output to the robot controller 30. Specifically, the control unit 11 may output the task information to the robot controller 30 at the timing based on the work start condition specified by the task information. For example, when the work start condition is to start the work immediately, the control unit 11 obtains the task information from the terminal device 20 by executing the work instruction output procedure 326, and then immediately obtains the task information. It may be output to the robot controller 30. For example, when the work start condition specifies the start time, the control unit 11 may output the task information to the robot controller 30 at the designated start time by executing the work instruction output procedure 326.
- the work start condition is not limited to these examples, and may be set in various other forms such as a condition based on the state of the work object.
- the control unit 11 uses the instruction to end the work of the robot 40 at the timing based on the end condition as the task information so that the work of the robot 40 is ended based on the end condition of the work specified by the task information. It may be output to. For example, when the work end condition is specified as the number of times the work specified in the job information is executed, the control unit 11 executes the work instruction output procedure 326 after the work is executed the specified number of times. An instruction to end the work of the robot 40 may be output to the robot controller 30. For example, when the end condition of the work specifies the end time, the control unit 11 issues an instruction to end the work of the robot 40 at the specified end time by executing the work instruction output procedure 326. May be output to.
- the robot controller 30 does not determine the end time of the work of the robot 40 by the time inside the robot controller 30.
- the control unit 11 can control the end time of the work of the robot 40 regardless of the reliability of the time inside the robot controller 30.
- the work end condition is not limited to these examples, and may be set in various other forms such as a condition based on the state of the work object.
- control unit 11 When the control unit 11 acquires a request to cancel the task information from the terminal device 20, the control unit 11 may output an instruction to cancel the work of the robot 40 to the robot controller 30 as the task information.
- control unit 11 When the control unit 11 acquires a request from the terminal device 20 to suspend the work of the robot 40, the control unit 11 may output an instruction to suspend the work of the robot 40 to the robot controller 30 as task information.
- the control unit 11 may output an instruction to end the work of the robot 40, an instruction to cancel, an instruction to pause, or the like to the robot controller 30 as information different from the task information.
- the software executed by the robot control unit 31 of the robot controller 30 includes a work instruction acquisition procedure 331 and a work execution procedure 332.
- the robot controller 30 executes the work execution procedure 332 with reference to the library group 333 including the action library installed in advance.
- the robot controller 30 By executing the work instruction acquisition procedure 331, the robot controller 30 acquires job information and task information as work instructions from the work instruction output procedure 326 executed by the system control device 10. The robot controller 30 analyzes the acquired work instruction by executing the work instruction acquisition procedure 331.
- the robot controller 30 controls the operation of the robot 40 based on the analysis result of the work instruction by the work instruction acquisition procedure 331 by executing the work execution procedure 332.
- the robot controller 30 analyzes that the instruction to start the work of the robot 40 is acquired as task information by the work instruction acquisition procedure 331, the robot controller 30 is controlled to cause the robot 40 to start the work by executing the work execution procedure 332. You can do it.
- the robot controller 30 queues (stores in a queue) an instruction to start the work of the robot 40, and takes out the queued instructions one by one to the robot. 40 may be allowed to perform the work.
- the robot controller 30 may control the order in which the robots 40 execute the operations by queuing. For example, when the robot controller 30 acquires an instruction to start the work of the robot 40 while the robot 40 is executing the work, the instruction acquired after the robot 40 completes the work being executed first. The robot 40 may be made to perform the work based on the above.
- the robot controller 30 may be controlled to cancel the work of the robot 40 by executing the work execution procedure 332.
- the robot controller 30 analyzes that the instruction to suspend the work of the robot 40 has been acquired by the work instruction acquisition procedure 331, the robot controller 30 controls the robot 40 to suspend the work by executing the work execution procedure 332. It's okay. If the instruction of the work to be canceled or suspended is queued and not started, the robot controller 30 may delete the queued instruction. The robot controller 30 may acquire an instruction to suspend the work of the robot 40 as task information.
- the robot controller 30 retrieves an instruction to start the work of the robot 40 from the queue by executing the work execution procedure 332.
- the robot controller 30 analyzes the job information and the task information constituting the instruction, and controls the hardware of the robot 40 so that the robot 40 executes the work.
- the robot controller 30 determines the execution procedure of the work content specified in the job information based on the end condition specified in the task information. For example, when the end condition is specified as the number of times the work is executed, the robot controller 30 executes the work execution procedure 332 so that the work content specified by the job is repeatedly executed a specified number of times. Control your hardware.
- the robot controller 30 causes the robot 40 to execute the work by calling and executing the program included in the library group 333 in the work execution procedure 332.
- the library group 333 includes an action library.
- the robot controller 30 causes the robot 40 to execute the work by calling and executing the action library specified by the identifier included in the job information in the work execution procedure 332.
- the library group 333 may further include an auxiliary library. When passing the identifier of the auxiliary library as a run-time argument when calling the action library, the robot controller 30 further calls and executes the auxiliary library when executing the action library.
- the robot controller 30 outputs information regarding the work status of the robot 40 to the system control device 10 by executing the work execution procedure 332.
- Information regarding the work status of the robot 40 is also referred to as feedback information (FB information).
- the FB information may include information that identifies the work being performed by the robot 40.
- the FB information may include information indicating whether the robot 40 has started or ended the work, or information indicating the progress of the work of the robot 40.
- the control unit 11 of the system control device 10 acquires FB information through the second interface 14. By executing the feedback management procedure 328, the control unit 11 registers the FB information in the database 329 and updates the execution status of the work specified by the task information registered in the database 325 in the database 325. to register.
- the robot control system 1 may be configured as a cloud robotics system.
- the robot control system 1 may include at least a part of the configuration illustrated in FIG. 3 as a basic configuration, and may include a configuration capable of communicating with an external system as another configuration.
- the robot control system 1 may be configured to be able to receive a request from an external system and output a request to the external system.
- the system control device 10 may be made redundant by being configured to include a plurality of server devices.
- the system control device 10 can be configured to meet various requirements by making it redundant.
- the server devices are configured to be able to communicate with each other.
- the terminal device 20 receives input of job data, which is definition information indicating the contents of robot work, from the user.
- job data which is definition information indicating the contents of robot work
- the terminal device 20 may allow the user to input job data using the GUI of the application installed in the terminal device 20 or the application distributed from an external device such as the system control device 10 on the web browser.
- the terminal device 20 accepts an input that specifies an action library that causes the robot 40 to perform a pick-and-place operation.
- the terminal device 20 accepts an input for specifying a run-time argument to be passed when calling the action library, if necessary.
- a run-time argument to be passed when calling the action library, if necessary.
- how to specify run-time arguments is defined for each action library. In this example, it is assumed that the run-time arguments passed when calling the action library of the pick-and-place operation are defined as information about the work object.
- the terminal device 20 When the terminal device 20 accepts an input for specifying an action library for pick-and-place operation, it further accepts an input (first complementary information) regarding a work object.
- the terminal device 20 may accept an input that designates a screw as an object of pick-and-place operation.
- the first complementary information called a screw is added to the job information. Therefore, for example, based on the code related to the screw in the action library, the robot 40 is made to perform an operation suitable for pick and place the screw. Can be done.
- the terminal device 20 when the terminal device 20 accepts the input for specifying the action library of the pick and place operation, it further accepts the input regarding the method of recognizing the work place.
- the terminal device 20 accepts an input that specifies inference of object recognition by AI as a method of recognizing a work place.
- the terminal device 20 may recognize that the identifier (second complementary information) of the auxiliary library for inferring the object recognition by AI is input as a run-time argument.
- the terminal device 20 may accept an input that specifies an auxiliary library.
- the terminal device 20 may accept an input regarding a condition for selecting a container for picking a work object and a container for placing the work object from the container candidates recognized by the camera mounted on the robot 40.
- the terminal device 20 may allow the user to select the color of the container as a feature of the container.
- the terminal device 20 may accept input of the condition that, for example, the container for picking the work object is red and the container for placing the work object is blue.
- the terminal device 20 can acquire information specifying how to operate the robot 40 by accepting an input for designating an action library. Further, the terminal device 20 can acquire information for specifying what is the work object of the robot 40 by receiving the input regarding the work object. Further, the terminal device 20 can acquire information specifying where to operate the robot 40 by accepting an input for designating an auxiliary library as a method of recognizing a work place. Specifically, the terminal device 20 can receive a request from the user to cause the robot 40 to perform the work of "picking and placing" the "screw” from the "red” container to the "blue” container.
- this work content is referred to as a pick and place job.
- the first complementary information that the object is a screw is accepted, but for example, the second complementary information that specifies an auxiliary library that can recognize the screw as an object may be accepted.
- the terminal device 20 At time t2, the terminal device 20 generates job information that identifies a pick-and-place job based on user input by the GUI, outputs job information to the system control device 10, and outputs a job information registration request. do.
- the system control device 10 confirms the content of the job information acquired together with the registration request.
- the system control device 10 saves the job information in the database when there is no problem such as inconsistency in the contents of the job information.
- the system control device 10 registers the job information in the database and makes it persistent. Persistence of information means to keep it until instructed to delete it, or to keep it for a predetermined period of time.
- the system control device 10 can call the job information many times during the period in which the job information is stored as valid information.
- the system control device 10 outputs a registration response including a report that registration of job information is completed.
- the terminal device 20 confirms that the job information has been registered by acquiring the registration response.
- the terminal device 20 is information instructing the robot 40 to execute the work specified by the job information. Accepts task data input.
- the terminal device 20 may allow the user to input task data using the GUI of the application installed in the terminal device 20 or the application distributed from an external device such as the system control device 10 on the web browser.
- the terminal device 20 designates the robot 40 that executes the pick-and-place job as a task as "X”, specifies the start condition as "immediately”, and sets the end condition as "up to the specified number of executions". Is specified, and the input that specifies the specified number of times as "5 times" is accepted.
- the terminal device 20 receives the information that specifies "who" by the input for selecting the robot 40, and receives the information that specifies "when” by the input that specifies the start condition or the end condition. Specifically, the terminal device 20 receives data of a task in which a robot 40 called “X” executes a pick-and-place job "immediately” "five times".
- the task data that the robot 40 "X” executes the pick-and-place job "immediately” "five times” corresponds to a work instruction.
- this work instruction is referred to as a pick-and-place task.
- the terminal device 20 At time t5, the terminal device 20 generates task information for specifying the pick-and-place task based on the user's input by the GUI, outputs the task information to the system control device 10, and outputs the task information registration request. do.
- the system control device 10 confirms the content of the task information acquired together with the registration request.
- the system control device 10 stores the task information in the database when there is no problem such as inconsistency in the contents of the task information. In other words, the system control device 10 registers the task information in the database and makes it persistent.
- the system control device 10 stores the task information together with the execution log of the work by the robot 40 as evidence that the work of the robot 40 is instructed by the user.
- the system control device 10 outputs a registration response including a report that registration of task information is completed.
- the terminal device 20 confirms that the task information has been registered by acquiring the registration response.
- the system control device 10 reads the task information registered in the database and outputs a work instruction to the robot controller 30.
- the system control device 10 immediately applies task information to the robot controller 30 that controls the robot 40 "X" specified in the pick-and-place task, according to the start condition being "immediately”. Is output as a work instruction.
- the robot controller 30 analyzes the task information as the acquired work instruction. If there is no problem such as inconsistency in the contents of the task information, the robot controller 30 proceeds to the procedure of instructing the robot 40 to execute the work based on the contents of the task information.
- the robot controller 30 is a system of feedback information reporting that the task information as a work instruction is accepted without any problem and the work of the robot 40 is started. Output to the control device 10.
- the robot controller 30 During the period from time t8 to t9, the robot controller 30 generates information for controlling the hardware of the robot 40 based on the task information and the contents of the job information included in the task information, and outputs the information to the robot 40.
- the robot controller 30 generates information for controlling the hardware of the robot 40 based on the contents of the pick-and-place task and outputs the information to the robot 40.
- the robot 40 operates each component of the hardware by controlling the hardware based on the information acquired from the robot controller 30, and executes the work specified by the pick-and-place task.
- the robot controller 30 determines a work procedure to be executed by the robot 40 based on the end condition specified by the pick-and-place task. Subsequently, the robot controller 30 executes the job based on the determined work procedure.
- a pick-and-place job is specified as the job.
- execution 5 times is specified as the end condition. Therefore, the robot controller 30 causes the robot 40 to execute the work of repeating the pick-and-place job five times.
- the robot controller 30 reads the action library for pick-and-place as an execution module based on the identifier of the action library specified in the job information for specifying the pick-and-place job. Further, the robot controller 30 recognizes an auxiliary library used for recognizing a "screw" as a work object in a pick-and-place operation and a container for picking and placing a "screw” in a pick-and-place operation. Load the auxiliary library used for. Further, the robot controller 30 reads the information for designating the red container as the container to be picked and the information for designating the blue container as the container to be placed.
- the robot controller 30 calls the action library by passing an identifier that specifies the auxiliary library and information that specifies the characteristics of the container as run-time arguments. By executing the called action library, the robot controller 30 can cause the robot 40 to find and pick a screw from the red container and place the picked screw in the blue container.
- the robot controller 30 outputs feedback information to the system control device 10 for reporting the result of causing the robot 40 to execute the work based on the task information.
- the feedback information may include information indicating whether the robot 40 has started the execution of the work or the robot 40 has finished the execution of the work.
- the feedback information may include information indicating whether the execution of the work by the robot 40 is normally completed or interrupted in the middle due to the occurrence of an abnormality.
- the timing at which the robot controller 30 outputs feedback information is not necessarily limited to time t8 or t9.
- the system control device 10 acquires the feedback information, it registers it in the database and makes it permanent. Further, the system control device 10 updates the task information corresponding to the feedback information among the task information registered in the database based on the feedback information.
- the system control device 10 may output the information on the occurrence of the abnormality to the terminal device 20 to notify the user of the error.
- the terminal device 20 receives an input from the user of a request for referring to the execution status of the pick-and-place task instructed to the robot 40.
- the terminal device 20 outputs a reference request to the system control device 10.
- the system control device 10 updates the task information registered in the database based on the feedback information. Therefore, the system control device 10 can read the task information registered in the database and acquire the information regarding the execution status of the pick-and-place task during the period from time t11 to t12.
- the system control device 10 outputs information regarding the execution status of the pick-and-place task to the terminal device 20 as a reference response to the task reference request.
- the terminal device 20 can acquire information on the execution status of the pick-and-place task and refer to it by the user.
- the terminal control unit 21 of the terminal device 20 may execute a terminal control method including the procedure of the flowchart illustrated in FIG.
- the terminal control method may be realized as a terminal control program to be executed by a processor constituting the terminal control unit 21.
- the terminal control program may be stored on a non-temporary computer-readable medium.
- the terminal control unit 21 accepts user input by the user interface 23 (step S51).
- the terminal control unit 21 generates job information based on the user's input (step S52).
- the terminal control unit 21 generates task information based on the user's input (step S53).
- the terminal control unit 21 outputs job information and task information to the system control device 10 (step S54). That is, the terminal control unit 21 outputs task information including job information to the system control device 10. After executing the procedure in step S54, the terminal control unit 21 ends the execution of the procedure shown in the flowchart of FIG.
- the terminal control unit 21 may repeat the execution of the procedure shown in the flowchart of FIG.
- the terminal control unit 21 may execute the procedure of step S52 and the procedure of step S53 in the order of replacement.
- the terminal control unit 21 may execute only one of steps S52 and S53.
- the terminal control unit 21 searches for the job information based on the user's input information instead of generating the job information. , May be obtained from the database.
- the control unit 11 may generate job information and task information.
- the job information may be searched based on the input information of the user and acquired from the database instead of generating the job information.
- a part of the job information or a part of the task information may be created by the terminal control unit 21.
- Other parts may be created by the control unit 11.
- the control unit 11 of the system control device 10 may execute a robot control method including the procedure of the flowchart illustrated in FIG.
- the robot control method may be realized as a robot control program to be executed by a processor constituting the control unit 11.
- the robot control program may be stored on a non-temporary computer-readable medium.
- the control unit 11 acquires job information from the terminal device 20 (step S41). Further, the control unit 11 may register the acquired job information in the database.
- the control unit 11 acquires task information from the terminal device 20 (step S42). Further, the control unit 11 may register the acquired task information in the database.
- the control unit 11 outputs job information and task information to the robot controller 30 (step S43). That is, the control unit 11 outputs task information including job information to the robot controller 30. After executing the procedure in step S43, the control unit 11 ends the execution of the procedure shown in the flowchart of FIG.
- the control unit 11 may repeat the execution of the procedure shown in the flowchart of FIG.
- the control unit 11 may execute the procedure of step S41 and the procedure of step S42 in the order in which they are interchanged.
- the control unit 11 may execute only one of step S41 and step S42. For example, the control unit 11 may acquire only the task information without reacquiring the job information that has already been acquired.
- the robot controller 30 may execute a robot control method including the procedure of the flowchart illustrated in FIG. 7.
- the robot control method may be realized as a robot control program to be executed by a processor constituting the robot controller 30.
- the robot control program may be stored on a non-temporary computer-readable medium.
- the robot controller 30 acquires job information and task information from the system control device 10 (step S61). That is, the robot controller 30 acquires task information including job information from the system control device 10.
- the robot controller 30 may acquire only one of the job information and the task information. For example, the robot controller 30 may acquire only the task information without reacquiring the job information that has already been acquired.
- the robot controller 30 outputs information instructing the robot 40 to operate (step S62).
- the robot controller 30 acquires the result of operation in response to an instruction from the robot 40 (step S63). After executing the procedure in step S63, the robot controller 30 ends the execution of the procedure shown in the flowchart of FIG. 7. The robot controller 30 may repeat the execution of the procedure shown in the flowchart of FIG.
- control unit 11 outputs job information and task information to the robot controller 30. It does not have to be.
- control unit 11 outputs an identifier that identifies the task information to the robot controller 30, and when requested by the robot controller 30, the task information and the job information, or the task information and the job information are used. Further output of the identified library may be made.
- the robot control system 1 realizes an abstract work instruction for the robot 40 with improved autonomy.
- the user can configure the definition information data indicating the work content to be executed by the robot 40 mainly by the setting from the GUI as job information, and instruct the robot 40 to perform the work content.
- teaching is a process that needs to be performed by an engineer with specialized skills. That is, the teaching work is not easy, and it is difficult to perform the teaching work each time, which is one of the factors that hinder the expansion of the utilization destinations of the robot.
- the improvement of the autonomy of the robot 40 itself and the abstraction of the work instruction accompanying the improvement of the autonomy are realized. That is, by leaving an undefined part in the action library and complementing the undefined part based on the input information of the user when using the robot 40, for example, the robot 40 is taught even if it is not exactly the same as the taught environment. Work can be carried out. Since the work can be executed even in an environment different from the environment taught by the robot 40, in other words, the user does not need to make detailed settings in order to make the robot 40 work, and the work instruction is given. Abstraction can be made, and thus the load of teaching work can be reduced. As a result, the work range of the robot 40 can be expanded and the utilization destination of the robot 40 can be expanded.
- the work defined by teaching is often a program managed inside the robot (robot controller).
- Programs managed inside the robot require a unique data format or protocol to instruct the robot controller to work over the network, or require a dedicated program to relay the program.
- Unique or dedicated configurations make it difficult to link with information processing systems operated at production sites such as factories. That is, a system using a program managed inside the robot has a low affinity with an information processing system operated at a production site such as a factory. As a result, the applications of robots are limited.
- the robot control system 1 has an abstract particle size such that a person instructs a person to work, for example, which robot 40, when, where, what, and what. You can define and instruct the robot 40 to work with. By doing so, the robot 40 can be made to execute the atypical work only by rearranging the work definition data (job information) or the work instruction data (task information). As a result, it becomes possible to rearrange the work contents to be executed by the robot 40 in a short time and at low cost.
- job information work definition data
- task information work instruction data
- the autonomous function of the robot 40 is realized by library software (action library or auxiliary library included in library group 333) and software group such as a procedure for executing the library.
- library software action library or auxiliary library included in library group 333
- software group such as a procedure for executing the library.
- the user can additionally install the library software on the robot controller 30 or delete the library software from the robot controller 30.
- the robot 40 can acquire new movements and new judgment criteria.
- the robot control system 1 it becomes easy for the user to create a work instruction for the robot 40 even if the user is an operation person who does not have specialized skills such as program development. ..
- the affinity with other information processing systems used at the production site such as a factory can be enhanced. Since it has a high affinity with other information processing systems, an interface having a high affinity with other information processing systems can be used as an interface such as a GUI used for a user to input an instruction. As a result, a system that causes the robot 40 to execute work in the manufacturing life cycle can be incorporated into a production site such as a factory as one service component.
- the system control device 10 further includes a robot simulator 50, although it is not essential.
- the system control device 10 does not include the robot simulator 50 and may be communicably connected to the robot simulator 50 installed outside.
- the robot simulator 50 can simulate the operation of the robot 40 realized as hardware by information processing.
- the control unit 11 of the system control device 10 may output task information to the robot simulator 50 to simulate the operation of the robot 40, and acquire the simulation result of the operation of the robot 40.
- the robot simulator 50 simulates the operation of the robot 40 by information processing based on the task information and the job information included in the task information.
- the robot simulator 50 outputs the simulation result of the operation of the robot 40 to the control unit 11.
- the control unit 11 may analyze the simulation result and determine whether the operation of the robot 40 based on the task information is completed normally or is stopped halfway due to an abnormality.
- the control unit 11 may be configured to output the task information determined that the operation of the robot 40 is normally completed to the robot controller 30.
- the control unit 11 may be configured not to output the task information determined that the operation of the robot 40 is not normally completed to the robot controller 30.
- the control unit 11 of the system control device 10 may acquire feedback information regarding the result of the work executed by the robot 40 based on the task information from the robot controller 30 by the second interface 14.
- the control unit 11 may update the task information registered in the database 325 based on the acquired feedback information. Further, the control unit 11 may generate new task information based on the acquired feedback information and register it in the database 325.
- the control unit 11 may output the updated or generated task information to the robot controller 30. That is, the control unit 11 may update or generate task information based on the feedback information and output it to the robot controller 30.
- the operation of the robot 40 can be improved by the control unit 11 updating or generating task information based on the feedback information.
- the control unit 11 of the system control device 10 acquires job information and task information from the terminal device 20 and registers them in the database.
- the task information includes job information, and specifies when to make which robot 40 execute the work content of the robot 40 specified by the job information.
- the user may request the robot 40 to perform a task that has been previously performed. In this case, the job information has not changed. Therefore, the control unit 11 can use the job information stored in the database 324 at the time of the previous execution. By doing so, the control unit 11 does not need to acquire the job information again in order to cause the robot 40 to execute the same work. As a result, it becomes easy to repeatedly execute the same work at an arbitrary timing.
- control unit 11 not only temporarily stores the job information but also permanently stores it.
- control unit 11 stores at least the job information associated with the task information that specifies the start condition and the end condition so that the robot controller 30 can instruct the robot 40 to execute the repetitive work. By doing so, it becomes easy for the robot 40 to perform the same work again.
- the same work that is repeatedly executed at an arbitrary timing is also called a repetitive work.
- the robot controller 30 installs the action library or the auxiliary library in advance, calls the action library or the auxiliary library from the library group 333 including the installed action library or the auxiliary library, and causes the robot 40 to execute the work.
- the robot controller 30 acquires and installs data of an action library or an auxiliary library from the system control device 10.
- the system control device 10 may output the data of the new action library or the auxiliary library to the robot controller 30 so that the robot controller 30 can install the new action library or the auxiliary library.
- the system control device 10 causes the robot controller 30 to perform a new operation to be executed by the robot 40, a new object recognition to be executed by the robot 40, or the like. You can acquire various new functions of. As a result, the convenience of the robot control system 1 is enhanced.
- the system control device 10 may output only the necessary action library to the robot controller 30 based on the job information and the task information. Further, only the auxiliary library may be installed in the robot controller 30.
- the terminal device 20 may have a function of registering job information and task information in a database.
- the robot control system 1 does not have to include the system control device 10.
- the system control device 10 may receive input directly from the user by having the function of the user interface 23. In this case, the robot control system 1 does not have to include the terminal device 20.
- a storage medium in which the program is recorded (as an example, an optical disk, It is also possible to take an embodiment as a magneto-optical disk, a CD-ROM, a CD-R, a CD-RW, a magnetic tape, a hard disk, a memory card, or the like).
- the implementation form of the program is not limited to the application program such as the object code compiled by the compiler and the program code executed by the interpreter, and may be in the form of a program module or the like incorporated in the operating system. good.
- the program does not necessarily have to carry out all the processing only in the CPU on the control board, and one of them is an expansion board attached to the board or another processing unit mounted on the expansion unit as needed. Part or all may be configured to be implemented.
- each of the features described in this disclosure may be replaced with an alternative feature that works for the same, equivalent, or similar purpose, unless expressly denied. Therefore, unless expressly denied, each of the disclosed features is merely an example of a comprehensive set of identical or equal features.
- embodiment according to the present disclosure is not limited to any specific configuration of the above-described embodiment.
- the embodiments according to the present disclosure extend to all the novel features described in the present disclosure, or combinations thereof, or all the novel methods described, steps of processing, or combinations thereof. be able to.
- the descriptions such as “first” and “second” are identifiers for distinguishing the configuration.
- the configurations distinguished by the descriptions such as “first” and “second” in the present disclosure can exchange numbers in the configurations.
- the first interface 13 can exchange the identifiers “first” and “second” with the second interface 14.
- the exchange of identifiers takes place at the same time.
- the configuration is distinguished.
- the identifier may be deleted.
- Configurations with the identifier removed are distinguished by a code. Based solely on the description of identifiers such as “first” and “second” in the present disclosure, it shall not be used as an interpretation of the order of the configurations or as a basis for the existence of identifiers with smaller numbers.
- Robot control system 10 System control device (11: control unit, 12: storage unit, 13: first interface, 14: second interface) 20 Terminal devices (21: terminal control unit, 22: communication interface, 23: user interface) 30 Robot controller 40 Robot 50 Robot simulator 80 Network 82 Access point 322 Work content management procedure 323 Work instruction management procedure 326 Work instruction output procedure 328 Feedback management procedure 324 325 329 Database 331 Work instruction acquisition procedure 332 Work execution procedure 333 Library group
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202180070314.7A CN116390835A (zh) | 2020-10-19 | 2021-10-19 | 系统控制装置、机器人控制方法、终端装置、终端控制方法以及机器人控制系统 |
| EP21882840.8A EP4230361A4 (en) | 2020-10-19 | 2021-10-19 | SYSTEM CONTROL DEVICE, ROBOT CONTROL METHOD, TERMINAL DEVICE, TERMINAL CONTROL METHOD AND ROBOT CONTROL SYSTEM |
| US18/032,326 US20230381962A1 (en) | 2020-10-19 | 2021-10-19 | System control device, robot control method, terminal device, terminal control method, and robot control system |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2020175506A JP7278246B2 (ja) | 2020-10-19 | 2020-10-19 | ロボット制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム |
| JP2020-175506 | 2020-10-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022085699A1 true WO2022085699A1 (ja) | 2022-04-28 |
Family
ID=81290561
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2021/038669 Ceased WO2022085699A1 (ja) | 2020-10-19 | 2021-10-19 | システム制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム |
Country Status (5)
| Country | Link |
|---|---|
| US (1) | US20230381962A1 (enExample) |
| EP (1) | EP4230361A4 (enExample) |
| JP (3) | JP7278246B2 (enExample) |
| CN (1) | CN116390835A (enExample) |
| WO (1) | WO2022085699A1 (enExample) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024203437A1 (ja) * | 2023-03-31 | 2024-10-03 | ソニーグループ株式会社 | 制御方法、制御装置、生成方法、および生成装置 |
Families Citing this family (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP7278246B2 (ja) * | 2020-10-19 | 2023-05-19 | 京セラ株式会社 | ロボット制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム |
| EP4382264A4 (en) * | 2021-08-05 | 2025-11-26 | Kyocera Corp | LIBRARY DISPLAY DEVICE, LIBRARY DISPLAY METHOD, AND ROBOT CONTROL SYSTEM |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH054181A (ja) | 1991-06-24 | 1993-01-14 | Toshiba Corp | ロボツト制御装置 |
| JP2005011001A (ja) * | 2003-06-18 | 2005-01-13 | Ricoh Co Ltd | バスモニター装置、プログラムおよび記録媒体 |
| JP6181861B2 (ja) * | 2014-05-01 | 2017-08-16 | 本田技研工業株式会社 | 多関節ロボットのティーチングデータ作成装置及び作成方法 |
Family Cites Families (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP3098470B2 (ja) * | 1997-09-12 | 2000-10-16 | 鹿児島日本電気株式会社 | 無人搬送車の搬送制御装置およびその制御方法 |
| JP4296736B2 (ja) * | 2000-10-13 | 2009-07-15 | ソニー株式会社 | ロボット装置 |
| WO2010092981A1 (ja) * | 2009-02-12 | 2010-08-19 | 三菱電機株式会社 | 産業用ロボットシステム |
| US9840007B1 (en) * | 2014-08-25 | 2017-12-12 | X Development Llc | Robotic operation libraries |
| JP6764796B2 (ja) * | 2017-01-26 | 2020-10-07 | 株式会社日立製作所 | ロボット制御システムおよびロボット制御方法 |
| JP6879009B2 (ja) * | 2017-03-30 | 2021-06-02 | 株式会社安川電機 | ロボット動作指令生成方法、ロボット動作指令生成装置及びコンピュータプログラム |
| JP6557282B2 (ja) * | 2017-05-17 | 2019-08-07 | ファナック株式会社 | 工作機械制御装置及び生産システム |
| US11345040B2 (en) * | 2017-07-25 | 2022-05-31 | Mbl Limited | Systems and methods for operating a robotic system and executing robotic interactions |
| WO2018172593A2 (es) * | 2018-05-25 | 2018-09-27 | Erle Robotics, S.L | Método para integrar nuevos módulos en robots modulares, y un componente de robot del mismo |
| CN108748152B (zh) * | 2018-06-07 | 2021-06-29 | 上海大学 | 一种机器人示教方法及系统 |
| JP2020138255A (ja) * | 2019-02-27 | 2020-09-03 | セイコーエプソン株式会社 | ロボット制御システム及びロボット制御方法 |
| JP7523384B2 (ja) * | 2020-03-27 | 2024-07-26 | 三菱電機株式会社 | タスクを実行するロボットの動作を制御するコントローラおよび方法 |
| CN111596614B (zh) * | 2020-06-02 | 2021-06-25 | 中国科学院自动化研究所 | 基于云边协同的运动控制误差补偿系统及方法 |
| JP7442413B2 (ja) * | 2020-08-28 | 2024-03-04 | 川崎重工業株式会社 | シミュレーション装置及びシミュレーションシステム |
| JP7278246B2 (ja) * | 2020-10-19 | 2023-05-19 | 京セラ株式会社 | ロボット制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム |
-
2020
- 2020-10-19 JP JP2020175506A patent/JP7278246B2/ja active Active
-
2021
- 2021-10-19 CN CN202180070314.7A patent/CN116390835A/zh active Pending
- 2021-10-19 US US18/032,326 patent/US20230381962A1/en active Pending
- 2021-10-19 WO PCT/JP2021/038669 patent/WO2022085699A1/ja not_active Ceased
- 2021-10-19 EP EP21882840.8A patent/EP4230361A4/en active Pending
-
2023
- 2023-01-25 JP JP2023009609A patent/JP7478862B2/ja active Active
- 2023-12-26 JP JP2023219594A patent/JP2024023927A/ja active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH054181A (ja) | 1991-06-24 | 1993-01-14 | Toshiba Corp | ロボツト制御装置 |
| JP2005011001A (ja) * | 2003-06-18 | 2005-01-13 | Ricoh Co Ltd | バスモニター装置、プログラムおよび記録媒体 |
| JP6181861B2 (ja) * | 2014-05-01 | 2017-08-16 | 本田技研工業株式会社 | 多関節ロボットのティーチングデータ作成装置及び作成方法 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2024203437A1 (ja) * | 2023-03-31 | 2024-10-03 | ソニーグループ株式会社 | 制御方法、制御装置、生成方法、および生成装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP7278246B2 (ja) | 2023-05-19 |
| CN116390835A (zh) | 2023-07-04 |
| JP2023038338A (ja) | 2023-03-16 |
| US20230381962A1 (en) | 2023-11-30 |
| JP2022066906A (ja) | 2022-05-02 |
| JP2024023927A (ja) | 2024-02-21 |
| JP7478862B2 (ja) | 2024-05-07 |
| EP4230361A4 (en) | 2025-03-05 |
| EP4230361A1 (en) | 2023-08-23 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7478862B2 (ja) | ロボット制御装置、ロボットコントローラ、ロボット制御方法、端末装置、及び端末制御方法 | |
| JP6875578B2 (ja) | 柔軟な人間−機械協働のためのシステムおよび方法 | |
| Sanfilippo et al. | Controlling Kuka industrial robots: Flexible communication interface JOpenShowVar | |
| Chitta | MoveIt!: an introduction | |
| CN101286058B (zh) | 机器人模块化分布式自适应控制系统及方法 | |
| JP7741166B2 (ja) | プログラム管理装置、ロボット制御システム、及びプログラム管理方法 | |
| CN114932555A (zh) | 机械臂协同作业系统及机械臂控制方法 | |
| CN112068455A (zh) | 任务仿真方法、系统、装置、电子设备及可读存储介质 | |
| JP2023505631A (ja) | ロボット制御システム用の構成可能性フレームワーク | |
| CN120245013B (zh) | 一种基于ue4的机械臂实时仿真与可视化方法 | |
| JP7271806B2 (ja) | システム制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム | |
| EP4389367A1 (en) | Holding mode determination device for robot, holding mode determination method, and robot control system | |
| JP7770416B2 (ja) | 情報処理装置、ロボットコントローラ、ロボット制御システム、及び情報処理方法 | |
| JP7778272B2 (ja) | ロボットシステム、製造方法、及びプログラム | |
| CN114193459A (zh) | 一种机械手臂的控制系统及其测试方法 | |
| WO2024253174A1 (ja) | 教示データ生成装置、ロボットコントローラ、ロボット制御システム、及び教示データ生成方法 | |
| EP4393660A1 (en) | Trained model generation method, trained model generation device, trained model, and device for estimating maintenance state | |
| Cabrera-Gámez et al. | CoolBOT: A component-oriented programming framework for robotics | |
| JP2025060080A (ja) | 学習データ生成装置、学習データ生成方法、及びプログラム | |
| Junior et al. | Scara3D: 3-Dimensional HRI integrated to a distributed control architecture for remote and cooperative actuation | |
| WO2025121241A1 (ja) | 動作計画共通化方法、動作計画共通化システム、プログラム、及び動作計画共通化装置 | |
| CN121245923A (en) | Method and device for robot to interact with environment and electronic equipment | |
| WO2024219466A1 (ja) | ロボットシステム、ロボット制御方法、ロボット制御プログラム、およびプログラム生成システム | |
| JP2023111376A (ja) | ロボット制御システムおよび制御装置 | |
| Gurala | A case study on petri net modeling, animation, and simulation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21882840 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 18032326 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2021882840 Country of ref document: EP Effective date: 20230519 |