WO2022131335A1 - 制御装置、ロボットシステム及び学習装置 - Google Patents
制御装置、ロボットシステム及び学習装置 Download PDFInfo
- Publication number
- WO2022131335A1 WO2022131335A1 PCT/JP2021/046542 JP2021046542W WO2022131335A1 WO 2022131335 A1 WO2022131335 A1 WO 2022131335A1 JP 2021046542 W JP2021046542 W JP 2021046542W WO 2022131335 A1 WO2022131335 A1 WO 2022131335A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- work
- information
- robot
- data
- control device
- Prior art date
Links
- 238000004891 communication Methods 0.000 claims abstract description 103
- 238000012545 processing Methods 0.000 claims description 79
- 238000003860 storage Methods 0.000 claims description 78
- 238000000034 method Methods 0.000 claims description 45
- 230000008569 process Effects 0.000 claims description 31
- 230000008859 change Effects 0.000 claims description 24
- 230000005540 biological transmission Effects 0.000 claims description 12
- 230000009471 action Effects 0.000 claims description 9
- 230000010365 information processing Effects 0.000 description 108
- 230000033001 locomotion Effects 0.000 description 66
- 239000012636 effector Substances 0.000 description 56
- 230000006870 function Effects 0.000 description 29
- 238000001514 detection method Methods 0.000 description 27
- 230000036544 posture Effects 0.000 description 27
- 238000010586 diagram Methods 0.000 description 22
- 230000015654 memory Effects 0.000 description 21
- 238000012986 modification Methods 0.000 description 19
- 230000004048 modification Effects 0.000 description 19
- 230000002093 peripheral effect Effects 0.000 description 14
- 238000012546 transfer Methods 0.000 description 13
- 230000001133 acceleration Effects 0.000 description 7
- 238000013528 artificial neural network Methods 0.000 description 7
- 238000013500 data storage Methods 0.000 description 6
- 238000000227 grinding Methods 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 6
- 238000003466 welding Methods 0.000 description 6
- 238000010422 painting Methods 0.000 description 5
- 238000013473 artificial intelligence Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 238000007789 sealing Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 239000000470 constituent Substances 0.000 description 3
- 239000007787 solid Substances 0.000 description 3
- 235000013361 beverage Nutrition 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 235000002597 Solanum melongena Nutrition 0.000 description 1
- 244000061458 Solanum melongena Species 0.000 description 1
- 230000032683 aging Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000010411 cooking Methods 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 230000000474 nursing effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/1653—Programme controls characterised by the control loop parameters identification, estimation, stiffness, accuracy, error analysis
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
Definitions
- This disclosure relates to control devices, robot systems and learning devices.
- Japanese Patent Application Laid-Open No. 62-199376 discloses a remote manipulation device using a master and a slave. This device creates a procedure plan in which a manual operation part and an automatic operation part are mixed from data indicating the degree of skill of the operator and a work goal, and controls the operation of the slave according to the procedure plan.
- the operation of the slave which requires the judgment of the operator, may correspond to the manual operation portion disclosed in Japanese Patent Application Laid-Open No. 62-199376.
- the operator of the manual operation part is required to have operation skills.
- the control device is a control device that controls a robot to execute a predetermined work by automatic operation, and includes a first processor, wherein the first processor works while the predetermined work is being executed. Acquiring state information including the state of the target work, determining a candidate for a work position related to the work based on the state information, and selecting the work position from the candidates for the work position.
- the selection request for requesting is transmitted to the operation terminal connected to the operation terminal via the communication network and the information of the selected position which is the selected work position is received from the operation terminal, the selection is performed.
- the robot is automatically operated according to the position.
- FIG. 1 is a schematic diagram showing an example of a configuration of a robot system according to an embodiment.
- FIG. 2 is a diagram showing an example of the configuration of the robot area according to the embodiment.
- FIG. 3 is a block diagram showing an example of the hardware configuration of the control device according to the embodiment.
- FIG. 4 is a block diagram showing an example of the functional configuration of the control device according to the embodiment.
- FIG. 5 is a diagram showing an example of work work information included in the first attribute information according to the embodiment.
- FIG. 6 is a diagram showing an example of work work information included in the first attribute information according to the embodiment.
- FIG. 7 is a diagram showing an example of surrounding environment work information included in the second attribute information according to the embodiment.
- FIG. 1 is a schematic diagram showing an example of a configuration of a robot system according to an embodiment.
- FIG. 2 is a diagram showing an example of the configuration of the robot area according to the embodiment.
- FIG. 3 is a block diagram showing an example of the hardware configuration of the control
- FIG. 8 is a diagram showing an example of a candidate for a gripping position of a presentation target determined for the work to be gripped.
- FIG. 9 is a diagram showing an example of candidates for the arrangement position of the presentation target determined for the transport vehicle to which the work is transferred.
- FIG. 10 is a diagram showing a display example of the planned operation of the robot according to the embodiment.
- FIG. 11A is a flowchart showing an example of the operation of the robot system according to the embodiment.
- FIG. 11B is a flowchart showing an example of the operation of the robot system according to the embodiment.
- FIG. 11C is a flowchart showing an example of the operation of the robot system according to the embodiment.
- FIG. 12 is a block diagram showing an example of the functional configuration of the control device and the learning device according to the modified example.
- FIG. 1 is a schematic diagram showing an example of the configuration of the robot system 1 according to the embodiment.
- the robot system 1 is a system that enables a user P, who is an operator at a position away from the robot 110, to operate the robot 110 in a remote access environment.
- the robot system 1 includes components arranged in one or more robot areas AR and components arranged in one or more user area AUs. Although not limited, in the present embodiment, one robot area AR and a plurality of user area AUs exist as targets of the robot system 1.
- Robot area AR is an area where one or more robots 110 are arranged.
- the robot 110 is an industrial robot and performs work.
- the robot 110 may be a service robot, a construction machine, a tunnel excavator, a crane, a cargo handling vehicle, a humanoid, or the like, instead of the industrial robot.
- Service robots are robots used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, and product provision.
- components forming the surrounding environment for the robot 110 to perform work are also arranged.
- the user area AU is an area where the user P who operates the robot 110 stays.
- the user area AU is arranged at a position away from the robot area AR, and in the robot system 1, a large number of users P of a large number of user area AUs operate the robot 110 of the robot area AR. be able to.
- a large number of user area AUs can be located in various locations within the factory premises, including the robot area AR, in various facilities of the company that operates the factory, in various locations throughout Japan, or in various locations around the world. And so on.
- FIG. 2 is a diagram showing an example of the configuration of the robot area AR according to the embodiment.
- the robot system 1 has one or more robots 110, peripheral devices 120 of the robot 110, image pickup devices 131 to 134, control devices 140, and robot communication. Includes device 150.
- the control device 140 is connected to the robot 110, peripheral devices 120, image pickup devices 131 to 134, and the robot communication device 150 via wired communication, wireless communication, or a combination thereof. Any wired or wireless communication may be used.
- the robot communication device 150 is connected to the communication network N so as to be capable of data communication.
- the robot system 1 in each user area AU, includes an operation terminal 210, a user communication device 220, and a presentation device 230.
- the user communication device 220 is connected to the operation terminal 210 and the presentation device 230 via wired communication, wireless communication, or a combination thereof. Any wired or wireless communication may be used.
- the user communication device 220 is connected to the communication network N so as to be capable of data communication. For example, when a plurality of users P exist in one user area AU, one or more operation terminals 210, one or more presentation devices 230, and one or more user communication devices 220 are arranged in the user area AU. May be done.
- the robot system 1 includes a server 310 that is connected to the communication network N so as to be capable of data communication.
- the server 310 manages communication via the communication network N.
- the server 310 includes a computer device.
- the server 310 manages authentication, connection, disconnection, and the like of communication between the robot communication device 150 and the user communication device 220.
- the server 310 stores identification information, security information, and the like of the robot communication device 150 and the user communication device 220 registered in the robot system 1, and uses the information to connect each device to the robot system 1. Authenticate your qualifications.
- the server 310 manages the transmission and reception of data between the robot communication device 150 and the user communication device 220, and the data may pass through the server 310.
- the server 310 may be configured to convert the data transmitted from the source into a data format available to the destination.
- the server 310 may be configured to store and store information, commands, data, and the like transmitted and received between the operation terminal 210 and the control device 140 in the process of operating the robot 110.
- the server 310 is an example of an intermediary device.
- the communication network N is not particularly limited, and may include, for example, a local area network (LAN), a wide area network (WAN), the Internet, or a combination of two or more of these.
- Communication network N includes short-range wireless communication such as Bluetooth (registered trademark) and ZigBee (registered trademark), dedicated network line, dedicated line of telecommunications carrier, public switched telephone network (PSTN), It may be configured to use a mobile communication network, an internet network, satellite communication, or a combination of two or more of these.
- the mobile communication network may use a 4th generation mobile communication system, a 5th generation mobile communication system, or the like.
- the communication network N can include one or more networks. In this embodiment, the communication network N is the Internet.
- the robot 110 includes a robot arm 111 and an end effector 112 attached to the tip of the robot arm 111.
- the robot arm 111 has a plurality of joints and can operate with multiple degrees of freedom.
- the robot arm 111 can move the end effector 112 to various positions and postures.
- the end effector 112 can act on the work W, which is the object of processing.
- the action of the end effector 112 is not particularly limited, but in the present embodiment, it is an action of gripping the work W.
- the robot arm 111 includes, but is not limited to, six joints JT1 to JT6 and servomotors RM1 to RM6 as drive devices for driving each of the joints JT1 to JT6.
- the number of joints of the robot arm 111 is not limited to six, and may be any number of five or less or seven or more.
- the end effector 112 includes a grip portion 112a capable of gripping operation, and a servomotor EM1 as a drive device for driving the grip portion 112a.
- the grip portion 112a may include two or more finger-shaped members that perform gripping operation and gripping release operation by driving the servomotor EM1.
- the drive device of the end effector 112 does not require the servomotor EM1 and may have a configuration according to the configuration of the end effector 112. For example, when the end effector 112 has a configuration in which the work W is attracted by a negative pressure, the end effector 112 is connected to a negative pressure generating device as a driving device thereof.
- the peripheral device 120 is arranged around the robot 110.
- the peripheral device 120 may be operated in cooperation with the operation of the robot 110.
- the operation of the peripheral device 120 may be an operation that gives an action to the work W, or may be an operation that does not give the action.
- the peripheral device 120 includes a belt conveyor 121 capable of transporting the work W and an automatic guided vehicle capable of autonomously transporting the work W (hereinafter, simply referred to as “transport vehicle”). ) 122A and 122B are included.
- the automatic guided vehicle may be an AGV (Automatic Guided Vehicle).
- Peripheral device 120 is not essential. In the following, the names of "belt conveyor 121" and “conveyor vehicles 122A and 122B" may be used when they are individually expressed, and the names of "peripheral equipment 120" may be used when they are collectively expressed. ..
- the image pickup devices 131 to 134 include a camera that captures a digital image, and are configured to send data of the image captured by the camera to the control device 140.
- the control device 140 processes the image data captured by the image pickup devices 131 to 134 into network transmittable data, and sends the image data to the operation terminal 210, the presentation device 230, or both of the user area AU via the communication network N. It may be configured as follows.
- the camera may be a camera capable of capturing an image for detecting a three-dimensional position of the subject with respect to the camera such as a distance to the subject.
- the three-dimensional position is a position in the three-dimensional space.
- the camera may have a configuration such as a stereo camera, a monocular camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, or a camera using a light cutting method. good.
- a stereo camera is used.
- the image pickup device 131 is arranged near the tip of the robot arm 111 and is oriented toward the end effector 112.
- the image pickup apparatus 131 can image the work W on which the end effector 112 acts.
- the image pickup apparatus 131 may be arranged at any position of the robot 110 as long as the work W can be imaged.
- the image pickup device 132 is fixedly arranged in the robot area AR, and images the robot 110 and the belt conveyor 121, for example, from above.
- the image pickup devices 133 and 134 are fixedly arranged in the robot area AR, respectively, and take images of the transport vehicles 122A and 122B waiting at the standby place near the robot 110, for example, from above.
- the image pickup devices 131 to 134 may be provided with a pan head that supports each camera and can operate so as to freely change the direction of the camera.
- the operation of the image pickup devices 131 to 134, that is, the operation of the camera and the pan head is controlled by the control device 140.
- the control device 140 includes an information processing device 141 and a robot controller 142.
- the robot controller 142 is configured to control the operation of the robot 110 and the peripheral device 120.
- the information processing device 141 is configured to process various information, commands, data, and the like transmitted and received between the robot communication device 150 and the user communication device 220.
- the information processing device 141 is configured to process commands, information, data, and the like received from the robot controller 142 and the image pickup device 131 and send them to the operation terminal 210, the presentation device 230, or both.
- the information processing apparatus 141 is configured to process commands, information, data, and the like received from the operation terminal 210 and send them to the robot controller 142.
- the information processing device 141 and the robot controller 142 include a computer device.
- the configuration of the information processing device 141 is not particularly limited, and the information processing device 141 is, for example, an electronic circuit board, an electronic control unit, a microcomputer, a personal computer, a workstation, a smart device such as a smartphone and a tablet, and other electronic devices. And so on.
- the robot controller 142 may include an electric circuit for controlling the electric power supplied to the robot 110 and the peripheral device 120.
- the robot communication device 150 includes a communication interface that can be connected to the communication network N.
- the robot communication device 150 is connected to the control device 140, specifically, the information processing device 141, and connects the information processing device 141 and the communication network N so as to be capable of data communication.
- the robot communication device 150 may include, for example, a communication device such as a modem, an ONU (Optical Network Unit), a router, and a mobile data communication device.
- the robot communication device 150 may include a computer device having a calculation function or the like.
- the robot communication device 150 may be included in the control device 140.
- the operation terminal 210 is configured to receive an input of a command, information, data, etc. by the user P, and output the received command, information, data, etc. to another device.
- the operation terminal 210 includes an operation input device 211 that accepts an input by the user P, and a terminal computer 212.
- the terminal computer 212 processes the commands, information, data, etc. received via the operation input device 211 and outputs them to another device, and receives the input of the commands, information, data, etc. from the other device, and said that. It is configured to perform processing of commands, information, data, etc.
- the operation terminal 210 converts the image data of the image pickup device 131 to 134 sent from the control device 140 into data that can be displayed on the presentation device 230, and outputs and displays the image data on the presentation device 230. ..
- the operation terminal 210 may include the operation input device 211 and the terminal computer 212 as an integrated device or may be included as separate devices.
- the configuration of the operation terminal 210 is not particularly limited, and for example, the operation terminal 210 is a teach pendant used for teaching work to a computer such as a personal computer, a smart device such as a smartphone and a tablet, a personal information terminal, a game terminal, and a robot. It may be a known teaching device such as, a known operating device of a robot, another operating device, another terminal device, a device using these, and an improved device thereof.
- the operation terminal 210 may be a dedicated device devised for the robot system 1, but may be a general-purpose device available in the general market. In this embodiment, a known general-purpose device is used for the operation terminal 210.
- the device may be configured to realize the function of the operation terminal 210 of the present disclosure by installing dedicated software.
- the configuration of the operation input device 211 is not particularly limited, and for example, the operation input device 211 is a device that is input via the operation of the user P such as a button, a lever, a dial, a joystick, a mouse, a key, a touch panel, and a motion capture. It may be included.
- the operation input device 211 includes a general-purpose device known as the above device.
- the operation terminal 210 is a personal computer, a smartphone or a tablet.
- the user communication device 220 and the presentation device 230 are not included, but one or more of these may be included.
- the operation terminal 210 is a smartphone or a tablet, the operation terminal 210 includes a user communication device 220 and a presentation device 230.
- the presentation device 230 includes a display that displays an image on the user P.
- the presentation device 230 displays an image of image data received from the control device 140 via the operation terminal 210. Examples of the image data are image data captured by the image pickup apparatus 131 to 134 and screen data related to the operation of the robot 110.
- the presenting device 230 may include a speaker that emits voice to the user P.
- the presentation device 230 outputs the sound of the voice data received from the control device 140 via the operation terminal 210.
- the presentation device 230 may be included in the operation terminal 210.
- the user communication device 220 includes a communication interface that can be connected to the communication network N.
- the user communication device 220 is connected to the operation terminal 210, and connects the operation terminal 210 and the communication network N so as to be capable of data communication.
- the user communication device 220 may include communication devices such as modems, ONUs, routers and mobile data communication devices, for example.
- the user communication device 220 may include a computer device having a calculation function or the like.
- the user communication device 220 may be included in the operation terminal 210.
- FIG. 3 is a block diagram showing an example of the hardware configuration of the control device 140 according to the embodiment.
- the information processing apparatus 141 includes a processor 1411, a memory 1412, a storage 1413, and input / output I / F (Interface: Interface) 1414 to 1416 as components.
- the components of the information processing apparatus 141 are connected to each other by the bus 1417, but may be connected by any wired communication or wireless communication.
- the robot controller 142 includes a processor 1421, a memory 1422, an input / output I / F 1423, communication I / F 1424 and 1425, and a drive I / F 1426 as components.
- the robot controller 142 may include storage.
- the components of the robot controller 142 are interconnected by bus 1427, but may be connected by any wired or wireless communication. Not all of the components included in the information processing apparatus 141 and the robot controller 142 are indispensable.
- the information processing apparatus 141 includes a circuit, and the circuit includes a processor 1411 and a memory 1412.
- the robot controller 142 includes a circuit, which includes a processor 1421 and a memory 1422. These circuits may include processing circuits.
- the circuit, processor 1411 and memory 1412 of the information processing apparatus 141 may be separate or integrated with respect to the circuit, processor 1421 and memory 1422 of the robot controller 142, respectively.
- the circuit sends and receives commands, information, data, etc. to and from other devices.
- the circuit inputs signals from various devices and outputs control signals to each controlled object.
- Processors 1411 and 1421 are examples of the first processor.
- the memories 1412 and 1422 store programs executed by the processors 1411 and 1421, various data, and the like, respectively.
- the memories 1412 and 1422 may include, for example, a storage device such as a volatile memory and a semiconductor memory which is a non-volatile memory.
- the memory 1412 and 1422 include, but are not limited to, a RAM (RandomAccessMemory) which is a volatile memory and a ROM (Read-OnlyMemory) which is a non-volatile memory.
- the memories 1412 and 1422 are examples of the first storage device.
- Storage 1413 stores various data.
- the storage 1413 may include a storage device such as a hard disk drive (HDD: Hard Disk Drive) and a solid state drive (SSD: Solid State Drive).
- the storage 1413 is an example of the first storage device.
- Both processors 1411 and 1421 form a computer system together with RAM and ROM.
- the computer system of the information processing apparatus 141 may realize the function of the information processing apparatus 141 by the processor 1411 using the RAM as a work area to execute a program recorded in the ROM.
- the computer system of the robot controller 142 may realize the function of the robot controller 142 by the processor 1421 using the RAM as a work area to execute a program recorded in the ROM.
- Some or all of the functions of the information processing apparatus 141 and the robot controller 142 may be realized by the computer system, or may be realized by a dedicated hardware circuit such as an electronic circuit or an integrated circuit, and the computer system and the robot controller 142 may be realized. It may be realized by a combination of hardware circuits.
- the information processing device 141 and the robot controller 142 may be configured to execute each process by centralized control by a single device, or may be configured to execute each process by distributed control by cooperation of a plurality of devices. May be done.
- the processors 1411 and 1421 are CPU (Central Processing Unit), MPU (MicroProcessingUnit), GPU (GraphicsProcessingUnit), microprocessor (microprocessor), processor core (processorcore), and the like.
- CPU Central Processing Unit
- MPU MicroProcessingUnit
- GPU GraphicsProcessingUnit
- microprocessor microprocessor
- processor core processor core
- multiprocessor ASIC (Application-Specific Integrated Circuit), FPGA (Field Programmable Gate Array), etc., by a logic circuit or dedicated circuit formed in an IC (integrated circuit) chip, LSI (Large Scale Integration), etc.
- Each process may be realized.
- the plurality of processes may be realized by one or a plurality of integrated circuits, or may be realized by one integrated circuit.
- the information processing device 141 and the robot controller 142 may be configured to include at least a part of each other's functions, or may be integrated.
- the first input / output I / F 1414 of the information processing device 141 connects the information processing device 141 and the robot controller 142, and enables input / output of information, commands, data, etc. between them.
- the second input / output I / F 1415 connects the information processing device 141 and the robot communication device 150, and enables input / output of information, commands, data, and the like between them.
- the third input / output I / F 1416 connects the information processing device 141 and the image pickup devices 131 to 134, and enables input / output of information, commands, data, and the like between them.
- the input / output I / F 1423 of the robot controller 142 connects the robot controller 142 and the first input / output I / O 1414 of the information processing device 141, and enables input / output of information, commands, data, etc. between them. ..
- the first communication I / F 1424 connects the robot controller 142 and the belt conveyor 121 via wired communication, wireless communication, or a combination thereof, and enables transmission / reception of signals and the like between them.
- the first communication I / F 1424 may include a communication circuit.
- the robot controller 142 may be configured to receive a signal indicating an operating state of the belt conveyor 121 such as operation execution, operation stop, and operation speed, and control the operation of the robot 110 according to the operation state. ..
- the robot controller 142 may be configured to control the operation of the belt conveyor 121 by transmitting a signal instructing the operation state to the belt conveyor 121 according to the processing status of the work W such as the transfer status.
- the second communication I / F 1425 connects the robot controller 142 and the transport vehicles 122A and 122B via wired communication, wireless communication, or a combination thereof, and enables transmission / reception of signals and the like between them.
- the second communication I / F 1425 may include a communication circuit.
- the robot controller 142 receives signals indicating the operating states of the transport vehicles 122A and 122B such as the position with respect to the robot 110, the arrival at the standby position, and the departure from the standby position, and the robot 110 operates according to the operating state. May be configured to control.
- the robot controller 142 may be configured to transmit a signal instructing an operating state to the transport vehicles 122A and 122B according to a processing status of the work W such as a loading status to control the operation of the transport vehicles 122A and 122B. ..
- the drive I / F 1426 connects the robot controller 142 and the drive circuit 113 of the robot 110, and enables transmission / reception of signals and the like between them.
- the drive circuit 113 is configured to control the power supplied from the servomotor RM1 of the robot arm 111 to the RM6 and the servomotor EM1 of the end effector 112 according to the command value included in the signal received from the robot controller 142.
- the drive circuit 113 can be driven from the servomotors RM1 to RM6 and EM1 in cooperation with each other.
- the robot controller 142 may be configured to servo control the RM6 and EM1 from the servomotors RM1.
- the robot controller 142 receives from the drive circuit 113 the detection values of the rotation sensors included in the servomotors RM1 to RM6 and EM1 and the command values of the currents from the servomotors RM1 to RM6 and EM1 from the drive circuit 113 as feedback information. do.
- the robot controller 142 determines the drive command values of the servomotors RM1 to RM6 and EM1 using the feedback information, and transmits the command values to the drive circuit 113.
- the robot controller 142 may be configured to cooperate with each other in the axis control of a plurality of servomotors.
- the robot controller 142 controls the servomotors RM1 to RM6 as robot axis control which is a part of the plurality of axis controls, and controls the servomotor EM1 as an external axis control which is a part of the plurality of axis controls. It may be configured as follows.
- the terminal computer 212 of the operation terminal 210 includes a processor and a memory like the information processing device 141 and the like.
- the terminal computer 212 is an input / output for establishing a connection between the terminal computer 212 and the operation input device 211, a connection between the terminal computer 212 and the user communication device 220, and a connection between the terminal computer 212 and the presentation device 230.
- I / F may also be included.
- FIG. 4 is a block diagram showing an example of the functional configuration of the control device 140 according to the embodiment.
- the information processing apparatus 141 includes a reception information processing unit 141a, a transmission information processing unit 141b, an image pickup control unit 141c, an image processing unit 141d1 to 141d3, a model generation unit 141e, a candidate determination unit 141f, and a scheduled operation detection unit.
- 141 g, an operation command unit 141h, attribute information processing units 141i1 and 141i2, and storage units 141s1 to 141s5 are included as functional components.
- the functions of the functional components other than the storage units 141s1 to 141s5 are realized by the processor 1411 and the like, and the functions of the storage units 140s1 to 140s5 are realized by the memory 1412, the storage 1413 or a combination thereof and the like. Not all of the above functional components are required.
- the robot controller 142 includes a drive command unit 142a, an motion information processing unit 142b, and a storage unit 142c as functional components.
- the functions of the drive command unit 142a and the operation information processing unit 142b are realized by the processor 1421 and the like, and the functions of the storage unit 142c are realized by the memory 1422 and the like. Not all of the above functional components are required.
- the storage units 141s1 to 141s5 and 142c store various information and data, and enable the stored information and data to be read out.
- the first storage unit 141s1 stores a control program for causing the robot 110 to automatically execute a predetermined work.
- the control program includes a calculation program, a calculation formula, or a combination thereof for calculating the operation amount, operation direction, operation speed, acceleration, etc. of each part of the robot 110 in the process of causing the robot 110 to perform a desired operation. obtain.
- the control program automatically transfers the work W, which is a beverage bottle conveyed by the belt conveyor 121, to the transfer vehicle 122A or 122B to the robot 110. It is a program to be executed by.
- the work W includes two types of work WA and WB, the work WA is transferred to the transport vehicle 122A, and the work WB is transferred to the transport vehicle 122B.
- the first storage unit 141s1 may store the information of the robot 110.
- the information of the robot 110 may include, for example, the type, identification information and characteristics of the robot arm 111 and the end effector 112.
- the characteristics of the robot arm 111 may include the position, model, shape, dimensions, operating direction and operating range of the robot arm 111, and the position, type, operating direction and operating range of the joint, and the like.
- the characteristics of the end effector 112 may include the shape and dimensions of the end effector 112, the position of the operating portion of the end effector 112, the operating direction, the operating range, and the like.
- the properties of the robot arm 111 and the end effector 112 may include their elasticity, plasticity, toughness, brittleness, malleability and the like.
- the information of the robot 110 may include a virtual model such as a two-dimensional model and a three-dimensional model of the robot arm 111 and the end effector 112.
- the second storage unit 141s2 stores information about the work W.
- the second storage unit 141s2 stores the first attribute information of the work W.
- the first attribute information includes the characteristics of the work W and the work work information which is information related to the predetermined work set in the work W.
- the characteristics of the work W may include, for example, the type, designation, identification information, characteristics, and the like of the work W.
- the characteristics of the work W may include the shape, dimensions, weight, elasticity, plasticity, toughness, brittleness, ductility, hollowness, solidity, center of gravity position, opening position and the like of the work W.
- the characteristics of the work W may include the presence / absence, type and amount of contents, and the presence / absence of closure of the opening.
- the first attribute information may include a virtual model of the work W such as a two-dimensional model and a three-dimensional model of the work W as a feature of the work W.
- the work work information is the order in which the work W receives processing, the speed and acceleration that can be given to the work W, the position on the work W that can give the action by the end effector 112, and when the action by the end effector 112 is given. It may include the state of the work W of.
- the work work information may include a position where the work W can be gripped by the end effector 112, and may include, for example, a candidate for a gripping position.
- the work work information may include candidates for the posture of the work W during transfer by the robot 110.
- the work work information may include a gripping force that can be applied to the work W during gripping by the end effector 112, and may include, for example, a candidate for gripping force.
- the characteristics of the work W and the work work information can be set according to the work W and the predetermined work.
- FIG. 5 is a diagram showing an example of work work information included in the first attribute information according to the embodiment.
- a gripping position GP1 at the upper end which is an open end
- a gripping position GP2 at the bottom end and a plurality of gripping positions GP3 to GP8 on the side portion between the upper end and the bottom end.
- the work work information of the work WA includes the information of the gripping positions GP1 to GP8.
- the work work information of the work WB may also include information on the gripping position similar to the work work information of the work WA.
- FIG. 6 is a diagram showing another example of work work information included in the first attribute information according to the embodiment.
- the postures Pa, Pb, Pc and Pd of the work WA are preset as candidates for the posture of the work WA during transfer from the belt conveyor 121 to the conveyors 122A and 122B.
- the work work information of the work WA includes information on postures Pa, Pb, Pc and Pd.
- the work work information of the work WB may also include information of the same posture as the work work information of the work WA.
- the third storage unit 141s3 includes information about components other than the robot 110 in the robot area AR.
- the third storage unit 141s3 stores the second attribute information and the third attribute information.
- the second attribute information includes the characteristics of the surrounding environment of the work W and the surrounding environment work information which is information about the predetermined work set in the surrounding environment.
- the characteristics of the surrounding environment of the work W include, for example, the characteristics of work handling elements such as devices, equipment, and equipment that handle the work W other than the robot 110.
- the characteristics of the surrounding environment of the work W include the characteristics of the peripheral device 120, specifically, the characteristics of the belt conveyor 121 and the transport vehicles 122A and 122B.
- the third attribute information includes the characteristics of the components in the robot area AR other than the surrounding environment included in the second attribute information.
- the third attribute information includes the features of the image pickup devices 131 to 134.
- the feature may include, for example, identification information and characteristics of the image pickup devices 131 to 134.
- the characteristics of the image pickup devices 131 to 134 may include the position, posture, shape, dimensions, installation method, and required separation distance from the image pickup devices 131 to 134.
- the characteristics of the belt conveyor 121 may include, for example, the type, name, identification information, characteristics, and the like of the belt conveyor 121.
- the characteristics of the belt conveyor 121 include the position, posture, shape and dimensions of the belt conveyor 121, the position, posture, shape and dimensions of the work area of the robot 110 in the belt conveyor 121, and the presence or absence of an object that obstructs the space around the work area.
- the position, posture, shape, dimensions, etc. of the object may be included.
- the characteristics of the transport vehicles 122A and 122B may include, for example, the types, names, identification information, characteristics, and the like of the transport vehicles 122A and 122B.
- the characteristics of the transport vehicles 122A and 122B may include the standby position, standby posture, shape and dimensions of the transport vehicles 122A and 122B, and the characteristics of the mounting portions 122Aa and 122Ba of the work WA and WB in the transport vehicles 122A and 122B. ..
- the standby position and the standby posture are the positions and orientations of the transport vehicles 122A and 122B when the work WA and WB are processed by the robot 110.
- the properties of the mounting portions 122Aa and 122Ba may include the position, shape, dimensions, tilt amount, elasticity, plasticity, toughness, brittleness and ductility of the surfaces of the mounting portions 122Aa and 122Ba.
- the shape of the surface may include a planar shape in a plan view, a concave-convex shape in the vertical direction, and the like.
- the characteristics of the transport vehicles 122A and 122B may include the presence or absence of an object that blocks the space around the mounting portions 122Aa and 122Ba, and the position, shape, and dimensions of the object.
- the second attribute information may include a virtual model such as a two-dimensional model and a three-dimensional model of the belt conveyor 121 and the conveyors 122A and 122B as features of the belt conveyor 121 and the conveyors 122A and 122B.
- the third storage unit 141s3 may include a virtual model of the constituent elements as information of the constituent elements other than the belt conveyor 121 and the transport vehicles 122A and 122B in the robot area AR.
- the surrounding environment work information includes, for example, information on predetermined work for a work handling element that handles a work W other than the robot 110.
- the surrounding environment work information may include the position, state, processing method, etc. of the work W when the work W is processed in the work handling element.
- the surrounding environment work information of the transport vehicles 122A and 122B is a candidate for the placement position of the work WA and WB on the surface of the mounting portions 122Aa and 122Ba, a candidate for the placement posture, a candidate for the placement order, and a placement direction.
- the placement direction may indicate the transfer direction of the work WA and WB to the mounting portions 122Aa and 122Ba.
- the arrangement method may indicate the degree of impact that the work WA and WB give to the mounting portions 122Aa and 122Ba at the time of mounting, the acceleration of the work WA and WB, and the like.
- FIG. 7 is a diagram showing an example of surrounding environment work information included in the second attribute information according to the embodiment.
- FIG. 7 shows a plan view of the mounting portion 122Aa of the transport vehicle 122A.
- the transport vehicle 122A includes a wall 122Ab that surrounds the mounting portion 122Aa and extends upward from the mounting portion 122Aa.
- the wall 122Ab forms a U-shape in a plan view from above, and a part of the mounting portion 122Aa is laterally opened between the open ends 122Aba and 122Abb.
- the wall 122Ab opens the mounting portion 122Aa upward.
- On the surface of the mounting portion 122Aa a plurality of placement positions P1 to P20 at the bottom of the work WA are preset as candidates for placement positions.
- the arrangement positions P1 to P20 are arranged in 4 rows ⁇ 5 columns.
- the surrounding environment work information of the transport vehicle 122A includes the posture of the work WA at the arrangement positions P1 to P20, the arrangement positions P1 to P20, the arrangement direction of the work WA from the arrangement positions P1 to P20, and the arrangement position P1 of the work WA. It includes a method of placing the P20 on the P20 with a low impact, that is, with a low acceleration.
- the peripheral environment work information of the transport vehicle 122B may include the same information as the peripheral environment work information of the transport vehicle 122A.
- the fourth storage unit 141s4 stores route-related data.
- the route-related data includes information for determining the movement route of the end effector 112 when performing a predetermined work.
- the route-related data includes information for determining the movement route of the end effector 112 when performing the operation when at least a part of the operation included in the predetermined work is specified.
- the route-related data may include an arithmetic expression, an arithmetic program, or a combination thereof of the movement path of the end effector 112.
- the fifth storage unit 141s5 stores and stores the log information of the robot 110.
- the fifth storage unit 141s5 uses as log information information about the operation result of the robot arm 111 and the end effector 112 of the robot 110 that executes a predetermined work, a command for the operation result, or an operation result including both of them.
- the stored log information may include information about all the operation results of the robot arm 111 and the end effector 112.
- the stored log information may include at least information on the operation result in the work area of the end effector 112 on the belt conveyor 121 and its vicinity and information on the operation result in the conveyors 122A and 122B and their vicinity. ..
- the storage unit 142c stores information for generating a drive command using the operation command received from the operation command unit 141h by the drive command unit 142a.
- the reception information processing unit 141a receives commands, information, data, etc. from the operation terminal 210 via the communication network N and the robot communication device 150, and sends them to the corresponding functional components in the information processing device 141.
- the reception information processing unit 141a may have a function of converting received commands, information, data, and the like into a data type that can be processed in the information processing apparatus 141.
- the transmission information processing unit 141b transmits commands, information, data, etc. output by each functional component of the information processing device 141 to the operation terminal 210 via the robot communication device 150 and the communication network N.
- the transmission information processing unit 141b may have a function of converting commands, information, data, and the like to a transmission target into a data type capable of network communication.
- the image pickup control unit 141c controls the operation of the image pickup devices 131 to 134 and outputs the image data captured by the image pickup device 131 to 134.
- the image pickup control unit 141c controls the operation of the camera and the pan head of the image pickup devices 131 to 134 according to the command received from the operation terminal 210.
- the image pickup control unit 141c receives image data from the image pickup device 131 and outputs the image data to the first image processing unit 141d1 and the transmission information processing unit 141b.
- the image pickup control unit 141c transmits the image data of the image pickup apparatus 131 to 134 designated by the command from the operation terminal 210 to the operation terminal 210.
- the operation terminal 210 outputs the image data to the presentation device 230 and displays it.
- the first image processing unit 141d1 processes the image data captured by the image pickup device 131, the image pickup device 132, or both of them.
- the first image processing unit 141d1 performs image processing for extracting the work W and surrounding components from the image indicated by the image data. Further, the first image processing unit 141d1 may detect the three-dimensional position of the subject projected on the pixel for displaying the work W and surrounding components.
- the first image processing unit 141d1 may extract edges from each of the two image data captured at the same time by the stereo camera of the image pickup apparatus 131 or 132. Further, the first image processing unit 141d1 compares the extracted edge with the shape of the work W included in the first attribute information stored in the second storage unit 141s2 by a pattern matching method or the like, and the edge of the work W. May be specified. The first image processing unit 141d1 obtains the extracted edge, the shape of the belt conveyor 121 included in the second attribute information stored in the third storage unit 141s3, and the shape of an object that blocks the space around the work area. The edges of the belt conveyor 121 and the above-mentioned object may be specified as the edges of surrounding components by comparing with a pattern matching method or the like.
- the first image processing unit 141d1 processes the pixels that project the work W and the surrounding components between the two image data by a stereo matching method or the like, and detects the distance between the subject and the camera projected on each pixel. You may. Further, the first image processing unit 141d1 may detect a three-dimensional position in the three-dimensional space in which the robot system 1 exists for the subject projected on each pixel.
- the model generation unit 141e generates a virtual model of the work W and surrounding components extracted from the image data by the first image processing unit 141d1. For example, the model generation unit 141e uses the information about the work W stored in the second storage unit 141s2 and the information about the surrounding components stored in the third storage unit 141s3, and the work is projected by the image data. Generate a virtual model that represents W and its surrounding components. For example, the model generation unit 141e may generate a three-dimensional CAD (Computer-Aided Design) model of the work W and surrounding components. The model generation unit 141e and the first image processing unit 141d1 can detect the state information including the state of the work W by generating the above-mentioned virtual model. The virtual model representing the work W and its surrounding components is an example of state information regarding the work W.
- CAD Computer-Aided Design
- the state information may include, for example, various information indicating the state of the work W and surrounding components.
- the state information includes the position, posture, position and movement direction and position of the work W and surrounding components in addition to or instead of the virtual model of the work W and surrounding components. And the movement speed of the posture may be included.
- the state information may include various information indicating the state of the surrounding environment of the work W.
- the state information may include the arrangement state of the work W in the transport vehicle 122A or 122B to which the work W is transferred.
- the candidate determination unit 141f determines a candidate for a work position related to the work W based on the state information, and outputs the candidate to the second image processing unit 141d2.
- Candidates for the work position regarding the work W include candidates for the gripping position of the presentation target in the work W to be gripped by the robot 110, and candidates for the placement position of the presentation target in the transport vehicle 122A or 122B to which the work W is transferred.
- the candidate determination unit 141f searches for the fifth storage unit 141s5, and outputs information on the gripping position and the arrangement position previously determined for the work in the same state as the work W to the second image processing unit 141d2. You may.
- the candidate determination unit 141f uses the model of the work W generated by the model generation unit 141e and the surrounding constituent elements as the state information, and from among the candidates for the gripping position of the work W included in the first attribute information. , Determines a candidate for the gripping position to be presented to the user P. For example, when the model of the work WA is in an upright state as in the posture Pa in FIG. 6, the candidate determination unit 141f determines the grippable gripping positions GP1 and GP3 to GP8 as presentation targets, as shown in FIG. do.
- FIG. 8 is a diagram showing an example of a candidate for a gripping position of a presentation target determined by the work WA of the gripping target.
- the candidate determination unit 141f may determine the gripping positions GP3 to GP8 as the presentation target when the work WA is lying down. For example, when another work is stacked on the lying work WA, the candidate determination unit 141f may determine the grippable gripping position among the gripping positions GP1 to GP8 as the presentation target.
- the candidate determination unit 141f determines a candidate for the arrangement position to be presented to the user P from the arrangement positions of the transfer destination transport vehicles 122A or 122B included in the second attribute information.
- the candidate determination unit 141f may determine the arrangement position on which the work W can be placed as the presentation target by using the information of the arrangement position where the work W is already arranged as the state information. ..
- the information on the arrangement position may be stored in the third storage unit 141s3, the fifth storage unit 141s5, or the like.
- the candidate determination unit 141f determines the remaining arrangement positions P8 to P20 from which the arrangement positions where the work WA is already arranged are excluded as the presentation target.
- FIG. 9 is a diagram showing an example of candidates for the arrangement position of the presentation target determined on the transport vehicle 122A to which the work WA is transferred.
- the second image processing unit 141d2 images the candidates determined by the candidate determination unit 141f and transmits them to the operation terminal 210.
- the candidate may include a candidate for the gripping position of the presentation target in the work W and a candidate for the placement position of the presentation target in the carrier vehicle 122A or 122B.
- the second image processing unit 141d2 performs information on the gripping position and the placement position previously determined for the work in the same state as the work W together with the candidate for the gripping position to be presented and the candidate for the placement position to be presented. , May be clearly shown in the image. Thereby, the user of the operation terminal 210 can determine the gripping position and the arrangement position with reference to the past information indicating the similar state.
- the second image processing unit 141d2 uses the information of the candidate of the gripping position of the presentation target of the work WA and the model of the work WA generated by the model generation unit 141e to obtain the image IA as shown in FIG. Data may be generated.
- the image IA clearly shows the gripping positions GP1 and GP3 to GP8, which are candidates for presentation, on the image of the model of the work WA.
- the second image processing unit 141d2 transmits the image IA data and the request for selecting the gripping position to the operation terminal 210.
- the second image processing unit 141d2 may transmit the information of the gripping positions GP1 and GP3 to the GP8 to the operation terminal 210 instead of the data of the image IA.
- the second image processing unit 141d2 may synthesize the image IA data with the image data captured by the image pickup device 131, the image pickup device 132, or both of them. As a result, the gripping positions GP1 and GP3 to GP8 are clearly indicated in the image of the work WA projected on the image captured by the image pickup device 131, the image pickup device 132, or both of them.
- the user P of the operation terminal 210 selects one from the gripping positions GP1 and GP3 to GP8 presented by the operation terminal 210 to the presentation device 230, inputs the selection result as the selection work position to the operation terminal 210, and inputs the information processing device to the operation terminal 210. Send to 141.
- the selected working position is an example of the selected position.
- the second image processing unit 141d2 shows in FIG. 9 using the information of the candidate of the arrangement position of the presentation target of the transport vehicle 122A or 122B and the information of the transport vehicle 122A or 122B included in the second attribute information.
- the data of the image IB may be generated.
- the image IB clearly indicates the placement positions P8 to P20, which are candidates for presentation, on the image of the mounting portion 122Aa or 122Ba.
- the second image processing unit 141d2 transmits the data of the image IB and the request for selecting the arrangement position to the operation terminal 210.
- the second image processing unit 141d2 may transmit the information of the arrangement positions P8 to P20 to the operation terminal 210 instead of the data of the image IB.
- the second image processing unit 141d2 may synthesize the image IB data with the image data captured by the image pickup apparatus 133 or 134. As a result, the placement positions P8 to P20 are clearly indicated in the image of the mounting portion 122Aa or 122Ba projected on the image captured by the image pickup apparatus 133 or 134.
- the user P of the operation terminal 210 selects one from the arrangement positions P8 to P20 presented by the operation terminal 210 to the presentation device 230, inputs the selection result as the selection work position to the operation terminal 210, and inputs the selection result to the information processing device 141. Send it.
- the scheduled motion detection unit 141g receives information on the selected work position selected from the work position candidates related to the work W from the operation terminal 210.
- the scheduled motion detection unit 141g detects the scheduled motion of the robot 110 according to the selected work position by using the information of the selected work position and the route-related data stored in the fourth storage unit 141s4.
- the scheduled motion detection unit 141g determines the gripping position of the work W included in the selected work position as the start point, determines the arrangement position of the work W included in the selected work position as the end point, and reaches the end point from the start point.
- the movement path of the end effector 112 is calculated by using an arithmetic expression, an arithmetic program, or both of the route-related data.
- the scheduled motion detection unit 141g calculates the posture of the end effector 112 at each position on the movement path of the end effector 112.
- the scheduled motion detection unit 141g detects the moving path of the end effector 112 and the posture in the moving path as the scheduled motion of the robot 110 according to the selected work position.
- the scheduled motion detection unit 141g transmits the detected scheduled motion to the operation terminal 210 via the third image processing unit 141d3.
- the operation terminal 210 presents the received scheduled operation to the user P via the presentation device 230.
- the operation terminal 210 receives an input for approving the scheduled operation, an input for modifying the scheduled operation, an input for changing the selected gripping position and the arrangement position, and the first attribute information and the second attribute information. It can accept input to change.
- the operation terminal 210 When the operation terminal 210 receives the approval of the scheduled operation, the operation terminal 210 transmits the approval result to the information processing apparatus 141.
- the scheduled motion detection unit 141g sends the approved scheduled motion to the motion command unit 141h.
- the operation terminal 210 When the operation terminal 210 receives the correction of the scheduled operation, the operation terminal 210 transmits the received correction content to the information processing apparatus 141.
- the scheduled motion detection unit 141g modifies the scheduled motion so as to reflect the modified content of the scheduled motion, and generates the modified scheduled motion as a new scheduled motion.
- the scheduled motion detection unit 141g transmits a new scheduled motion to the operation terminal 210.
- the operation terminal 210 When the operation terminal 210 receives the change of the gripping position, the arrangement position, or both of them, the operation terminal 210 transmits the received change content to the information processing apparatus 141.
- the scheduled motion detection unit 141g generates a new scheduled motion according to the changes in the gripping position, the placement position, or both of them.
- the scheduled motion detection unit 141g transmits a new scheduled motion to the operation terminal 210.
- the operation terminal 210 When the operation terminal 210 receives the change of the first attribute information, the second attribute information, or both of them, the operation terminal 210 transmits the received change contents to the information processing apparatus 141.
- the first attribute information processing unit 141i1 and the second attribute information processing unit 141i2 have the first attribute information stored in the second storage unit 141s2 and the second attribute information stored in the third storage unit 141s3 according to the received changes. , Or both of these.
- the first attribute information processing unit 141i1 and the second attribute information processing unit 141i2 use the changed first attribute information and second attribute information as new first attribute information and second attribute information in the second storage unit 141s2, second. 3 Storage unit 141s3 or both of these are stored.
- the scheduled motion detection unit 141g generates a new scheduled motion using the new first attribute information and the second attribute information, and transmits the new scheduled motion to the operation terminal 210.
- the scheduled motion detection unit 141g uses information about components other than the robot 110 in the robot area AR stored in the third storage unit 141s3 to determine the presence or absence of interference between the robot 110 and the above components in the process of scheduled motion. It is detected and the detection result is transmitted to the operation terminal 210.
- the scheduled motion detection unit 141g may detect the occurrence of interference when the movement path of the end effector 112 passes through a region within a distance from the image pickup device 131 to 134 based on the third attribute information.
- the scheduled motion detection unit 141g detects the occurrence of interference, the movement path of the end effector 112 may be recalculated based on the separation distance of the interference target or the like so as to avoid the interference.
- the third image processing unit 141d3 visualizes the scheduled motion generated by the scheduled motion detecting unit 141g and transmits it to the operation terminal 210.
- the third image processing unit 141d3 has information about components other than the robot 110 in the robot area AR stored in the third storage unit 141s3, information about the robot 110 stored in the first storage unit 141s1, and a schedule.
- Image data that clearly indicates the movement path of the end effector 112 may be generated by using the scheduled motion detected by the motion detection unit 141g.
- the third image processing unit 141d3 may generate image IC data as shown in FIG.
- the image IC includes a movement path TP of the end effector 112 displayed in a broken line, a model of the robot arm 111, a model of the end effector 112, a model of the belt conveyor 121, and a model of the transport vehicle 122A to which the work W is transferred. , And models of image pickup devices 132 and 133, which are other components in the robot area AR.
- an image ID indicating the interference is displayed.
- the scheduled motion detection unit 141g When the scheduled motion is changed by the operation terminal 210 in order to avoid interference, the scheduled motion detection unit 141g generates a new scheduled motion according to the changed content, and the third image processing unit 141d3 displays a new alternate-dashed line.
- the movement path TP1 of the scheduled operation is clearly shown on the image IC.
- the user P of the operation terminal 210 can visually confirm the movement route and determine the approval of the scheduled operation.
- the first attribute information processing unit 141i1 changes the first attribute information stored in the second storage unit 141s2 according to the command received from the operation terminal 210, and uses the changed first attribute information as the new first attribute information. It is stored in the second storage unit 141s2. That is, the first attribute information processing unit 141i1 updates the first attribute information.
- the first attribute information processing unit 141i1 may transmit the first attribute information to the operation terminal 210.
- the first attribute information processing unit 141i1 may transmit the first attribute information corresponding to the selected work position to the operation terminal 210.
- the first attribute information processing unit 141i1 may output the first attribute information to the second image processing unit 141d2 and the third image processing unit 141d3, and make the first attribute information clearly appear on the generated image.
- the second attribute information processing unit 141i2 changes the second attribute information stored in the third storage unit 141s3 according to the command received from the operation terminal 210, and uses the changed second attribute information as new second attribute information. It is stored in the third storage unit 141s3. That is, the second attribute information processing unit 141i2 updates the second attribute information.
- the second attribute information processing unit 141i2 may transmit the second attribute information to the operation terminal 210.
- the second attribute information processing unit 141i2 may transmit the second attribute information corresponding to the selected work position to the operation terminal 210.
- the second attribute information processing unit 141i2 may output the second attribute information to the second image processing unit 141d2 and the third image processing unit 141d3 so that the second attribute information is clearly shown on the generated image.
- the operation command unit 141h generates an operation command for moving and operating the end effector 112 according to the control program stored in the first storage unit 141s1.
- the operation command unit 141h generates an operation command for moving and operating the end effector 112 according to the approved scheduled operation generated by the scheduled operation detection unit 141g. That is, the operation command unit 141h generates an operation command according to the selected work position, the first attribute information, and the second attribute information.
- the operation command unit 141h transmits an operation command to the robot controller 142.
- the operation command includes at least the position command of the end effector 112 and the force command, and in the present embodiment, both are included. Further, the operation command includes a command of the gripping force of the end effector 112 with respect to the work W.
- the operation command unit 141h uses the operation command, the approved scheduled operation, the gripping position and the arrangement position included in the scheduled operation, the drive command acquired from the drive command unit 142a, or a combination of two or more thereof as log information. It may be stored in the fifth storage unit 141s5.
- the position command may include commands such as the target position of the end effector 112 in the three-dimensional space, the moving speed of the target position, the target posture, and the moving speed of the target posture.
- the force command may include commands such as the magnitude and direction of the force applied by the end effector 112 to the work W in three-dimensional space.
- the force command may include the acceleration that the end effector 112 applies to the work W.
- the drive command unit 142a generates a drive command for operating the robot arm 111 and the end effector 112 by using the information stored in the storage unit 142c so that the end effector 112 moves and grips according to the operation command. do.
- the drive command includes the command values of the currents of the servomotors RM1 to RM6 of the robot arm 111 and the servomotors EM1 of the end effector 112.
- the drive command unit 142a generates a drive command using the feedback information received from the operation information processing unit 142b.
- the operation information processing unit 142b acquires information on the rotation amount and current value from the servomotors RM1 to RM6 and EM1, and outputs the information to the drive command unit 142a as feedback information.
- the motion information processing unit 142b acquires the rotation amount of each servomotor from the rotation sensor included in the servomotor.
- the operation information processing unit 142b acquires the current value of each servomotor from the command value of the current of the drive circuit of the servomotor.
- the operation information processing unit 142b may acquire a current value from the current sensor.
- FIGS. 11A to 11C are flowcharts showing an example of the operation of the robot system 1 according to the embodiment.
- the user P inputs a request in charge of operating the robot that transfers the work to the operation terminal 210, and the operation terminal 210 transmits the request to the server 310 (step S101).
- the server 310 searches for a robot 110 capable of performing the work, and connects the information processing device 141 of the searched robot 110 and the operation terminal 210 via the communication network N (step S102).
- the user P When the user P receives the notification of the connection completion from the server 310, the user P inputs the transfer work execution command to the operation terminal 210.
- the operation terminal 210 transmits the command to the information processing device 141 (step S103).
- the information processing apparatus 141 starts controlling the transfer work and automatically causes the robot 110 to operate according to the control program stored in the first storage unit 141s1 (step S104).
- the information processing device 141 processes the image data captured by the image pickup device 131, and the work W to be transferred and its surrounding configuration are projected on the image data.
- the element is extracted (step S105).
- the information processing device 141 may process the image data captured by the image pickup device 132 to extract the work W or the like to be transferred.
- the information processing apparatus 141 further processes the image data and detects the three-dimensional positions of the extracted work W and its surrounding components (step S106).
- the information processing apparatus 141 generates a virtual model of the work W extracted in step S105 and its surrounding components (step S107), and grips the work W based on the virtual model and the first attribute information.
- the candidate position is determined (step S108).
- the information processing device 141 generates image data indicating a candidate for the gripping position of the work W and transmits it to the operation terminal 210 (step S109). The information processing device 141 also transmits the first attribute information to the operation terminal 210.
- the operation terminal 210 displays the received image data and the first attribute information on the presentation device 230.
- the user P can visually recognize the candidate for the gripping position of the work W displayed on the presentation device 230 and select the gripping position.
- the operation terminal 210 receives a command from the user P to specify a selective grip position, which is one of the candidates for the grip position of the work W, the operation terminal 210 transmits information on the selective grip position to the information processing apparatus 141 (step S110). ).
- the information processing apparatus 141 determines a candidate for the arrangement position of the work W on the transport vehicle 122A or 122B based on the second attribute information (step S111).
- the information processing apparatus 141 generates image data indicating a candidate for the arrangement position of the work W on the transport vehicle 122A or 122B, and transmits the image data to the operation terminal 210 (step S112).
- the information processing device 141 also transmits the second attribute information to the operation terminal 210.
- the operation terminal 210 displays the received image data and the second attribute information on the presentation device 230.
- the user P can visually recognize the candidate for the arrangement position of the work W displayed on the presentation device 230 and select the arrangement position.
- the operation terminal 210 receives a command from the user P to specify the selected arrangement position, which is one of the candidates for the arrangement position of the work W, the operation terminal 210 transmits the information of the selected arrangement position to the information processing apparatus 141 (step S113). ).
- the information processing apparatus 141 detects the scheduled operation of the robot 110 by using the information on the gripping position and the arrangement position of the selected work W and the route-related data (step S114).
- the information processing device 141 generates image data indicating the scheduled operation of the robot 110, that is, the movement path of the end effector 112, and transmits it to the operation terminal 210 (step S115).
- the information processing device 141 also transmits the first attribute information and the second attribute information to the operation terminal 210.
- the information processing apparatus 141 determines whether or not there is interference between the robot 110 and surrounding components when the end effector 112 is moved along the movement path (step S116).
- the information processing apparatus 141 transmits the interference information to the operation terminal 210 when there is interference (Yes in step S116), and proceeds to step S118 when there is no interference (No in step S116).
- step S118 the operation terminal 210 causes the presentation device 230 to present image data indicating the movement path of the end effector 112.
- the interference portion is displayed on the image displayed by the presenting device 230.
- step S119 when the operation terminal 210 has an input for approving the scheduled operation of the robot 110 (Yes in step S119), the operation terminal 210 transmits the approval information to the information processing apparatus 141 (step S122), and proceeds to step S123.
- the operation terminal 210 proceeds to step S120 when there is an unapproved input (No in step S119).
- step S120 the user P inputs a command for changing the scheduled operation of the robot 110 to the operation terminal 210, and the operation terminal 210 transmits the command to the information processing device 141.
- the above-mentioned command is a command to change the scheduled operation itself, a command to change the gripping position, arrangement, or both of the work W, a command to change the first attribute information, the second attribute information, or both of them, or these. Includes two or more combinations.
- step S121 the information processing apparatus 141 detects a new scheduled operation according to a command for changing the scheduled operation of the robot 110.
- the information processing apparatus 141 repeats the processing after step S115 by using the new scheduled operation.
- step S123 the information processing apparatus 141 generates an operation command of the robot 110 and transmits it to the robot controller 142 using the approved scheduled operation and the three-dimensional position of the work W detected in step S106.
- the robot controller 142 generates a drive command according to the operation command, and causes the robot arm 111 and the end effector 112 to operate according to the drive command (step S124). That is, the robot controller 142 causes the robot arm 111 and the end effector 112 to grip the work W on the belt conveyor 121 and transfer it to the transport vehicle 122A or 122B in accordance with the operation command.
- step S125 When the information processing apparatus 141 receives a command to complete the transfer work from the operation terminal 210 (Yes in step S125), the information processing apparatus 141 ends a series of processes, and if it does not receive the above command (No in step S125), proceeds to step S126. ..
- step S126 the information processing apparatus 141 automatically moves the end effector 112 to the vicinity of the belt conveyor 121 to the robot arm 111 after transferring the work W to the transport vehicle 122A or 122B, and repeats the processes after step S105. ..
- the information processing apparatus 141 selects the gripping position of the work W and the arrangement position of the transfer destination at each timing when the new work W is gripped by the end effector 112. Is requested, and the robot 110 is operated according to the selection result of the user P.
- the user P does not need to directly manually operate the robot 110 for the operation of the robot 110 that requires the judgment of the user P as described above, and is appropriate from the candidate elements for determining the operation. All you have to do is select the element. Therefore, various users can participate in the operation of the robot 110 from various places regardless of the robot operation skill of the user.
- This modification differs from the embodiment in that the robot system includes the learning device 400.
- this modification will be described mainly on the points different from those of the embodiment, and the description of the same points as those of the embodiment will be omitted as appropriate.
- FIG. 12 is a block diagram showing an example of the functional configuration of the control device 140 and the learning device 400 according to the modified example.
- the information processing device 141A of the control device 140 further includes the log information output unit 141j as a functional component.
- the log information output unit 141j outputs the log information stored in the fifth storage unit 141s5 to the request source in response to a request from the outside of the information processing apparatus 141A.
- the log information output unit 141j may output log information to a predetermined output destination at a predetermined timing according to a preset program.
- the function of the log information output unit 141j is realized by the processor 1411 or the like.
- the learning device 400 includes a computer device as well as the information processing device 141A.
- the learning device 400 includes a circuit, which includes a processor and memory as described above.
- the learning device may include storage as described above.
- the processor of the learning device 400 is an example of the second processor, and the memory and storage of the learning device 400 is an example of the second storage device.
- the learning device 400 is a device different from the information processing device 141A and the robot controller 142, and can perform data communication with the information processing device 141A via wired communication, wireless communication, or a combination thereof. Connected to. Any wired or wireless communication may be used.
- the learning device 400 may be incorporated in the information processing device 141A or the robot controller 142.
- the learning device 400 may be connected to the information processing device 141A via the communication network N.
- the learning device 400 may be connected to a plurality of information processing devices 141A.
- the learning device 400 may be configured to input / output data to / from the information processing device 141A via a storage medium.
- the storage medium is a semiconductor-based or other integrated circuit (IC: Integrated Circuit), hard disk drive (HDD), hybrid hard drive (HHD: Hybrid Hard Disk Drive), optical disk, optical disk drive (ODD: Optical Disk Drive), optical. Magnetic disks, optical magnetic drives, floppy disk drives (FDDs), magnetic tapes, solid drive (SSD), RAM drives, secure digital cards or drives, any other suitable storage medium, or two of these. The above combinations can be included.
- the learning device 400 is arranged in the robot area AR, but is not limited to this.
- the learning device 400 may be arranged in the user area AU, or may be arranged in a place different from the robot area AR and the user area AU.
- the learning device 400 may be arranged at the location of the server 310 and may be configured to communicate data with the information processing device 141A via the server 310.
- the learning device 400 may be incorporated in the server 310.
- the learning device 400 includes a learning data processing unit 401, a learning data storage unit 402, a learning unit 403, an input data processing unit 404, and an output data processing unit 405 as functional components.
- the function of the learning data storage unit 402 is realized by the memory, storage, or a combination thereof that can be included in the learning device 400, and the function of the functional component other than the learning data storage unit 402 is realized by the processor or the like included in the learning device 400. It will be realized.
- the learning data storage unit 402 stores various information and data, and makes it possible to read the stored information and data.
- the learning data storage unit 402 stores learning data.
- the learning data processing unit 401 receives log information from the log information output unit 141j of the information processing device 141A, and stores the log information as learning data in the learning data storage unit 402.
- the learning data processing unit 401 may request the log information from the log information output unit 141j and acquire the log information from the log information output unit 141j, and acquire the log information sent from the log information output unit 141j at a predetermined timing. You may.
- the learning data processing unit 401 actually executed the state information related to the work, the first attribute information and the second attribute information corresponding to the state indicated by the state information, and the work among the log information.
- Information on the selected working position such as the selected gripping position and the selected arrangement position may be acquired.
- the selected work position is a work position selected from the candidates for the work position based on the state information, and is a work position actually used.
- the learning unit 403 includes a learning model, and in this modification, it includes a learning model that performs machine learning.
- the learning unit 403 trains the learning model using the learning data, and causes the learning model to improve the accuracy of the output data with respect to the input data.
- the training model may include a neural network (NeuralNetwork), RandomForest, GeneticProgramming, regression model, tree model, Bayesian model, time series model, clustering model, ensemble learning model, etc. including.
- the neural network includes a plurality of node layers including an input layer and an output layer.
- the node layer contains one or more nodes.
- the neural network When the neural network includes an input layer, an intermediate layer, and an output layer, the neural network performs output processing from the input layer to the intermediate layer and output processing from the intermediate layer to the output layer for the information input to the nodes of the input layer. Are sequentially performed, and the output result matching the input information is output.
- Each node in one layer is connected to each node in the next layer, and the connections between the nodes are weighted.
- the information of the node of one layer is weighted for the connection between the nodes and output to the node of the next layer.
- the state information related to the work is used as input data, and the reliability of each candidate for the work position related to the work corresponding to the state indicated by the state information is used as output data.
- the state information related to the work and the first attribute information and the second attribute information corresponding to the state indicated by the state information may be input data.
- the reliability may be a probability of being a correct answer, and may be expressed by, for example, a score or the like.
- the state information about the work included in the learning data is used as the input data for learning, and the information of the selected work position actually executed for the work is used as the teacher data.
- the learning model may use the state information related to the work included in the learning data and the first attribute information and the second attribute information corresponding to the state indicated by the state information as input data.
- the learning unit 403 matches or minimizes the reliability of each candidate work position output by the learning model when input data is input and the selection work position of the teacher data. Adjust the weighting of the connection between the nodes in the neural network so that it becomes the same.
- the learning model after such weighting adjustment can output the reliability of each candidate of the work position corresponding to the state indicated by the state information with high accuracy.
- the input data processing unit 404 receives information to be input to the learning model of the learning unit 403 from the outside, converts the information into state information that can be input to the learning model, and outputs the information to the learning unit 403.
- the input data processing unit 404 may accept information about the work, first attribute information, and second attribute information.
- the information about the work includes information such as a virtual model generated by the model generation unit 141e of the information processing device 141A, information such as data after image processing by the first image processing unit 141d1, the image pickup device 131, and the image pickup device 131.
- the input data processing unit 404 may convert information about such a work into information indicating the state of the work and the components of the surrounding environment such as the positions and postures of the components of the work and the surrounding environment.
- the input data processing unit 404 may receive the information from any device capable of outputting information about the work.
- the output data processing unit 405 determines the optimum work position using the reliability of each of the work position candidates related to the work output by the learning unit 403, and outputs the information on the optimum work position.
- the output data processing unit 405 may output the information to a device related to the control of the robot.
- the output data processing unit 405 may output the information to a device having a function such as a scheduled operation detection unit 141g of the information processing device 141A, an operation command unit 141h, or a combination thereof, and the robot controller 142 may output the information. It may be output to such a device.
- the output data processing unit 405 may be configured to determine the most reliable work position among the work position candidates related to the work as the optimum work position.
- the most reliable working position is one of the working position candidates.
- the most reliable working position is one of the gripping positions GP1 to GP8 shown in FIG.
- the output data processing unit 405 may be configured to determine the optimum working position from an arbitrary position with respect to the work.
- the output data processing unit 405 functions the relationship between the work position and its reliability by using the work position candidate and the information of each reliability, and uses the function to determine the most reliable work position. You may calculate.
- the most reliable working position is not limited to the candidate working position.
- the most reliable work position is determined to be an arbitrary position on the work WA, and from the grip position GP1 to the position on the work WA other than GP8 shown in FIG. 5, for example, from the grip position GP1. It may be a position between GP8.
- the learning device 400 learns the judgment result of the user P for the operation of the robot 110 that requires the judgment of the user P, and outputs the optimum judgment result for the operation of the robot 110 instead of the user P. be able to.
- the learning device 400 can learn various judgment results of the user P even from the log information of one information processing device 141A in one robot area AR.
- the learning device 400 can learn various judgment results from the log information of various information processing devices 141A in various robot areas AR.
- the learning device 400 can learn various determination results from the log information of the server 310. Such a learning device 400 can improve the output data with high accuracy.
- the learning model of the learning unit 403 includes the function of the output data processing unit 405, and may be configured to output the same output data as the output data processing unit 405.
- the work of the robot 110 targeted by the robot system 1 is the work of gripping and transferring the work W, but is not limited thereto.
- the robot system 1 may target any work.
- the robot system 1 may be targeted for operations such as assembling a work to an assembly target, assembling the work, welding, grinding, painting, and sealing.
- the first attribute information may be the first attribute information such as the work, the welding target portion, the grinding target portion, the painting target portion, and the sealing target portion.
- the working position related to the work may be, for example, a position on the work, a welding target portion, a grinding target portion, a painting target portion, a sealing target portion, or the like.
- the second attribute information may be the second attribute information such as an assembly target of the work, other parts assembled together with the work, a welding target, a grinding target, a painting target, and a sealing target.
- the working positions related to the work are, for example, the position of the work with respect to the assembly target, the position of the work with respect to other parts, the position of the welding target part on the welding target, the position of the grinding target part on the grinding target, and the painting target. It may be the position of the portion to be painted above, the position of the portion to be sealed on the object to be sealed, and the like.
- the control device 140 transmits the candidate of the gripping position of the work W and the candidate of the arrangement position of the work W in the transport vehicle 122A or 122B to the operation terminal 210, and requests the user P to select. It is composed.
- the elements requested by the control device 140 to be selected are not limited to these, and may be any elements that can be selected at the discretion of the user P.
- the control device 140 may be configured to send candidates such as the posture of the work W and the gripping force to the work W in each operation of the robot 110 to the operation terminal 210 and request the user P to select. ..
- control device 140 is configured not to directly control the drive of the motor of the belt conveyor 121, but may be configured to be directly controlled as an external axis control. As a result, the control device 140 can control the operation of the robot 110 and the operation of the belt conveyor 121 in a highly accurate manner.
- the control device 140 extracts the work W and the like and the three-dimensional position of the work W and the like by using the image data captured by the image pickup device 131 in order to detect the state information about the work W. It is configured to perform image processing for detection and the like, but is not limited to this.
- the image data used for the image processing may be any image data of an image pickup device capable of capturing the work W.
- image data captured by the image pickup device 132 that captures the work W from above may be used.
- the control device 140 can perform image processing on the image data of the image pickup device 132 before the robot arm 111 returns the end effector 112 to the vicinity of the belt conveyor 121 after the work W is transferred. Therefore, quick work becomes possible.
- the control device 140 is configured to use the image data captured by the work W for acquiring the state information regarding the work W, but the present invention is not limited to this.
- the control device 140 relates to the work W by using the detection result of the external sensor which is a sensor arranged separately from the work W, the detection result of the mounted sensor which is the sensor arranged in the work W, or a combination thereof. It may be configured to acquire state information.
- the external sensor may be configured to detect the position and posture of the work W from the outside of the work W.
- an external sensor detects a work W using a light wave, a laser, a magnetism, a radio wave, an electromagnetic wave, an ultrasonic wave, or a combination of two or more thereof, and a photoelectric sensor, a laser sensor, a radio wave sensor, an electromagnetic wave sensor, etc. It may be an ultrasonic sensor, various lidars (LiDAR), or a combination of two or more thereof.
- the on-board sensor may be configured to move together with the work W and detect the position, posture, and the like of the work W.
- the on-board sensor may be an acceleration sensor, an angular velocity sensor, a magnetic sensor, a GPS (Global Positioning System) receiver, or a combination of two or more thereof.
- the information processing apparatus 141 may be configured to use AI (Artificial Intelligence) for processing.
- AI is an image process of image data captured by image pickup devices 131 and 132, a process of generating a virtual model using information such as a work W extracted from the image data, and attribute information and virtual. It can be used for processing and the like to determine a candidate for a work position related to the work W by using a model or the like.
- AI may include a learning model that performs machine learning.
- the training model may include a neural network.
- the learning model that performs image processing may use image data as input data and output information such as the edge of a subject projected on the image data, a three-dimensional position, or a combination thereof.
- the learning model that generates a virtual model may use the information of the edge of the subject extracted from the image data, the three-dimensional position, or a combination thereof as input data, and the information of the virtual model of the subject as output data. ..
- the learning model for determining the work position candidate for the work W may use the information of the virtual model of the work W and its surrounding elements as input data, and the work position candidate such as the grip position of the work W as output data. ..
- the learning model may be a model for machine learning using the learning data corresponding to the input data and the teacher data corresponding to the output data.
- the server 310 is configured to connect a selected one of the plurality of operating terminals 210 to a robot group that is a combination of the robot 110 and its control device 140. However, it is not limited to this.
- the server 310 may be configured to connect a selected one of the plurality of operating terminals 210 to a selected one of the plurality of robot groups.
- the control device is a control device that controls a robot to execute a predetermined work by automatic operation, and includes a first processor, wherein the first processor works while the predetermined work is being executed. Acquiring state information including the state of the target work, determining a candidate for a work position related to the work based on the state information, and selecting the work position from the candidates for the work position.
- the selection request for requesting is transmitted to the operation terminal connected to the operation terminal via the communication network and the information of the selected position which is the selected work position is received from the operation terminal, the selection is performed.
- the robot is automatically operated according to the position.
- the user of the operation terminal does not directly operate the robot for manual operation by using the operation terminal, but performs an indirect operation for automatic operation according to the command of the selected position. ..
- the user is not required to have operation skills for direct operation, and can make the robot perform the intended operation by a simple operation.
- the control device can cause the robot to perform the appropriate movement without being affected by the user's skill level by following the selection position determined by the user. .. Since the user is not required to have direct operation skills, various users can participate in the operation of the robot.
- the operation terminal may be capable of selecting the selected position by the user and transmitting information on the selected position to the control device.
- the operation terminal is not limited to the operation terminal dedicated to the robot, and various terminals can be applied. Further, since the amount of communication data between the operation terminal and the control device is suppressed to a low level, quick and reliable communication using various communication networks is possible. Therefore, the control device can be connected to various operation terminals of various users in various places by using communication via a communication network, and can be operated by the robot according to the operation of the user. Therefore, the operation of the robot operation, which requires the judgment of the operator, is automated, which makes it possible to diversify the operators who can operate the robot.
- the first processor acquires the first image data which is the data of the image captured by the work, and the first image data.
- the state information may be detected by image processing.
- the control device can perform the process from the detection of the state information about the work to the determination of the candidate work position by itself.
- the first processor acquires the first image data which is the data of the image captured by the work, and said.
- the second image data which is the data of the image representing the candidate of the working position on the image of the first image data is generated, and the selection using the second image data.
- the request may be sent to the operating terminal.
- the control device can request the selection of the selected position by using the image showing the candidate of the working position on the image in which the work is captured.
- the first processor when the first processor receives the information of the selected position from the operation terminal, the first processor detects the scheduled operation of the robot according to the selected position and the scheduled operation.
- the information of the above may be further transmitted to the operation terminal and presented.
- the control device can present to the user the scheduled motion of the robot according to the selected position. For example, the control device may cause the robot to execute the scheduled operation after the user permits the scheduled operation.
- the first processor may accept a change of the scheduled operation by the operation terminal and cause the robot to operate by automatic operation according to the changed scheduled operation.
- the user can change the scheduled motion presented to the operation terminal and cause the robot to execute the scheduled motion after the change. For example, when the user confirms that the robot interferes with a surrounding object in the scheduled motion, the user can change the scheduled motion so as to avoid the interference by using the operation terminal. In this case, the user may change the selected position, for example, or may change the operation path of the robot in the scheduled operation. Reliable and safe robot operation is possible.
- the control device further includes a first storage device, and the first storage device contains first attribute information including information on the characteristics of the work and the predetermined work set in the work.
- the first processor may be stored and the robot may be automatically operated according to the first attribute information and the selected working position.
- the control device can cause the robot to perform an operation suitable for the work.
- the control device is based on the first attribute information and the gripping force in the operation of the robot gripping the work. Can be determined.
- the control device determines the posture and moving speed of the work in the motion of the robot moving the work based on the first attribute information. can do.
- the first processor further executes to transmit and present the first attribute information corresponding to the selected position to the operation terminal, and the first processor is said to be said. Even if the operation terminal accepts the change of the first attribute information, the first attribute information is changed according to the received change content, and the robot is automatically operated according to the changed first attribute information and the selected position. good.
- the control device can determine the operation of the robot according to the first attribute information according to the judgment of the user of the operation terminal. The control device can reflect the judgment result of the user other than the selected position in the operation control of the robot.
- the first attribute information includes information on a position where the robot can apply an action to the work, and in the process of determining a candidate for the work position, the first attribute information.
- the circuit may use the first attribute information to determine a candidate for a position where the robot acts on the work as a candidate for the working position.
- the control device stores the information of the candidate of the action position set in the work as the first attribute information. Therefore, the control device can keep the amount of processing for determining the candidate of the working position low.
- the control device further comprises a first storage device, wherein the first storage device includes information on the characteristics of the surrounding environment of the work and the predetermined work set in the peripheral environment. Two attribute information may be stored, and the first processor may cause the robot to operate by automatic operation according to the second attribute information and the selected position.
- the control device can cause the robot to perform an operation suitable for the surrounding environment.
- the control device shall be included in the second attribute information. Based on this, it is possible to determine the speed and acceleration of the work in the motion of the robot placing the work on the placement surface.
- the control device operates the robot to arrange the work based on the second attribute information. It is possible to determine the arrangement order and arrangement direction of the work.
- the first processor further executes to transmit and present the second attribute information corresponding to the selected position to the operation terminal, and the first processor is said to be said. Even if the operation terminal accepts the change of the second attribute information, the second attribute information is changed according to the received change content, and the robot is automatically operated according to the changed second attribute information and the selected position. good.
- the control device can determine the operation of the robot according to the second attribute information according to the judgment of the user of the operation terminal. The control device can reflect the judgment result of the user other than the selected position in the operation control of the robot.
- the second attribute information includes information on the position of the work with respect to the surrounding environment, and in the process of determining a candidate for the work position, the first processor is the first.
- the two attribute information may be used to determine a candidate for the position of the work with respect to the surrounding environment as a candidate for the position of the work.
- the control device stores the information of the candidate position of the work with respect to the surrounding environment set in the surrounding environment as the second attribute information. Therefore, the control device can keep the amount of processing for determining the candidate of the working position low.
- the robot system according to one aspect of the present disclosure includes a control device according to one aspect of the present disclosure and the robot controlled by the control device. According to the above aspect, the same effect as that of the control device according to one aspect of the present disclosure can be obtained.
- the robot system includes a plurality of robot groups including a plurality of combinations of the control device and the robot controlled by the control device, and an intermediary device connected to the communication network so as to be capable of data communication. It is further provided with an intermediary device that mediates the connection between the selected operation terminal among the plurality of the operation terminals and the control device of the selected robot group among the plurality of robot groups. May be good.
- any user among the users of the plurality of operation terminals can make the robot of any robot group among the plurality of robot groups operate according to the selected position. For example, it is possible for a plurality of users to take turns to indirectly operate a robot in one robot group. This enables continuous work of the robot while reducing the operational load of each user. For example, it is possible to indirectly operate the robot by a robot among a plurality of users and a user suitable for a predetermined work.
- the learning device comprises a second processor and a second storage device, wherein the second storage device is the state acquired in one or more control devices according to one aspect of the present disclosure.
- the information and the selected position corresponding to the state information are stored as learning data, and the selected position corresponding to the state information is the selected position selected from the candidates for the working position based on the state information.
- the second processor learns by using the state information of the learning data as learning input data and the information of the selected position of the learning data corresponding to the state information as teacher data.
- the input state information which is the state information including the work state, is accepted as input data, and the optimum work position information among the work position candidates related to the work corresponding to the input state information is output as output data. To execute.
- the learning device can learn the selection result of the work position by the user of the operation terminal, that is, the judgment result of the user.
- the learning device after learning can determine and output the optimum work position among the candidate work positions on behalf of the user of the operation terminal. Therefore, the learning device can further automate the operation of the robot operation that requires the judgment of the user.
- the second processor and the second storage device may be separate from or integrated with the first processor and the first storage device, respectively.
- the learning device comprises a second processor and a second storage device, wherein the second storage device is the state acquired in one or more control devices according to one aspect of the present disclosure.
- the information and the selected position corresponding to the state information are stored as learning data, and the selected position corresponding to the state information is the selected position selected from the candidates for the working position based on the state information.
- the second processor learns by using the state information of the learning data as input data for learning and the information of the selected position of the learning data corresponding to the state information as teacher data. Based on the fact that input state information, which is state information including the state of the work, is accepted as input data, and the reliability of work position candidates related to the work corresponding to the input state information is output as output data.
- the optimum work position is determined from an arbitrary position with respect to the work, and the information of the optimum work position is output.
- the learning device can learn the selection result of the work position by the user of the operation terminal, that is, the judgment result of the user.
- the learning device after learning can determine and output the optimum position for the working position among arbitrary positions on behalf of the user of the operation terminal. Therefore, the learning device can further automate the operation of the robot operation that requires the judgment of the user.
- the second processor and the second storage device may be separate from or integrated with the first processor and the first storage device, respectively.
- circuits including general purpose processors, dedicated processors, integrated circuits, ASICs, conventional circuits, and / or combinations thereof, configured or programmed to perform the disclosed functions. Alternatively, it can be executed using a processing circuit.
- a processor is considered a processing circuit or circuit because it includes transistors and other circuits.
- a circuit, unit, or means is hardware that performs the listed functions or is programmed to perform the listed functions.
- the hardware may be the hardware disclosed herein, or it may be other known hardware that is programmed or configured to perform the listed functions. If the hardware is a processor considered to be a type of circuit, the circuit, means, or unit is a combination of hardware and software, and the software is used to configure the hardware and / or processor.
- the division of blocks in the functional block diagram is an example, and multiple blocks are realized as one block, one block is divided into multiple blocks, some functions are transferred to other blocks, or two or more of these are combined. It may be combined. A single piece of hardware or software may process the functions of multiple blocks with similar functions in parallel or in a time division manner.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Manipulator (AREA)
Abstract
Description
例示的な実施の形態に係るロボットシステム1の構成を説明する。図1は、実施の形態に係るロボットシステム1の構成の一例を示す概略図である。図1に示すように、ロボットシステム1は、ロボット110から離れた位置にいる操作者であるユーザPがリモートアクセス環境でロボット110を操作することを可能にするシステムである。ロボットシステム1は、1つ以上のロボットエリアARに配置される構成要素と、1つ以上のユーザエリアAUに配置される構成要素とを含む。限定されないが、本実施の形態では、1つのロボットエリアARと、複数のユーザエリアAUとが、ロボットシステム1の対象として存在する。
ロボットエリアARの構成要素の一例を説明する。図1及び図2に示すように、本実施の形態では、ロボット110は、ロボットアーム111と、ロボットアーム111の先端に取り付けられるエンドエフェクタ112とを備える。ロボットアーム111は複数の関節を有し、多自由度に動作することができる。ロボットアーム111は、エンドエフェクタ112を様々な位置及び姿勢に移動させることができる。エンドエフェクタ112は、処理の対象物であるワークWに作用を加えることができる。エンドエフェクタ112の作用は特に限定されないが、本実施の形態ではワークWを把持する作用である。
ユーザエリアAUの構成要素の一例を説明する。図1に示すように、操作端末210は、ユーザPによる指令、情報及びデータ等の入力を受け付け、受け付けた指令、情報及びデータ等を他の装置に出力するように構成される。操作端末210は、ユーザPによる入力を受け付ける操作入力装置211と、端末コンピュータ212とを含む。端末コンピュータ212は、操作入力装置211を介して受け付けた指令、情報及びデータ等を処理し他の装置に出力すること、及び、他の装置からの指令、情報及びデータ等の入力を受け付け、当該指令、情報及びデータ等を処理することを実行するように構成される。限定されないが、本実施の形態では、操作端末210は、制御装置140から送られる撮像装置131から134の画像データを提示装置230に表示可能なデータに変換し、提示装置230に出力し表示させる。操作端末210は、操作入力装置211及び端末コンピュータ212を、一体の装置として含んでもよく、別々の装置として含んでもよい。
実施の形態に係る制御装置140のハードウェア構成の一例を説明する。図3は、実施の形態に係る制御装置140のハードウェア構成の一例を示すブロック図である。図3に示すように、情報処理装置141は、プロセッサ1411と、メモリ1412と、ストレージ1413と、入出力I/F(インタフェース:Interface)1414から1416とを構成要素として含む。情報処理装置141の各構成要素は、バス1417によって相互に接続されるが、いかなる有線通信又は無線通信で接続されてもよい。ロボットコントローラ142は、プロセッサ1421と、メモリ1422と、入出力I/F1423と、通信I/F1424及び1425と、駆動I/F1426とを構成要素として含む。ロボットコントローラ142は、ストレージを含んでもよい。ロボットコントローラ142の各構成要素は、バス1427によって相互に接続されるが、いかなる有線通信又は無線通信で接続されてもよい。情報処理装置141及びロボットコントローラ142それぞれに含まれる構成要素の全てが必須ではない。
図4を参照しつつ、実施の形態に係る制御装置140の機能的構成の一例を説明する。図4は、実施の形態に係る制御装置140の機能的構成の一例を示すブロック図である。情報処理装置141は、受信情報処理部141aと、送信情報処理部141bと、撮像制御部141cと、画像処理部141d1から141d3と、モデル生成部141eと、候補決定部141fと、予定動作検出部141gと、動作指令部141hと、属性情報処理部141i1及び141i2と、記憶部141s1から141s5とを機能的構成要素として含む。記憶部141s1から141s5を除く機能的構成要素の機能は、プロセッサ1411等によって実現され、記憶部140s1から140s5の機能は、メモリ1412、ストレージ1413又はこれらの組み合わせ等によって実現される。上記の機能的構成要素の全てが必須ではない。
図11Aから図11Cを参照しつつ、実施の形態に係るロボットシステム1の動作の一例を説明する。図11Aから図11Cは、実施の形態に係るロボットシステム1の動作の一例を示すフローチャートである。まず、ユーザPは、ワークの移送作業をするロボットの操作を担当する要求を操作端末210に入力し、操作端末210は、当該要求をサーバ310に送信する(ステップS101)。サーバ310は、当該作業を行うことができるロボット110を探索し、探索されたロボット110の情報処理装置141と上記操作端末210とを通信ネットワークNを介して接続する(ステップS102)。
本変形例は、ロボットシステムが学習装置400を備える点で、実施の形態と異なる。以下、本変形例について、実施の形態と異なる点を中心に説明し、実施の形態と同様の点の説明を適宜省略する。
以上、本開示の実施の形態の例について説明したが、本開示は、上記実施の形態及び変形例に限定されない。すなわち、本開示の範囲内で種々の変形及び改良が可能である。例えば、各種変形を実施の形態及び変形例に施したもの、及び、異なる実施の形態及び変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
Claims (15)
- ロボットに自動運転で所定作業を実行させる制御を行う制御装置であって、
第1プロセッサを備え、
前記第1プロセッサは、
前記所定作業の実行中、作業対象であるワークの状態を含む状態情報を取得することと、
前記状態情報に基づき、前記ワークに関する作業位置の候補を決定することと、
前記作業位置の候補の中から前記作業位置を選択することを要求する選択要求を、通信ネットワークを介してデータ通信可能に接続される操作端末に送信することと、
前記操作端末から、選択された前記作業位置である選択位置の情報を受信すると、前記選択位置に従って前記ロボットに自動運転で動作させることとを実行する
制御装置。 - 前記状態情報を取得する処理において、前記第1プロセッサは、前記ワークが撮像された画像のデータである第1画像データを取得し、前記第1画像データを画像処理することによって前記状態情報を検出する
請求項1に記載の制御装置。 - 前記選択要求を前記操作端末に送信する処理において、前記第1プロセッサは、
前記ワークが撮像された画像のデータである第1画像データを取得し、前記第1画像データを画像処理することによって、前記第1画像データの画像上で前記作業位置の候補を表す画像のデータである第2画像データを生成し、
前記第2画像データを用いた前記選択要求を前記操作端末に送信する
請求項1又は2に記載の制御装置。 - 前記第1プロセッサは、
前記操作端末から前記選択位置の情報を受信すると、前記選択位置に従った前記ロボットの予定動作を検出することと、
前記予定動作の情報を前記操作端末に送信し提示させることとをさらに実行する
請求項1から3のいずれか一項に記載の制御装置。 - 前記第1プロセッサは、前記操作端末による前記予定動作の変更を受け付け、変更された前記予定動作に従って前記ロボットに自動運転で動作させる
請求項4に記載の制御装置。 - 第1記憶装置をさらに備え、
前記第1記憶装置は、前記ワークの特徴及び前記ワークに設定されている前記所定作業に関する情報を含む第1属性情報を記憶し、
前記第1プロセッサは、前記第1属性情報及び前記選択位置に従って前記ロボットに自動運転で動作させる
請求項1から5のいずれか一項に記載の制御装置。 - 前記第1プロセッサは、前記選択位置に対応する前記第1属性情報を前記操作端末に送信し提示させることをさらに実行し、
前記第1プロセッサは、
前記操作端末による前記第1属性情報の変更を受け付け、受け付けた変更内容に従って前記第1属性情報を変更し、
変更された前記第1属性情報及び前記選択位置に従って前記ロボットに自動運転で動作させる
請求項6に記載の制御装置。 - 前記第1属性情報は、前記ロボットが前記ワークに作用を加えることができる位置の情報を含み、
前記作業位置の候補を決定する処理において、前記第1プロセッサは、前記第1属性情報を用いて、前記作業位置の候補として、前記ロボットが前記ワークに作用を加える位置の候補を決定する
請求項6又は7に記載の制御装置。 - 第1記憶装置をさらに備え、
前記第1記憶装置は、前記ワークの周辺環境の特徴及び前記周辺環境に設定されている前記所定作業に関する情報を含む第2属性情報を記憶し、
前記第1プロセッサは、前記第2属性情報及び前記選択位置に従って前記ロボットに自動運転で動作させる
請求項1から8のいずれか一項に記載の制御装置。 - 前記第1プロセッサは、前記選択位置に対応する前記第2属性情報を前記操作端末に送信し提示させることをさらに実行し、
前記第1プロセッサは、前記操作端末による前記第2属性情報の変更を受け付け、受け付けた変更内容に従って前記第2属性情報を変更し、
変更された前記第2属性情報及び前記選択位置に従って前記ロボットに自動運転で動作させる
請求項9に記載の制御装置。 - 前記第2属性情報は、前記周辺環境に対する前記ワークの位置の情報を含み、
前記作業位置の候補を決定する処理において、前記第1プロセッサは、前記第2属性情報を用いて、前記作業位置の候補として、前記周辺環境に対する前記ワークの位置の候補を決定する
請求項9又は10に記載の制御装置。 - 請求項1から11のいずれか一項に記載の制御装置と、
前記制御装置によって制御される前記ロボットとを備えるロボットシステム。 - 前記制御装置と前記制御装置によって制御される前記ロボットとの複数の組み合わせを含む複数のロボットグループと、
前記通信ネットワークにデータ通信可能に接続される仲介装置であって、複数の前記操作端末のうちの選択された前記操作端末と前記複数のロボットグループのうちの選択された前記ロボットグループの前記制御装置との接続を仲介する仲介装置とをさらに備える
請求項12に記載のロボットシステム。 - 第2プロセッサと第2記憶装置とを備え、
前記第2記憶装置は、請求項1から13のいずれか一項に記載の1つ以上の制御装置において取得された、前記状態情報と前記状態情報に対応する前記選択位置とを学習用データとして記憶し、
前記状態情報に対応する前記選択位置は、前記状態情報に基づく前記作業位置の候補の中から選択された前記選択位置であり、
前記第2プロセッサは、
前記学習用データの前記状態情報を学習用入力データとし、前記状態情報に対応する前記学習用データの前記選択位置の情報を教師データとして、学習することと、
ワークの状態を含む状態情報である入力状態情報を入力データとして受け付け、前記入力状態情報に対応する前記ワークに関する作業位置の候補の中の最適な前記作業位置の情報を出力データとして出力することとを実行する
学習装置。 - 第2プロセッサと第2記憶装置とを備え、
前記第2記憶装置は、請求項1から13のいずれか一項に記載の1つ以上の制御装置において取得された、前記状態情報と前記状態情報に対応する前記選択位置とを学習用データとして記憶し、
前記状態情報に対応する前記選択位置は、前記状態情報に基づく前記作業位置の候補の中から選択された前記選択位置であり、
前記第2プロセッサは、
前記学習用データの前記状態情報を学習用入力データとし、前記状態情報に対応する前記学習用データの前記選択位置の情報を教師データとして、学習することと、
ワークの状態を含む状態情報である入力状態情報を入力データとして受け付け、前記入力状態情報に対応する前記ワークに関する作業位置の候補の信頼度を出力データとして出力することと、
前記信頼度に基づき、前記ワークに関する任意の位置から最適な前記作業位置を決定し、前記最適な作業位置の情報を出力することとを実行する
学習装置。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022570057A JP7473685B2 (ja) | 2020-12-18 | 2021-12-16 | 制御装置、ロボットシステム及び学習装置 |
US18/267,309 US20240051134A1 (en) | 2020-12-18 | 2021-12-16 | Controller, robot system and learning device |
CN202180084729.XA CN116600952A (zh) | 2020-12-18 | 2021-12-16 | 控制装置、机器人系统以及学习装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2020210012 | 2020-12-18 | ||
JP2020-210012 | 2020-12-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022131335A1 true WO2022131335A1 (ja) | 2022-06-23 |
Family
ID=82059561
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/046542 WO2022131335A1 (ja) | 2020-12-18 | 2021-12-16 | 制御装置、ロボットシステム及び学習装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240051134A1 (ja) |
JP (1) | JP7473685B2 (ja) |
CN (1) | CN116600952A (ja) |
WO (1) | WO2022131335A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047924A1 (ja) * | 2022-08-29 | 2024-03-07 | パナソニックIpマネジメント株式会社 | 画像認識装置、画像認識方法およびプログラム |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62199376A (ja) * | 1986-02-26 | 1987-09-03 | 株式会社日立製作所 | 遠隔マニピユレ−シヨン方法及び装置 |
JP2012006097A (ja) * | 2010-06-23 | 2012-01-12 | Yaskawa Electric Corp | ロボット装置 |
JP2020131279A (ja) * | 2019-02-26 | 2020-08-31 | 株式会社神戸製鋼所 | 溶接線データ生成装置、溶接システム、溶接線データ生成方法及びプログラム |
-
2021
- 2021-12-16 JP JP2022570057A patent/JP7473685B2/ja active Active
- 2021-12-16 WO PCT/JP2021/046542 patent/WO2022131335A1/ja active Application Filing
- 2021-12-16 US US18/267,309 patent/US20240051134A1/en active Pending
- 2021-12-16 CN CN202180084729.XA patent/CN116600952A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS62199376A (ja) * | 1986-02-26 | 1987-09-03 | 株式会社日立製作所 | 遠隔マニピユレ−シヨン方法及び装置 |
JP2012006097A (ja) * | 2010-06-23 | 2012-01-12 | Yaskawa Electric Corp | ロボット装置 |
JP2020131279A (ja) * | 2019-02-26 | 2020-08-31 | 株式会社神戸製鋼所 | 溶接線データ生成装置、溶接システム、溶接線データ生成方法及びプログラム |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2024047924A1 (ja) * | 2022-08-29 | 2024-03-07 | パナソニックIpマネジメント株式会社 | 画像認識装置、画像認識方法およびプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20240051134A1 (en) | 2024-02-15 |
JP7473685B2 (ja) | 2024-04-23 |
JPWO2022131335A1 (ja) | 2022-06-23 |
CN116600952A (zh) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7338034B2 (ja) | リモートクライアントデバイスからの入力に基づく効率的なロボットの制御 | |
US10759051B2 (en) | Architecture and methods for robotic mobile manipulation system | |
US20220212340A1 (en) | Control device, control system, mechanical apparatus system, and controlling method | |
CN114728417A (zh) | 由远程操作员触发的机器人自主对象学习 | |
Annem et al. | Towards remote teleoperation of a semi-autonomous mobile manipulator system in machine tending tasks | |
Westerberg et al. | Virtual environment-based teleoperation of forestry machines: Designing future interaction methods | |
JP7281349B2 (ja) | 遠隔操作システム | |
WO2022131335A1 (ja) | 制御装置、ロボットシステム及び学習装置 | |
KR102518766B1 (ko) | 데이터 생성 장치, 데이터 생성 방법, 데이터 생성 프로그램 및 원격 조작 시스템 | |
JP6644104B2 (ja) | 多機能統合型作業テーブルおよびそれを用いた生産システム | |
WO2021070859A1 (ja) | 制御方法、制御装置、ロボットシステム、プログラム及び記録媒体 | |
US11618164B2 (en) | Robot and method of controlling same | |
KR20160116445A (ko) | 지능형 공구 심부름 로봇 | |
US20220374295A1 (en) | Systems and Methods for Inter-Process Communication within a Robot | |
Hentout et al. | A telerobotic human/robot interface for mobile manipulators: A study of human operator performance | |
Sylari et al. | Hand gesture-based on-line programming of industrial robot manipulators | |
Asavasirikulkij et al. | A Study of Digital Twin and Its Communication Protocol in Factory Automation Cell | |
Jia et al. | Distributed telerobotics system based on common object request broker architecture | |
Wozniak et al. | Virtual reality framework for better human-robot collaboration and mutual understanding | |
JP2022131206A (ja) | 情報処理装置、学習装置、情報処理システム及びロボットシステム | |
Chandrasekaran et al. | A Robotic System Architecture Based on Safety Controller and Priority Module Using Robot Operating System (ROS), Sensor Fusion and Human Robot Interaction for Control and Safety | |
JP2022035765A (ja) | 自律移動体、及びその制御システム、制御方法、及びプログラム | |
CN115225682A (zh) | 管理服务器、远程操作系统、远程操作方法以及存储介质 | |
CN115997182A (zh) | 模拟装置及模拟系统 | |
JP2022066982A (ja) | 教育システム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21906695 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022570057 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18267309 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180084729.X Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21906695 Country of ref document: EP Kind code of ref document: A1 |