WO2022181458A1 - Information processing device, learning device, information processing system, and robot system - Google Patents

Information processing device, learning device, information processing system, and robot system Download PDF

Info

Publication number
WO2022181458A1
WO2022181458A1 PCT/JP2022/006514 JP2022006514W WO2022181458A1 WO 2022181458 A1 WO2022181458 A1 WO 2022181458A1 JP 2022006514 W JP2022006514 W JP 2022006514W WO 2022181458 A1 WO2022181458 A1 WO 2022181458A1
Authority
WO
WIPO (PCT)
Prior art keywords
evaluation
robot
information
action
motion
Prior art date
Application number
PCT/JP2022/006514
Other languages
French (fr)
Japanese (ja)
Inventor
仁志 蓮沼
雅幸 掃部
武司 山本
Original Assignee
川崎重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 川崎重工業株式会社 filed Critical 川崎重工業株式会社
Publication of WO2022181458A1 publication Critical patent/WO2022181458A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J3/00Manipulators of master-slave type, i.e. both controlling unit and controlled unit perform corresponding spatial movements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education

Definitions

  • the present disclosure relates to information processing devices, learning devices, information processing systems, and robot systems.
  • the management system disclosed in Japanese Unexamined Patent Application Publication No. 2020-135362 makes an operator operating a robot feel as if an object at a remote location exists nearby, while allowing an operator to operate a robot near an object at a remote location.
  • a server provided in the management system acquires state information indicating the state of the robot or the state of the operator when the operator operates the robot, and compares the state indicated by the state information with the reference level to determine whether the robot is operated. determine a person's suitability.
  • the skill level of robot operation is determined as aptitude. For example, the operator can only know the skill level of the entire operation for the work through the determined level. The operator may not be able to find a specific measure for improving the robot operation skill, and may not be able to increase the motivation for improving the skill and the willingness to operate.
  • the present disclosure provides an information processing device, a learning device, an information processing system, and a robot system that improve a user's willingness to operate a robot.
  • An information processing device is an information processing device that evaluates an operation of the robot for causing the robot to perform a task, the processing circuit including a processing circuit, wherein the processing circuit receives an input from a user to an operation device.
  • detection processing for detecting motion-related information, which is information related to the motion of the robot that operates in accordance with an operation command received; and a presentation process of presenting the evaluation result of the evaluation process on a presentation device that receives information presentation while the user operates the operation device, are executed during the execution of the work.
  • the processing circuit is configured to perform a first timing each time an evaluation target motion is a motion of the robot to be evaluated, and a second timing each time an evaluation target stage is a work stage to be evaluated in the work.
  • the processing circuit executes the evaluation processing in the same order as the execution order of the evaluation target operation.
  • the processing circuit presents the evaluation result regarding the evaluation target stage in the same order as the execution order of the evaluation target stage. is presented on the presentation device.
  • FIG. 1 is a diagram illustrating an example of a configuration of a robot system according to an embodiment.
  • FIG. 2 is a perspective view showing an example of the configuration of the robot according to the embodiment.
  • FIG. 3 is a block diagram illustrating an example of the hardware configuration of the control device according to the embodiment;
  • FIG. 4 is a block diagram illustrating an example of a functional configuration of the control device according to the embodiment;
  • FIG. 5 is a flow chart showing an example of the operation of the robot system according to the embodiment.
  • FIG. 6 is a plan view showing an example of the operation of the robot system according to the embodiment;
  • FIG. 7 is a block diagram showing an example of functional configurations of an information processing device and a learning device according to a modification.
  • FIG. 1 is a diagram showing an example of a configuration of a robot system 1 according to an embodiment.
  • the robot system 1 includes one or more robots 100 , one or more operation terminals 200 and a server 300 .
  • the robot system 1 is configured to provide a service to a user P using a remotely operated robot 100 .
  • the robot system 1 can be used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, sales, rental, and article provision.
  • the robot 100 is a robot that can perform tasks such as providing services.
  • work is defined as accomplishing, crafting, creating, or any combination of two or more of these, a productive thing in which some useful result is expected.
  • a series of actions performed by the robot 100 can be targeted.
  • the “work” does not have to be a series of actions performed by the robot 100 during the process of teaching the robot 100 .
  • “work” may not cover a series of actions performed by robot 100 for play, games, entertainment, or a combination of two or more thereof.
  • two or more robots 100 are arranged in one service providing area AS where user P is provided with services. Furthermore, one or more operation terminals 200 are arranged in each of two or more operation areas AO located away from the service providing area AS.
  • the robot 100 is configured to be connected to the communication network N via wireless communication so as to enable data communication.
  • the communication connecting the robot 100 and the communication network N may be wired communication or a combination of wired and wireless communication.
  • the operation terminal 200 is configured to be connected to the communication network N for data communication via wired communication, wireless communication, or a combination thereof.
  • One robot 100 and one operation terminal 200 can be connected via a communication network N so as to be capable of data communication. Any wired or wireless communication may be used.
  • the server 300 manages communication via the communication network N.
  • Server 300 includes a computer device.
  • the server 300 manages communication authentication, connection, disconnection, and the like between the robot 100 and the operation terminal 200 .
  • the server 300 stores identification information, security information, etc. of the robot 100 and the operation terminal 200 registered in the robot system 1, and uses the information to determine the qualification of connection of the operation terminal 200 to the robot system 1.
  • the server 300 manages transmission and reception of data between the robot 100 and the operation terminal 200 , and the data may pass through the server 300 .
  • Server 300 may be configured to convert data sent from a source into a data format that can be used by a destination.
  • the server 300 may be configured to store and accumulate information, commands, data, etc. that are transmitted and received between the robot 100 and the operation terminal 200 during the operation of the robot 100 .
  • the communication network N is not particularly limited, and can include, for example, a local area network (LAN), a wide area network (WAN), the Internet, or a combination of two or more thereof.
  • Communication network N includes short-range wireless communication such as Bluetooth (registered trademark) and ZigBee (registered trademark), network dedicated lines, communication carrier dedicated lines, public switched telephone network (PSTN), It may be configured to use mobile networks, internet networks, satellite communications, or a combination of two or more thereof.
  • the mobile communication network may use a 4th generation mobile communication system, a 5th generation mobile communication system, or the like.
  • Communication network N may include one or more networks. In this embodiment, the communication network N is the Internet.
  • the configuration of the operation terminal 200 according to the embodiment will be described with reference to FIG.
  • the operation terminal 200 is configured to receive commands, information, data, and the like input by the operator PO, and to output the received commands, information, data, and the like to other devices.
  • the operation terminal 200 includes an operation input device 201 , a terminal computer 202 , a presentation device 203 and a communication device 204 .
  • the four devices of the operation input device 201, the terminal computer 202, the presentation device 203, and the communication device 204 may be integrated to form one device, and each of the four devices independently forms a device and is connected to each other. or two or more devices may form one device and be connected to other devices.
  • the operation terminal 200 is an example of an operation device.
  • the configuration of the operation terminal 200 is not particularly limited. It may be a teaching device, a known operating device for a robot, other operating devices, other terminal devices, devices using these devices, and improved devices thereof.
  • the operation terminal 200 may be a dedicated device devised for the robot system 1, or may be a general-purpose device available on the general market. In this embodiment, a known general-purpose device is used as the operation terminal 200 .
  • the device may be configured to implement the functions of the operation terminal 200 of the present disclosure by installing dedicated software.
  • the operation input device 201 is configured to be able to receive input from the operator PO, and to output to the terminal computer 202 signals indicating input commands, information, data, and the like.
  • the configuration of the operation input device 201 is not particularly limited. may include a device that can be
  • the operation input device 201 may include an imaging device that captures an image of the operator PO or the like, and a voice input device such as a microphone that receives voice input of the operator PO or the like.
  • the operation input device 201 may be configured to output to the terminal computer 202 the captured image data and the signal indicating the input voice.
  • the terminal computer 202 processes commands, information, data, etc. received via the operation input device 201 and outputs them to other devices, and receives inputs of commands, information, data, etc. from other devices, and processes them. It is configured to execute instructions, process information and data, and the like.
  • the presentation device 203 is configured to present information to the operator PO. Although not limited, in this embodiment the presentation device 203 includes a display capable of displaying an image to the operator PO. The presentation device 203 displays an image of image data received from the terminal computer 202 . The presentation device 203 may include an audio output device such as a speaker capable of emitting audio to the operator PO. The presentation device 203 outputs the sound of the sound data received from the terminal computer 202 .
  • the communication device 204 has a communication interface that can be connected to the communication network N.
  • the communication device 204 is connected to the terminal computer 202 and connects the terminal computer 202 and the communication network N so that data communication is possible.
  • Communication devices 204 may include, for example, communication equipment such as modems, Optical Network Units (ONUs), routers, and mobile data communication equipment.
  • Communication device 204 may include a computing device having computing capabilities and the like.
  • FIG. 2 is a perspective view showing an example configuration of the robot 100 according to the embodiment.
  • the robot 100 includes one carrier 110, one or more robot arms 120, one or more end effectors 130, a secondary battery module 141, a power supply circuit 142, a communication device 143, an imaging device 144, 145 and 146 , a sound collector 147 , a display device 148 , an audio output device 149 and a control device 170 .
  • the control device 170 includes an information processing device 180 and a robot controller 190 .
  • the information processing device 180 and the robot controller 190 may be separate devices or may be integrated.
  • the robot arm 120 is a robot arm that can also function for industrial purposes.
  • the quantity of each component described above is not limited to the above quantity, and can be changed as appropriate.
  • the carrier 110 is configured to run on its own.
  • the transport vehicle 110 includes two drive wheels 111 , four auxiliary wheels 112 , and a transport drive device 113 that drives the two drive wheels 111 .
  • the transport driving device 113 includes a servomotor as an electric actuator that is a driving source. Each servo motor is controlled by robot controller 190 .
  • the transport driving device 113 rotates the two drive wheels 111 individually at various rotational directions and rotational speeds, thereby causing the transport vehicle 110 to travel at various speeds in the forward direction D1A, the backward direction D1B, and the turning direction. can.
  • the robot arm 120 two robot arms 120A and 120B are mounted on the carrier 110 via the base 120C. Both of the robot arms 120A and 120B are rotatable about an axis S1 in the direction from the carrier 110 toward the base 120C, forming coaxial dual-arm robot arms.
  • the robot arm 120A includes links 121A to 124A and arm drivers M1A to M4A that drive joints interconnecting the links 121A to 124A.
  • the robot arm 120B includes links 121B to 124B and arm drivers M1B to M4B that drive the joints interconnecting the links 121B to 124B.
  • arm drives M1A to M4A and M1B to M4B are shown in FIG.
  • Arm driving devices M1A to M4A and M1B to M4B include servo motors as electric actuators that are driving sources. Each servo motor is controlled by robot controller 190 .
  • Links 121A and 121B are connected to base 120C via joints.
  • Links 124 A and 124 B include mechanical interfaces that are connectable with end effector 130 .
  • the robot arms 120A and 120B as described above have a horizontal articulated arm configuration, but may have any configuration.
  • robotic arms 120A and 120B may be vertically articulated, polar, cylindrical, rectangular, or other types of robotic arms.
  • the number of robot arms 120 arranged on the carrier 110 may be one or more.
  • the rotation axes of the robot arms 120A and 120B may not be the same axis S1.
  • End effector 130 two end effectors 130A and 130B are detachably attached to the links 124A and 124B, respectively.
  • End effectors 130A and 130B are configured to act on objects handled by robot 100 .
  • End effectors 130A and 130B are operable and may include a drive.
  • the driving device may be powered by electric power, gas pressure, hydraulic pressure, or the like.
  • a drive device powered by electric power may include a servomotor as an electric actuator.
  • the drives may be controlled by robot controller 190 .
  • the robot 100 further includes a device housing 140 on the carrier 110 .
  • the secondary battery module 141 , the power supply circuit 142 , the communication device 143 , the information processing device 180 and the robot controller 190 are arranged inside the equipment housing 140 , but may be arranged at any position on the carrier 110 .
  • a table 140a on which articles can be placed is arranged on the device housing 140. As shown in FIG.
  • the secondary battery module 141 functions as a power source for the robot 100.
  • the secondary battery module 141 includes one or more secondary batteries.
  • a secondary battery is a battery capable of charging and discharging power. Examples of secondary batteries include lead storage batteries, lithium ion secondary batteries, all-solid batteries, nickel-hydrogen storage batteries, nickel-cadmium storage batteries, and the like.
  • the power supply circuit 142 is a circuit that controls power supply and demand for the secondary battery module 141 .
  • the power supply circuit 142 is configured to control supply and demand of electric power in accordance with commands from the information processing device 180 and the robot controller 190 .
  • the power supply circuit 142 may include devices such as converters, inverters, transformers and amplifiers.
  • the power supply circuit 142 is connected to an external power supply such as a commercial power supply, supplies the power supplied from the external power supply to the secondary battery module 141 and stores the power, and transfers the power stored in the secondary battery module 141 to the power inside the robot 100 . to a component that consumes
  • the communication device 143 is a device for wireless communication, and is configured to connect to the communication network N via wireless communication.
  • the wireless communication used by the communication device 143 is not particularly limited, but for example, mobile data communication, wireless LAN such as wireless Wi-Fi (Wireless Fidelity), Bluetooth (registered trademark) and ZigBee (registered trademark) It may be short-range wireless communication, wireless communication using a combination of two or more of these, or the like.
  • the communication device 143 has a device compatible with wireless communication to be used.
  • the display device 148 includes a display capable of displaying images.
  • the display may be configured to receive input from the user P, and may be a touch panel, for example.
  • the display device 148 can display an image of image data sent from the information processing device 180 or the like.
  • the display device 148 can display an image to the user P facing the robot 100 .
  • the sound collecting device 147 includes a microphone capable of acquiring sound from the surroundings and outputting an audio signal of the sound.
  • the sound collector 147 is configured to output an audio signal to the information processing device 180 or the like.
  • the information processing device 180 may be configured to convert a voice signal into voice data and transmit the data to the operation terminal 200, or may be configured to recognize voice from the voice signal.
  • the sound collector 147 can function as an external sensor of the robot 100 capable of sensing the outside of the robot 100 .
  • the internal world sensor of the robot 100 can sense the inside of the robot 100, and one example of the internal world sensor is a rotation sensor provided in the servo motors of the arm driving devices M1A to M4A and M1B to M4B.
  • the audio output device 149 includes a speaker capable of converting audio signals into sound waves and emitting them as sound.
  • the audio output device 149 can output audio corresponding to audio signals sent from the information processing device 180 or the like.
  • the voice output device 149 can output voice to the user P facing the robot 100 .
  • the imaging devices 144 , 145 and 146 each include a camera that captures digital images and are configured to send data of the captured images to the information processing device 180 .
  • the information processing device 180 may be configured to process the image data captured by the imaging devices 144 , 145 and 146 into network transmittable data, and transmit the data to the operation terminal 200 via the communication network N.
  • the information processing device 180 may be configured to perform image processing on the image data and use it for other processing.
  • the camera may be a camera capable of capturing an image for detecting the three-dimensional position of the subject with respect to the camera, such as the distance to the subject.
  • the camera may be a stereo camera, a monocular camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, a camera using a light section method, or a combination of two or more of these. etc. may be included.
  • Imaging devices 144, 145 and 146 may function as external sensors.
  • the imaging device 144 is arranged at the tip of either or both of the robot arms 120A and 120B. Without limitation, in this embodiment the imaging device 144 is located on the end effector 130A of the robot arm 120A.
  • the imaging device 145 is arranged on the display device 148 .
  • the imaging device 145 can capture an image of the user P to whom the service is provided facing the robot 100 .
  • Imaging device 145 may include the ability to change imaging direction, have a wide imaging field of view, or both, so that the end effectors 130A and 130B can image the object on which the end effectors 130A and 130B act. It may be configured as
  • the imaging device 146 is arranged on the carrier 110 in the forward direction D1A.
  • the robot controller 190 is configured to control the motion of the robot arms 120A and 120B, the end effectors 130A and 130B, and the vehicle 110.
  • the robot controller 190 controls the operation of the arm drives M1A to M4A and M1B to M4B of the robot arms 120A and 120B, the drives of the end effectors 130A and 130B, the transport drive 113 of the transport vehicle 110, and the like.
  • the information processing device 180 is connected to the robot controller 190 via wired communication, wireless communication, or a combination thereof, and performs arithmetic processing related to control of the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110. configured as
  • the information processing device 180 and the robot controller 190 include computer devices.
  • the configuration of the information processing device 180 is not particularly limited. etc.
  • the information processing device 180 is connected to the terminal computer 202 of the operation terminal 200 via the communication device 143 , the communication network N, and the communication device 204 so as to be capable of data communication.
  • the configuration of the robot controller 190 is not particularly limited, the robot controller 190 may be, for example, an electronic circuit board, an electronic control unit, a microcomputer, other electronic equipment, and the like.
  • the robot controller 190 may include a drive circuit for controlling the power supplied to the arm drives M1A to M4A and M1B to M4B, the drives of the end effectors 130A and 130B, the transport drive 113, and the like.
  • the information processing device 180 and the robot controller 190 control objects such as the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110 by manual operation control when causing the robot 100 to perform a task. It is configured to control the operation of elements, but is not so limited.
  • the information processing device 180 and the robot controller 190 may be configured to control the operation of one or more of the control target elements by automatic operation control or a combination of automatic operation control and manual operation control.
  • the combination of automatic operation control and manual operation control may be configured such that automatic operation control and manual operation control are assigned to each control target element, and part of the work of robot 100 is performed by automatic operation control. , other parts of the work may be configured to be performed by manual operation control.
  • the manual operation control may be a control that causes the elements to be controlled to operate in accordance with the details of the operation input by the operator PO to the operation input device 201 of the operation terminal 200 .
  • the controlled element can perform an action according to the action of the operator PO who operates the operation input device 201 .
  • the automatic operation control may be a control that autonomously operates the control target element according to the control program.
  • FIG. 3 is a block diagram showing an example of the hardware configuration of the control device 170 according to the embodiment.
  • the information processing apparatus 180 includes a processor 1801, a memory 1802, a storage 1803, and input/output I/Fs (interfaces) 1804a to 1804f as components. Each component of the information processing device 180 is interconnected by a bus 1805, but may be connected by any other wired or wireless communication.
  • the robot controller 190 includes a processor 1901, a memory 1902, an input/output I/F 1903, and a drive I/F 1904 as components. Robot controller 190 may include storage.
  • Each component of robot controller 190 is interconnected by bus 1905, but may be connected by any other wired or wireless communication.
  • the information processing device 180 and the robot controller 190 as described above include processing circuitry, but may include additional circuitry. Not all of the components included in each of the information processing device 180 and the robot controller 190 are essential.
  • the set of processor 1801 and memory 1802 and the set of processor 1901 and memory 1902 each form a processing circuit.
  • the processing circuitry sends and receives commands, information, data, etc. to and from other devices.
  • the processing circuit inputs signals from various devices and outputs control signals to the elements to be controlled.
  • Memories 1802 and 1902 store programs executed by the processors 1801 and 1901, various data, and the like. Memories 1802 and 1902 may include storage devices such as semiconductor memory such as volatile and non-volatile memory. Although not so limited, in this embodiment, memories 1802 and 1902 include volatile memory, RAM (Random Access Memory), and non-volatile memory, ROM (Read-Only Memory).
  • RAM Random Access Memory
  • ROM Read-Only Memory
  • the storage 1803 stores various data.
  • the storage 1803 may include storage devices such as semiconductor memory, hard disk drives (HDDs), solid state drives (SSDs), or combinations of two or more thereof.
  • HDDs hard disk drives
  • SSDs solid state drives
  • Both processors 1801 and 1901 form a computer system together with RAM and ROM.
  • the computer system of the information processing device 180 may realize the functions of the information processing device 180 by the processor 1801 using the RAM as a work area and executing the program recorded in the ROM.
  • the computer system of the robot controller 190 may realize the functions of the robot controller 190 by the processor 1901 using the RAM as a work area and executing the program recorded in the ROM.
  • Some or all of the functions of the information processing device 180 and the robot controller 190 may be implemented by the computer system described above, or may be implemented by dedicated hardware circuits such as electronic circuits or integrated circuits. It may be implemented by a combination of hardware circuits.
  • Each of the information processing device 180 and the robot controller 190 may be configured to execute each process by centralized control by a single device, or may be configured to execute each process by distributed control by cooperation of a plurality of devices.
  • the information processing device 180 and the robot controller 190 may be configured to include at least some of the functions of each other, or may be integrated.
  • processors 1801 and 1901 include CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), microprocessor, processor core, Including multiprocessors, ASICs (Application-Specific Integrated Circuits), FPGAs (Field Programmable Gate Arrays), etc., logic circuits or dedicated circuits formed on IC (Integrated Circuit) chips, LSIs (Large Scale Integration), etc. Each processing may be realized.
  • a plurality of processes may be implemented by one or more integrated circuits, or may be implemented by one integrated circuit.
  • the first input/output I/F 1804a of the information processing device 180 connects the information processing device 180 and the robot controller 190, enabling input/output of information, commands, data, etc. between them.
  • the second input/output I/F 1804b connects the information processing device 180 and the communication device 143 to enable input/output of information, commands, data, and the like therebetween.
  • the third input/output I/F 1804c connects the information processing device 180 and the imaging devices 144 to 146, enabling input/output of information, commands, data, and the like therebetween.
  • the fourth input/output I/F 1804d connects the information processing device 180 and the sound collector 147, and enables input/output of information, commands, data, and the like therebetween.
  • the fifth input/output I/F 1804e connects the information processing device 180 and the display device 148 to enable input/output of information, commands, data, and the like therebetween.
  • the sixth input/output I/F 1804f connects the information processing device 180 and the audio output device 149 and enables input/output of information, commands, data, and the like therebetween.
  • the input/output I/F 1903 of the robot controller 190 connects the robot controller 190 and the first input/output I/F 1804a of the information processing device 180, enabling input/output of information, commands, data, etc. between them. .
  • the drive I/F 1904 connects the robot controller 190 and the drive circuit 142a, enabling transmission and reception of signals and the like between them.
  • the driving circuit 142a supplies electric power to the arm driving devices M1A to M4A and M1B to M4B, the driving devices of the end effectors 130A and 130B, and the transport driving device 113 according to the command values included in the signals received from the robot controller 190. configured to control.
  • the driving circuit 142a can cause each driving device to cooperate with each other to drive.
  • the drive circuit 142a is formed as part of the power supply circuit 142 in this embodiment.
  • the robot controller 190 may be configured to servo-control the servo motor of each driving device.
  • the robot controller 190 receives from the drive circuit 142a, as feedback information, the detected value of the rotation sensor provided in the servomotor and the command value of the current from the drive circuit 142a to the servomotor.
  • the robot controller 190 determines command values for driving the servomotors using the feedback information, and transmits the command values to the drive circuit 142a.
  • the terminal computer 202 of the operation terminal 200 includes a processor and memory, like the information processing device 180 and the like.
  • the terminal computer 202 has an input/output I for establishing connection between the terminal computer 202 and the operation input device 201, connection between the terminal computer 202 and the communication device 204, and connection between the terminal computer 202 and the presentation device 203. /F may also be included.
  • FIG. 4 is a block diagram showing an example of the functional configuration of the control device 170 according to the embodiment.
  • the information processing device 180 includes an operation command unit 180a, a detection processing unit 180b, an evaluation processing unit 180c, a consideration determination unit 180d, a presentation processing unit 180e, an accumulation processing unit 180f, a reference determination unit 180g, and a timer unit. 180h and storage units 181a and 181b as functional components.
  • the functions of the functional components other than the storage units 181a and 181b are implemented by the processor 1801 or the like, and the functions of the storage units 181a and 181b are implemented by storage devices such as memory 1802, storage 1803, or a combination thereof.
  • the functions of the storage units 181a and 181b may be implemented by separate storage devices or may be implemented by one storage device. Not all of the above functional building blocks are essential.
  • the robot controller 190 includes drive command units 190a to 190e as functional components. Functions of the drive command units 190a to 190e are implemented by the processor 1901 and the like. Not all of the above functional building blocks are essential.
  • the first storage unit 181a stores motion-related reference data for various tasks.
  • motion-related reference data is denoted as "reference data" in FIG.
  • the action-related reference data includes information on action-related references, which are references related to actions of the robot 100 .
  • the motion-related criteria include criteria for motion-related information that is information related to the motion of the robot 100 .
  • the motion-related information may include motion-related element information such as the motion of the robot 100, the state of the surrounding environment of the robot 100, the state of the object that the robot 100 acts on, and the state of the surrounding environment of the object.
  • the action-related standards are standards that express the minimum required level of action-related elements, the standard level of action-related elements, the excellent level of action-related elements, and the ideal level of action-related elements. good too.
  • Action-related criteria may include criteria for features to be evaluated, which are features for evaluating action-related elements.
  • one task includes a series of multiple motions of the robot 100.
  • the motion-related criteria for motion of the robot 100 may include a series of multiple motion criteria for each of the robot arms 120A and 120B, the end effectors 130A and 130B, and the vehicle 110 that form the task of interest.
  • the evaluation target features of the action-related criteria of the action of the robot 100 are the time required for the action, the order of the action, the position of the robot 100 and its parts in the action, and the posture of the robot 100 and its parts in the action.
  • the trajectory of the robot 100 and its parts in the motion the velocity of the position in the motion, the velocity of the posture in the motion, the acceleration of the position in the motion, the acceleration of the posture in the motion, and , the work performance of the robot 100 in the motion, and the like.
  • the state of the surrounding environment of the robot 100 may include the state of robot peripheral elements, which are components such as devices, equipment, and equipment that form the surrounding environment for the robot 100 to work.
  • robot peripheral elements include peripheral elements such as devices, facilities and equipment arranged in the work area of the robot 100 such as the service area AS, peripheral elements such as devices, facilities and equipment cooperating with the robot 100, and , and peripheral elements such as sensors, such as imaging devices, placed in the work area.
  • the features to be evaluated for the motion-related criteria of the robot peripheral elements are the force and impact exerted by the robot 100 on the robot peripheral elements, changes in the position, posture, shape and state of the robot 100 exerted on the robot peripheral elements, and the robot relative to the robot peripheral elements.
  • the robot peripheral elements of the robot 100 are other robots 100, storage racks SR in which articles W, which are objects, are stored, and users other than the user P to whom the robot 100 provides services. .
  • the state of the object may include the state of the object handled by the robot 100 using the end effectors 130A and 103B.
  • the features to be evaluated for the motion-related criteria of the object are the position, pose, position velocity, pose velocity, position acceleration and pose acceleration of the object, changes in the shape of the object, changes in the state of the object, and , the workmanship of the object, and other features.
  • the target object is an article W.
  • FIG. 1 the target object is an article W.
  • the state of the surrounding environment of the object may include the state of the elements surrounding the object, which are constituent elements such as devices, equipment, and equipment that handle the object other than the robot 100 .
  • object peripheral elements include peripheral elements such as devices, facilities and equipment that support, hold, accommodate, load, mount and transfer objects, peripheral elements such as equipment, facilities and equipment to which objects are assembled, and , peripheral elements such as devices, equipment, and equipment that apply processing to objects.
  • the features to be evaluated for the motion-related criteria of the object peripheral elements are the force and impact exerted by the robot 100 on the object peripheral elements, the position, posture, shape and state changes exerted by the robot 100 on the object peripheral elements, and the object Features such as the position, posture, position velocity, posture velocity, position acceleration, and posture acceleration of the robot 100 and its parts with respect to peripheral elements may be included.
  • the object peripheral elements are the shelf board of the storage shelf SR, other articles in the storage shelf SR, the user P, and the like.
  • the second storage unit 181b stores accumulated data, which is data accumulated by the accumulation processing unit 180f.
  • the accumulated data includes information on the evaluation result determined by the evaluation processing unit 180c and information on the action-related criteria used to determine the evaluation result in association with each other. That is, in the accumulated data, action-related information, action-related reference information for the action-related information, and evaluation result information based on the action-related reference are associated.
  • the second storage unit 181b is an example of a functional unit realized by the first storage device.
  • the motion command unit 180a During manual operation control, the motion command unit 180a generates motion commands for causing the robot arms 120A and 120B, the end effectors 130A and 130B, and the transport vehicle 110 to operate according to the operation commands received from the operation terminal 200, and the robot controller 190 output.
  • the motion command unit 180a During automatic operation control, the motion command unit 180a generates motion commands for causing the robot arms 120A and 120B, the end effectors 130A and 130B, and the transport vehicle 110 to move according to the control program, and outputs them to the robot controller 190.
  • the operation command is a command indicating the content of the operation input to the operation input device 201 by the operator PO.
  • the operation command includes a command for operating the robot 100 under manual operation control according to the content of the operation.
  • the terminal computer 202 receives a signal indicating the content of the operation from the operation input device 201 , converts the signal into an operation command, and transmits the operation command to the information processing device 180 .
  • the motion command unit 180a issues a first motion command to the robot arm 120A, a second motion command to the robot arm 120B, a third motion command to the end effector 130A, a fourth motion command to the end effector 130B, and a transport vehicle. Generate a fifth motion command to 110 .
  • a third motion command and a fourth motion command respectively, may be generated when end effectors 130A and 130B include drives.
  • the first motion command and the second motion command may include target position commands and target force commands generated by each portion of the robot arms 120A and 120B.
  • the third and fourth motion commands may include target state commands for the end effectors 130A and 130B.
  • the fifth motion command may include a target position command for the transport vehicle 110 .
  • the target position command may include commands such as target position, position target movement speed, target attitude, and attitude target movement speed.
  • the target force command may include commands such as target force magnitude and direction.
  • the target force command may include a target acceleration command.
  • the timing unit 180h measures the elapsed time and outputs the timing result to the detection processing unit 180b, the evaluation processing unit 180c, the consideration determination unit 180d, the presentation processing unit 180e, and the like.
  • the timing result can be used for processing in the detection processing unit 180b, the evaluation processing unit 180c, the consideration determination unit 180d, the presentation processing unit 180e, and the like.
  • the function of the clock unit 180h may be realized by a clock or the like mounted on the computer device of the information processing device 180. FIG.
  • the detection processing unit 180b executes detection processing for detecting motion-related information of the robot 100 that operates according to the operation command while the robot 100 is performing work.
  • the detection processing unit 180b outputs the detection result to the evaluation processing unit 180c.
  • the detection processing unit 180b is configured to output the detection result to the evaluation processing unit 180c without setting a delay time after detecting the action-related information. may be provided.
  • the detection processing unit 180b detects information of action-related elements as action-related information during execution of work.
  • the detection processing unit 180b may be configured to detect information of motion-related elements in any manner.
  • the detection processing unit 180b detects the motion of the robot 100, specifically, the motions of the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110 as motion-related elements.
  • the detection processing unit 180b may be configured to process an operation command and detect a target motion of the robot 100 indicated by the operation command as a motion-related element. It may be configured to detect motion as a motion-related element.
  • the detection processing unit 180b may be configured to acquire drive commands from the drive command units 190a to 190e of the robot controller 190, process them, and detect target motions of the robot 100 indicated by the drive commands as motion-related elements.
  • the detection processing unit 180b may be configured to acquire feedback information from the drive command units 190a to 190e of the robot controller 190, process it, and detect the motion of the robot 100 indicated by the feedback information as a motion-related element.
  • the detection processing unit 180b acquires captured image data from the imaging devices 144, 145 and 146 and an imaging device external to the robot 100, and processes the captured image data. It may be configured to detect as
  • the detection processing unit 180b detects the state of robot peripheral elements forming the peripheral environment of the robot 100 as motion-related elements. For example, the detection processing unit 180b detects the state of the robot peripheral elements using the detection results received from the external sensor of the robot 100 and the information received from the robot peripheral elements.
  • the external sensors include force sensors arranged between the end effectors 130A and 103B and the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and the service area AS. It is a sensor such as an imaging device.
  • Examples of information received from the robot peripheral elements are the state, vibration, impact, position, attitude, position velocity, attitude velocity, etc. of the robot peripheral elements.
  • the detection processing unit 180b detects, as an action-related element, the state of an action object, which is an object to which the robot 100 acts.
  • the detection processing unit 180b detects the state of the action object using the detection result received from the external sensor of the robot 100 and the information received from the action object.
  • the external sensors include the force sensors of the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and sensors arranged in the service providing area AS.
  • Examples of information received from the action object are the detection results of sensors provided by the action object, and the state, vibration, impact, position, attitude, position velocity, attitude velocity, position acceleration, and attitude of the action object. acceleration and the like.
  • the detection processing unit 180b detects the state of an object peripheral element that forms the peripheral environment of the action object as an action-related element. For example, the detection processing unit 180b detects the state of the object peripheral element using the detection result received from the external sensor of the robot 100 and the information received from the object peripheral element.
  • the external sensors include the force sensors of the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and sensors arranged in the service providing area AS.
  • Examples of information received from the object surrounding elements are the state, vibration, impact, position, orientation, positional velocity, and orientational velocity of the object surrounding elements.
  • the detection processing unit 180b may be configured to detect motion-related elements at any timing.
  • the detection processing unit 180b detects motion-related elements at predetermined timings, such as timings at predetermined time intervals, predetermined timings for executing predetermined motions of the robot 100, and predetermined timings for predetermined work steps. may be configured to detect Although not limited, in the present embodiment, the detection processing unit 180b executes detection processing at predetermined time intervals.
  • the detection processing unit 180b is an example of a functional unit implemented by a processing circuit.
  • a predetermined motion of the robot 100 is an example of a motion to be processed.
  • the unit of operation to be processed may be set arbitrarily.
  • the delimitation of the motion to be processed in one unit may be a temporal delimitation such as the execution period of the motion, a delimitation of motion change such as the timing at which the motion changes, a designated delimitation, other delimitation, or two of these. It may be set by a combination of the above.
  • one unit of processing target operation is a series of operations to be executed.
  • one unit of processing target motion includes a moving motion of the robot arm 120A that moves the end effector 130A to the target object, a gripping motion of the end effector 130A that grips the target object, and a moving motion of the end effector 130A onto the table 140a to move the target object.
  • one target action may be set for the four actions, or two or more target actions may be set.
  • a work stage can be set by dividing a series of actions included in the work.
  • the unit of work steps may be set arbitrarily.
  • a division of one unit of work stage may be a temporal division such as a work execution period, a movement change division such as a timing at which the movement included in the work is switched, a designated division, other divisions, or any of these divisions. It may be set by a combination of two or more.
  • one unit of work stage may be set to include a portion of all operations performed in
  • one unit work stage includes a process in which the robot 100 moves from a position near the user P to a position near the point A, and the robot 100 grips an object at the point A with the end effector 130A and places it on the robot 100. moving the robot 100 from a position near the point A to a position near the user P; and handing the object on the robot 100 to the user P using the end effector 130A.
  • one work stage may be set for the four processes, or two or more work stages may be set.
  • the evaluation processing unit 180c performs evaluation processing for comparing and evaluating the motion-related information of the robot 100 detected by the detection processing unit 180b with the motion-related criteria of the motion-related information while the robot 100 is performing a task. Specifically, the evaluation processing unit 180c compares the information of the action-related element included in the action-related information with the action-related criteria of the action-related element, determines the evaluation result of the action-related element, and determines the evaluation result of the action-related element. An evaluation result of the action-related information is determined based on the evaluation result of the element.
  • the action-related information to be evaluated by the evaluation processing unit 180c is action-related information corresponding to the process-target action to be evaluated among the process-target actions included in the work.
  • the processing target motion to be evaluated is referred to as an "evaluation target motion".
  • the evaluation target action is one or more of all the process target actions included in the work. All of the process target actions may be the evaluation target actions, or part of all the process target actions may be the evaluation target actions. Two or more evaluation target actions may be continuous with each other or may be discontinuous with each other.
  • the evaluation processing unit 180c may determine the deviation amount, deviation direction and An evaluation result of either or both of the action-related element and the evaluation target feature may be determined based on the deviation frequency or the like. Furthermore, the evaluation processing unit 180c may determine the evaluation result of the action-related information based on the evaluation results of either or both of the action-related element and the evaluation target feature.
  • the evaluation processing unit 180c executes the evaluation process at either or both timings of each evaluation target motion and each evaluation target stage, which is a work stage to be evaluated in the work. All the work stages included in the work may be the evaluation target stages, or a part of all the work stages may be the evaluation target stages. Furthermore, the action-related information corresponding to all the processing target actions included in the evaluation target stage may be the evaluation target for the evaluation of the evaluation target stage. Relevant information may be subject to evaluation for evaluation in the subject to evaluation stage.
  • the evaluation processing unit 180c reads the motion-related reference data of the evaluation target motion from the first storage unit 181a, and stores the motion-related reference data and the evaluation target motion corresponding to the motion-related reference data. It compares with the action-related information and determines an evaluation result of the action-related information. For example, the evaluation processing unit 180c may determine the evaluation level of action-related information. For example, the evaluation processing unit 180c compares the evaluation target feature criteria included in the action-related reference data with the evaluation target feature of the action-related element of the action-related information, and determines the evaluation level of the evaluation target feature. good too. The evaluation processing unit 180c may determine one or both of the evaluation level of the action-related element and the evaluation level of the action-related information based on the evaluation level of the evaluation target feature.
  • the evaluation processing unit 180c executes the evaluation processing of the action-related information corresponding to the first evaluation target action by the timing of the second evaluation target action, which is the next evaluation target action after the first evaluation target action. can.
  • the evaluation processing unit 180c can execute and complete the evaluation processing of the action-related information corresponding to the first evaluation target action by the timing of the second evaluation target action.
  • the timing of the second evaluation target motion is the reference timing of the second evaluation target motion included in the motion-related reference data, the detection timing of the motion-related information corresponding to the second evaluation target motion, or any of these. It may be the timing of combination or the like.
  • the reference timing and detection timing may be an operation start timing, an operation completion timing, or a predetermined timing therebetween.
  • the evaluation processing unit 180c may be configured to execute the evaluation process of the action-related information corresponding to the first evaluation target action by the timing of the next process target action after the first evaluation target action.
  • the timing of the next processing target motion may be the completion timing or a predetermined timing between the start timing and the completion timing of the actual motion of the robot 100 corresponding to the processing target motion.
  • the evaluation processing unit 180c performs the evaluation processing without setting a delay time after receiving the detection result of the motion-related information corresponding to the first evaluation target motion from the detection processing unit 180b. Start.
  • the evaluation processing unit 180c retrieves the action-related reference data of the process target action to be evaluated among the process target actions included in the evaluation target stage from the first storage unit 181a.
  • the action-related reference data is read out, and the action-related information corresponding to the process target action to be evaluated is compared with the action-related reference data to determine the evaluation result of the action-related information.
  • the evaluation processing unit 180c integrates the evaluation results of the motion-related information corresponding to the processing target motions of all the evaluation targets, and determines the evaluation result of the evaluation target stage based on the integration result. For example, the evaluation processing unit 180c may determine the evaluation level of the evaluation target stage. Any integration method may be used.
  • the integration method may be a method of adding evaluation results of a plurality of action-related information, a method of weighting and adding evaluation results of a plurality of action-related information, or a method of adding evaluation results of a plurality of action-related information.
  • a method of integrating the evaluation results by other statistical processing may be used.
  • the evaluation processing unit 180c can, for example, execute the evaluation process of the first work stage by the timing of the second evaluation stage, which is the next evaluation stage after the first evaluation stage.
  • the evaluation processing unit 180c can execute and complete the evaluation processing of action-related information corresponding to the process target actions of all evaluation targets included in the first evaluation target stage by the timing of the second evaluation target stage.
  • the timing of the second work stage is the reference timing of the second evaluation target stage included in the motion-related reference data, the detection timing of the second evaluation target stage, or a combination of these timings. good too.
  • the reference timing and the detection timing may be the start timing of the first target operation in the second evaluation stage, the completion timing of the last target operation, or a predetermined timing therebetween.
  • the evaluation processing unit 180c may be configured to execute the evaluation process of the first evaluation target stage by the timing of the work stage following the first evaluation target stage.
  • the timing of the next work step may be the start timing of the first target operation of the work step, the completion timing of the last target operation, or a predetermined timing between these.
  • the evaluation processing unit 180c collects action-related information corresponding to the last evaluation target processing target action among all evaluation target processing target actions included in the first evaluation target stage. is received from the detection processing unit 180b, the evaluation processing is started without providing a delay time.
  • the evaluation processing unit 180c is an example of a functional unit implemented by a processing circuit.
  • the compensation determination unit 180d determines compensation to be given to the operator PO for the operation of the robot 100 based on the evaluation result of the evaluation processing unit 180c.
  • the price determining unit 180d determines a price corresponding to the evaluation result of the action-related information each time the action-related information corresponding to the action to be evaluated is evaluated.
  • the compensation determination unit 180d determines a compensation corresponding to the evaluation result of the evaluation target stage each time the evaluation process of the evaluation target stage is performed.
  • the consideration may be a reward amount, points, score, score, etc. given to the operator PO.
  • the reward amount may be a reward amount in real currency or may be a reward amount in virtual currency.
  • the compensation determination unit 180d sequentially accumulates the individual compensations that have been determined during the execution of the work, and calculates the cumulative result that is the result of accumulating the compensations. Each time the price determining unit 180d determines a price, it outputs information on the price that has been decided and information on the accumulation result including the price that has been decided to the presentation processing part 180e.
  • the consideration determining unit 180d may be configured to separately accumulate the consideration for the action-related information corresponding to the action to be evaluated and the consideration for the stage to be evaluated, thereby calculating two accumulation results. It may be configured to integrate and accumulate to calculate one integrated result.
  • the price determination unit 180d may be configured to add weight to the evaluation level of the action-related information corresponding to the action to be evaluated and determine the price for the action-related information.
  • the price determining unit 180d may be configured to determine the price of the action-related information by adding weights to the evaluation levels of the evaluation target features of the action-related information and integrating the weighted evaluation levels. It may be configured to further weight the determined consideration for the action-related information. For example, the difficulty level of the operation differs depending on the motion to be evaluated.
  • the consideration determination unit 180d may be configured to change the weighting according to the motion to be evaluated.
  • the price determining unit 180d may be configured to change the weighting according to the feature to be evaluated.
  • the price determining unit 180d may be configured to determine the price by weighting the evaluation level of the evaluation target stage.
  • the consideration determining unit 180d adds weights to the evaluation levels of the action-related information corresponding to the processing target action of the evaluation target included in the evaluation target stage, and integrates the evaluation levels after the weighting, thereby A consideration may be determined, and it may be configured to further weight the determined consideration for the evaluation target stage.
  • the consideration determining unit 180d may be configured to change the weighting according to the evaluation target stage and the processing target action. Any method may be used as the integration method used by the consideration determining unit 180d.
  • the consideration determining unit 180d is an example of a functional unit implemented by a processing circuit.
  • the presentation processing unit 180e executes presentation processing for causing the presentation device 203, which receives information presentation while the operator PO operates the operation input device 201, to present the evaluation result of the evaluation processing unit 180c during the execution of the work. do.
  • the presentation processing unit 180e causes the presentation device 203 to present the price determined by the price determination unit 180d and the accumulated result of the price during execution of the work.
  • the presentation processing unit 180 e generates data including the price received from the price determination unit 180 d and the accumulated result, and transmits the data and a presentation command for the data to the operation terminal 200 .
  • the presentation processing unit 180e may generate data including image data, audio data, character string data, or a combination of two or more of these, and transmit the generated data to the operation terminal 200, which indicates the price and the accumulated result.
  • the presentation processing unit 180e displays the evaluation result of the motion-related information corresponding to the evaluation target motion in the same order as the execution order of the evaluation target motion. is transmitted to the operation terminal 200 and presented by the presentation device 203 .
  • Evaluation results may include either or both of an evaluation level and consideration.
  • the presentation processing unit 180e can execute the presentation processing of the evaluation result of the action-related information corresponding to the first evaluation target action by the timing of the next second evaluation target action.
  • the timing of the second evaluation target action may be any one of the timings exemplified for the timing of the second evaluation target action of the evaluation processing unit 180c. They may be the same or different.
  • the presentation processing unit 180e transmits the evaluation result of the evaluation target stage to the operation terminal 200 in the same order as the execution order of the evaluation target stage. and causes the presentation device 203 to present it.
  • Evaluation results may include either or both of an evaluation level and consideration.
  • the presentation processing unit 180e can execute the presentation processing of the evaluation result of the first evaluation target stage by the timing of the next second evaluation target stage.
  • the timing of the second evaluation target stage may be any of the timings exemplified for the timing of the second evaluation work stage of the evaluation processing unit 180c, and may be the timing of the second evaluation work stage of the evaluation processing unit 180c. They may be the same or different.
  • the presentation processing unit 180e is an example of a functional unit implemented by a processing circuit.
  • the accumulation processing unit 180f associates the action-related reference data, the evaluation result information of the action-related information based on the action-related reference data, and the execution date and time of the action of the robot 100 corresponding to the action-related information, and stores them as accumulated data. It is stored in the second storage unit 181b.
  • the accumulation processing unit 180f causes the second storage unit 181b to store the accumulated data regarding the motion to be evaluated and the accumulated data regarding the stage to be evaluated.
  • the accumulation processing unit 180f stores information on the evaluation target motion, information on the evaluation result of the motion-related information corresponding to the evaluation target motion, and information on the evaluation result of the motion-related information corresponding to the evaluation target motion as the accumulated data on the evaluation target motion. It may be stored in the second storage unit 181b in association with the motion-related reference data.
  • the evaluation result information may include either or both of the evaluation level and the consideration.
  • the accumulation processing unit 180f stores, as accumulated data related to the evaluation target stage, information on the evaluation target stage, information on the processing target action of the evaluation target included in the evaluation target stage, and action-related information corresponding to the processing target action. 2 or more of the evaluation result information, the evaluation result information of the evaluation target stage, and the action-related reference data used in the evaluation process of the evaluation result are associated and stored in the second storage unit 181b. You may let The evaluation result information may include either or both of the evaluation level and the consideration.
  • the reference determination unit 180g uses the accumulated data in the second storage unit 181b to determine new action-related reference data, and stores the action-related reference data in the first storage unit 181a.
  • the reference determination unit 180g generates new second action-related reference data using the first action-related reference data included in the accumulated data of the second storage unit 181b and the evaluation result based on the first action-related reference data. to decide.
  • An evaluation result based on the first action-related reference data is an evaluation result determined using the first action-related reference data.
  • the reference determination unit 180g can accept specified conditions for determining the second action-related reference data.
  • the reference determination unit 180g receives a specified condition from the operation terminal 200.
  • the specified condition may be a condition for limiting the first action-related reference data and evaluation results used to determine the second action-related reference data.
  • the reference determination unit 180g selects, from the data stored in the second storage unit 181b, the information of the evaluation result that satisfies the specified condition, and the first action-related reference data that is the action-related reference data used for the evaluation result.
  • Second action-related reference data may be determined using the evaluation result information and the first action-related reference data.
  • the specified condition may be a condition that limits the attributes of the evaluation results for which the first action-related reference data is used.
  • the attributes of the evaluation results are the users targeted by the evaluation results, the robots targeted by the evaluation results, the tasks targeted by the evaluation results, the workplaces targeted by the evaluation results, the work environments targeted by the evaluation results, and the evaluation results. It may be the work execution date targeted by the evaluation result, and the work execution period targeted by the evaluation result.
  • the specified condition may be a condition that limits one or more evaluation result attributes.
  • the reference determination unit 180g can use the evaluation result of a specific user and the first action-related reference data to decide the second action-related reference data corresponding to the user. For example, if the user's evaluation level is low, the reference determination unit 180g may generate second action-related reference data with a lower reference than the first action-related reference data.
  • the user can command the information processing device 180 via the operation terminal 200 to request evaluation using the second motion-related reference data. As a result, the user can obtain a higher evaluation from the information processing device 180, and can be motivated to operate the robot.
  • the reference determination unit 180g is an example of a functional unit implemented by a processing circuit.
  • the first drive command unit 190a generates a drive command for driving the joints of the arm drive devices M1A to M4A of the robot arm 120A up to the state of the target position command and the target force command included in the first motion command. and output to the drive circuit 142a.
  • the drive command includes a current command value for the servomotor.
  • the first drive command section 190a generates a drive command using feedback information of the servomotor.
  • the second drive command unit 190b generates a drive command for driving the joints of the arm drive devices M1B to M4B of the robot arm 120B up to the state of the target position command and the target force command included in the second motion command. and output to the drive circuit 142a.
  • the drive command includes a current command value for the servomotor.
  • the second drive command section 190b generates a drive command using feedback information of the servomotor.
  • the third drive command section 190c generates a drive command for driving the drive device of the end effector 130A to the target state included in the third motion command, and outputs it to the drive circuit 142a.
  • the fourth drive command section 190d generates a drive command for driving the drive device of the end effector 130B to the target state included in the fourth operation command, and outputs it to the drive circuit 142a.
  • the fifth drive command unit 190e generates a drive command for causing the carrier drive device 113 of the carrier 110 to drive the drive wheels 111 to the state of the target position command included in the fifth operation command.
  • output to The drive command includes a current command value for the servomotor.
  • the fifth drive command section 190e generates a drive command using feedback information of the servomotor.
  • FIG. 5 is a flow chart showing an example of the operation of the robot system 1 according to the embodiment.
  • FIG. 6 is a plan view showing an example of the operation of the robot system 1 according to the embodiment.
  • the information processing apparatus 180 executes the evaluation process and the presentation process each time the evaluation target motion is performed and each time the evaluation target stage is performed.
  • the operator PO inputs a request to operate the robot to the operation terminal 200, and the operation terminal 200 transmits the request to the server 300 (step S101).
  • the server 300 searches for the robot 100 that can be operated by the operator PO, and connects the information processing device 180 of the searched robot 100 and the operation terminal 200 via the communication network N (step S102).
  • the operator PO When the operator PO receives the connection completion notification from the server 300, the operator PO inputs a robot operation execution start command to the operation terminal 200. Operation terminal 200 transmits the command to information processing device 180 .
  • the information processing device 180 activates the robot 100 so that it can operate according to the operation command received from the operation terminal 200 .
  • the information processing device 180 turns ON the work execution flag for the service of the robot 100, that is, starts work (step S103).
  • the operator PO inputs into the operation terminal 200 the specification conditions of the motion-related reference data used for evaluating the robot operation (step S104).
  • the operation terminal 200 transmits a command for the specified condition to the information processing device 180 .
  • the specified condition may be a condition that specifies standard action-related reference data or a condition that specifies attributes of evaluation results to be used.
  • the operator PO inputs a specified condition that designates the use of the evaluation results for the operator PO.
  • the information processing device 180 generates new action-related reference data using the specified condition received from the operation terminal 200 and the accumulated data accumulated in the information processing device 180, and uses the action-related reference data. is determined (step S105).
  • the action-related reference data is action-related reference data adjusted in accordance with the evaluation result of the operator PO.
  • the operator PO searches for the user P requesting the service within the service providing area AS while checking the screen of the presentation device 203 that displays the captured image of the imaging device 145 and the like.
  • the operator PO finds the user P
  • the operator PO operates the operation terminal 200 while visually recognizing the screen of the presentation device 203 that displays the captured images of one or both of the imaging devices 145 and 146, and moves the robot 100 to the user. It is caused to travel to a position near P (step S106).
  • the information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time it is detected (step S107).
  • each piece of action-related information at a predetermined time interval corresponds to an evaluation target action.
  • the information processing device 180 causes the presentation device 203 to display the price of the action-related information and the accumulated result of the price.
  • the information processing device 180 evaluates the evaluation target features such as the running speed, running acceleration, running trajectory, and interference with the robot peripheral elements for the motion-related information.
  • the information processing apparatus 180 executes the detection process, the evaluation process, and the presentation process without providing a delay time between them. , it is possible to check in real time without delay from the time when the operation for the evaluation target motion is input to the operation terminal 200, that is, in real time.
  • the information processing device 180 executes evaluation processing and presentation processing of the first work stage, which is a travel process from the starting position to the near position, and Then, the result of integration is displayed on the presentation device 203 (step S108).
  • the operator PO confirms that the robot 100 has arrived at a position near the user P via the screen of the presentation device 203, the operator PO makes an input to the operation terminal 200 and the display device 148 and the voice output device 149 of the robot 100. An inquiry is made to the user P about the article W to be requested via the output from either or both of them (step S109).
  • the operator PO inputs to the operation terminal 200 to receive the requested article W1.
  • a request is made to the information processing device 180 for information on the position of the requested article W1 on the shelf SR.
  • the information processing device 180 can store information on the service providing area AS where the robot 100 is arranged and information on the service provided, in addition to the information on the robot 100 on which the information processing device 180 is mounted.
  • the operation terminal 200 receives the information on the position PW1 of the requested item W1 from the information processing device 180 and presents it to the operator PO (step S110).
  • the server 300 may hold information on the service providing area AS and information on the service provided, and the operation terminal 200 may request the information on the position of the requested article W1 from the server 300 .
  • the operator PO operates the operation terminal 200 while visually recognizing the screen of the presentation device 203 that displays the information of the position PW1 and the captured image of one or both of the imaging devices 145 and 146, and operates the robot. 100 is moved to a position near the position PW1 (step S111).
  • the information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time detection is performed. (step S112).
  • step S108 when the robot 100 reaches a position near the position PW1, the information processing device 180 performs evaluation processing of the second work stage, which is a traveling process from a position near the user P to a position near the position PW1. And presentation processing is executed (step S113).
  • the operator PO When the operator PO confirms through the screen of the presentation device 203 that the robot 100 has arrived at a position near the position PW1, the operator PO instructs the operation terminal 200 to operate the robot arms 120A and 120B and the end effectors 130A and 130B. Input (step S114). This operation is to take out the requested item W1 stored in the storage shelf SR from the storage shelf SR using the end effectors 130A and 130B and place the requested item W1 on the table 140a of the robot 100.
  • the operator PO inputs an operation to the operation terminal 200 while viewing the screen of the presentation device 203 that displays the captured images of the imaging devices 144 and 145 .
  • the information processing device 180 detects the motion-related information of the robot 100 during the motion of the robot 100, and every time the motion-related information corresponding to the motion to be evaluated is detected, the motion-related information is evaluated and presented (step S115).
  • the motions to be evaluated include the movement motions of the robot arms 120A and 120B that move the end effectors 130A and 130B to the required item W1, the gripping motion of the end effectors 130A and 130B that grip the required item W1, and the movement of the required item W1 onto the table 140a.
  • one or more of the processing target operations including the movement and placement operation of the robot arms 120A and 120B that move and place the required article W1, and the grip release operation of the end effectors 130A and 103B that release the grip of the requested article W1. It is a target action.
  • all the motions to be processed are motions to be evaluated.
  • the information processing device 180 evaluates the evaluation target feature for the action-related information corresponding to each evaluation target action.
  • the evaluation target features include the time required for the operation of the robot 100, the positions, postures, position movement speeds, posture movement speeds, position accelerations, posture accelerations, and movement trajectories of the end effectors 130A and 103B, and the required article Position, attitude and state of W1; force and impact exerted by robot 100 on requested article W1; contact, force and impact exerted by robot 100 on object peripheral elements; interference between robot 100 and robot peripheral elements may include one or more of
  • the object peripheral elements may be shelves of the storage shelf SR, other articles, and the like.
  • the acceleration of the end effectors 130A and 103B can indicate the impact or the like that the end effectors 130A and 103B give to the requested article W1.
  • the information processing device 180 processes an image of the requested article W1 captured by either or both of the imaging devices 144 and 145, and holds the requested article W1 gripped by the end effectors 130A and 103B. It may be configured to detect a state and evaluate the holding state as the feature to be evaluated.
  • the information processing device 180 processes the image of the requested article W1 on the table 140a captured by the imaging device 144 or the like, detects the placement state of the requested article W1, and evaluates the placement state as an evaluation target feature. It may be configured as
  • the information processing apparatus 180 performs a third work, which is a take-out operation process from the arrival of the robot 100 at a position near the position PW1 to the completion of the release of the grip.
  • the stage evaluation process and presentation process are executed, and the price and the accumulated result are displayed on the presentation device 203 (step S116).
  • step S117 when the operator PO confirms that the requested article W1 is placed on the table 140a through the screen of the presentation device 203, an image captured by one or both of the imaging devices 145 and 146 is displayed. While viewing the screen of the presentation device 203, the operation terminal 200 is operated to cause the robot 100 to travel to a position near the user P (step S117).
  • the information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time detection is performed. (step S118).
  • step S108 when the robot 100 reaches a position near the user P, the information processing device 180 performs an evaluation process of a fourth work stage, which is a traveling process from a position near the position PW1 to a position near the user P. And presentation processing is executed (step S119).
  • a fourth work stage which is a traveling process from a position near the position PW1 to a position near the user P.
  • presentation processing is executed (step S119).
  • step S120 This operation is an operation of handing over the requested item W1 placed on the table 140a to the user P using the end effectors 130A and 130B.
  • the operator PO inputs an operation to the operation terminal 200 while viewing the screen of the presentation device 203 that displays the captured images of the imaging devices 144 and 145 .
  • the information processing device 180 detects the motion-related information of the robot 100 during the motion of the robot 100, and every time the motion-related information corresponding to the motion to be evaluated is detected, the motion-related information is evaluated and presented (step S121).
  • the motions to be evaluated include the movement motions of the robot arms 120A and 120B that move the end effectors 130A and 130B to the required item W1, the gripping motions of the end effectors 130A and 130B that grip the required item W1, and the user P's movement of the requested item W1.
  • Objects to be processed include movement operations of the robot arms 120A and 120B that move forward and present, and grip releasing operations of the end effectors 130A and 103B that release the grip of the requested article W1 according to the holding state of the requested article W1 by the user P.
  • the information processing device 180 evaluates the evaluation target feature for the action-related information corresponding to each evaluation target action.
  • the evaluation target features include the time required for the operation of the robot 100, the positions, postures, position movement speeds, posture movement speeds, position accelerations, posture accelerations, and movement trajectories of the end effectors 130A and 103B, and the required article It may include one or more of the position, posture, and state of W1, the force and impact exerted by the robot 100 on the requested item W1, and the contact, force, and impact exerted by the robot 100 on an object peripheral element such as the user P. .
  • the information processing device 180 processes an image of the requested article W1 captured by either or both of the imaging devices 144 and 145, and holds the requested article W1 gripped by the end effectors 130A and 103B. It may be configured to detect a state and evaluate the holding state as the feature to be evaluated.
  • the information processing apparatus 180 performs a fifth work, which is a hand-over operation process from the arrival of the robot 100 to a position near the user P to the completion of the release of the grip.
  • the stage evaluation process and presentation process are executed, and the price and the accumulated result are displayed on the presentation device 203 (step S122).
  • the operator PO When the operator PO ends the robot operation, the operator PO inputs an operation end command to the operation terminal 200 .
  • the information processing apparatus 180 receives an operation end command from the operation terminal 200 (Yes in step S123), the information processing apparatus 180 ends the series of processes.
  • the information processing device 180 turns off the work execution flag and stops the robot 100 .
  • the server 300 cuts off the connection between the operation terminal 200 and the information processing device 180 .
  • the information processing apparatus 180 returns to step S106 when not receiving an operation end command (No in step S123).
  • the information processing device 180 collects the action-related information corresponding to the action to be evaluated, the evaluation result for the work stage, the price, the accumulated price, and the action-related reference data used for the evaluation. are stored and accumulated in association with each other.
  • the operator PO operates the robot 100 while confirming the evaluation of the operation of the operator PO through the consideration and the accumulated result of the consideration each time the motion to be evaluated and the work stage are performed. can be manipulated. As a result, the operator PO can find a concrete measure for improving the skill of operating the robot. The operator PO can be motivated to improve the skill of operating the robot in order to obtain more compensation.
  • This modification differs from the embodiment in that the robot system includes a learning device 400 .
  • this modified example will be described with a focus on the points that are different from the embodiment, and the description of the points that are the same as the embodiment will be omitted as appropriate.
  • FIG. 7 is a block diagram showing an example of functional configurations of an information processing device 180A and a learning device 400 according to a modification.
  • the information processing device 180A further includes a data input/output unit 180i as a functional component.
  • the data input/output unit 180 i inputs and outputs data to and from the learning device 400 .
  • the data input/output unit 180 i outputs accumulated data stored in the second storage unit 181 b to the learning device 400 .
  • the data input/output unit 180i requests action-related reference data from the learning device 400 in response to a request from the reference determination unit 180g.
  • the data input/output unit 180 i receives the specified condition and the information of the evaluation result satisfying the specified condition from the reference determination unit 180 g and sends the information to the learning device 400 .
  • the learning device 400 sends the action-related reference data corresponding to the specified condition and the evaluation result to the data input/output unit 180i.
  • the learning device 400 includes a computer device, similar to the information processing device 180A. Although not limited, in this modification, the learning device 400 is a device separate from the information processing device 180A and the robot controller 190, and is capable of data communication with the information processing device 180A via wired communication, wireless communication, or a combination thereof. connected to Any wired or wireless communication may be used.
  • the learning device 400 is mounted on the robot 100 together with the information processing device 180A.
  • the learning device 400 may be incorporated in the information processing device 180A or the robot controller 190.
  • the learning device 400 and the information processing device 180A form an information processing system 500.
  • Such a learning device 400 may include processing circuitry including a processor and memory.
  • the learning device 400 may be configured to input/output data with the information processing device 180A via a storage medium.
  • Storage media include semiconductor-based or other integrated circuits (ICs), hard disk drives (HDDs), hybrid hard disk drives (HHDs), optical disks, optical disk drives (ODDs), optical Magnetic disk, magneto-optical drive, floppy disk drive (FDD), magnetic tape, solid state drive (SSD), RAM drive, secure digital card or drive, any other suitable storage medium, or two of these Combinations of the above may be included.
  • the learning device 400 is mounted on the robot 100, but may be arranged outside the robot 100 and connected to the information processing device 180A via the communication network N.
  • the learning device 400 may be configured to connect with the information processing devices 180A of two or more robots 100 .
  • the learning device 400 may be placed in the service provision area AS, may be placed in the operation area AO, or may be placed elsewhere.
  • the learning device 400 may be arranged at the location of the server 300 and configured to perform data communication with the information processing device 180A via the server 300 .
  • the learning device 400 may be incorporated into the server 300 .
  • the learning device 400 includes a data input/output unit 401, a learning data storage unit 402, a machine learning unit 403, an input data processing unit 404, and an output data processing unit 405 as functional components.
  • the function of the learning data storage unit 402 is implemented by a memory, a storage, or a combination thereof that may be included in the learning device 400, and the functions of functional components other than the learning data storage unit 402 are implemented by a processor or the like included in the learning device 400. Realized.
  • the machine learning unit 403 is an example of a functional unit implemented by a machine learning circuit
  • the learning data storage unit 402 is an example of a functional unit implemented by a second storage device.
  • the data input/output unit 401 receives accumulated data from the data input/output unit 180i of the information processing device 180A, and stores the accumulated data in the learning data storage unit 402 as machine learning data.
  • the data input/output unit 401 receives, from the data input/output unit 180 i , a request command for motion-related reference data, information on specified conditions, and information on evaluation results satisfying the specified conditions, and outputs the information to the input data processing unit 404 .
  • the data input/output unit 401 receives the action-related reference data from the output data processing unit 405 and sends the action-related reference data to the data input/output unit 180i.
  • the input data processing unit 404 converts the specified condition information and evaluation result information received from the data input/output unit 401 into data that can be input to the machine learning model of the machine learning unit 403 and outputs the data to the machine learning unit 403 .
  • the output data processing unit 405 converts the output data of the machine learning unit 403 into motion-related reference data and outputs the data to the data input/output unit 401 .
  • the machine learning unit 403 includes a machine learning model.
  • the machine learning unit 403 causes the machine learning model to perform machine learning using the machine learning data, and improves the accuracy of the output data with respect to the input data in the machine learning model.
  • Machine learning models may include neural networks, random forests, genetic programming, regression models, tree models, Bayes models, time series models, clustering models, ensemble learning models, etc., but in this modification, neural Including network.
  • a neural network includes multiple layers of nodes, including an input layer and an output layer. The node layer contains one or more nodes. When a neural network includes an input layer, an intermediate layer, and an output layer, the neural network performs output processing from the input layer to the intermediate layer and output processing from the intermediate layer to the output layer for information input to the nodes of the input layer.
  • Each node in one layer is connected to each node in the next layer, and the connections between nodes are weighted.
  • the information of the nodes of one layer is weighted by the connection between nodes and output to the nodes of the next layer.
  • the machine learning model uses condition-related information related to specified conditions and evaluation-related information related to evaluation results as input data, and outputs information related to action-related criteria corresponding to the condition-related information and the evaluation-related information. data.
  • the condition-related information may be information indicating either or both of the specified condition and the attribute of the evaluation result, and the evaluation-related information may be information indicating the evaluation result.
  • the machine learning model uses the condition-related information and evaluation-related information included in the machine learning data as input data for machine learning, and uses the criterion-related information related to the action-related criterion data corresponding to the condition-related information and the evaluation-related information.
  • the reference-related information may be information indicating motion-related reference data.
  • the machine learning unit 403 matches or compares output data of a machine learning model when machine learning input data is input with teacher data corresponding to the machine learning input data. Adjust the weighting of the connections between nodes in the neural network, such as to minimize the error.
  • the weighted-adjusted machine learning model can output the reference-related information appropriate to the information.
  • the learning device 400 as described above can output action-related reference data more suitable for the specified condition and the evaluation result than the action-related reference data determined by the reference determination unit 180g of the embodiment.
  • the machine learning model of the machine learning unit 403 may include the function of the output data processing unit 405 and may be configured to output the same output data as the output data processing unit 405 does.
  • the work of the robot 100 targeted by the robot system 1 is work related to the service industry, but is not limited to this.
  • the robot system 1 may target any work.
  • the robot system 1 may be intended for tasks related to industry, agriculture, and other industries.
  • the robot system 1 includes the robot 100, but the robot included in the robot system 1 may be any robot capable of performing work.
  • the robot may have a moving device with a configuration different from that of the carrier 110, and may have legs or the like.
  • the robot does not have to be equipped with a mobile device.
  • the robot may have components other than the robot arm, and may be, for example, a humanoid robot, an animal robot, or the like.
  • the information processing devices 180 and 180A capture the images of external elements other than the robot 100, such as robot peripheral elements, objects, and object peripheral elements.
  • the information processing devices 180 and 180A may detect either or both of the detection result of an external sensor, which is a sensor arranged separately from the external element, and the detection result of a mounted sensor, which is a sensor arranged on the external element. may be configured to detect the state of an external element.
  • the external sensor may be configured to detect the position, orientation, etc. of the external element from the outside.
  • the external sensor is configured to perform detection using light waves, lasers, magnetism, radio waves, electromagnetic waves, ultrasonic waves, or a combination of two or more of these, and the like.
  • the on-board sensor may be configured to move with the external element, detect the position, orientation, etc. of the external element.
  • onboard sensors may include acceleration sensors, angular rate sensors, magnetic sensors, GPS (Global Positioning System) receivers, or combinations of two or more of these, and the like.
  • the information processing devices 180 and 180A may be configured to use AI (Artificial Intelligence) for processing.
  • AI can be used for the processing of determining the evaluation of the motion to be processed, the processing of determining the evaluation of the work stage, the processing of calculating the price and the accumulated result, image processing, and the like.
  • AI may include learning models that perform machine learning.
  • a learning model may include a neural network.
  • the server 300 may be configured to have at least some of the functions of the information processing devices 180 and 180A and the learning device 400.
  • server 300 may be configured to have the functionality of one or more of the functional components included in information processors 180 and 180A shown in FIGS.
  • Server 300 may be configured to have the functionality of one or more of the functional components included in learning device 400 shown in FIG.
  • An information processing device is an information processing device that evaluates an operation of the robot for causing the robot to perform a task, the processing circuit including a processing circuit, wherein the processing circuit receives an input from a user to an operation device.
  • detection processing for detecting motion-related information, which is information related to the motion of the robot that operates in accordance with an operation command received; and a presentation process of presenting the evaluation result of the evaluation process on a presentation device that receives information presentation while the user operates the operation device, are executed during the execution of the work.
  • the processing circuit is configured to perform a first timing each time an evaluation target motion is a motion of the robot to be evaluated, and a second timing each time an evaluation target stage is a work stage to be evaluated in the work.
  • the processing circuit executes the evaluation processing in the same order as the execution order of the evaluation target operation.
  • the processing circuit presents the evaluation result regarding the evaluation target stage in the same order as the execution order of the evaluation target stage. is presented on the presentation device.
  • the information processing apparatus performs the evaluation target motion and the evaluation target stage at either or both of the first timing each time the evaluation target motion is performed and the second timing each time the evaluation target stage is performed. Evaluation processing is performed on either or both of them, and the evaluation results are presented to the user via the presentation device.
  • the user can check the evaluation of the operation at the above timing. Furthermore, the user can confirm the evaluation for the operation in the same order as the operation order. As a result, the user can recognize the evaluated operation and the evaluation result of the operation, so that the user can be motivated to improve the operation accuracy when performing the evaluation target operation next time. Therefore, the information processing device can increase the user's desire to operate the robot.
  • the processing circuit when the processing circuit executes the evaluation process at the first timing, the evaluation process related to the first evaluation target action and the first evaluation target action may be executed by the timing of the second evaluation target action following the first evaluation target action.
  • the information processing device executes the evaluation process and the presentation process regarding the evaluation target motion by the timing of the next evaluation target motion. Therefore, the user can accurately recognize the evaluation target action that has been evaluated and the evaluation result regarding the evaluation target action.
  • the processing circuit when the processing circuit executes the evaluation process at the second timing, the evaluation process related to the first evaluation stage and the first evaluation stage may be executed by the timing of the second evaluation target stage following the first evaluation target stage.
  • the information processing device executes the evaluation process and the presentation process for the evaluation target stage by the timing of the next evaluation target stage. Therefore, the user can accurately recognize the evaluation target stage that has been evaluated and the evaluation result related to the evaluation target stage.
  • the processing circuit further determines a compensation to be given to the user for operating the robot based on the evaluation result
  • the presentation device may be caused to present the results of the accumulation of the considerations being executed.
  • the information processing device presents to the user, via the presentation device, the price indicating the individual evaluation results and the accumulated result of the price.
  • the consideration may be a reward amount, points, score, score, or the like.
  • the processing circuit may include, as the motion-related information, the operation command and a motion command generated for the robot to cause the robot to operate according to the operation command. , detecting one or more of an actual operation result of the robot, a state of the surrounding environment of the robot, a state of an object to which the robot acts, and a state of the surrounding environment of the object. good too.
  • the information processing device can perform evaluation processing based on various information.
  • the processing circuit may convert two or more evaluation target features, which are features for evaluating the action-related information, into the two or more evaluation target features included in the action-related criteria.
  • the action-related information may be evaluated based on the evaluation results of the two or more evaluation target features by comparing with a reference of the evaluation target features.
  • the evaluation method in the information processing device can be clear and concise. Therefore, it is possible to reduce the processing amount of the evaluation process and speed up the process.
  • the features to be evaluated may include features such as required time, position, orientation, trajectory, position velocity, orientation velocity, position acceleration, and orientation acceleration for the motion of the robot.
  • the evaluation target features may include features such as force and impact applied by the robot to the surrounding environment of the robot, and features such as changes in the position, posture, shape and state of the surrounding environment applied by the robot.
  • Features to be evaluated include features such as the position, posture, positional velocity, posture velocity, positional acceleration, and posture acceleration of the object to which the robot acts, and shape changes and states of the object to which the robot acts characteristics such as changes and workmanship, characteristics such as force and impact exerted by the robot on the surrounding environment of the object, and characteristics such as changes in the position, posture, shape and state of the surrounding environment given by the robot.
  • the information processing apparatus may determine the evaluation result of the motion-related information by adding weights to the evaluation results of each of the two or more evaluation target features.
  • the information processing apparatus further includes a first storage device, and the first storage device stores the information of the action-related criteria and the evaluation result of the action-related information based on the action-related criteria. and the processing circuit uses the information of the first action-related criterion and the information of the evaluation result based on the first action-related criterion stored in the first storage device to perform a second may determine information of said action-related criteria of.
  • the information processing device can newly determine the action-related criteria according to the evaluation results accumulated in the first storage device.
  • the first storage device may be included in the processing circuitry or separate from the processing circuitry.
  • the processing circuit uses information of the evaluation result that satisfies a specified condition and the evaluation result from information accumulated in the first storage device. extracting the information of the first action-related criterion, which is the action-related criterion, and extracting the information of the second action-related criterion using the extracted information of the evaluation result and the information of the first action-related criterion; may be determined.
  • the information processing device can determine a new action-related criterion using evaluation results that satisfy specified conditions.
  • the information processing device can determine action-related criteria for various purposes.
  • the designated condition may be a condition that designates evaluation results corresponding to an object to be acted upon by the robot, the robot, the work, the work place, the work environment, the work execution date, the work execution period, and the like.
  • a learning device uses the information of the action-related criterion and the information of the evaluation result of the action-related information based on the action-related criterion in the information processing device according to the aspect of the present disclosure for machine learning.
  • a second storage device for storing data
  • a machine learning circuit uses the information of the evaluation result included in the machine learning data as input data for machine learning, and the machine learning data Machine learning is performed using the information of the action-related reference included and used for the evaluation result as teacher data, and the machine learning circuit uses evaluation information corresponding to the information of the evaluation result as input data and corresponds to the evaluation information.
  • the information of the action-related criteria is used as output data.
  • the learning device when it after machine learning receives input of evaluation information corresponding to evaluation results, it can output action-related criteria suitable for the evaluation information. For example, when evaluation information corresponding to a high evaluation result is input to the learning device, the learning device outputs a simple, low-level action-related criterion that is likely to obtain a high evaluation. For example, when rating information corresponding to a low rating result is input to the learning device, the learning device outputs high-level action-related criteria that are difficult to obtain high ratings. For example, by using the learning device, the user can acquire the action-related reference of the difficulty level desired by the user and use it for operating the robot.
  • the second storage device may be separate from the machine learning circuitry, the processing circuitry and the first storage device, may be included in the machine learning circuitry, the processing circuitry or the first storage device, and may include the first storage device. .
  • An information processing system includes an information processing device according to one aspect of the present disclosure and a learning device according to one aspect of the present disclosure, wherein the learning device is the operating device or the information processing device. receives input data from and outputs output data corresponding to the input data to the information processing device.
  • the information processing system can obtain the same effects as the information processing device and the learning device according to one aspect of the present disclosure.
  • the information processing system can use the action-related criteria output from the learning device to perform evaluation processing in the information processing device.
  • a robot system includes an information processing device according to one aspect of the present disclosure, the robot, and a robot controller that controls an operation of the robot, wherein the information processing device includes the robot controller and Communicatively connected. According to the above aspect, the robot system can obtain the same effects as the information processing device according to one aspect of the present disclosure.
  • a robot system includes an information processing system according to one aspect of the present disclosure, the robot, and a robot controller that controls an operation of the robot, wherein the information processing device includes the robot controller and Communicatively connected. According to the above aspect, the robot system can obtain the same effects as the information processing system according to one aspect of the present disclosure.
  • circuits including general purpose processors, special purpose processors, integrated circuits, ASICs, conventional circuits, and/or combinations thereof configured or programmed to perform the disclosed functions. or can be performed using processing circuitry.
  • a processor is considered a processing circuit or circuit because it includes transistors and other circuits.
  • a circuit, unit, or means is hardware that performs or is programmed to perform the recited functions.
  • the hardware may be the hardware disclosed herein, or other known hardware programmed or configured to perform the recited functions.
  • a circuit, means or unit is a combination of hardware and software, where the hardware is a processor which is considered a type of circuit, the software being used to configure the hardware and/or the processor.
  • the division of blocks in the functional block diagram is an example. Multiple blocks may be implemented as one block, one block may be divided into multiple blocks, some functions may be moved to other blocks, and two or more of these may be combined. . A single piece of hardware or software may process the functions of multiple blocks having similar functions in parallel or in a time division manner.

Abstract

This information processing device comprises a processing circuit. While a robot which operates according to a manipulation command is executing a task, the processing circuit executes a detection process for detecting operation-related information of the robot, an evaluation process for comparing the operation-related information with a reference so as to carry out evaluation, and a presentation process for causing a presentation device to present an evaluation result. The processing circuit executes the evaluation process with first timing at which each of the evaluation target operations of the robot to be evaluated occurs, and/or second timing at which each of the evaluation target stages of the task to be evaluated occurs. The processing circuit causes the presentation device to present evaluation results pertaining to the evaluation target operations in the order in which the evaluation target operations are executed. The processing circuit causes the presentation device to present evaluation results pertaining to the evaluation target stages in the order in which the evaluation target stages are executed.

Description

情報処理装置、学習装置、情報処理システム及びロボットシステムInformation processing device, learning device, information processing system and robot system 関連出願への相互参照Cross-references to related applications
 本件出願は、2021年2月26日に日本特許庁に出願された特願2021-30029号の優先権及びその利益を主張するものであり、その全体を参照することにより本件出願の一部をなすものとして引用する。 This application claims the priority and benefit of Japanese Patent Application No. 2021-30029 filed with the Japan Patent Office on February 26, 2021, and part of this application is incorporated by reference in its entirety. Quoted as an eggplant.
 本開示は、情報処理装置、学習装置、情報処理システム及びロボットシステムに関する。 The present disclosure relates to information processing devices, learning devices, information processing systems, and robot systems.
 ロボットの操作者を評価する技術が存在する。例えば、特開2020-135362号公報に開示される管理システムは、ロボットを操作する操作者に、遠隔地にある物が近くに存在するかのように感じさせながら、遠隔地にある物の近くにあるロボットの操作を操作者が行う環境を提供する。管理システムが備えるサーバは、操作者がロボットを操作した際のロボットの状態又は操作者の状態を示す状態情報を取得し、状態情報が示す状態を基準レベルと比較することにより、ロボット操作に対する操作者の適性度を決定する。 Technology exists to evaluate robot operators. For example, the management system disclosed in Japanese Unexamined Patent Application Publication No. 2020-135362 makes an operator operating a robot feel as if an object at a remote location exists nearby, while allowing an operator to operate a robot near an object at a remote location. Provide an environment in which an operator operates a robot in A server provided in the management system acquires state information indicating the state of the robot or the state of the operator when the operator operates the robot, and compares the state indicated by the state information with the reference level to determine whether the robot is operated. determine a person's suitability.
 特開2020-135362号公報では、適性度として、ロボット操作のスキルのレベルが決定される。例えば、操作者は、決定されたレベルを通じて、作業に対する操作全体のスキルレベルしか知ることができない。操作者は、ロボット操作のスキルを向上させるための具体的な方策を見出すことができず、スキル向上のためのモチベーション及び操作意欲を高めることができない可能性がある。 In Japanese Patent Application Laid-Open No. 2020-135362, the skill level of robot operation is determined as aptitude. For example, the operator can only know the skill level of the entire operation for the work through the determined level. The operator may not be able to find a specific measure for improving the robot operation skill, and may not be able to increase the motivation for improving the skill and the willingness to operate.
 本開示は、ロボット操作へのユーザの意欲を向上する情報処理装置、学習装置、情報処理システム及びロボットシステムを提供する。 The present disclosure provides an information processing device, a learning device, an information processing system, and a robot system that improve a user's willingness to operate a robot.
 本開示の一態様に係る情報処理装置は、作業をロボットに実行させるための前記ロボットの操作を評価する情報処理装置であって、処理回路を含み、前記処理回路は、ユーザによって操作装置に入力される操作指令に従って動作する前記ロボットの動作に関連する情報である、動作関連情報を検出する検出処理と、検出された前記動作関連情報を、前記ロボットの動作に関連する基準である動作関連基準と比較し評価する評価処理と、前記評価処理での評価結果を、前記ユーザが前記操作装置を操作しつつ情報の提示を受ける提示装置に提示させる提示処理とを、前記作業の実行中に実行するように構成され、前記処理回路は、評価対象の前記ロボットの動作である評価対象動作の都度の第1タイミングと、前記作業における評価対象の作業段階である評価対象段階の都度の第2タイミングとのうちのいずれか又は両方のタイミングで前記評価処理を実行し、前記処理回路は、前記第1タイミングで前記評価処理を実行する場合、前記評価対象動作の実行順序と同じ順序で前記評価対象動作に関する前記評価結果を前記提示装置に提示させ、前記処理回路は、前記第2タイミングで前記評価処理を実行する場合、前記評価対象段階の実行順序と同じ順序で前記評価対象段階に関する前記評価結果を前記提示装置に提示させる。 An information processing device according to an aspect of the present disclosure is an information processing device that evaluates an operation of the robot for causing the robot to perform a task, the processing circuit including a processing circuit, wherein the processing circuit receives an input from a user to an operation device. detection processing for detecting motion-related information, which is information related to the motion of the robot that operates in accordance with an operation command received; and a presentation process of presenting the evaluation result of the evaluation process on a presentation device that receives information presentation while the user operates the operation device, are executed during the execution of the work. The processing circuit is configured to perform a first timing each time an evaluation target motion is a motion of the robot to be evaluated, and a second timing each time an evaluation target stage is a work stage to be evaluated in the work. When the evaluation processing is executed at the first timing, the processing circuit executes the evaluation processing in the same order as the execution order of the evaluation target operation. When causing the presentation device to present the evaluation result regarding the action, and executing the evaluation process at the second timing, the processing circuit presents the evaluation result regarding the evaluation target stage in the same order as the execution order of the evaluation target stage. is presented on the presentation device.
図1は、実施の形態に係るロボットシステムの構成の一例を示す図である。FIG. 1 is a diagram illustrating an example of a configuration of a robot system according to an embodiment. 図2は、実施の形態に係るロボットの構成の一例を示す斜視図である。FIG. 2 is a perspective view showing an example of the configuration of the robot according to the embodiment. 図3は、実施の形態に係る制御装置のハードウェア構成の一例を示すブロック図である。FIG. 3 is a block diagram illustrating an example of the hardware configuration of the control device according to the embodiment; 図4は、実施の形態に係る制御装置の機能的構成の一例を示すブロック図である。FIG. 4 is a block diagram illustrating an example of a functional configuration of the control device according to the embodiment; 図5は、実施の形態に係るロボットシステムの動作の一例を示すフローチャートである。FIG. 5 is a flow chart showing an example of the operation of the robot system according to the embodiment. 図6は、実施の形態に係るロボットシステムの動作の一例を示す平面図である。FIG. 6 is a plan view showing an example of the operation of the robot system according to the embodiment; 図7は、変形例に係る情報処理装置及び学習装置の機能的構成の一例を示すブロック図である。FIG. 7 is a block diagram showing an example of functional configurations of an information processing device and a learning device according to a modification.
 以下において、本開示の例示的な実施の形態を、図面を参照しつつ説明する。以下で説明する実施の形態は、いずれも包括的又は具体的な例を示すものである。以下の実施の形態における構成要素のうち、最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。添付の図面における各図は、模式的な図であり、必ずしも厳密に図示されたものでない。各図において、実質的に同一の構成要素に対しては同一の符号を付しており、重複する説明は省略又は簡略化される場合がある。本明細書及び請求項では、「装置」とは、1つの装置を意味し得るだけでなく、複数の装置からなるシステムも意味し得る。 Exemplary embodiments of the present disclosure will be described below with reference to the drawings. All of the embodiments described below are comprehensive or specific examples. Among the constituent elements in the following embodiments, constituent elements that are not described in independent claims indicating the highest concept will be described as optional constituent elements. Each figure in the accompanying drawings is a schematic diagram and is not necessarily strictly illustrated. In each drawing, substantially the same components are denoted by the same reference numerals, and redundant description may be omitted or simplified. In the specification and claims, "device" may mean not only one device, but also a system of multiple devices.
 [ロボットシステムの構成]
 図1を参照しつつ、例示的な実施の形態に係るロボットシステム1の構成を説明する。図1は、実施の形態に係るロボットシステム1の構成の一例を示す図である。ロボットシステム1は、1つ以上のロボット100と、1つ以上の操作端末200と、サーバ300とを備える。限定されないが、本実施の形態では、ロボットシステム1は、遠隔地から操作されるロボット100を用いてユーザPにサービスを提供するように構成される。ロボットシステム1は、介護、医療、清掃、警備、案内、救助、調理、販売、レンタル、物品提供等の様々なサービス業で使用されることができる。
[Configuration of robot system]
A configuration of a robot system 1 according to an exemplary embodiment will be described with reference to FIG. FIG. 1 is a diagram showing an example of a configuration of a robot system 1 according to an embodiment. The robot system 1 includes one or more robots 100 , one or more operation terminals 200 and a server 300 . Although not limited, in this embodiment, the robot system 1 is configured to provide a service to a user P using a remotely operated robot 100 . The robot system 1 can be used in various service industries such as nursing care, medical care, cleaning, security, guidance, rescue, cooking, sales, rental, and article provision.
 ロボット100は、サービスの提供等のための作業を行うことができるロボットである。本明細書及び請求項において、「作業」は、何らかの有効な結果が期待されるような生産的な事柄を、成し遂げる、作り上げる、創造する、又はこれらの2つ以上の組み合わせを実現するために、ロボット100が実行する一連の動作を対象とすることができる。例えば、「作業」は、ロボット100への教示の過程でロボット100が実行する一連の動作を対象としなくてもよい。例えば、「作業」は、遊戯、ゲーム、エンターテインメント、又はこれらの2つ以上の組み合わせのためにロボット100が実行する一連の動作を対象としなくてもよい。 The robot 100 is a robot that can perform tasks such as providing services. As used herein and in the claims, "work" is defined as accomplishing, crafting, creating, or any combination of two or more of these, a productive thing in which some useful result is expected. A series of actions performed by the robot 100 can be targeted. For example, the “work” does not have to be a series of actions performed by the robot 100 during the process of teaching the robot 100 . For example, "work" may not cover a series of actions performed by robot 100 for play, games, entertainment, or a combination of two or more thereof.
 限定されないが、本実施の形態では、ユーザPにサービスを提供する場所である1つのサービス提供エリアASに、2つ以上のロボット100が配置される。さらに、サービス提供エリアASから離れた位置にある2つ以上の操作エリアAОのそれぞれに、1つ以上の操作端末200が配置される。 Although not limited, in the present embodiment, two or more robots 100 are arranged in one service providing area AS where user P is provided with services. Furthermore, one or more operation terminals 200 are arranged in each of two or more operation areas AO located away from the service providing area AS.
 ロボット100は、無線通信を介して、データ通信可能に通信ネットワークNと接続するように構成される。ロボット100と通信ネットワークNとを接続する通信は、有線通信、又は、有線通信及び無線通信の組み合わせ等であってもよい。操作端末200は、有線通信、無線通信又はこれらの組み合わせを介して、データ通信可能に通信ネットワークNと接続するように構成される。1つのロボット100と1つの操作端末200とが、通信ネットワークNを介してデータ通信可能に接続され得る。いかなる有線通信及び無線通信が用いられてもよい。 The robot 100 is configured to be connected to the communication network N via wireless communication so as to enable data communication. The communication connecting the robot 100 and the communication network N may be wired communication or a combination of wired and wireless communication. The operation terminal 200 is configured to be connected to the communication network N for data communication via wired communication, wireless communication, or a combination thereof. One robot 100 and one operation terminal 200 can be connected via a communication network N so as to be capable of data communication. Any wired or wireless communication may be used.
 サーバ300は、通信ネットワークNを介した通信を管理する。サーバ300は、コンピュータ装置を含む。サーバ300は、ロボット100と操作端末200との間の通信の認証、接続及び接続解除等を管理する。例えば、サーバ300は、ロボットシステム1に登録されているロボット100及び操作端末200の識別情報及びセキュリティ情報等を記憶し、当該情報を用いて、操作端末200のロボットシステム1への接続の資格を認証する。サーバ300は、ロボット100と操作端末200との間のデータの送受信を管理し、当該データはサーバ300を経由してもよい。サーバ300は、送信元から送信されるデータを、送信先が利用可能なデータ型式に変換するように構成されてもよい。サーバ300は、ロボット100の操作の過程でロボット100と操作端末200との間で送受信された情報、指令及びデータ等を記憶し蓄積するように構成されてもよい。 The server 300 manages communication via the communication network N. Server 300 includes a computer device. The server 300 manages communication authentication, connection, disconnection, and the like between the robot 100 and the operation terminal 200 . For example, the server 300 stores identification information, security information, etc. of the robot 100 and the operation terminal 200 registered in the robot system 1, and uses the information to determine the qualification of connection of the operation terminal 200 to the robot system 1. Authenticate. The server 300 manages transmission and reception of data between the robot 100 and the operation terminal 200 , and the data may pass through the server 300 . Server 300 may be configured to convert data sent from a source into a data format that can be used by a destination. The server 300 may be configured to store and accumulate information, commands, data, etc. that are transmitted and received between the robot 100 and the operation terminal 200 during the operation of the robot 100 .
 通信ネットワークNは特に限定されず、例えば、ローカルエリアネットワーク(Local Area Network:LAN)、広域ネットワーク(Wide Area Network:WAN)、インターネット、又はこれらの2つ以上の組み合わせを含むことができる。通信ネットワークNは、ブルートゥース(Bluetooth)(登録商標)及びZigBee(登録商標)などの近距離無線通信、ネットワーク専用回線、通信事業者の専用回線、公衆交換電話網(Public Switched Telephone Network:PSTN)、モバイル通信網、インターネット網、衛星通信、又は、これらの2つ以上の組み合わせを用いるように構成され得る。モバイル通信網は、第4世代移動通信システム及び第5世代移動通信システム等を用いるものであってもよい。通信ネットワークNは、1つ又は複数のネットワークを含むことができる。本実施の形態では、通信ネットワークNはインターネットである。 The communication network N is not particularly limited, and can include, for example, a local area network (LAN), a wide area network (WAN), the Internet, or a combination of two or more thereof. Communication network N includes short-range wireless communication such as Bluetooth (registered trademark) and ZigBee (registered trademark), network dedicated lines, communication carrier dedicated lines, public switched telephone network (PSTN), It may be configured to use mobile networks, internet networks, satellite communications, or a combination of two or more thereof. The mobile communication network may use a 4th generation mobile communication system, a 5th generation mobile communication system, or the like. Communication network N may include one or more networks. In this embodiment, the communication network N is the Internet.
 [操作端末の構成]
 図1を参照しつつ、実施の形態に係る操作端末200の構成を説明する。操作端末200は、操作者PОによる指令、情報及びデータ等の入力を受け付けることができ、受け付けた指令、情報及びデータ等を他の装置に出力するように構成される。操作端末200は、操作入力装置201と、端末コンピュータ202と、提示装置203と、通信装置204とを含む。操作入力装置201、端末コンピュータ202、提示装置203及び通信装置204の4つの装置は、1つの装置を形成するように一体化されてもよく、4つの装置それぞれが単独で装置を形成し互いに接続されてもよく、2つ以上の装置が1つの装置を形成し他の装置と接続されてもよい。
[Configuration of operation terminal]
The configuration of the operation terminal 200 according to the embodiment will be described with reference to FIG. The operation terminal 200 is configured to receive commands, information, data, and the like input by the operator PO, and to output the received commands, information, data, and the like to other devices. The operation terminal 200 includes an operation input device 201 , a terminal computer 202 , a presentation device 203 and a communication device 204 . The four devices of the operation input device 201, the terminal computer 202, the presentation device 203, and the communication device 204 may be integrated to form one device, and each of the four devices independently forms a device and is connected to each other. or two or more devices may form one device and be connected to other devices.
 操作端末200は操作装置の一例である。操作端末200の構成は特に限定されず、例えば、操作端末200は、パーソナルコンピュータなどのコンピュータ、スマートフォン及びタブレットなどのスマートデバイス、個人情報端末、ゲーム端末、ロボットへの教示作業に使用される既知の教示装置、ロボットの既知の操作装置、その他の操作装置、その他の端末装置、これらを利用する装置、並びに、これらを改良した装置等であってもよい。操作端末200は、ロボットシステム1のために考案される専用の装置であってもよいが、一般市場において入手可能な汎用的な装置であってもよい。本実施の形態では、操作端末200には既知の汎用的な装置が用いられる。当該装置は、専用のソフトウェアがインストールされることによって、本開示の操作端末200の機能を実現するように構成されてもよい。 The operation terminal 200 is an example of an operation device. The configuration of the operation terminal 200 is not particularly limited. It may be a teaching device, a known operating device for a robot, other operating devices, other terminal devices, devices using these devices, and improved devices thereof. The operation terminal 200 may be a dedicated device devised for the robot system 1, or may be a general-purpose device available on the general market. In this embodiment, a known general-purpose device is used as the operation terminal 200 . The device may be configured to implement the functions of the operation terminal 200 of the present disclosure by installing dedicated software.
 操作入力装置201は、操作者PОによる入力を受け付けることができ、入力された指令、情報及びデータ等を示す信号などを端末コンピュータ202に出力するように構成される。操作入力装置201の構成は特に限定されず、例えば、操作入力装置201は、ボタン、レバー、ダイヤル、ジョイスティック、マウス、キー、タッチパネル及びモーションキャプチャ等の、操作者PОの操作を介して入力が与えられる装置を含んでもよい。操作入力装置201は、操作者PО等の画像を撮像する撮像装置、及び、操作者PО等の音声の入力を受け付けるマイクなどの音声入力装置を含んでもよい。操作入力装置201は、撮像された画像データ、及び、入力された音声を示す信号を端末コンピュータ202に出力するように構成されてもよい。 The operation input device 201 is configured to be able to receive input from the operator PO, and to output to the terminal computer 202 signals indicating input commands, information, data, and the like. The configuration of the operation input device 201 is not particularly limited. may include a device that can be The operation input device 201 may include an imaging device that captures an image of the operator PO or the like, and a voice input device such as a microphone that receives voice input of the operator PO or the like. The operation input device 201 may be configured to output to the terminal computer 202 the captured image data and the signal indicating the input voice.
 端末コンピュータ202は、操作入力装置201を介して受け付けた指令、情報及びデータ等を処理し他の装置に出力すること、及び、他の装置からの指令、情報及びデータ等の入力を受け付け、当該指令、情報及びデータ等を処理することを実行するように構成される。 The terminal computer 202 processes commands, information, data, etc. received via the operation input device 201 and outputs them to other devices, and receives inputs of commands, information, data, etc. from other devices, and processes them. It is configured to execute instructions, process information and data, and the like.
 提示装置203は、操作者PОに情報を提示するように構成される。限定されないが、本実施の形態では、提示装置203は、画像を操作者PОに表示可能であるディスプレイを含む。提示装置203は、端末コンピュータ202から受け取る画像データの画像を表示する。提示装置203は、音声を操作者PОに発することができるスピーカなどの音声出力装置を含んでもよい。提示装置203は、端末コンピュータ202から受け取る音声データの音声を出力する。 The presentation device 203 is configured to present information to the operator PO. Although not limited, in this embodiment the presentation device 203 includes a display capable of displaying an image to the operator PO. The presentation device 203 displays an image of image data received from the terminal computer 202 . The presentation device 203 may include an audio output device such as a speaker capable of emitting audio to the operator PO. The presentation device 203 outputs the sound of the sound data received from the terminal computer 202 .
 通信装置204は、通信ネットワークNと接続可能である通信インタフェースを備える。通信装置204は、端末コンピュータ202と接続され、端末コンピュータ202と通信ネットワークNとをデータ通信可能に接続する。通信装置204は、例えば、モデム、ОNU(光回線終端装置:Optical Network Unit)、ルータ及びモバイルデータ通信機器等の通信機器を含んでもよい。通信装置204は、演算機能等を有するコンピュータ装置を含んでもよい。 The communication device 204 has a communication interface that can be connected to the communication network N. The communication device 204 is connected to the terminal computer 202 and connects the terminal computer 202 and the communication network N so that data communication is possible. Communication devices 204 may include, for example, communication equipment such as modems, Optical Network Units (ONUs), routers, and mobile data communication equipment. Communication device 204 may include a computing device having computing capabilities and the like.
 [ロボットの構成]
 図2を参照しつつ、実施の形態に係るロボット100の構成を説明する。図2は、実施の形態に係るロボット100の構成の一例を示す斜視図である。ロボット100は、1つの搬送車110と、1つ以上のロボットアーム120と、1つ以上のエンドエフェクタ130と、二次電池モジュール141と、電源回路142と、通信装置143と、撮像装置144、145及び146と、集音装置147と、表示装置148と、音声出力装置149と、制御装置170とを備える。制御装置170は、情報処理装置180とロボットコントローラ190とを含む。情報処理装置180とロボットコントローラ190とは、互いに別々の装置であってもよく、一体化されてもよい。限定されないが、本実施の形態では、ロボットアーム120には、産業用としても機能することができるロボットアームが用いられる。上記の各構成要素の数量は上記数量に限定されず、適宜変更可能である。
[Robot configuration]
A configuration of the robot 100 according to the embodiment will be described with reference to FIG. FIG. 2 is a perspective view showing an example configuration of the robot 100 according to the embodiment. The robot 100 includes one carrier 110, one or more robot arms 120, one or more end effectors 130, a secondary battery module 141, a power supply circuit 142, a communication device 143, an imaging device 144, 145 and 146 , a sound collector 147 , a display device 148 , an audio output device 149 and a control device 170 . The control device 170 includes an information processing device 180 and a robot controller 190 . The information processing device 180 and the robot controller 190 may be separate devices or may be integrated. Although not limited, in this embodiment, the robot arm 120 is a robot arm that can also function for industrial purposes. The quantity of each component described above is not limited to the above quantity, and can be changed as appropriate.
 搬送車110は自走するように構成される。限定されないが、本実施の形態では、搬送車110は、2つの駆動輪111と、4つの補助輪112と、2つの駆動輪111を駆動する搬送駆動装置113とを含む。搬送駆動装置113は、駆動源である電動アクチュエータとしてサーボモータを含む。各サーボモータは、ロボットコントローラ190によって制御される。搬送駆動装置113は、2つの駆動輪111を個別に様々な回転方向及び回転速度で回転駆動することによって、前進方向D1A、後進方向D1B及び旋回方向へ様々な速度で搬送車110に走行させることできる。 The carrier 110 is configured to run on its own. Although not limited, in this embodiment, the transport vehicle 110 includes two drive wheels 111 , four auxiliary wheels 112 , and a transport drive device 113 that drives the two drive wheels 111 . The transport driving device 113 includes a servomotor as an electric actuator that is a driving source. Each servo motor is controlled by robot controller 190 . The transport driving device 113 rotates the two drive wheels 111 individually at various rotational directions and rotational speeds, thereby causing the transport vehicle 110 to travel at various speeds in the forward direction D1A, the backward direction D1B, and the turning direction. can.
 限定されないが、本実施の形態では、ロボットアーム120として、2つのロボットアーム120A及び120Bが、基台120Cを介して搬送車110に搭載される。ロボットアーム120A及び120Bはいずれも、搬送車110から基台120Cに向かう方向の軸S1を中心として回動可能であり、同軸双腕式のロボットアームを形成する。ロボットアーム120Aは、リンク121Aから124Aと、リンク121Aから124Aを相互に接続する関節を駆動するアーム駆動装置M1AからM4Aとを含む。ロボットアーム120Bは、リンク121Bから124Bと、リンク121Bから124Bを相互に接続する関節を駆動するアーム駆動装置M1BからM4Bとを含む。例えば、アーム駆動装置M1AからM4A及びM1BからM4Bは図3に示される。アーム駆動装置M1AからM4A及びM1BからM4Bは、駆動源である電動アクチュエータとしてサーボモータを含む。各サーボモータは、ロボットコントローラ190によって制御される。リンク121A及び121Bは、関節を介して基台120Cと接続される。リンク124A及び124Bは、エンドエフェクタ130と接続可能であるメカニカルインタフェースを含む。 Although not limited, in the present embodiment, as the robot arm 120, two robot arms 120A and 120B are mounted on the carrier 110 via the base 120C. Both of the robot arms 120A and 120B are rotatable about an axis S1 in the direction from the carrier 110 toward the base 120C, forming coaxial dual-arm robot arms. The robot arm 120A includes links 121A to 124A and arm drivers M1A to M4A that drive joints interconnecting the links 121A to 124A. The robot arm 120B includes links 121B to 124B and arm drivers M1B to M4B that drive the joints interconnecting the links 121B to 124B. For example, arm drives M1A to M4A and M1B to M4B are shown in FIG. Arm driving devices M1A to M4A and M1B to M4B include servo motors as electric actuators that are driving sources. Each servo motor is controlled by robot controller 190 . Links 121A and 121B are connected to base 120C via joints. Links 124 A and 124 B include mechanical interfaces that are connectable with end effector 130 .
 上記のようなロボットアーム120A及び120Bは、水平多関節型アームの構成を有するが、いかなる構成を有してもよい。例えば、ロボットアーム120A及び120Bは、垂直多関節型、極座標型、円筒座標型、直角座標型、又はその他の型式のロボットアームであってもよい。搬送車110に配置されるロボットアーム120の数量も1つ以上であればよい。ロボットアーム120A及び120Bの回動軸は、同一の軸S1でなくてもよい。 The robot arms 120A and 120B as described above have a horizontal articulated arm configuration, but may have any configuration. For example, robotic arms 120A and 120B may be vertically articulated, polar, cylindrical, rectangular, or other types of robotic arms. The number of robot arms 120 arranged on the carrier 110 may be one or more. The rotation axes of the robot arms 120A and 120B may not be the same axis S1.
 エンドエフェクタ130として、2つのエンドエフェクタ130A及び130Bがそれぞれ、リンク124A及び124Bに着脱可能に取り付けられる。エンドエフェクタ130A及び130Bは、ロボット100が扱う対象物に作用を加えるように構成される。エンドエフェクタ130A及び130Bは、動作可能であり、駆動装置を備えてもよい。当該駆動装置は、電力、ガス圧又は液圧等を動力源としてもよい。電力を動力源とする駆動装置は、電動アクチュエータとしてサーボモータを備えてもよい。駆動装置は、ロボットコントローラ190によって制御されてもよい。 As the end effector 130, two end effectors 130A and 130B are detachably attached to the links 124A and 124B, respectively. End effectors 130A and 130B are configured to act on objects handled by robot 100 . End effectors 130A and 130B are operable and may include a drive. The driving device may be powered by electric power, gas pressure, hydraulic pressure, or the like. A drive device powered by electric power may include a servomotor as an electric actuator. The drives may be controlled by robot controller 190 .
 ロボット100は、搬送車110上に機器筐体140をさらに備える。二次電池モジュール141、電源回路142、通信装置143、情報処理装置180及びロボットコントローラ190は、機器筐体140内に配置されるが、搬送車110のいかなる位置に配置されてもよい。機器筐体140の上には、物品の載置が可能であるテーブル140aが配置されている。 The robot 100 further includes a device housing 140 on the carrier 110 . The secondary battery module 141 , the power supply circuit 142 , the communication device 143 , the information processing device 180 and the robot controller 190 are arranged inside the equipment housing 140 , but may be arranged at any position on the carrier 110 . A table 140a on which articles can be placed is arranged on the device housing 140. As shown in FIG.
 二次電池モジュール141は、ロボット100の電力源として機能する。二次電池モジュール141は、1つ以上の二次電池を含む。二次電池は、電力の充電及び放電を可能な電池である。二次電池の例は、鉛蓄電池、リチウムイオン二次電池、全固体電池、ニッケル・水素蓄電池、ニッケル・カドミウム蓄電池等である。 The secondary battery module 141 functions as a power source for the robot 100. The secondary battery module 141 includes one or more secondary batteries. A secondary battery is a battery capable of charging and discharging power. Examples of secondary batteries include lead storage batteries, lithium ion secondary batteries, all-solid batteries, nickel-hydrogen storage batteries, nickel-cadmium storage batteries, and the like.
 電源回路142は、二次電池モジュール141に対する電力の需給を制御する回路である。電源回路142は、情報処理装置180及びロボットコントローラ190の指令等に従って、電力の需給を制御するように構成される。例えば、電源回路142は、コンバータ、インバータ、トランス及びアンプ等の機器を含んでもよい。電源回路142は、商用電源等の外部電源と接続され、外部電源から供給される電力を二次電池モジュール141に供給し蓄電させ、二次電池モジュール141に蓄積される電力をロボット100内の電力を消費する構成要素に供給するように構成される。 The power supply circuit 142 is a circuit that controls power supply and demand for the secondary battery module 141 . The power supply circuit 142 is configured to control supply and demand of electric power in accordance with commands from the information processing device 180 and the robot controller 190 . For example, the power supply circuit 142 may include devices such as converters, inverters, transformers and amplifiers. The power supply circuit 142 is connected to an external power supply such as a commercial power supply, supplies the power supplied from the external power supply to the secondary battery module 141 and stores the power, and transfers the power stored in the secondary battery module 141 to the power inside the robot 100 . to a component that consumes
 通信装置143は、無線通信のための装置であり、無線通信を介して通信ネットワークNと接続するように構成される。通信装置143が使用する無線通信は、特に限定されないが、例えば、モバイルデータ通信、無線Wi-Fi(Wireless Fidelity)などの無線LAN、ブルートゥース(Bluetooth)(登録商標)及びZigBee(登録商標)などの近距離無線通信、又はこれらの2つ以上の組み合わせ等を用いる無線通信であってもよい。通信装置143は、使用する無線通信に対応する機器を有する。 The communication device 143 is a device for wireless communication, and is configured to connect to the communication network N via wireless communication. The wireless communication used by the communication device 143 is not particularly limited, but for example, mobile data communication, wireless LAN such as wireless Wi-Fi (Wireless Fidelity), Bluetooth (registered trademark) and ZigBee (registered trademark) It may be short-range wireless communication, wireless communication using a combination of two or more of these, or the like. The communication device 143 has a device compatible with wireless communication to be used.
 表示装置148は、画像を表示可能であるディスプレイを含む。ディスプレイは、ユーザPによる入力を受け付けるように構成されてもよく、例えば、タッチパネルであってもよい。表示装置148は、情報処理装置180等から送られる画像データの画像を表示することができる。表示装置148は、ロボット100と対峙するユーザPに画像を表示することができる。 The display device 148 includes a display capable of displaying images. The display may be configured to receive input from the user P, and may be a touch panel, for example. The display device 148 can display an image of image data sent from the information processing device 180 or the like. The display device 148 can display an image to the user P facing the robot 100 .
 集音装置147は、周囲から音声を取得し当該音声の音声信号を出力することができるマイクを含む。集音装置147は、音声信号を情報処理装置180等に出力するように構成される。情報処理装置180は、音声信号を音声データに変換し操作端末200に送信するように構成されてもよく、音声信号から音声認識するように構成されてもよい。集音装置147は、ロボット100の外部をセンシングすることができるロボット100の外界センサとして機能し得る。ロボット100の内界センサは、ロボット100の内部をセンシングすることができ、内界センサの一例は、アーム駆動装置M1AからM4A及びM1BからM4B等のサーボモータに備わる回転センサである。 The sound collecting device 147 includes a microphone capable of acquiring sound from the surroundings and outputting an audio signal of the sound. The sound collector 147 is configured to output an audio signal to the information processing device 180 or the like. The information processing device 180 may be configured to convert a voice signal into voice data and transmit the data to the operation terminal 200, or may be configured to recognize voice from the voice signal. The sound collector 147 can function as an external sensor of the robot 100 capable of sensing the outside of the robot 100 . The internal world sensor of the robot 100 can sense the inside of the robot 100, and one example of the internal world sensor is a rotation sensor provided in the servo motors of the arm driving devices M1A to M4A and M1B to M4B.
 音声出力装置149は、音声信号を音波に変換し音声として放射することができるスピーカを含む。音声出力装置149は、情報処理装置180等から送られる音声信号に対応する音声を出力することができる。音声出力装置149は、ロボット100と対峙するユーザPに音声を出力することができる。 The audio output device 149 includes a speaker capable of converting audio signals into sound waves and emitting them as sound. The audio output device 149 can output audio corresponding to audio signals sent from the information processing device 180 or the like. The voice output device 149 can output voice to the user P facing the robot 100 .
 撮像装置144、145及び146はそれぞれ、デジタル画像を撮像するカメラを含み、撮像した画像のデータを情報処理装置180に送るように構成される。情報処理装置180は、撮像装置144、145及び146によって撮像された画像データをネットワーク送信可能なデータに処理し、通信ネットワークNを介して、操作端末200に送るように構成されてもよい。情報処理装置180は、上記画像データを画像処理し、他の処理に使用するように構成されてもよい。カメラは、被写体までの距離などのカメラに対する被写体の3次元の位置等を検出するための画像を撮像できるカメラであってもよい。例えば、カメラは、ステレオカメラ、単眼カメラ、TОFカメラ(トフカメラ:Time-of-Flight-Camera)、縞投影などのパターン光投影カメラ、光切断法を用いたカメラ、又はこれらの2つ以上の組み合わせ等の構成を含んでもよい。撮像装置144、145及び146は外界センサとして機能し得る。 The imaging devices 144 , 145 and 146 each include a camera that captures digital images and are configured to send data of the captured images to the information processing device 180 . The information processing device 180 may be configured to process the image data captured by the imaging devices 144 , 145 and 146 into network transmittable data, and transmit the data to the operation terminal 200 via the communication network N. The information processing device 180 may be configured to perform image processing on the image data and use it for other processing. The camera may be a camera capable of capturing an image for detecting the three-dimensional position of the subject with respect to the camera, such as the distance to the subject. For example, the camera may be a stereo camera, a monocular camera, a TOF camera (Time-of-Flight-Camera), a pattern light projection camera such as fringe projection, a camera using a light section method, or a combination of two or more of these. etc. may be included. Imaging devices 144, 145 and 146 may function as external sensors.
 撮像装置144は、ロボットアーム120A及び120Bのいずれか又は両方の先端部に配置される。限定されないが、本実施の形態では、撮像装置144は、ロボットアーム120Aのエンドエフェクタ130Aに配置される。撮像装置145は、表示装置148に配置される。撮像装置145は、ロボット100に対峙するサービス提供対象のユーザPを撮像することができる。撮像装置145は、エンドエフェクタ130A及び130Bが作用を加える対象物とエンドエフェクタ130A及び130Bとを撮像できるように、撮像方向を変える、広角な撮像視野を有する、又は、これらの両方の機能を含むように構成されてもよい。撮像装置146は、前進方向D1Aに向けて搬送車110に配置される。 The imaging device 144 is arranged at the tip of either or both of the robot arms 120A and 120B. Without limitation, in this embodiment the imaging device 144 is located on the end effector 130A of the robot arm 120A. The imaging device 145 is arranged on the display device 148 . The imaging device 145 can capture an image of the user P to whom the service is provided facing the robot 100 . Imaging device 145 may include the ability to change imaging direction, have a wide imaging field of view, or both, so that the end effectors 130A and 130B can image the object on which the end effectors 130A and 130B act. It may be configured as The imaging device 146 is arranged on the carrier 110 in the forward direction D1A.
 ロボットコントローラ190は、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B、並びに、搬送車110の動作を制御するように構成される。例えば、ロボットコントローラ190は、ロボットアーム120A及び120Bのアーム駆動装置M1AからM4A及びM1BからM4B、エンドエフェクタ130A及び130Bの駆動装置、並びに搬送車110の搬送駆動装置113等の動作を制御するように構成される。情報処理装置180は、ロボットコントローラ190と有線通信、無線通信又はこれらの組み合わせを介して接続され、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B、並びに搬送車110の制御に関連する演算処理を行うように構成される。 The robot controller 190 is configured to control the motion of the robot arms 120A and 120B, the end effectors 130A and 130B, and the vehicle 110. For example, the robot controller 190 controls the operation of the arm drives M1A to M4A and M1B to M4B of the robot arms 120A and 120B, the drives of the end effectors 130A and 130B, the transport drive 113 of the transport vehicle 110, and the like. Configured. The information processing device 180 is connected to the robot controller 190 via wired communication, wireless communication, or a combination thereof, and performs arithmetic processing related to control of the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110. configured as
 情報処理装置180及びロボットコントローラ190はコンピュータ装置を含む。情報処理装置180の構成は特に限定されないが、例えば、情報処理装置180は、電子回路基板、電子制御ユニット、マイクロコンピュータ、パーソナルコンピュータ、ワークステーション、スマートフォン及びタブレットなどのスマートデバイス、並びにその他の電子機器等であってもよい。情報処理装置180は、通信装置143、通信ネットワークN及び通信装置204を介して、操作端末200の端末コンピュータ202とデータ通信可能に接続される。ロボットコントローラ190の構成は特に限定されないが、例えば、ロボットコントローラ190は、電子回路基板、電子制御ユニット、マイクロコンピュータ、及びその他の電子機器等であってもよい。ロボットコントローラ190は、アーム駆動装置M1AからM4A及びM1BからM4B、エンドエフェクタ130A及び130Bの駆動装置、並びに搬送駆動装置113等に供給する電力を制御するための駆動回路を含んでもよい。 The information processing device 180 and the robot controller 190 include computer devices. The configuration of the information processing device 180 is not particularly limited. etc. The information processing device 180 is connected to the terminal computer 202 of the operation terminal 200 via the communication device 143 , the communication network N, and the communication device 204 so as to be capable of data communication. Although the configuration of the robot controller 190 is not particularly limited, the robot controller 190 may be, for example, an electronic circuit board, an electronic control unit, a microcomputer, other electronic equipment, and the like. The robot controller 190 may include a drive circuit for controlling the power supplied to the arm drives M1A to M4A and M1B to M4B, the drives of the end effectors 130A and 130B, the transport drive 113, and the like.
 本実施の形態では、情報処理装置180及びロボットコントローラ190は、作業をロボット100に実行させる場合、手動運転制御により、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B、並びに搬送車110等の制御対象要素の動作を制御するように構成されるが、これに限定されない。情報処理装置180及びロボットコントローラ190は、自動運転制御、又は、自動運転制御及び手動運転制御の組み合わせにより、上記制御対象要素のうちの1つ以上の動作を制御するように構成されてもよい。例えば、自動運転制御及び手動運転制御の組み合わせは、制御対象要素毎に自動運転制御及び手動運転制御が振り分けられるように構成されてもよく、ロボット100の作業の一部が自動運転制御によって実行され、当該作業の他部が手動運転制御によって実行されるように構成されてもよい。 In the present embodiment, the information processing device 180 and the robot controller 190 control objects such as the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110 by manual operation control when causing the robot 100 to perform a task. It is configured to control the operation of elements, but is not so limited. The information processing device 180 and the robot controller 190 may be configured to control the operation of one or more of the control target elements by automatic operation control or a combination of automatic operation control and manual operation control. For example, the combination of automatic operation control and manual operation control may be configured such that automatic operation control and manual operation control are assigned to each control target element, and part of the work of robot 100 is performed by automatic operation control. , other parts of the work may be configured to be performed by manual operation control.
 例えば、手動運転制御は、操作者PОが操作端末200の操作入力装置201に入力する操作内容に逐次従って、制御対象要素に動作させる制御であってもよい。例えば、手動運転制御では、制御対象要素は、操作入力装置201を操作する操作者PОの動作に従った動作を実行し得る。自動運転制御は、制御プログラムに従って制御対象要素に自律的に動作させる制御であってもよい。 For example, the manual operation control may be a control that causes the elements to be controlled to operate in accordance with the details of the operation input by the operator PO to the operation input device 201 of the operation terminal 200 . For example, in manual operation control, the controlled element can perform an action according to the action of the operator PO who operates the operation input device 201 . The automatic operation control may be a control that autonomously operates the control target element according to the control program.
 [制御装置のハードウェア構成]
 図3を参照しつつ、実施の形態に係る制御装置170のハードウェア構成を説明する。図3は、実施の形態に係る制御装置170のハードウェア構成の一例を示すブロック図である。情報処理装置180は、プロセッサ1801と、メモリ1802と、ストレージ1803と、入出力I/F(インタフェース:Interface)1804aから1804fとを構成要素として含む。情報処理装置180の各構成要素は、バス1805によって相互に接続されるが、他のいかなる有線通信又は無線通信で接続されてもよい。ロボットコントローラ190は、プロセッサ1901と、メモリ1902と、入出力I/F1903と、駆動I/F1904とを構成要素として含む。ロボットコントローラ190は、ストレージを含んでもよい。ロボットコントローラ190の各構成要素は、バス1905によって相互に接続されるが、他のいかなる有線通信又は無線通信で接続されてもよい。このような情報処理装置180及びロボットコントローラ190は、処理回路を含むが、さらに回路を含んでもよい。情報処理装置180及びロボットコントローラ190それぞれに含まれる構成要素の全てが必須ではない。
[Hardware configuration of control device]
A hardware configuration of the control device 170 according to the embodiment will be described with reference to FIG. FIG. 3 is a block diagram showing an example of the hardware configuration of the control device 170 according to the embodiment. The information processing apparatus 180 includes a processor 1801, a memory 1802, a storage 1803, and input/output I/Fs (interfaces) 1804a to 1804f as components. Each component of the information processing device 180 is interconnected by a bus 1805, but may be connected by any other wired or wireless communication. The robot controller 190 includes a processor 1901, a memory 1902, an input/output I/F 1903, and a drive I/F 1904 as components. Robot controller 190 may include storage. Each component of robot controller 190 is interconnected by bus 1905, but may be connected by any other wired or wireless communication. The information processing device 180 and the robot controller 190 as described above include processing circuitry, but may include additional circuitry. Not all of the components included in each of the information processing device 180 and the robot controller 190 are essential.
 プロセッサ1801及びメモリ1802の組、並びに、プロセッサ1901及びメモリ1902の組はそれぞれ、処理回路を形成する。処理回路は、他の装置との指令、情報及びデータ等の送受信を行う。処理回路は、各種機器からの信号の入力及び制御対象要素への制御信号の出力を行う。 The set of processor 1801 and memory 1802 and the set of processor 1901 and memory 1902 each form a processing circuit. The processing circuitry sends and receives commands, information, data, etc. to and from other devices. The processing circuit inputs signals from various devices and outputs control signals to the elements to be controlled.
 メモリ1802及び1902はそれぞれ、プロセッサ1801及び1901が実行するプログラム、及び各種データ等を記憶する。メモリ1802及び1902は、揮発性メモリ及び不揮発性メモリなどの半導体メモリ等の記憶装置を含んでもよい。限定されないが、本実施の形態では、メモリ1802及び1902は、揮発性メモリであるRAM(Random Access Memory)と不揮発性メモリであるRОM(Read-Only Memory)とを含む。 Memories 1802 and 1902 store programs executed by the processors 1801 and 1901, various data, and the like. Memories 1802 and 1902 may include storage devices such as semiconductor memory such as volatile and non-volatile memory. Although not so limited, in this embodiment, memories 1802 and 1902 include volatile memory, RAM (Random Access Memory), and non-volatile memory, ROM (Read-Only Memory).
 ストレージ1803は、各種データを記憶する。ストレージ1803は、半導体メモリ、ハードディスクドライブ(HDD:Hard Disk Drive)、固体ドライブ(SSD:Solid State Drive)又はこれらの2つ以上の組み合わせ等の記憶装置を含んでもよい。 The storage 1803 stores various data. The storage 1803 may include storage devices such as semiconductor memory, hard disk drives (HDDs), solid state drives (SSDs), or combinations of two or more thereof.
 プロセッサ1801及び1901はいずれも、RAM及びRОMと一緒にコンピュータシステムを形成する。情報処理装置180のコンピュータシステムは、プロセッサ1801がRAMをワークエリアとして用いてRОMに記録されたプログラムを実行することによって、情報処理装置180の機能を実現してもよい。ロボットコントローラ190のコンピュータシステムは、プロセッサ1901がRAMをワークエリアとして用いてRОMに記録されたプログラムを実行することによって、ロボットコントローラ190の機能を実現してもよい。 Both processors 1801 and 1901 form a computer system together with RAM and ROM. The computer system of the information processing device 180 may realize the functions of the information processing device 180 by the processor 1801 using the RAM as a work area and executing the program recorded in the ROM. The computer system of the robot controller 190 may realize the functions of the robot controller 190 by the processor 1901 using the RAM as a work area and executing the program recorded in the ROM.
 情報処理装置180及びロボットコントローラ190の機能の一部又は全部は、上記コンピュータシステムにより実現されてもよく、電子回路又は集積回路等の専用のハードウェア回路により実現されてもよく、上記コンピュータシステム及びハードウェア回路の組み合わせにより実現されてもよい。情報処理装置180及びロボットコントローラ190はそれぞれ、単一の装置による集中制御により各処理を実行するように構成されてもよく、複数の装置の協働による分散制御により各処理を実行するように構成されてもよい。情報処理装置180及びロボットコントローラ190は、互いの機能の少なくとも一部を含むように構成されてもよく、一体化されてもよい。 Some or all of the functions of the information processing device 180 and the robot controller 190 may be implemented by the computer system described above, or may be implemented by dedicated hardware circuits such as electronic circuits or integrated circuits. It may be implemented by a combination of hardware circuits. Each of the information processing device 180 and the robot controller 190 may be configured to execute each process by centralized control by a single device, or may be configured to execute each process by distributed control by cooperation of a plurality of devices. may be The information processing device 180 and the robot controller 190 may be configured to include at least some of the functions of each other, or may be integrated.
 限定されないが、例えば、プロセッサ1801及び1901は、CPU(中央処理装置:Central Processing Unit)、MPU(Micro Processing Unit)、GPU(Graphics Processing Unit)、マイクロプロセッサ(microprocessor)、プロセッサコア(processor core)、マルチプロセッサ(multiprocessor)、ASIC(Application-Specific Integrated Circuit)、FPGA(Field Programmable Gate Array)等を含み、IC(集積回路)チップ、LSI(Large Scale Integration)等に形成された論理回路又は専用回路によって各処理を実現してもよい。複数の処理は、1つ又は複数の集積回路により実現されてもよく、1つの集積回路により実現されてもよい。 For example, but not limited to, processors 1801 and 1901 include CPU (Central Processing Unit), MPU (Micro Processing Unit), GPU (Graphics Processing Unit), microprocessor, processor core, Including multiprocessors, ASICs (Application-Specific Integrated Circuits), FPGAs (Field Programmable Gate Arrays), etc., logic circuits or dedicated circuits formed on IC (Integrated Circuit) chips, LSIs (Large Scale Integration), etc. Each processing may be realized. A plurality of processes may be implemented by one or more integrated circuits, or may be implemented by one integrated circuit.
 情報処理装置180の第1入出力I/F1804aは、情報処理装置180とロボットコントローラ190とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。第2入出力I/F1804bは、情報処理装置180と通信装置143とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。第3入出力I/F1804cは、情報処理装置180と撮像装置144から146とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。第4入出力I/F1804dは、情報処理装置180と集音装置147とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。第5入出力I/F1804eは、情報処理装置180と表示装置148とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。第6入出力I/F1804fは、情報処理装置180と音声出力装置149とを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。 The first input/output I/F 1804a of the information processing device 180 connects the information processing device 180 and the robot controller 190, enabling input/output of information, commands, data, etc. between them. The second input/output I/F 1804b connects the information processing device 180 and the communication device 143 to enable input/output of information, commands, data, and the like therebetween. The third input/output I/F 1804c connects the information processing device 180 and the imaging devices 144 to 146, enabling input/output of information, commands, data, and the like therebetween. The fourth input/output I/F 1804d connects the information processing device 180 and the sound collector 147, and enables input/output of information, commands, data, and the like therebetween. The fifth input/output I/F 1804e connects the information processing device 180 and the display device 148 to enable input/output of information, commands, data, and the like therebetween. The sixth input/output I/F 1804f connects the information processing device 180 and the audio output device 149 and enables input/output of information, commands, data, and the like therebetween.
 ロボットコントローラ190の入出力I/F1903は、ロボットコントローラ190と情報処理装置180の第1入出力I/F1804aとを接続し、これらの間での情報、指令及びデータ等の入出力を可能にする。駆動I/F1904は、ロボットコントローラ190と駆動回路142aとを接続し、これらの間での信号等の送受信を可能にする。駆動回路142aは、ロボットコントローラ190から受信する信号に含まれる指令値に従って、アーム駆動装置M1AからM4A及びM1BからM4B、エンドエフェクタ130A及び130Bの駆動装置、並びに、搬送駆動装置113に供給する電力を制御するように構成される。例えば、駆動回路142aは、各駆動装置に互いに連携させて駆動させることができる。限定されないが、本実施の形態では、駆動回路142aは、電源回路142の一部として形成される。 The input/output I/F 1903 of the robot controller 190 connects the robot controller 190 and the first input/output I/F 1804a of the information processing device 180, enabling input/output of information, commands, data, etc. between them. . The drive I/F 1904 connects the robot controller 190 and the drive circuit 142a, enabling transmission and reception of signals and the like between them. The driving circuit 142a supplies electric power to the arm driving devices M1A to M4A and M1B to M4B, the driving devices of the end effectors 130A and 130B, and the transport driving device 113 according to the command values included in the signals received from the robot controller 190. configured to control. For example, the driving circuit 142a can cause each driving device to cooperate with each other to drive. Although not limited, the drive circuit 142a is formed as part of the power supply circuit 142 in this embodiment.
 ロボットコントローラ190は、各駆動装置のサーボモータをサーボ制御するように構成されてもよい。ロボットコントローラ190は、サーボモータが備える回転センサの検出値と、駆動回路142aからサーボモータへの電流の指令値とを、フィードバック情報として駆動回路142aから受信する。ロボットコントローラ190は、フィードバック情報を用いてサーボモータの駆動の指令値を決定し、駆動回路142aに送信する。 The robot controller 190 may be configured to servo-control the servo motor of each driving device. The robot controller 190 receives from the drive circuit 142a, as feedback information, the detected value of the rotation sensor provided in the servomotor and the command value of the current from the drive circuit 142a to the servomotor. The robot controller 190 determines command values for driving the servomotors using the feedback information, and transmits the command values to the drive circuit 142a.
 操作端末200には汎用的な装置が利用可能であるため、操作端末200のハードウェア構成の詳細な説明を省略する。操作端末200の端末コンピュータ202は、情報処理装置180等と同様にプロセッサ及びメモリを含む。端末コンピュータ202は、端末コンピュータ202と操作入力装置201との接続、端末コンピュータ202と通信装置204との接続、及び、端末コンピュータ202と提示装置203との接続のそれぞれを確立するための入出力I/Fも含み得る。 Since a general-purpose device can be used as the operation terminal 200, detailed description of the hardware configuration of the operation terminal 200 is omitted. The terminal computer 202 of the operation terminal 200 includes a processor and memory, like the information processing device 180 and the like. The terminal computer 202 has an input/output I for establishing connection between the terminal computer 202 and the operation input device 201, connection between the terminal computer 202 and the communication device 204, and connection between the terminal computer 202 and the presentation device 203. /F may also be included.
 [制御装置の機能的構成]
 図4を参照しつつ、実施の形態に係る制御装置170の機能的構成を説明する。図4は、実施の形態に係る制御装置170の機能的構成の一例を示すブロック図である。情報処理装置180は、動作指令部180aと、検出処理部180bと、評価処理部180cと、対価決定部180dと、提示処理部180eと、蓄積処理部180fと、基準決定部180gと、計時部180hと、記憶部181a及び181bとを機能的構成要素として含む。記憶部181a及び181bを除く機能的構成要素の機能は、プロセッサ1801等によって実現され、記憶部181a及び181bの機能は、メモリ1802、ストレージ1803又はこれらの組み合わせ等の記憶装置によって実現される。記憶部181a及び181bそれぞれの機能は、別々の記憶装置によって実現されてもよく、1つの記憶装置によって実現されてもよい。上記の機能的構成要素の全てが必須ではない。ロボットコントローラ190は、駆動指令部190aから190eを機能的構成要素として含む。駆動指令部190aから190eの機能は、プロセッサ1901等によって実現される。上記の機能的構成要素の全てが必須ではない。
[Functional Configuration of Control Device]
A functional configuration of the control device 170 according to the embodiment will be described with reference to FIG. FIG. 4 is a block diagram showing an example of the functional configuration of the control device 170 according to the embodiment. The information processing device 180 includes an operation command unit 180a, a detection processing unit 180b, an evaluation processing unit 180c, a consideration determination unit 180d, a presentation processing unit 180e, an accumulation processing unit 180f, a reference determination unit 180g, and a timer unit. 180h and storage units 181a and 181b as functional components. The functions of the functional components other than the storage units 181a and 181b are implemented by the processor 1801 or the like, and the functions of the storage units 181a and 181b are implemented by storage devices such as memory 1802, storage 1803, or a combination thereof. The functions of the storage units 181a and 181b may be implemented by separate storage devices or may be implemented by one storage device. Not all of the above functional building blocks are essential. The robot controller 190 includes drive command units 190a to 190e as functional components. Functions of the drive command units 190a to 190e are implemented by the processor 1901 and the like. Not all of the above functional building blocks are essential.
 第1記憶部181aは、様々な作業についての動作関連基準データを記憶する。例えば、動作関連基準データは、図4では、「基準データ」と表記される。動作関連基準データは、ロボット100の動作に関連する基準である動作関連基準の情報を含む。動作関連基準は、ロボット100の動作に関連する情報である動作関連情報についての基準を含む。動作関連情報は、ロボット100の動作、ロボット100の周辺環境の状態、ロボット100が作用を加える対象物の状態、及び、当該対象物の周辺環境の状態等の動作関連要素の情報を含み得る。 The first storage unit 181a stores motion-related reference data for various tasks. For example, motion-related reference data is denoted as "reference data" in FIG. The action-related reference data includes information on action-related references, which are references related to actions of the robot 100 . The motion-related criteria include criteria for motion-related information that is information related to the motion of the robot 100 . The motion-related information may include motion-related element information such as the motion of the robot 100, the state of the surrounding environment of the robot 100, the state of the object that the robot 100 acts on, and the state of the surrounding environment of the object.
 例えば、動作関連基準は、動作関連要素に最低限要求されるレベル、動作関連要素の標準的なレベル、動作関連要素の優良なレベル及び動作関連要素の理想的なレベル等を表す基準であってもよい。動作関連基準は、動作関連要素を評価するための特徴である評価対象特徴の基準を含んでもよい。 For example, the action-related standards are standards that express the minimum required level of action-related elements, the standard level of action-related elements, the excellent level of action-related elements, and the ideal level of action-related elements. good too. Action-related criteria may include criteria for features to be evaluated, which are features for evaluating action-related elements.
 例えば、1つの作業は、ロボット100の一連の複数の動作を含む。ロボット100の動作の動作関連基準は、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B並びに搬送車110それぞれの当該作業を形成する一連の複数の動作の基準を含んでもよい。例えば、ロボット100の動作の動作関連基準の評価対象特徴は、当該動作の所要時間、当該動作の順序、当該動作でのロボット100及びその各部の位置、当該動作でのロボット100及びその各部の姿勢、当該動作でのロボット100及びその各部の軌跡、当該動作での上記位置の速度、当該動作での上記姿勢の速度、当該動作での上記位置の加速度、当該動作での上記姿勢の加速度、及び、当該動作でのロボット100の作業の出来映え等の特徴を含んでもよい。 For example, one task includes a series of multiple motions of the robot 100. The motion-related criteria for motion of the robot 100 may include a series of multiple motion criteria for each of the robot arms 120A and 120B, the end effectors 130A and 130B, and the vehicle 110 that form the task of interest. For example, the evaluation target features of the action-related criteria of the action of the robot 100 are the time required for the action, the order of the action, the position of the robot 100 and its parts in the action, and the posture of the robot 100 and its parts in the action. , the trajectory of the robot 100 and its parts in the motion, the velocity of the position in the motion, the velocity of the posture in the motion, the acceleration of the position in the motion, the acceleration of the posture in the motion, and , the work performance of the robot 100 in the motion, and the like.
 ロボット100の周辺環境の状態は、ロボット100が作業を行うための周辺環境を形成する装置、設備及び機器等の構成要素であるロボット周辺要素の状態を含んでもよい。ロボット周辺要素の例は、サービス提供エリアASのようなロボット100の作業エリアに配置される装置、設備及び機器等の周辺要素、ロボット100と協働する装置、設備及び機器等の周辺要素、並びに、作業エリアに配置される撮像装置などのセンサ等の周辺要素である。ロボット周辺要素の動作関連基準の評価対象特徴は、ロボット100がロボット周辺要素に与える力及び衝撃、ロボット100がロボット周辺要素に与える位置、姿勢、形状及び状態の変化、並びに、ロボット周辺要素に対するロボット100及びその各部の位置、姿勢、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度等の特徴を含んでもよい。例えば、図1では、ロボット100のロボット周辺要素は、他のロボット100、対象物である物品Wが収納される収納棚SR、及び、ロボット100がサービスを提供するユーザP以外のユーザ等である。 The state of the surrounding environment of the robot 100 may include the state of robot peripheral elements, which are components such as devices, equipment, and equipment that form the surrounding environment for the robot 100 to work. Examples of robot peripheral elements include peripheral elements such as devices, facilities and equipment arranged in the work area of the robot 100 such as the service area AS, peripheral elements such as devices, facilities and equipment cooperating with the robot 100, and , and peripheral elements such as sensors, such as imaging devices, placed in the work area. The features to be evaluated for the motion-related criteria of the robot peripheral elements are the force and impact exerted by the robot 100 on the robot peripheral elements, changes in the position, posture, shape and state of the robot 100 exerted on the robot peripheral elements, and the robot relative to the robot peripheral elements. Features such as position, pose, position velocity, pose velocity, position acceleration and pose acceleration of 100 and its parts may be included. For example, in FIG. 1, the robot peripheral elements of the robot 100 are other robots 100, storage racks SR in which articles W, which are objects, are stored, and users other than the user P to whom the robot 100 provides services. .
 対象物の状態は、ロボット100がエンドエフェクタ130A及び103B等を用いて取り扱う対象物の状態を含んでもよい。対象物の動作関連基準の評価対象特徴は、対象物の位置、姿勢、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度、対象物の形状の変化、対象物の状態の変化、並びに、対象物の出来映え等の特徴を含んでもよい。例えば、図1では、対象物は、物品Wである。 The state of the object may include the state of the object handled by the robot 100 using the end effectors 130A and 103B. The features to be evaluated for the motion-related criteria of the object are the position, pose, position velocity, pose velocity, position acceleration and pose acceleration of the object, changes in the shape of the object, changes in the state of the object, and , the workmanship of the object, and other features. For example, in FIG. 1, the target object is an article W. FIG.
 対象物の周辺環境の状態は、ロボット100以外の対象物を取り扱う装置、設備及び機器等の構成要素である対象物周辺要素の状態を含んでもよい。対象物周辺要素の例は、対象物を支持、保持、収容、積載、搭載及び移送等する装置、設備及び機器等の周辺要素、対象物が組み付けられる装置、設備及び機器等の周辺要素、並びに、対象物に処理を加える装置、設備及び機器等の周辺要素である。対象物周辺要素の動作関連基準の評価対象特徴は、ロボット100が対象物周辺要素に与える力及び衝撃、ロボット100が対象物周辺要素に与える位置、姿勢、形状及び状態の変化、並びに、対象物周辺要素に対するロボット100及びその各部の位置、姿勢、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度等の特徴を含んでもよい。例えば、図1では、対象物周辺要素は、収納棚SRの棚板、収納棚SR内の他の物品、及びユーザP等である。 The state of the surrounding environment of the object may include the state of the elements surrounding the object, which are constituent elements such as devices, equipment, and equipment that handle the object other than the robot 100 . Examples of object peripheral elements include peripheral elements such as devices, facilities and equipment that support, hold, accommodate, load, mount and transfer objects, peripheral elements such as equipment, facilities and equipment to which objects are assembled, and , peripheral elements such as devices, equipment, and equipment that apply processing to objects. The features to be evaluated for the motion-related criteria of the object peripheral elements are the force and impact exerted by the robot 100 on the object peripheral elements, the position, posture, shape and state changes exerted by the robot 100 on the object peripheral elements, and the object Features such as the position, posture, position velocity, posture velocity, position acceleration, and posture acceleration of the robot 100 and its parts with respect to peripheral elements may be included. For example, in FIG. 1, the object peripheral elements are the shelf board of the storage shelf SR, other articles in the storage shelf SR, the user P, and the like.
 第2記憶部181bは、蓄積処理部180fによって蓄積されるデータである蓄積データを記憶する。蓄積データは、評価処理部180cによって決定される評価結果の情報と、当該評価結果の決定に用いられた動作関連基準の情報とを関連付けて含む。つまり、蓄積データでは、動作関連情報と、当該動作関連情報の動作関連基準の情報と、当該動作関連基準に基づく評価結果の情報とが関連付けられている。第2記憶部181bは第1記憶装置によって実現される機能部の一例である。 The second storage unit 181b stores accumulated data, which is data accumulated by the accumulation processing unit 180f. The accumulated data includes information on the evaluation result determined by the evaluation processing unit 180c and information on the action-related criteria used to determine the evaluation result in association with each other. That is, in the accumulated data, action-related information, action-related reference information for the action-related information, and evaluation result information based on the action-related reference are associated. The second storage unit 181b is an example of a functional unit realized by the first storage device.
 動作指令部180aは、手動運転制御のとき、操作端末200から受信する操作指令に従って、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B並びに搬送車110に動作させるための動作指令を生成し、ロボットコントローラ190に出力する。動作指令部180aは、自動運転制御のとき、制御プログラムに従ってロボットアーム120A及び120B、エンドエフェクタ130A及び130B並びに搬送車110に動作させるための動作指令を生成し、ロボットコントローラ190に出力する。 During manual operation control, the motion command unit 180a generates motion commands for causing the robot arms 120A and 120B, the end effectors 130A and 130B, and the transport vehicle 110 to operate according to the operation commands received from the operation terminal 200, and the robot controller 190 output. During automatic operation control, the motion command unit 180a generates motion commands for causing the robot arms 120A and 120B, the end effectors 130A and 130B, and the transport vehicle 110 to move according to the control program, and outputs them to the robot controller 190.
 操作指令は、操作者PОによって操作入力装置201に入力される操作の内容を示す指令である。操作指令は、当該操作の内容に従ってロボット100に手動運転制御のもとで動作させるための指令を含む。端末コンピュータ202は、操作入力装置201から操作の内容を示す信号を受け取り、当該信号を操作指令に変換し、当該操作指令を情報処理装置180に送信する。 The operation command is a command indicating the content of the operation input to the operation input device 201 by the operator PO. The operation command includes a command for operating the robot 100 under manual operation control according to the content of the operation. The terminal computer 202 receives a signal indicating the content of the operation from the operation input device 201 , converts the signal into an operation command, and transmits the operation command to the information processing device 180 .
 動作指令部180aは、ロボットアーム120Aへの第1動作指令、ロボットアーム120Bへの第2動作指令、エンドエフェクタ130Aへの第3動作指令、エンドエフェクタ130Bへの第4動作指令、及び、搬送車110への第5動作指令を生成する。例えば、第3動作指令及び第4動作指令はそれぞれ、エンドエフェクタ130A及び130Bが駆動装置を備える場合に生成され得る。第1動作指令及び第2動作指令は、ロボットアーム120A及び120Bの各部の目標位置の指令及び各部が発生する目標力の指令を含み得る。第3動作指令及び第4動作指令は、エンドエフェクタ130A及び130Bの目標状態の指令を含み得る。第5動作指令は、搬送車110の目標位置の指令を含み得る。 The motion command unit 180a issues a first motion command to the robot arm 120A, a second motion command to the robot arm 120B, a third motion command to the end effector 130A, a fourth motion command to the end effector 130B, and a transport vehicle. Generate a fifth motion command to 110 . For example, a third motion command and a fourth motion command, respectively, may be generated when end effectors 130A and 130B include drives. The first motion command and the second motion command may include target position commands and target force commands generated by each portion of the robot arms 120A and 120B. The third and fourth motion commands may include target state commands for the end effectors 130A and 130B. The fifth motion command may include a target position command for the transport vehicle 110 .
 なお、目標位置の指令は、目標の位置、位置の目標の移動速度、目標の姿勢、及び、姿勢の目標の移動速度等の指令を含み得る。目標力の指令は、目標の力の大きさ及び方向等の指令を含み得る。目標力の指令は、目標の加速度の指令を含んでもよい。 It should be noted that the target position command may include commands such as target position, position target movement speed, target attitude, and attitude target movement speed. The target force command may include commands such as target force magnitude and direction. The target force command may include a target acceleration command.
 計時部180hは、経過時間を計測し、計時結果を検出処理部180b、評価処理部180c、対価決定部180d及び提示処理部180e等に出力する。計時結果は、検出処理部180b、評価処理部180c、対価決定部180d及び提示処理部180e等での処理に使用され得る。例えば、計時部180hの機能は、情報処理装置180のコンピュータ装置に搭載されるクロック等により実現されてもよい。 The timing unit 180h measures the elapsed time and outputs the timing result to the detection processing unit 180b, the evaluation processing unit 180c, the consideration determination unit 180d, the presentation processing unit 180e, and the like. The timing result can be used for processing in the detection processing unit 180b, the evaluation processing unit 180c, the consideration determination unit 180d, the presentation processing unit 180e, and the like. For example, the function of the clock unit 180h may be realized by a clock or the like mounted on the computer device of the information processing device 180. FIG.
 検出処理部180bは、操作指令に従って動作するロボット100の動作関連情報を、ロボット100による作業の実行中に検出するための検出処理を実行する。検出処理部180bは、検出結果を評価処理部180cに出力する。本実施の形態では、検出処理部180bは、動作関連情報の検出後に遅延時間を設けずに、検出結果を評価処理部180cに出力するように構成されるが、これに限定されず、遅延時間が設けられてもよい。 The detection processing unit 180b executes detection processing for detecting motion-related information of the robot 100 that operates according to the operation command while the robot 100 is performing work. The detection processing unit 180b outputs the detection result to the evaluation processing unit 180c. In this embodiment, the detection processing unit 180b is configured to output the detection result to the evaluation processing unit 180c without setting a delay time after detecting the action-related information. may be provided.
 検出処理部180bは、作業の実行中に、動作関連情報として、動作関連要素の情報を検出する。検出処理部180bは、いかなる方法で動作関連要素の情報を検出するように構成されてもよい。 The detection processing unit 180b detects information of action-related elements as action-related information during execution of work. The detection processing unit 180b may be configured to detect information of motion-related elements in any manner.
 例えば、検出処理部180bは、動作関連要素として、ロボット100の動作を検出し、具体的には、ロボットアーム120A及び120B、エンドエフェクタ130A及び130B並びに搬送車110の動作を検出する。例えば、検出処理部180bは、操作指令を処理し操作指令が示すロボット100の目標動作を動作関連要素として検出するように構成されてもよく、動作指令を処理し動作指令が示すロボット100の目標動作を動作関連要素として検出するように構成されてもよい。検出処理部180bは、ロボットコントローラ190の駆動指令部190aから190eから駆動指令を取得して処理し、駆動指令が示すロボット100の目標動作を動作関連要素として検出するように構成されてもよい。検出処理部180bは、ロボットコントローラ190の駆動指令部190aから190eからフィードバック情報を取得して処理し、フィードバック情報が示すロボット100の動作を動作関連要素として検出するように構成されてもよい。検出処理部180bは、撮像装置144、145及び146並びにロボット100の外部の撮像装置等から撮像画像データを取得して画像処理し、処理後の画像データが示すロボット100の状態を、動作関連要素として検出するように構成されてもよい。 For example, the detection processing unit 180b detects the motion of the robot 100, specifically, the motions of the robot arms 120A and 120B, the end effectors 130A and 130B, and the carrier 110 as motion-related elements. For example, the detection processing unit 180b may be configured to process an operation command and detect a target motion of the robot 100 indicated by the operation command as a motion-related element. It may be configured to detect motion as a motion-related element. The detection processing unit 180b may be configured to acquire drive commands from the drive command units 190a to 190e of the robot controller 190, process them, and detect target motions of the robot 100 indicated by the drive commands as motion-related elements. The detection processing unit 180b may be configured to acquire feedback information from the drive command units 190a to 190e of the robot controller 190, process it, and detect the motion of the robot 100 indicated by the feedback information as a motion-related element. The detection processing unit 180b acquires captured image data from the imaging devices 144, 145 and 146 and an imaging device external to the robot 100, and processes the captured image data. It may be configured to detect as
 例えば、検出処理部180bは、動作関連要素として、ロボット100の周辺環境を形成するロボット周辺要素の状態を検出する。例えば、検出処理部180bは、ロボット100の外界センサから受け取る検出結果及びロボット周辺要素から受け取る情報等を用いて、ロボット周辺要素の状態を検出する。当該外界センサの例は、エンドエフェクタ130A及び103Bとロボットアーム120A及び120Bとの間に配置される力覚センサ、撮像装置144から146、集音装置147、並びに、サービス提供エリアASに配置される撮像装置などのセンサ等である。ロボット周辺要素から受け取る情報の例は、ロボット周辺要素の状態、振動、衝撃、位置、姿勢、位置の速度、姿勢の速度等である。 For example, the detection processing unit 180b detects the state of robot peripheral elements forming the peripheral environment of the robot 100 as motion-related elements. For example, the detection processing unit 180b detects the state of the robot peripheral elements using the detection results received from the external sensor of the robot 100 and the information received from the robot peripheral elements. Examples of the external sensors include force sensors arranged between the end effectors 130A and 103B and the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and the service area AS. It is a sensor such as an imaging device. Examples of information received from the robot peripheral elements are the state, vibration, impact, position, attitude, position velocity, attitude velocity, etc. of the robot peripheral elements.
 例えば、検出処理部180bは、動作関連要素として、ロボット100が作用を加える対象物である作用対象物の状態を検出する。例えば、検出処理部180bは、ロボット100の外界センサから受け取る検出結果及び作用対象物から受け取る情報等を用いて、作用対象物の状態を検出する。当該外界センサの例は、ロボットアーム120A及び120Bの力覚センサ、撮像装置144から146、集音装置147、及び、サービス提供エリアASに配置されるセンサ等である。作用対象物から受け取る情報の例は、作用対象物が備えるセンサの検出結果、並びに、作用対象物の状態、振動、衝撃、位置、姿勢、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度等である。 For example, the detection processing unit 180b detects, as an action-related element, the state of an action object, which is an object to which the robot 100 acts. For example, the detection processing unit 180b detects the state of the action object using the detection result received from the external sensor of the robot 100 and the information received from the action object. Examples of the external sensors include the force sensors of the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and sensors arranged in the service providing area AS. Examples of information received from the action object are the detection results of sensors provided by the action object, and the state, vibration, impact, position, attitude, position velocity, attitude velocity, position acceleration, and attitude of the action object. acceleration and the like.
 例えば、検出処理部180bは、動作関連要素として、作用対象物の周辺環境を形成する対象物周辺要素の状態を検出する。例えば、検出処理部180bは、ロボット100の外界センサから受け取る検出結果及び対象物周辺要素から受け取る情報等を用いて、対象物周辺要素の状態を検出する。当該外界センサの例は、ロボットアーム120A及び120Bの力覚センサ、撮像装置144から146、集音装置147、及び、サービス提供エリアASに配置されるセンサ等である。対象物周辺要素から受け取る情報の例は、対象物周辺要素の状態、振動、衝撃、位置、姿勢、位置の速度及び姿勢の速度等である。 For example, the detection processing unit 180b detects the state of an object peripheral element that forms the peripheral environment of the action object as an action-related element. For example, the detection processing unit 180b detects the state of the object peripheral element using the detection result received from the external sensor of the robot 100 and the information received from the object peripheral element. Examples of the external sensors include the force sensors of the robot arms 120A and 120B, the imaging devices 144 to 146, the sound collector 147, and sensors arranged in the service providing area AS. Examples of information received from the object surrounding elements are the state, vibration, impact, position, orientation, positional velocity, and orientational velocity of the object surrounding elements.
 検出処理部180bは、いかなるタイミングで動作関連要素を検出するように構成されてもよい。例えば、検出処理部180bは、所定の時間間隔でのタイミング、予め決められたロボット100の所定の動作の実行タイミング、及び予め決められた所定の作業段階のタイミング等の所定のタイミングで動作関連要素を検出するように構成されてもよい。限定されないが、本実施の形態では、検出処理部180bは、所定の時間間隔で検出処理を実行する。検出処理部180bは処理回路によって実現される機能部の一例である。 The detection processing unit 180b may be configured to detect motion-related elements at any timing. For example, the detection processing unit 180b detects motion-related elements at predetermined timings, such as timings at predetermined time intervals, predetermined timings for executing predetermined motions of the robot 100, and predetermined timings for predetermined work steps. may be configured to detect Although not limited, in the present embodiment, the detection processing unit 180b executes detection processing at predetermined time intervals. The detection processing unit 180b is an example of a functional unit implemented by a processing circuit.
 以下において、検出処理部180b以外の情報処理装置180の機能的構成要素が処理の対象とするロボット100の動作を、「処理対象動作」と称する。例えば、予め決められたロボット100の所定の動作は、処理対象動作の一例である。処理対象動作の単位は、任意に設定されてもよい。例えば、1単位の処理対象動作の区切りは、動作の実行期間等の時間的な区切り、動作が変化するタイミング等の動作変化の区切り、指定される区切り、その他の区切り、又は、これらの2つ以上の組み合わせ等により設定されてもよい。 Hereinafter, the motion of the robot 100 to be processed by the functional components of the information processing device 180 other than the detection processing unit 180b will be referred to as "processed motion". For example, a predetermined motion of the robot 100 is an example of a motion to be processed. The unit of operation to be processed may be set arbitrarily. For example, the delimitation of the motion to be processed in one unit may be a temporal delimitation such as the execution period of the motion, a delimitation of motion change such as the timing at which the motion changes, a designated delimitation, other delimitation, or two of these. It may be set by a combination of the above.
 例えば、ロボット100が、地点Aに載置されている対象物をエンドエフェクタ130Aで把持しテーブル140a上に載せる一連の動作を実行する場合、1単位の処理対象動作は、実行される一連の動作の全てを含むように設定されてもよく、当該一連の動作の一部を含むように設定されてもよい。例えば、1単位の処理対象動作は、エンドエフェクタ130Aを対象物へ移動するロボットアーム120Aの移動動作、対象物を把持するエンドエフェクタ130Aの把持動作、エンドエフェクタ130Aをテーブル140a上へ移動し対象物を載置するロボットアーム120Aの移動動作、及び、対象物の把持を解除するエンドエフェクタ130Aの把持解除動作の4つの動作うちの1つ以上を含むように設定されてもよい。例えば、当該4つの動作に対して、1つの処理対象動作が設定されてもよく、2つ以上の処理対象動作が設定されてもよい。 For example, when the robot 100 executes a series of operations of gripping an object placed at a point A with the end effector 130A and placing it on the table 140a, one unit of processing target operation is a series of operations to be executed. may be set to include all of, or may be set to include a part of the series of operations. For example, one unit of processing target motion includes a moving motion of the robot arm 120A that moves the end effector 130A to the target object, a gripping motion of the end effector 130A that grips the target object, and a moving motion of the end effector 130A onto the table 140a to move the target object. and one or more of the four motions of the movement motion of the robot arm 120A for placing the object and the grip releasing motion of the end effector 130A releasing the grip of the object. For example, one target action may be set for the four actions, or two or more target actions may be set.
 作業段階は、作業に含まれる一連の動作を区分けすることによって設定され得る。作業段階の単位は、任意に設定されてもよい。例えば、1単位の作業段階の区切りは、作業の実行期間等の時間的な区切り、作業に含まれる動作が切り替わるタイミング等の動作変化の区切り、指定される区切り、その他の区切り、又は、これらの2つ以上の組み合わせ等により設定されてもよい。例えば、ロボット100が、ユーザPから対象物を要求する指令を受け付け、地点Aに載置されている対象物を運搬しユーザPに渡す作業を実行する場合、1単位の作業段階は、作業中に実行される全ての動作の一部を含むように設定されてもよい。 A work stage can be set by dividing a series of actions included in the work. The unit of work steps may be set arbitrarily. For example, a division of one unit of work stage may be a temporal division such as a work execution period, a movement change division such as a timing at which the movement included in the work is switched, a designated division, other divisions, or any of these divisions. It may be set by a combination of two or more. For example, when the robot 100 receives a command requesting an object from a user P, carries an object placed at a point A, and delivers it to the user P, one unit of work stage may be set to include a portion of all operations performed in
 例えば、1単位の作業段階は、ロボット100がユーザPの近傍位置から地点Aの近傍位置へ移動する工程と、ロボット100がエンドエフェクタ130Aにより地点Aの対象物を把持しロボット100上に載置する工程と、ロボット100が地点Aの近傍位置からユーザPの近傍位置へ移動する工程と、ロボット100がエンドエフェクタ130Aによりロボット100上の対象物をユーザPに手渡す工程との4つの工程のうちの1つ以上の工程を含むように設定されてもよい。例えば、当該4つの工程に対して、1つの作業段階が設定されてもよく、2つ以上の作業段階が設定されてもよい。 For example, one unit work stage includes a process in which the robot 100 moves from a position near the user P to a position near the point A, and the robot 100 grips an object at the point A with the end effector 130A and places it on the robot 100. moving the robot 100 from a position near the point A to a position near the user P; and handing the object on the robot 100 to the user P using the end effector 130A. may be set to include one or more steps of For example, one work stage may be set for the four processes, or two or more work stages may be set.
 評価処理部180cは、検出処理部180bによって検出されるロボット100の動作関連情報を、ロボット100による作業の実行中に動作関連情報の動作関連基準と比較し評価するための評価処理を実行する。具体的には、評価処理部180cは、動作関連情報に含まれる動作関連要素の情報と当該動作関連要素の動作関連基準とを比較して当該動作関連要素の評価結果を決定し、当該動作関連要素の評価結果に基づき当該動作関連情報の評価結果を決定する。 The evaluation processing unit 180c performs evaluation processing for comparing and evaluating the motion-related information of the robot 100 detected by the detection processing unit 180b with the motion-related criteria of the motion-related information while the robot 100 is performing a task. Specifically, the evaluation processing unit 180c compares the information of the action-related element included in the action-related information with the action-related criteria of the action-related element, determines the evaluation result of the action-related element, and determines the evaluation result of the action-related element. An evaluation result of the action-related information is determined based on the evaluation result of the element.
 評価処理部180cの評価処理対象の動作関連情報は、作業に含まれる処理対象動作のうちの評価対象の処理対象動作に対応する動作関連情報である。以下において、評価対象の処理対象動作を「評価対象動作」と称する。評価対象動作は、作業に含まれる全ての処理対象動作のうちの1つ以上の処理対象動作である。当該全ての処理対象動作が評価対象動作とされてもよく、当該全ての処理対象動作の一部が評価対象動作とされてもよい。2つ以上の評価対象動作は、互いに連続してもよく、互いに不連続であってもよい。 The action-related information to be evaluated by the evaluation processing unit 180c is action-related information corresponding to the process-target action to be evaluated among the process-target actions included in the work. Hereinafter, the processing target motion to be evaluated is referred to as an "evaluation target motion". The evaluation target action is one or more of all the process target actions included in the work. All of the process target actions may be the evaluation target actions, or part of all the process target actions may be the evaluation target actions. Two or more evaluation target actions may be continuous with each other or may be discontinuous with each other.
 例えば、評価処理部180cは、評価対象動作に対応する動作関連情報の動作関連要素と当該動作関連要素の評価対象特徴とのうちのいずれか又は両方についての動作関連基準に対する逸脱量、逸脱方向及び逸脱頻度等に基づき、当該動作関連要素と当該評価対象特徴とのうちのいずれか又は両方の評価結果を決定してもよい。さらに、評価処理部180cは、当該動作関連要素と当該評価対象特徴とのうちのいずれか又は両方の評価結果に基づき、当該動作関連情報の評価結果を決定してもよい。 For example, the evaluation processing unit 180c may determine the deviation amount, deviation direction and An evaluation result of either or both of the action-related element and the evaluation target feature may be determined based on the deviation frequency or the like. Furthermore, the evaluation processing unit 180c may determine the evaluation result of the action-related information based on the evaluation results of either or both of the action-related element and the evaluation target feature.
 評価処理部180cは、評価対象動作の都度のタイミングと、作業における評価対象の作業段階である評価対象段階の都度のタイミングとのうちのいずれか又は両方のタイミングで評価処理を実行する。作業に含まれる全ての作業段階が評価対象段階とされてもよく、当該全ての作業段階の一部が評価対象段階とされてもよい。さらに、評価対象段階に含まれる全ての処理対象動作に対応する動作関連情報が、評価対象段階の評価のための評価対象とされてもよく、当該全ての処理対象動作の一部に対応する動作関連情報が、評価対象段階の評価のための評価対象とされてもよい。 The evaluation processing unit 180c executes the evaluation process at either or both timings of each evaluation target motion and each evaluation target stage, which is a work stage to be evaluated in the work. All the work stages included in the work may be the evaluation target stages, or a part of all the work stages may be the evaluation target stages. Furthermore, the action-related information corresponding to all the processing target actions included in the evaluation target stage may be the evaluation target for the evaluation of the evaluation target stage. Relevant information may be subject to evaluation for evaluation in the subject to evaluation stage.
 評価対象動作の都度のタイミングでの評価処理では、評価処理部180cは、当該評価対象動作の動作関連基準データを第1記憶部181aから読み出し、当該動作関連基準データと当該評価対象動作に対応する動作関連情報とを比較し、当該動作関連情報の評価結果を決定する。例えば、評価処理部180cは、動作関連情報の評価レベルを決定してもよい。例えば、評価処理部180cは、当該動作関連基準データに含まれる評価対象特徴の基準と、当該動作関連情報の動作関連要素の評価対象特徴とを比較し、評価対象特徴の評価レベルを決定してもよい。評価処理部180cは、評価対象特徴の評価レベルに基づき、当該動作関連要素の評価レベル及び当該動作関連情報の評価レベルのうちのいずれか又は両方を決定してもよい。 In the evaluation process at each timing of the evaluation target motion, the evaluation processing unit 180c reads the motion-related reference data of the evaluation target motion from the first storage unit 181a, and stores the motion-related reference data and the evaluation target motion corresponding to the motion-related reference data. It compares with the action-related information and determines an evaluation result of the action-related information. For example, the evaluation processing unit 180c may determine the evaluation level of action-related information. For example, the evaluation processing unit 180c compares the evaluation target feature criteria included in the action-related reference data with the evaluation target feature of the action-related element of the action-related information, and determines the evaluation level of the evaluation target feature. good too. The evaluation processing unit 180c may determine one or both of the evaluation level of the action-related element and the evaluation level of the action-related information based on the evaluation level of the evaluation target feature.
 評価処理部180cは、例えば、第1の評価対象動作に対応する動作関連情報の評価処理を、第1の評価対象動作の次の評価対象動作である第2の評価対象動作のタイミングまでに実行し得る。評価処理部180cは、第1の評価対象動作に対応する動作関連情報の評価処理を、第2の評価対象動作のタイミングまでに実行し完了し得る。例えば、第2の評価対象動作のタイミングは、動作関連基準データに含まれる第2の評価対象動作の基準のタイミング、第2の評価対象動作に対応する動作関連情報の検出タイミング、又は、これらの組み合わせのタイミング等であってもよい。基準のタイミング及び検出タイミングは、動作の開始タイミング、動作の完了タイミング又はこれらの間の所定のタイミングであってもよい。 The evaluation processing unit 180c, for example, executes the evaluation processing of the action-related information corresponding to the first evaluation target action by the timing of the second evaluation target action, which is the next evaluation target action after the first evaluation target action. can. The evaluation processing unit 180c can execute and complete the evaluation processing of the action-related information corresponding to the first evaluation target action by the timing of the second evaluation target action. For example, the timing of the second evaluation target motion is the reference timing of the second evaluation target motion included in the motion-related reference data, the detection timing of the motion-related information corresponding to the second evaluation target motion, or any of these. It may be the timing of combination or the like. The reference timing and detection timing may be an operation start timing, an operation completion timing, or a predetermined timing therebetween.
 評価処理部180cは、第1の評価対象動作に対応する動作関連情報の評価処理を、第1の評価対象動作の次の処理対象動作のタイミングまでに実行するように構成されてもよい。例えば、次の処理対象動作のタイミングは、当該処理対象動作に対応するロボット100の実際の動作についての、完了タイミング、又は開始タイミングと完了タイミングとの間の所定のタイミングであってもよい。限定されないが、本実施の形態では、評価処理部180cは、第1の評価対象動作に対応する動作関連情報の検出結果を検出処理部180bから受け取った後に遅延時間を設けずに、評価処理を開始する。 The evaluation processing unit 180c may be configured to execute the evaluation process of the action-related information corresponding to the first evaluation target action by the timing of the next process target action after the first evaluation target action. For example, the timing of the next processing target motion may be the completion timing or a predetermined timing between the start timing and the completion timing of the actual motion of the robot 100 corresponding to the processing target motion. Although not limited, in the present embodiment, the evaluation processing unit 180c performs the evaluation processing without setting a delay time after receiving the detection result of the motion-related information corresponding to the first evaluation target motion from the detection processing unit 180b. Start.
 評価対象段階の都度のタイミングでの評価処理では、評価処理部180cは、当該評価対象段階に含まれる処理対象動作のうちの評価対象の処理対象動作の動作関連基準データを第1記憶部181aから読み出し、当該動作関連基準データと、評価対象の処理対象動作に対応する動作関連情報とを比較し、当該動作関連情報の評価結果を決定する。評価処理部180cは、全ての評価対象の処理対象動作に対応する動作関連情報の評価結果を統合し、統合結果により当該評価対象段階の評価結果を決定する。例えば、評価処理部180cは、評価対象段階の評価レベルを決定してもよい。統合方法は、いかなる方法であってもよい。例えば、統合方法は、複数の動作関連情報の評価結果を加算する方法であってもよく、複数の動作関連情報の評価結果を重み付けて加算する方法であってもよく、複数の動作関連情報の評価結果を他の統計的処理により統合する方法であってもよい。 In the evaluation process at each timing of the evaluation target stage, the evaluation processing unit 180c retrieves the action-related reference data of the process target action to be evaluated among the process target actions included in the evaluation target stage from the first storage unit 181a. The action-related reference data is read out, and the action-related information corresponding to the process target action to be evaluated is compared with the action-related reference data to determine the evaluation result of the action-related information. The evaluation processing unit 180c integrates the evaluation results of the motion-related information corresponding to the processing target motions of all the evaluation targets, and determines the evaluation result of the evaluation target stage based on the integration result. For example, the evaluation processing unit 180c may determine the evaluation level of the evaluation target stage. Any integration method may be used. For example, the integration method may be a method of adding evaluation results of a plurality of action-related information, a method of weighting and adding evaluation results of a plurality of action-related information, or a method of adding evaluation results of a plurality of action-related information. A method of integrating the evaluation results by other statistical processing may be used.
 評価処理部180cは、例えば、第1の作業段階の評価処理を、第1の評価対象段階の次の評価対象段階である第2の評価対象段階のタイミングまでに実行し得る。評価処理部180cは、第1の評価対象段階に含まれる全ての評価対象の処理対象動作に対応する動作関連情報の評価処理を、第2の評価対象段階のタイミングまでに実行し完了し得る。例えば、第2の作業段階のタイミングは、動作関連基準データに含まれる第2の評価対象段階の基準のタイミング、第2の評価対象段階の検出タイミング、又は、これらの組み合わせのタイミング等であってもよい。基準のタイミング及び検出タイミングは、第2の評価対象段階の最初の処理対象動作の開始タイミング、最後の処理対象動作の完了タイミング又はこれらの間の所定のタイミングであってもよい。 The evaluation processing unit 180c can, for example, execute the evaluation process of the first work stage by the timing of the second evaluation stage, which is the next evaluation stage after the first evaluation stage. The evaluation processing unit 180c can execute and complete the evaluation processing of action-related information corresponding to the process target actions of all evaluation targets included in the first evaluation target stage by the timing of the second evaluation target stage. For example, the timing of the second work stage is the reference timing of the second evaluation target stage included in the motion-related reference data, the detection timing of the second evaluation target stage, or a combination of these timings. good too. The reference timing and the detection timing may be the start timing of the first target operation in the second evaluation stage, the completion timing of the last target operation, or a predetermined timing therebetween.
 評価処理部180cは、第1の評価対象段階の評価処理を、第1の評価対象段階の次の作業段階のタイミングまでに実行するように構成されてもよい。例えば、次の作業段階のタイミングは、当該作業段階の最初の処理対象動作の開始タイミング、最後の処理対象動作の完了タイミング又はこれらの間の所定のタイミングであってもよい。限定されないが、本実施の形態では、評価処理部180cは、第1の評価対象段階に含まれる全ての評価対象の処理対象動作のうちの最後の評価対象の処理対象動作に対応する動作関連情報の検出結果を、検出処理部180bから受け取った後に遅延時間を設けずに、評価処理を開始する。評価処理部180cは処理回路によって実現される機能部の一例である。 The evaluation processing unit 180c may be configured to execute the evaluation process of the first evaluation target stage by the timing of the work stage following the first evaluation target stage. For example, the timing of the next work step may be the start timing of the first target operation of the work step, the completion timing of the last target operation, or a predetermined timing between these. Although not limited, in the present embodiment, the evaluation processing unit 180c collects action-related information corresponding to the last evaluation target processing target action among all evaluation target processing target actions included in the first evaluation target stage. is received from the detection processing unit 180b, the evaluation processing is started without providing a delay time. The evaluation processing unit 180c is an example of a functional unit implemented by a processing circuit.
 対価決定部180dは、評価処理部180cの評価結果に基づき、ロボット100の操作に対して操作者PОに与える対価を決定する。対価決定部180dは、評価対象動作に対応する動作関連情報の評価処理の都度、当該動作関連情報の評価結果に対応する対価を決定する。対価決定部180dは、評価対象段階の評価処理の都度、当該評価対象段階の評価結果に対応する対価を決定する。例えば、対価は、操作者PОに与えられる報酬額、ポイント、得点及びスコア等であってもよい。報酬額は、現実の通貨の報酬額であってよく、仮想の通貨の報酬額であってもよい。 The compensation determination unit 180d determines compensation to be given to the operator PO for the operation of the robot 100 based on the evaluation result of the evaluation processing unit 180c. The price determining unit 180d determines a price corresponding to the evaluation result of the action-related information each time the action-related information corresponding to the action to be evaluated is evaluated. The compensation determination unit 180d determines a compensation corresponding to the evaluation result of the evaluation target stage each time the evaluation process of the evaluation target stage is performed. For example, the consideration may be a reward amount, points, score, score, etc. given to the operator PO. The reward amount may be a reward amount in real currency or may be a reward amount in virtual currency.
 対価決定部180dは、作業の実行中、決定された個々の対価を順次積み上げ、対価を積み上げた結果である積算結果を算出する。対価決定部180dは、対価を決定する毎に、決定された対価の情報と、決定された対価を含む積算結果の情報とを提示処理部180eに出力する。対価決定部180dは、評価対象動作に対応する動作関連情報に対する対価と、評価対象段階に対する対価とを、別々に積み上げて2つの積算結果を算出するように構成されてもよく、2つ対価を統合して積み上げて1つの積算結果を算出するように構成されてもよい。 The compensation determination unit 180d sequentially accumulates the individual compensations that have been determined during the execution of the work, and calculates the cumulative result that is the result of accumulating the compensations. Each time the price determining unit 180d determines a price, it outputs information on the price that has been decided and information on the accumulation result including the price that has been decided to the presentation processing part 180e. The consideration determining unit 180d may be configured to separately accumulate the consideration for the action-related information corresponding to the action to be evaluated and the consideration for the stage to be evaluated, thereby calculating two accumulation results. It may be configured to integrate and accumulate to calculate one integrated result.
 対価決定部180dは、評価対象動作に対応する動作関連情報の評価レベルに重み付けを付加して当該動作関連情報の対価を決定するように構成されてもよい。対価決定部180dは、当該動作関連情報の評価対象特徴の評価レベルに重み付けを付加し、重み付け付加後の評価レベルを統合することによって、当該動作関連情報の対価を決定するように構成されてもよく、決定された当該動作関連情報の対価にさらに重み付けを付加するように構成されてもよい。例えば、評価対象動作に応じて操作の難易度が異なる。対価決定部180dは、評価対象動作に応じて重み付けを変えるように構成されてもよい。対価決定部180dは、評価対象特徴に応じて重み付けを変えるように構成されてもよい。 The price determination unit 180d may be configured to add weight to the evaluation level of the action-related information corresponding to the action to be evaluated and determine the price for the action-related information. The price determining unit 180d may be configured to determine the price of the action-related information by adding weights to the evaluation levels of the evaluation target features of the action-related information and integrating the weighted evaluation levels. It may be configured to further weight the determined consideration for the action-related information. For example, the difficulty level of the operation differs depending on the motion to be evaluated. The consideration determination unit 180d may be configured to change the weighting according to the motion to be evaluated. The price determining unit 180d may be configured to change the weighting according to the feature to be evaluated.
 対価決定部180dは、評価対象段階の評価レベルに重み付けを付加して対価を決定するように構成されてもよい。対価決定部180dは、評価対象段階に含まれる評価対象の処理対象動作に対応する動作関連情報の評価レベルに重み付けを付加し、重み付け付加後の評価レベルを統合することによって、当該評価対象段階の対価を決定してもよく、決定された当該評価対象段階の対価にさらに重み付けを付加するように構成されてもよい。対価決定部180dは、評価対象段階及び処理対象動作に応じて重み付けを変えるように構成されてもよい。上記において対価決定部180dが用いる統合方法は、いかなる方法であってもよい。対価決定部180dは処理回路によって実現される機能部の一例である。 The price determining unit 180d may be configured to determine the price by weighting the evaluation level of the evaluation target stage. The consideration determining unit 180d adds weights to the evaluation levels of the action-related information corresponding to the processing target action of the evaluation target included in the evaluation target stage, and integrates the evaluation levels after the weighting, thereby A consideration may be determined, and it may be configured to further weight the determined consideration for the evaluation target stage. The consideration determining unit 180d may be configured to change the weighting according to the evaluation target stage and the processing target action. Any method may be used as the integration method used by the consideration determining unit 180d. The consideration determining unit 180d is an example of a functional unit implemented by a processing circuit.
 提示処理部180eは、評価処理部180cの評価結果を、操作者PОが操作入力装置201を操作しつつ情報の提示を受ける提示装置203に、作業の実行中に提示させるための提示処理を実行する。限定されないが、本実施の形態では、提示処理部180eは、対価決定部180dによって決定される対価と、作業の実行中の対価の積算結果とを提示装置203に提示させる。提示処理部180eは、対価決定部180dから受け取る対価及びその積算結果を含むデータを生成し、当該データと当該データの提示指令とを操作端末200に送信する。例えば、提示処理部180eは、対価及びその積算結果を示す、画像データ、音声データ、文字列データ又はこれらの2つ以上の組み合わせを含むデータを生成し、操作端末200に送信してもよい。 The presentation processing unit 180e executes presentation processing for causing the presentation device 203, which receives information presentation while the operator PO operates the operation input device 201, to present the evaluation result of the evaluation processing unit 180c during the execution of the work. do. Although not limited, in the present embodiment, the presentation processing unit 180e causes the presentation device 203 to present the price determined by the price determination unit 180d and the accumulated result of the price during execution of the work. The presentation processing unit 180 e generates data including the price received from the price determination unit 180 d and the accumulated result, and transmits the data and a presentation command for the data to the operation terminal 200 . For example, the presentation processing unit 180e may generate data including image data, audio data, character string data, or a combination of two or more of these, and transmit the generated data to the operation terminal 200, which indicates the price and the accumulated result.
 提示処理部180eは、評価対象動作の都度のタイミングで評価処理部180cによって評価処理が実行される場合、評価対象動作の実行順序と同じ順序で、評価対象動作に対応する動作関連情報の評価結果を操作端末200に送信し提示装置203に提示させる。評価結果は、評価レベルと対価とのうちのいずれか又は両方を含み得る。例えば、提示処理部180eは、第1の評価対象動作に対応する動作関連情報の評価結果の提示処理を、次の第2の評価対象動作のタイミングまでに実行し得る。第2の評価対象動作のタイミングは、評価処理部180cの第2の評価対象動作のタイミングについて例示したタイミングのいずれかであってもよく、評価処理部180cの第2の評価対象動作のタイミングと同じであってもよく異なっていてもよい。 When the evaluation processing is executed by the evaluation processing unit 180c at each timing of the evaluation target motion, the presentation processing unit 180e displays the evaluation result of the motion-related information corresponding to the evaluation target motion in the same order as the execution order of the evaluation target motion. is transmitted to the operation terminal 200 and presented by the presentation device 203 . Evaluation results may include either or both of an evaluation level and consideration. For example, the presentation processing unit 180e can execute the presentation processing of the evaluation result of the action-related information corresponding to the first evaluation target action by the timing of the next second evaluation target action. The timing of the second evaluation target action may be any one of the timings exemplified for the timing of the second evaluation target action of the evaluation processing unit 180c. They may be the same or different.
 提示処理部180eは、評価対象段階の都度のタイミングで評価処理部180cによって評価処理が実行される場合、評価対象段階の実行順序と同じ順序で、評価対象段階の評価結果を操作端末200に送信し提示装置203に提示させる。評価結果は、評価レベルと対価とのうちのいずれか又は両方を含み得る。例えば、提示処理部180eは、第1の評価対象段階の評価結果の提示処理を、次の第2の評価対象段階のタイミングまでに実行し得る。第2の評価対象段階のタイミングは、評価処理部180cの第2の評価作業段階のタイミングについて例示したタイミングのいずれかであってもよく、評価処理部180cの第2の評価作業段階のタイミングと同じであってもよく異なっていてもよい。提示処理部180eは処理回路によって実現される機能部の一例である。 When the evaluation processing is executed by the evaluation processing unit 180c at each timing of the evaluation target stage, the presentation processing unit 180e transmits the evaluation result of the evaluation target stage to the operation terminal 200 in the same order as the execution order of the evaluation target stage. and causes the presentation device 203 to present it. Evaluation results may include either or both of an evaluation level and consideration. For example, the presentation processing unit 180e can execute the presentation processing of the evaluation result of the first evaluation target stage by the timing of the next second evaluation target stage. The timing of the second evaluation target stage may be any of the timings exemplified for the timing of the second evaluation work stage of the evaluation processing unit 180c, and may be the timing of the second evaluation work stage of the evaluation processing unit 180c. They may be the same or different. The presentation processing unit 180e is an example of a functional unit implemented by a processing circuit.
 蓄積処理部180fは、動作関連基準データと、動作関連基準データに基づく動作関連情報の評価結果の情報と、当該動作関連情報に対応するロボット100の動作の実施日時とを関連付けて、蓄積データとして第2記憶部181bに記憶させる。蓄積処理部180fは、評価対象動作に関する蓄積データと、評価対象段階に関する蓄積データとを第2記憶部181bに記憶させる。 The accumulation processing unit 180f associates the action-related reference data, the evaluation result information of the action-related information based on the action-related reference data, and the execution date and time of the action of the robot 100 corresponding to the action-related information, and stores them as accumulated data. It is stored in the second storage unit 181b. The accumulation processing unit 180f causes the second storage unit 181b to store the accumulated data regarding the motion to be evaluated and the accumulated data regarding the stage to be evaluated.
 例えば、蓄積処理部180fは、評価対象動作に関する蓄積データとして、評価対象動作の情報と、当該評価対象動作に対応する動作関連情報の評価結果の情報と、当該評価結果の評価処理に用いられた動作関連基準データとを関連付けて、第2記憶部181bに記憶させてもよい。評価結果の情報は、評価レベルと対価とのうちのいずれか又は両方を含み得る。 For example, the accumulation processing unit 180f stores information on the evaluation target motion, information on the evaluation result of the motion-related information corresponding to the evaluation target motion, and information on the evaluation result of the motion-related information corresponding to the evaluation target motion as the accumulated data on the evaluation target motion. It may be stored in the second storage unit 181b in association with the motion-related reference data. The evaluation result information may include either or both of the evaluation level and the consideration.
 例えば、蓄積処理部180fは、評価対象段階に関する蓄積データとして、評価対象段階の情報と、当該評価対象段階に含まれる評価対象の処理対象動作の情報と、当該処理対象動作に対応する動作関連情報の評価結果の情報と、当該評価対象段階の評価結果の情報と、当該評価結果の評価処理に用いられた動作関連基準データとのうちの2つ以上を関連付けて、第2記憶部181bに記憶させてもよい。評価結果の情報は、評価レベルと対価とのうちのいずれか又は両方を含み得る。 For example, the accumulation processing unit 180f stores, as accumulated data related to the evaluation target stage, information on the evaluation target stage, information on the processing target action of the evaluation target included in the evaluation target stage, and action-related information corresponding to the processing target action. 2 or more of the evaluation result information, the evaluation result information of the evaluation target stage, and the action-related reference data used in the evaluation process of the evaluation result are associated and stored in the second storage unit 181b. You may let The evaluation result information may include either or both of the evaluation level and the consideration.
 基準決定部180gは、第2記憶部181bの蓄積データを用いて、新たな動作関連基準データを決定し、当該動作関連基準データを第1記憶部181aに記憶させる。基準決定部180gは、第2記憶部181bの蓄積データに含まれる第1の動作関連基準データと第1の動作関連基準データに基づく評価結果とを用いて、新たな第2の動作関連基準データを決定する。第1の動作関連基準データに基づく評価結果は、第1の動作関連基準データを用いて決定された評価結果である。 The reference determination unit 180g uses the accumulated data in the second storage unit 181b to determine new action-related reference data, and stores the action-related reference data in the first storage unit 181a. The reference determination unit 180g generates new second action-related reference data using the first action-related reference data included in the accumulated data of the second storage unit 181b and the evaluation result based on the first action-related reference data. to decide. An evaluation result based on the first action-related reference data is an evaluation result determined using the first action-related reference data.
 例えば、基準決定部180gは、第2の動作関連基準データを決定するための指定条件を受け付けることができる。例えば、基準決定部180gは、指定条件を操作端末200から受け付ける。例えば、指定条件は、第2の動作関連基準データの決定に用いられる第1の動作関連基準データ及び評価結果を限定するための条件であってもよい。基準決定部180gは、第2記憶部181bの蓄積データの中から、指定条件を満たす評価結果の情報と、当該評価結果に用いられた動作関連基準データである第1の動作関連基準データとを抽出し、当該評価結果の情報及び当該第1の動作関連基準データを用いて第2の動作関連基準データを決定してもよい。 For example, the reference determination unit 180g can accept specified conditions for determining the second action-related reference data. For example, the reference determination unit 180g receives a specified condition from the operation terminal 200. FIG. For example, the specified condition may be a condition for limiting the first action-related reference data and evaluation results used to determine the second action-related reference data. The reference determination unit 180g selects, from the data stored in the second storage unit 181b, the information of the evaluation result that satisfies the specified condition, and the first action-related reference data that is the action-related reference data used for the evaluation result. Second action-related reference data may be determined using the evaluation result information and the first action-related reference data.
 例えば、指定条件は、第1の動作関連基準データが用いられる評価結果の属性を限定する条件であってもよい。評価結果の属性は、評価結果が対象とするユーザ、評価結果が対象とするロボット、評価結果が対象とする作業、評価結果が対象とする作業場所、評価結果が対象とする作業環境、評価結果が対象とする作業実施日、及び評価結果が対象とする作業実施期間等であってもよい。指定条件は、1つ以上の評価結果の属性を限定する条件であってもよい。 For example, the specified condition may be a condition that limits the attributes of the evaluation results for which the first action-related reference data is used. The attributes of the evaluation results are the users targeted by the evaluation results, the robots targeted by the evaluation results, the tasks targeted by the evaluation results, the workplaces targeted by the evaluation results, the work environments targeted by the evaluation results, and the evaluation results. It may be the work execution date targeted by the evaluation result, and the work execution period targeted by the evaluation result. The specified condition may be a condition that limits one or more evaluation result attributes.
 例えば、基準決定部180gは、特定のユーザの評価結果及び第1の動作関連基準データを用いて、当該ユーザに対応する第2の動作関連基準データを決定することができる。例えば、当該ユーザの評価レベルが低い場合、基準決定部180gは、第1の動作関連基準データよりも低い基準の第2の動作関連基準データを生成してもよい。当該ユーザは、ロボット100を操作するとき、第2の動作関連基準データを用いた評価の要求を、操作端末200を介して情報処理装置180に指令することができる。これにより、当該ユーザは、情報処理装置180からより高い評価を得ることができ、ロボット操作に対する意欲を向上させ得る。基準決定部180gは処理回路によって実現される機能部の一例である。 For example, the reference determination unit 180g can use the evaluation result of a specific user and the first action-related reference data to decide the second action-related reference data corresponding to the user. For example, if the user's evaluation level is low, the reference determination unit 180g may generate second action-related reference data with a lower reference than the first action-related reference data. When the user operates the robot 100 , the user can command the information processing device 180 via the operation terminal 200 to request evaluation using the second motion-related reference data. As a result, the user can obtain a higher evaluation from the information processing device 180, and can be motivated to operate the robot. The reference determination unit 180g is an example of a functional unit implemented by a processing circuit.
 第1駆動指令部190aは、第1動作指令に含まれる目標位置の指令及び目標力の指令の状態にまで、ロボットアーム120Aのアーム駆動装置M1AからM4Aに関節を駆動させるための駆動指令を生成し、駆動回路142aに出力する。駆動指令はサーボモータの電流の指令値を含む。第1駆動指令部190aは、サーボモータのフィードバック情報を用いて駆動指令を生成する。 The first drive command unit 190a generates a drive command for driving the joints of the arm drive devices M1A to M4A of the robot arm 120A up to the state of the target position command and the target force command included in the first motion command. and output to the drive circuit 142a. The drive command includes a current command value for the servomotor. The first drive command section 190a generates a drive command using feedback information of the servomotor.
 第2駆動指令部190bは、第2動作指令に含まれる目標位置の指令及び目標力の指令の状態にまで、ロボットアーム120Bのアーム駆動装置M1BからM4Bに関節を駆動させるための駆動指令を生成し、駆動回路142aに出力する。駆動指令はサーボモータの電流の指令値を含む。第2駆動指令部190bは、サーボモータのフィードバック情報を用いて駆動指令を生成する。 The second drive command unit 190b generates a drive command for driving the joints of the arm drive devices M1B to M4B of the robot arm 120B up to the state of the target position command and the target force command included in the second motion command. and output to the drive circuit 142a. The drive command includes a current command value for the servomotor. The second drive command section 190b generates a drive command using feedback information of the servomotor.
 第3駆動指令部190cは、第3動作指令に含まれる目標状態にまでエンドエフェクタ130Aの駆動装置を駆動させるための駆動指令を生成し、駆動回路142aに出力する。 The third drive command section 190c generates a drive command for driving the drive device of the end effector 130A to the target state included in the third motion command, and outputs it to the drive circuit 142a.
 第4駆動指令部190dは、第4動作指令に含まれる目標状態にまでエンドエフェクタ130Bの駆動装置を駆動させるための駆動指令を生成し、駆動回路142aに出力する。 The fourth drive command section 190d generates a drive command for driving the drive device of the end effector 130B to the target state included in the fourth operation command, and outputs it to the drive circuit 142a.
 第5駆動指令部190eは、第5動作指令に含まれる目標位置の指令の状態にまで、搬送車110の搬送駆動装置113に駆動輪111を駆動させるための駆動指令を生成し、駆動回路142aに出力する。駆動指令はサーボモータの電流の指令値を含む。第5駆動指令部190eは、サーボモータのフィードバック情報を用いて駆動指令を生成する。 The fifth drive command unit 190e generates a drive command for causing the carrier drive device 113 of the carrier 110 to drive the drive wheels 111 to the state of the target position command included in the fifth operation command. output to The drive command includes a current command value for the servomotor. The fifth drive command section 190e generates a drive command using feedback information of the servomotor.
 [ロボットシステムの動作]
 図5及び図6を参照しつつ、実施の形態に係るロボットシステム1の動作の一例を説明する。図5は、実施の形態に係るロボットシステム1の動作の一例を示すフローチャートである。図6は、実施の形態に係るロボットシステム1の動作の一例を示す平面図である。以下において、ロボット100がユーザPから要求される物品Wを収納棚SRから探し出し、当該ユーザPに手渡すサービスを実行する例を説明する。図5及び図6の例では、情報処理装置180は、評価対象動作の都度及び評価対象段階の都度それぞれにおいて、評価処理及び提示処理を実行する。
[Operation of the robot system]
An example of the operation of the robot system 1 according to the embodiment will be described with reference to FIGS. 5 and 6. FIG. FIG. 5 is a flow chart showing an example of the operation of the robot system 1 according to the embodiment. FIG. 6 is a plan view showing an example of the operation of the robot system 1 according to the embodiment. An example in which the robot 100 searches for an article W requested by the user P from the storage shelf SR and hands it over to the user P will be described below. In the examples of FIGS. 5 and 6, the information processing apparatus 180 executes the evaluation process and the presentation process each time the evaluation target motion is performed and each time the evaluation target stage is performed.
 まず、操作者PОは、ロボット操作を担当する要求を操作端末200に入力し、操作端末200は、当該要求をサーバ300に送信する(ステップS101)。サーバ300は、操作者PОが操作することができるロボット100を探索し、探索されたロボット100の情報処理装置180と上記操作端末200とを通信ネットワークNを介して接続する(ステップS102)。 First, the operator PO inputs a request to operate the robot to the operation terminal 200, and the operation terminal 200 transmits the request to the server 300 (step S101). The server 300 searches for the robot 100 that can be operated by the operator PO, and connects the information processing device 180 of the searched robot 100 and the operation terminal 200 via the communication network N (step S102).
 操作者PОは、接続完了の通知をサーバ300から受け取ると、ロボット操作の実行開始指令を操作端末200に入力する。操作端末200は、当該指令を情報処理装置180に送信する。情報処理装置180は、操作端末200から受信する操作指令に従って動作できるように、ロボット100を起動する。情報処理装置180は、ロボット100でのサービスのための作業の実行フラグをОN状態にする、つまり作業を開始する(ステップS103)。 When the operator PO receives the connection completion notification from the server 300, the operator PO inputs a robot operation execution start command to the operation terminal 200. Operation terminal 200 transmits the command to information processing device 180 . The information processing device 180 activates the robot 100 so that it can operate according to the operation command received from the operation terminal 200 . The information processing device 180 turns ON the work execution flag for the service of the robot 100, that is, starts work (step S103).
 次いで、操作者PОは、ロボット操作の評価に使用する動作関連基準データの指定条件を操作端末200に入力する(ステップS104)。操作端末200は、当該指定条件の指令を情報処理装置180に送信する。例えば、指定条件は、標準の動作関連基準データを指定する条件、又は、使用する評価結果の属性を指定する条件であってもよい。本例では、操作者PОは、当該操作者PОに対する評価結果の使用を指定する指定条件を入力する。 Next, the operator PO inputs into the operation terminal 200 the specification conditions of the motion-related reference data used for evaluating the robot operation (step S104). The operation terminal 200 transmits a command for the specified condition to the information processing device 180 . For example, the specified condition may be a condition that specifies standard action-related reference data or a condition that specifies attributes of evaluation results to be used. In this example, the operator PO inputs a specified condition that designates the use of the evaluation results for the operator PO.
 次いで、情報処理装置180は、操作端末200から受信した指定条件と、情報処理装置180に蓄積される蓄積データとを用いて、新たな動作関連基準データを生成し、当該動作関連基準データの使用を決定する(ステップS105)。当該動作関連基準データは、操作者PОの評価結果に合わせて調整された動作関連基準データである。 Next, the information processing device 180 generates new action-related reference data using the specified condition received from the operation terminal 200 and the accumulated data accumulated in the information processing device 180, and uses the action-related reference data. is determined (step S105). The action-related reference data is action-related reference data adjusted in accordance with the evaluation result of the operator PO.
 次いで、操作者PОは、撮像装置145の撮像画像等を表示する提示装置203の画面を確認しつつ、サービス提供エリアAS内でサービスを要求するユーザPを探す。操作者PОは、ユーザPを見つけると、撮像装置145及び146のうちのいずれか又は両方の撮像画像等を表示する提示装置203の画面を視認しつつ操作端末200を操作し、ロボット100をユーザPの近傍位置にまで走行させる(ステップS106)。 Next, the operator PO searches for the user P requesting the service within the service providing area AS while checking the screen of the presentation device 203 that displays the captured image of the imaging device 145 and the like. When the operator PO finds the user P, the operator PO operates the operation terminal 200 while visually recognizing the screen of the presentation device 203 that displays the captured images of one or both of the imaging devices 145 and 146, and moves the robot 100 to the user. It is caused to travel to a position near P (step S106).
 情報処理装置180は、ロボット100の走行中、所定の時間間隔でロボット100の動作関連情報を検出し、検出の都度、動作関連情報の評価処理及び提示処理を実行する(ステップS107)。本例では、所定の時間間隔の動作関連情報それぞれが、評価対象動作に対応する。情報処理装置180は、動作関連情報の対価及び対価の積算結果を提示装置203に表示させる。例えば、情報処理装置180は、動作関連情報について、ロボット100の走行速度、走行加速度、走行軌跡、及びロボット周辺要素との干渉等の評価対象特徴を評価する。本例では、情報処理装置180は、検出処理、評価処理及び提示処理を、これらの間に遅延時間を設けずに実行するため、操作者PОは、評価対象動作の対価及び対価の積算結果を、当該評価対象動作のための操作を操作端末200に入力した時点から遅延なく、リアルタイムに、つまり実時間で確認することができる。 The information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time it is detected (step S107). In this example, each piece of action-related information at a predetermined time interval corresponds to an evaluation target action. The information processing device 180 causes the presentation device 203 to display the price of the action-related information and the accumulated result of the price. For example, the information processing device 180 evaluates the evaluation target features such as the running speed, running acceleration, running trajectory, and interference with the robot peripheral elements for the motion-related information. In this example, the information processing apparatus 180 executes the detection process, the evaluation process, and the presentation process without providing a delay time between them. , it is possible to check in real time without delay from the time when the operation for the evaluation target motion is input to the operation terminal 200, that is, in real time.
 情報処理装置180は、ロボット100がユーザPの近傍位置に到着すると、出発位置から当該近傍位置までの走行工程である第1作業段階の評価処理及び提示処理を実行し、第1作業段階の対価及び積算結果を提示装置203に表示させる(ステップS108)。 When the robot 100 arrives at a position near the user P, the information processing device 180 executes evaluation processing and presentation processing of the first work stage, which is a travel process from the starting position to the near position, and Then, the result of integration is displayed on the presentation device 203 (step S108).
 操作者PОは、提示装置203の画面を介してユーザPの近傍位置へのロボット100の到着を確認すると、操作端末200への入力、及び、ロボット100の表示装置148と音声出力装置149とのうちのいずれか又は両方からの出力等を介して、要求する物品WをユーザPに問い合わせる(ステップS109)。 When the operator PO confirms that the robot 100 has arrived at a position near the user P via the screen of the presentation device 203, the operator PO makes an input to the operation terminal 200 and the display device 148 and the voice output device 149 of the robot 100. An inquiry is made to the user P about the article W to be requested via the output from either or both of them (step S109).
 次いで、ユーザPが、集音装置147と表示装置148のタッチパネルとのうちのいずれか又は両方に要求物品W1の情報を入力すると、操作者PОは、操作端末200への入力を介して、収納棚SRにおける要求物品W1の位置の情報を、情報処理装置180に要求する。情報処理装置180は、情報処理装置180が搭載されるロボット100の情報に加えて、当該ロボット100が配置されるサービス提供エリアASに関する情報及び提供サービスに関する情報も記憶し得る。操作端末200は、要求物品W1の位置PW1の情報を情報処理装置180から受け取り操作者PОに提示する(ステップS110)。サーバ300が、サービス提供エリアASに関する情報及び提供サービスに関する情報を保持し、操作端末200は、要求物品W1の位置の情報をサーバ300に要求してもよい。 Next, when the user P inputs the information of the requested article W1 to either or both of the sound collector 147 and the touch panel of the display device 148, the operator PO inputs to the operation terminal 200 to receive the requested article W1. A request is made to the information processing device 180 for information on the position of the requested article W1 on the shelf SR. The information processing device 180 can store information on the service providing area AS where the robot 100 is arranged and information on the service provided, in addition to the information on the robot 100 on which the information processing device 180 is mounted. The operation terminal 200 receives the information on the position PW1 of the requested item W1 from the information processing device 180 and presents it to the operator PO (step S110). The server 300 may hold information on the service providing area AS and information on the service provided, and the operation terminal 200 may request the information on the position of the requested article W1 from the server 300 .
 次いで、操作者PОは、位置PW1の情報と、撮像装置145及び146のうちのいずれか又は両方の撮像画像となどを表示する提示装置203の画面を視認しつつ操作端末200を操作し、ロボット100を位置PW1の近傍位置にまで走行させる(ステップS111)。 Next, the operator PO operates the operation terminal 200 while visually recognizing the screen of the presentation device 203 that displays the information of the position PW1 and the captured image of one or both of the imaging devices 145 and 146, and operates the robot. 100 is moved to a position near the position PW1 (step S111).
 情報処理装置180は、ステップS107の処理と同様に、ロボット100の走行中、所定の時間間隔でロボット100の動作関連情報を検出し、検出の都度、動作関連情報の評価処理及び提示処理を実行する(ステップS112)。 As in the process of step S107, the information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time detection is performed. (step S112).
 情報処理装置180は、ステップS108の処理と同様に、ロボット100が位置PW1の近傍位置に到着すると、ユーザPの近傍位置から位置PW1の近傍位置までの走行工程である第2作業段階の評価処理及び提示処理を実行する(ステップS113)。 As in the process of step S108, when the robot 100 reaches a position near the position PW1, the information processing device 180 performs evaluation processing of the second work stage, which is a traveling process from a position near the user P to a position near the position PW1. And presentation processing is executed (step S113).
 操作者PОは、提示装置203の画面を介して位置PW1の近傍位置へのロボット100の到着を確認すると、ロボットアーム120A及び120B並びにエンドエフェクタ130A及び130Bに動作させるための操作を操作端末200に入力する(ステップS114)。当該動作は、収納棚SRに収納されている要求物品W1をエンドエフェクタ130A及び130Bを用いて収納棚SRから取り出し、要求物品W1をロボット100のテーブル140a上に載置する動作である。操作者PОは、撮像装置144及び145の撮像画像等を表示する提示装置203の画面を視認しつつ、操作端末200に操作を入力する。 When the operator PO confirms through the screen of the presentation device 203 that the robot 100 has arrived at a position near the position PW1, the operator PO instructs the operation terminal 200 to operate the robot arms 120A and 120B and the end effectors 130A and 130B. Input (step S114). This operation is to take out the requested item W1 stored in the storage shelf SR from the storage shelf SR using the end effectors 130A and 130B and place the requested item W1 on the table 140a of the robot 100. FIG. The operator PO inputs an operation to the operation terminal 200 while viewing the screen of the presentation device 203 that displays the captured images of the imaging devices 144 and 145 .
 情報処理装置180は、ロボット100の動作中、ロボット100の動作関連情報を検出し、評価対象動作に対応する動作関連情報の検出の都度、当該動作関連情報の評価処理及び提示処理を実行する(ステップS115)。例えば、評価対象動作は、エンドエフェクタ130A及び130Bを要求物品W1へ移動するロボットアーム120A及び120Bの移動動作、要求物品W1を把持するエンドエフェクタ130A及び130Bの把持動作、要求物品W1をテーブル140a上に移動し載置するロボットアーム120A及び120Bの移動載置動作、並びに、要求物品W1の把持を解除するエンドエフェクタ130A及び103Bの把持解除動作等を含む処理対象動作のうちの1つ以上の処理対象動作である。本例では、全ての処理対象動作が、評価対象動作である。 The information processing device 180 detects the motion-related information of the robot 100 during the motion of the robot 100, and every time the motion-related information corresponding to the motion to be evaluated is detected, the motion-related information is evaluated and presented ( step S115). For example, the motions to be evaluated include the movement motions of the robot arms 120A and 120B that move the end effectors 130A and 130B to the required item W1, the gripping motion of the end effectors 130A and 130B that grip the required item W1, and the movement of the required item W1 onto the table 140a. one or more of the processing target operations including the movement and placement operation of the robot arms 120A and 120B that move and place the required article W1, and the grip release operation of the end effectors 130A and 103B that release the grip of the requested article W1. It is a target action. In this example, all the motions to be processed are motions to be evaluated.
 情報処理装置180は、評価対象動作それぞれに対応する動作関連情報について、評価対象特徴を評価する。例えば、評価対象特徴は、ロボット100の動作の所要時間と、エンドエフェクタ130A及び103Bの位置、姿勢、位置の移動速度、姿勢の移動速度、位置の加速度、姿勢の加速度及び移動軌跡と、要求物品W1の位置、姿勢及び状態と、ロボット100が要求物品W1に与える力及び衝撃と、ロボット100が対象物周辺要素に与える接触、力及び衝撃と、ロボット100とロボット周辺要素との干渉とのうちの1つ以上を含み得る。例えば、対象物周辺要素は、収納棚SRの棚板及び他の物品等であってもよい。例えば、エンドエフェクタ130A及び103Bの加速度は、エンドエフェクタ130A及び103Bが要求物品W1に与える衝撃等を示し得る。 The information processing device 180 evaluates the evaluation target feature for the action-related information corresponding to each evaluation target action. For example, the evaluation target features include the time required for the operation of the robot 100, the positions, postures, position movement speeds, posture movement speeds, position accelerations, posture accelerations, and movement trajectories of the end effectors 130A and 103B, and the required article Position, attitude and state of W1; force and impact exerted by robot 100 on requested article W1; contact, force and impact exerted by robot 100 on object peripheral elements; interference between robot 100 and robot peripheral elements may include one or more of For example, the object peripheral elements may be shelves of the storage shelf SR, other articles, and the like. For example, the acceleration of the end effectors 130A and 103B can indicate the impact or the like that the end effectors 130A and 103B give to the requested article W1.
 例えば、情報処理装置180は、撮像装置144及び145のうちのいずれか又は両方等によって撮像される要求物品W1の画像を処理して、エンドエフェクタ130A及び103Bに把持されている要求物品W1の保持状態を検出し、評価対象特徴として当該保持状態を評価するように構成されてもよい。情報処理装置180は、撮像装置144等によって撮像されるテーブル140a上の要求物品W1の画像を処理して、要求物品W1の載置状態を検出し、評価対象特徴として当該載置状態を評価するように構成されてもよい。 For example, the information processing device 180 processes an image of the requested article W1 captured by either or both of the imaging devices 144 and 145, and holds the requested article W1 gripped by the end effectors 130A and 103B. It may be configured to detect a state and evaluate the holding state as the feature to be evaluated. The information processing device 180 processes the image of the requested article W1 on the table 140a captured by the imaging device 144 or the like, detects the placement state of the requested article W1, and evaluates the placement state as an evaluation target feature. It may be configured as
 情報処理装置180は、エンドエフェクタ130A及び103Bによる要求物品W1の把持解除動作が検出されると、位置PW1の近傍位置へのロボット100の到着から把持解除完了までの取り出し動作工程である第3作業段階の評価処理及び提示処理を実行し、対価及び積算結果を提示装置203に表示させる(ステップS116)。 When the end effectors 130A and 103B detect that the end effectors 130A and 103B have released the requested item W1, the information processing apparatus 180 performs a third work, which is a take-out operation process from the arrival of the robot 100 at a position near the position PW1 to the completion of the release of the grip. The stage evaluation process and presentation process are executed, and the price and the accumulated result are displayed on the presentation device 203 (step S116).
 次いで、操作者PОは、提示装置203の画面を介して、テーブル140a上への要求物品W1の載置を確認すると、撮像装置145及び146のうちのいずれか又は両方の撮像画像等を表示する提示装置203の画面を視認しつつ操作端末200を操作し、ロボット100をユーザPの近傍位置にまで走行させる(ステップS117)。 Next, when the operator PO confirms that the requested article W1 is placed on the table 140a through the screen of the presentation device 203, an image captured by one or both of the imaging devices 145 and 146 is displayed. While viewing the screen of the presentation device 203, the operation terminal 200 is operated to cause the robot 100 to travel to a position near the user P (step S117).
 情報処理装置180は、ステップS107の処理と同様に、ロボット100の走行中、所定の時間間隔でロボット100の動作関連情報を検出し、検出の都度、動作関連情報の評価処理及び提示処理を実行する(ステップS118)。 As in the process of step S107, the information processing device 180 detects motion-related information of the robot 100 at predetermined time intervals while the robot 100 is running, and executes evaluation processing and presentation processing of the motion-related information each time detection is performed. (step S118).
 情報処理装置180は、ステップS108の処理と同様に、ロボット100がユーザPの近傍位置に到着すると、位置PW1の近傍位置からユーザPの近傍位置までの走行工程である第4作業段階の評価処理及び提示処理を実行する(ステップS119)。 As in the process of step S108, when the robot 100 reaches a position near the user P, the information processing device 180 performs an evaluation process of a fourth work stage, which is a traveling process from a position near the position PW1 to a position near the user P. And presentation processing is executed (step S119).
 次いで、操作者PОは、提示装置203の画面を介してユーザPの近傍位置へのロボット100の到着を確認すると、ロボットアーム120A及び120B並びにエンドエフェクタ130A及び130Bに動作させるための操作を操作端末200に入力する(ステップS120)。当該動作は、テーブル140a上に載置される要求物品W1をエンドエフェクタ130A及び130Bを用いてユーザPに手渡す動作である。操作者PОは、撮像装置144及び145の撮像画像等を表示する提示装置203の画面を視認しつつ、操作端末200に操作を入力する。 Next, when the operator PO confirms that the robot 100 has arrived at a position near the user P through the screen of the presentation device 203, the operator PO operates the operation terminal to cause the robot arms 120A and 120B and the end effectors 130A and 130B to operate. 200 (step S120). This operation is an operation of handing over the requested item W1 placed on the table 140a to the user P using the end effectors 130A and 130B. The operator PO inputs an operation to the operation terminal 200 while viewing the screen of the presentation device 203 that displays the captured images of the imaging devices 144 and 145 .
 情報処理装置180は、ロボット100の動作中、ロボット100の動作関連情報を検出し、評価対象動作に対応する動作関連情報の検出の都度、当該動作関連情報の評価処理及び提示処理を実行する(ステップS121)。例えば、評価対象動作は、エンドエフェクタ130A及び130Bを要求物品W1へ移動するロボットアーム120A及び120Bの移動動作、要求物品W1を把持するエンドエフェクタ130A及び130Bの把持動作、要求物品W1をユーザPの前に移動し差し出すロボットアーム120A及び120Bの移動動作、並びに、ユーザPの要求物品W1の保持状態に応じて要求物品W1の把持を解除するエンドエフェクタ130A及び103Bの把持解除動作等を含む処理対象動作のうちの1つ以上の処理対象動作である。本例では、全ての処理対象動作が、評価対象動作である。 The information processing device 180 detects the motion-related information of the robot 100 during the motion of the robot 100, and every time the motion-related information corresponding to the motion to be evaluated is detected, the motion-related information is evaluated and presented ( step S121). For example, the motions to be evaluated include the movement motions of the robot arms 120A and 120B that move the end effectors 130A and 130B to the required item W1, the gripping motions of the end effectors 130A and 130B that grip the required item W1, and the user P's movement of the requested item W1. Objects to be processed include movement operations of the robot arms 120A and 120B that move forward and present, and grip releasing operations of the end effectors 130A and 103B that release the grip of the requested article W1 according to the holding state of the requested article W1 by the user P. One or more of the actions to be processed. In this example, all the motions to be processed are motions to be evaluated.
 情報処理装置180は、評価対象動作それぞれに対応する動作関連情報について、評価対象特徴を評価する。例えば、評価対象特徴は、ロボット100の動作の所要時間と、エンドエフェクタ130A及び103Bの位置、姿勢、位置の移動速度、姿勢の移動速度、位置の加速度、姿勢の加速度及び移動軌跡と、要求物品W1の位置、姿勢及び状態と、ロボット100が要求物品W1に与える力及び衝撃と、ロボット100がユーザP等の対象物周辺要素に与える接触、力及び衝撃とのうちの1つ以上を含み得る。例えば、情報処理装置180は、撮像装置144及び145のうちのいずれか又は両方等によって撮像される要求物品W1の画像を処理して、エンドエフェクタ130A及び103Bに把持されている要求物品W1の保持状態を検出し、評価対象特徴として当該保持状態を評価するように構成されてもよい。 The information processing device 180 evaluates the evaluation target feature for the action-related information corresponding to each evaluation target action. For example, the evaluation target features include the time required for the operation of the robot 100, the positions, postures, position movement speeds, posture movement speeds, position accelerations, posture accelerations, and movement trajectories of the end effectors 130A and 103B, and the required article It may include one or more of the position, posture, and state of W1, the force and impact exerted by the robot 100 on the requested item W1, and the contact, force, and impact exerted by the robot 100 on an object peripheral element such as the user P. . For example, the information processing device 180 processes an image of the requested article W1 captured by either or both of the imaging devices 144 and 145, and holds the requested article W1 gripped by the end effectors 130A and 103B. It may be configured to detect a state and evaluate the holding state as the feature to be evaluated.
 情報処理装置180は、エンドエフェクタ130A及び103Bによる要求物品W1の把持解除動作が検出されると、ロボット100のユーザPの近傍位置への到着から把持解除完了までの手渡し動作工程である第5作業段階の評価処理及び提示処理を実行し、対価及び積算結果を提示装置203に表示させる(ステップS122)。 When the end effectors 130A and 103B detect that the end effectors 130A and 103B have released the requested item W1, the information processing apparatus 180 performs a fifth work, which is a hand-over operation process from the arrival of the robot 100 to a position near the user P to the completion of the release of the grip. The stage evaluation process and presentation process are executed, and the price and the accumulated result are displayed on the presentation device 203 (step S122).
 操作者PОは、ロボット操作を終了する場合、操作終了の指令を操作端末200に入力する。情報処理装置180は、操作終了の指令を操作端末200から受け取ると(ステップS123でYes)、一連の処理を終了する。情報処理装置180は、作業の実行フラグをОFF状態にし、ロボット100を停止する。サーバ300は、操作端末200と情報処理装置180との接続を遮断する。情報処理装置180は、操作終了の指令を受け取らない場合(ステップS123でNo)にステップS106に戻る。 When the operator PO ends the robot operation, the operator PO inputs an operation end command to the operation terminal 200 . When the information processing apparatus 180 receives an operation end command from the operation terminal 200 (Yes in step S123), the information processing apparatus 180 ends the series of processes. The information processing device 180 turns off the work execution flag and stops the robot 100 . The server 300 cuts off the connection between the operation terminal 200 and the information processing device 180 . The information processing apparatus 180 returns to step S106 when not receiving an operation end command (No in step S123).
 ステップS106からS123の処理の過程において、情報処理装置180は、評価対象動作に対応する動作関連情報及び作業段階についての評価結果、対価及び対価の積算結果と、評価に用いられた動作関連基準データとを関連付けて記憶し蓄積する。 In the course of the processing from steps S106 to S123, the information processing device 180 collects the action-related information corresponding to the action to be evaluated, the evaluation result for the work stage, the price, the accumulated price, and the action-related reference data used for the evaluation. are stored and accumulated in association with each other.
 上記のようなステップS101からS123の処理によって、操作者PОは、評価対象動作及び作業段階それぞれの都度、対価及び対価の積算結果を通じて、操作者PОの操作に対する評価を確認しつつ、ロボット100を操作することができる。これにより、操作者PОは、ロボット操作のスキルを向上させるための具体的な方策を見出すことができる。操作者PОは、より多くの対価を得るためにロボット操作のスキルを向上させるモチベーションを高めることができる。 Through the processing of steps S101 to S123 as described above, the operator PO operates the robot 100 while confirming the evaluation of the operation of the operator PO through the consideration and the accumulated result of the consideration each time the motion to be evaluated and the work stage are performed. can be manipulated. As a result, the operator PO can find a concrete measure for improving the skill of operating the robot. The operator PO can be motivated to improve the skill of operating the robot in order to obtain more compensation.
 (変形例)
 本変形例は、ロボットシステムが学習装置400を備える点で、実施の形態と異なる。以下、本変形例について、実施の形態と異なる点を中心に説明し、実施の形態と同様の点の説明を適宜省略する。
(Modification)
This modification differs from the embodiment in that the robot system includes a learning device 400 . In the following, this modified example will be described with a focus on the points that are different from the embodiment, and the description of the points that are the same as the embodiment will be omitted as appropriate.
 図7は、変形例に係る情報処理装置180A及び学習装置400の機能的構成の一例を示すブロック図である。図7に示すように、情報処理装置180Aは、データ入出力部180iを機能的構成要素としてさらに含む。データ入出力部180iは、学習装置400との間でデータの入出力を行う。例えば、データ入出力部180iは、第2記憶部181bに記憶される蓄積データを学習装置400に出力する。データ入出力部180iは、基準決定部180gからの要求に応じて、学習装置400に動作関連基準データを要求する。データ入出力部180iは、基準決定部180gから指定条件と当該指定条件を満たす評価結果の情報とを受け取り学習装置400に送る。学習装置400は、当該指定条件及び当該評価結果に対応する動作関連基準データをデータ入出力部180iに送る。 FIG. 7 is a block diagram showing an example of functional configurations of an information processing device 180A and a learning device 400 according to a modification. As shown in FIG. 7, the information processing device 180A further includes a data input/output unit 180i as a functional component. The data input/output unit 180 i inputs and outputs data to and from the learning device 400 . For example, the data input/output unit 180 i outputs accumulated data stored in the second storage unit 181 b to the learning device 400 . The data input/output unit 180i requests action-related reference data from the learning device 400 in response to a request from the reference determination unit 180g. The data input/output unit 180 i receives the specified condition and the information of the evaluation result satisfying the specified condition from the reference determination unit 180 g and sends the information to the learning device 400 . The learning device 400 sends the action-related reference data corresponding to the specified condition and the evaluation result to the data input/output unit 180i.
 学習装置400は、情報処理装置180Aと同様に、コンピュータ装置を含む。限定されないが、本変形例では、学習装置400は、情報処理装置180A及びロボットコントローラ190とは別の装置であり、有線通信、無線通信又はこれらの組み合わせを介して情報処理装置180Aとデータ通信可能に接続される。いかなる有線通信及び無線通信が用いられてもよい。学習装置400は、情報処理装置180Aと共に、ロボット100に搭載される。学習装置400は、情報処理装置180A又はロボットコントローラ190に組み込まれてもよい。学習装置400及び情報処理装置180Aは、情報処理システム500を形成する。このような学習装置400は、プロセッサ及びメモリを含む処理回路を含み得る。 The learning device 400 includes a computer device, similar to the information processing device 180A. Although not limited, in this modification, the learning device 400 is a device separate from the information processing device 180A and the robot controller 190, and is capable of data communication with the information processing device 180A via wired communication, wireless communication, or a combination thereof. connected to Any wired or wireless communication may be used. The learning device 400 is mounted on the robot 100 together with the information processing device 180A. The learning device 400 may be incorporated in the information processing device 180A or the robot controller 190. FIG. The learning device 400 and the information processing device 180A form an information processing system 500. FIG. Such a learning device 400 may include processing circuitry including a processor and memory.
 学習装置400は、記憶媒体を介して、情報処理装置180Aとデータを入出力するように構成されてもよい。記憶媒体は、半導体ベースの若しくは他の集積回路(IC:Integrated Circuit)、ハードディスクドライブ(HDD)、ハイブリッドハードドライブ(HHD:Hybrid Hard Disk Drive)、光ディスク、光ディスクドライブ(ОDD:Optical Disk Drive)、光磁気ディスク、光磁気ドライブ、フロッピィディスクドライブ(FDD:Floppy Disk Drive)、磁気テープ、固体ドライブ(SSD)、RAMドライブ、セキュアデジタルカード若しくはドライブ、任意の他の適切な記憶媒体、又はこれらの2つ以上の組合せを含むことができる。 The learning device 400 may be configured to input/output data with the information processing device 180A via a storage medium. Storage media include semiconductor-based or other integrated circuits (ICs), hard disk drives (HDDs), hybrid hard disk drives (HHDs), optical disks, optical disk drives (ODDs), optical Magnetic disk, magneto-optical drive, floppy disk drive (FDD), magnetic tape, solid state drive (SSD), RAM drive, secure digital card or drive, any other suitable storage medium, or two of these Combinations of the above may be included.
 本変形例では、学習装置400は、ロボット100に搭載されるが、ロボット100の外部に配置され、通信ネットワークNを介して情報処理装置180Aと接続されるように構成されてもよい。学習装置400は、2つ以上のロボット100の情報処理装置180Aと接続するように構成されてもよい。例えば、学習装置400は、サービス提供エリアASに配置されてもよく、操作エリアAОに配置されてもよく、その他の場所に配置されてもよい。例えば、学習装置400は、サーバ300の配置場所に配置され、サーバ300経由で情報処理装置180Aとデータ通信するように構成されてもよい。学習装置400は、サーバ300に組み込まれてもよい。 In this modification, the learning device 400 is mounted on the robot 100, but may be arranged outside the robot 100 and connected to the information processing device 180A via the communication network N. The learning device 400 may be configured to connect with the information processing devices 180A of two or more robots 100 . For example, the learning device 400 may be placed in the service provision area AS, may be placed in the operation area AO, or may be placed elsewhere. For example, the learning device 400 may be arranged at the location of the server 300 and configured to perform data communication with the information processing device 180A via the server 300 . The learning device 400 may be incorporated into the server 300 .
 学習装置400は、データ入出力部401と、学習データ記憶部402と、機械学習部403と、入力データ処理部404と、出力データ処理部405とを機能的構成要素として含む。学習データ記憶部402の機能は、学習装置400が含み得るメモリ、ストレージ又はこれらの組み合わせ等によって実現され、学習データ記憶部402以外の機能的構成要素の機能は、学習装置400が含むプロセッサ等によって実現される。機械学習部403は、機械学習回路によって実現される機能部の一例であり、学習データ記憶部402は、第2記憶装置によって実現される機能部の一例である。 The learning device 400 includes a data input/output unit 401, a learning data storage unit 402, a machine learning unit 403, an input data processing unit 404, and an output data processing unit 405 as functional components. The function of the learning data storage unit 402 is implemented by a memory, a storage, or a combination thereof that may be included in the learning device 400, and the functions of functional components other than the learning data storage unit 402 are implemented by a processor or the like included in the learning device 400. Realized. The machine learning unit 403 is an example of a functional unit implemented by a machine learning circuit, and the learning data storage unit 402 is an example of a functional unit implemented by a second storage device.
 データ入出力部401は、情報処理装置180Aのデータ入出力部180iから蓄積データを受け取り、当該蓄積データを機械学習用データとして、学習データ記憶部402に記憶させる。データ入出力部401は、データ入出力部180iから、動作関連基準データの要求指令と指定条件の情報と当該指定条件を満たす評価結果の情報とを受け取り、入力データ処理部404に出力する。データ入出力部401は、出力データ処理部405から動作関連基準データを受け取り、当該動作関連基準データをデータ入出力部180iに送る。 The data input/output unit 401 receives accumulated data from the data input/output unit 180i of the information processing device 180A, and stores the accumulated data in the learning data storage unit 402 as machine learning data. The data input/output unit 401 receives, from the data input/output unit 180 i , a request command for motion-related reference data, information on specified conditions, and information on evaluation results satisfying the specified conditions, and outputs the information to the input data processing unit 404 . The data input/output unit 401 receives the action-related reference data from the output data processing unit 405 and sends the action-related reference data to the data input/output unit 180i.
 入力データ処理部404は、データ入出力部401から受け取る指定条件の情報及び評価結果の情報を機械学習部403の機械学習モデルに入力可能なデータに変換して機械学習部403に出力する。出力データ処理部405は、機械学習部403の出力データを動作関連基準データに変換してデータ入出力部401に出力する。 The input data processing unit 404 converts the specified condition information and evaluation result information received from the data input/output unit 401 into data that can be input to the machine learning model of the machine learning unit 403 and outputs the data to the machine learning unit 403 . The output data processing unit 405 converts the output data of the machine learning unit 403 into motion-related reference data and outputs the data to the data input/output unit 401 .
 機械学習部403は、機械学習モデルを含む。機械学習部403は、機械学習用データを用いて機械学習モデルに機械学習させ、機械学習モデルに、入力データに対する出力データの精度を向上させる。機械学習モデルは、ニューラルネットワーク(Neural Network)、Random Forest、Genetic Programming、回帰モデル、木モデル、ベイズモデル、時系列モデル、クラスタリングモデル、アンサンブル学習モデル等を含んでもよいが、本変形例では、ニューラルネットワークを含む。ニューラルネットワークは、入力層及び出力層を含む複数のノード層を含む。ノード層には、1つ以上のノードが含まれる。ニューラルネットワークが、入力層、中間層及び出力層を含む場合、ニューラルネットワークは、入力層のノードに入力された情報について、入力層から中間層への出力処理、中間層から出力層への出力処理を順次行い、入力情報に適合する出力結果を出力する。1つの層の各ノードは、次の層の各ノードと接続されており、ノード間の接続には、重み付けがされている。1つの層のノードの情報は、ノード間の接続の重み付けが付与されて、次の層のノードに出力される。 The machine learning unit 403 includes a machine learning model. The machine learning unit 403 causes the machine learning model to perform machine learning using the machine learning data, and improves the accuracy of the output data with respect to the input data in the machine learning model. Machine learning models may include neural networks, random forests, genetic programming, regression models, tree models, Bayes models, time series models, clustering models, ensemble learning models, etc., but in this modification, neural Including network. A neural network includes multiple layers of nodes, including an input layer and an output layer. The node layer contains one or more nodes. When a neural network includes an input layer, an intermediate layer, and an output layer, the neural network performs output processing from the input layer to the intermediate layer and output processing from the intermediate layer to the output layer for information input to the nodes of the input layer. are sequentially performed, and output results that match the input information are output. Each node in one layer is connected to each node in the next layer, and the connections between nodes are weighted. The information of the nodes of one layer is weighted by the connection between nodes and output to the nodes of the next layer.
 機械学習モデルは、指定条件に関連する条件関連情報と、評価結果に関連する評価関連情報とを入力データとし、当該条件関連情報及び当該評価関連情報に対応する動作関連基準に関連する情報を出力データとする。条件関連情報は、指定条件と評価結果の属性とのうちのいずれか又は両方を示す情報であってもよく、評価関連情報は評価結果を示す情報であってもよい。 The machine learning model uses condition-related information related to specified conditions and evaluation-related information related to evaluation results as input data, and outputs information related to action-related criteria corresponding to the condition-related information and the evaluation-related information. data. The condition-related information may be information indicating either or both of the specified condition and the attribute of the evaluation result, and the evaluation-related information may be information indicating the evaluation result.
 機械学習モデルは、機械学習用データに含まれる条件関連情報及び評価関連情報を機械学習用入力データとし、当該条件関連情報及び当該評価関連情報に対応する動作関連基準データに関連する基準関連情報を教師データとする。基準関連情報は動作関連基準データを示す情報であってもよい。例えば、機械学習では、機械学習部403は、機械学習用入力データが入力された場合の機械学習モデルの出力データと、当該機械学習用入力データに対応する教師データとの間でこれらを一致又は誤差を最小化等するように、ニューラルネットワーク内のノード間の接続の重み付けを調整する。このような重み付け調整後の機械学習モデルは、条件関連情報及び評価関連情報が入力されると、当該情報に適切な基準関連情報を出力することができる。 The machine learning model uses the condition-related information and evaluation-related information included in the machine learning data as input data for machine learning, and uses the criterion-related information related to the action-related criterion data corresponding to the condition-related information and the evaluation-related information. Let it be teacher data. The reference-related information may be information indicating motion-related reference data. For example, in machine learning, the machine learning unit 403 matches or compares output data of a machine learning model when machine learning input data is input with teacher data corresponding to the machine learning input data. Adjust the weighting of the connections between nodes in the neural network, such as to minimize the error. When the condition-related information and the evaluation-related information are input, the weighted-adjusted machine learning model can output the reference-related information appropriate to the information.
 上記のような学習装置400は、実施の形態の基準決定部180gによって決定される動作関連基準データよりも、指定条件及び評価結果に適切な動作関連基準データを出力することができる。本変形例において、機械学習部403の機械学習モデルは、出力データ処理部405の機能を含み、出力データ処理部405と同様の出力データを出力するように構成されてもよい。 The learning device 400 as described above can output action-related reference data more suitable for the specified condition and the evaluation result than the action-related reference data determined by the reference determination unit 180g of the embodiment. In this modification, the machine learning model of the machine learning unit 403 may include the function of the output data processing unit 405 and may be configured to output the same output data as the output data processing unit 405 does.
 (その他の実施の形態)
 以上、本開示の例示的な実施の形態及び変形例について説明したが、本開示は、上記実施の形態及び変形例に限定されない。すなわち、本開示の範囲内で種々の変形及び改良が可能である。例えば、各種変形を実施の形態及び変形例に施したもの、及び、異なる実施の形態及び変形例における構成要素を組み合わせて構築される形態も、本開示の範囲内に含まれる。
(Other embodiments)
Although exemplary embodiments and modifications of the present disclosure have been described above, the present disclosure is not limited to the above embodiments and modifications. That is, various modifications and improvements are possible within the scope of the present disclosure. For example, embodiments and modified examples with various modifications, and forms constructed by combining components of different embodiments and modified examples are also included within the scope of the present disclosure.
 例えば、実施の形態及び変形例において、ロボットシステム1が対象とするロボット100の作業は、サービス業に関する作業であるが、これに限定されない。ロボットシステム1は、いかなる作業を対象としてもよい。例えば、ロボットシステム1は、工業、農業及びその他の産業等に関する作業を対象としてもよい。 For example, in the embodiment and modifications, the work of the robot 100 targeted by the robot system 1 is work related to the service industry, but is not limited to this. The robot system 1 may target any work. For example, the robot system 1 may be intended for tasks related to industry, agriculture, and other industries.
 実施の形態及び変形例において、ロボットシステム1はロボット100を備えるが、ロボットシステム1が備えるロボットは、作業を行うことができるロボットであればよい。例えば、ロボットは、搬送車110とは異なる構成の移動装置を備えてもよく、例えば、脚部等を備えてもよい。ロボットは、移動装置を備えなくてもよい。例えば、ロボットは、ロボットアーム以外の構成要素を備えてもよく、例えば、人型ロボット及び動物型ロボット等であってもよい。 In the embodiment and modification, the robot system 1 includes the robot 100, but the robot included in the robot system 1 may be any robot capable of performing work. For example, the robot may have a moving device with a configuration different from that of the carrier 110, and may have legs or the like. The robot does not have to be equipped with a mobile device. For example, the robot may have components other than the robot arm, and may be, for example, a humanoid robot, an animal robot, or the like.
 実施の形態及び変形例において、情報処理装置180及び180Aは、ロボット周辺要素、対象物及び対象物周辺要素等のロボット100以外の要素である外部要素の状態を検出するために、これらが撮像されている画像データを用いるように構成されるが、これに限定されない。例えば、情報処理装置180及び180Aは、外部要素とは別に配置されるセンサである外部センサの検出結果と、外部要素に配置されるセンサである搭載センサの検出結果とのうちのいずれか又は両方を用いて、外部要素の状態を検出するように構成されてもよい。外部センサは、外部から外部要素の位置及び姿勢等を検出するように構成されてもよい。例えば、外部センサは、光波、レーザ、磁気、電波、電磁波、超音波又はこれらの2つ以上の組み合わせ等を用いて検出を行うように構成され、光電センサ、レーザセンサ、電波式センサ、電磁波式センサ、超音波センサ、各種ライダ(LiDAR)又はこれらの2つ以上の組み合わせ等を含んでもよい。搭載センサは、外部要素と共に移動し、外部要素の位置及び姿勢等を検出するように構成されてもよい。例えば、搭載センサは、加速度センサ、角速度センサ、磁気センサ、GPS(全地球測位システム:Global Positioning System)受信機又はこれらの2つ以上の組み合わせ等を含んでもよい。 In the embodiment and the modification, the information processing devices 180 and 180A capture the images of external elements other than the robot 100, such as robot peripheral elements, objects, and object peripheral elements. However, it is not limited to this. For example, the information processing devices 180 and 180A may detect either or both of the detection result of an external sensor, which is a sensor arranged separately from the external element, and the detection result of a mounted sensor, which is a sensor arranged on the external element. may be configured to detect the state of an external element. The external sensor may be configured to detect the position, orientation, etc. of the external element from the outside. For example, the external sensor is configured to perform detection using light waves, lasers, magnetism, radio waves, electromagnetic waves, ultrasonic waves, or a combination of two or more of these, and the like. It may also include sensors, ultrasonic sensors, various lidars (LiDAR), or combinations of two or more of these. The on-board sensor may be configured to move with the external element, detect the position, orientation, etc. of the external element. For example, onboard sensors may include acceleration sensors, angular rate sensors, magnetic sensors, GPS (Global Positioning System) receivers, or combinations of two or more of these, and the like.
 実施の形態及び変形例に係る情報処理装置180及び180Aは、処理にAI(人工知能:Artificial Intelligence)を用いるように構成されてもよい。例えば、AIは、処理対象動作の評価を決定する処理、作業段階の評価を決定する処理、対価及び積算結果を算出する処理、並びに画像処理等に用いることができる。例えば、AIは、機械学習を行う学習モデルを含んでもよい。例えば、学習モデルは、ニューラルネットワークを含んでもよい。 The information processing devices 180 and 180A according to the embodiment and modifications may be configured to use AI (Artificial Intelligence) for processing. For example, AI can be used for the processing of determining the evaluation of the motion to be processed, the processing of determining the evaluation of the work stage, the processing of calculating the price and the accumulated result, image processing, and the like. For example, AI may include learning models that perform machine learning. For example, a learning model may include a neural network.
 実施の形態及び変形例において、サーバ300は、情報処理装置180及び180A並びに学習装置400の機能の少なくとも一部の機能を有するように構成されてもよい。例えば、サーバ300は、図4及び図7に示される情報処理装置180及び180Aに含まれる1つ以上の機能的構成要素の機能を有するように構成されてもよい。サーバ300は、図7に示される学習装置400に含まれる1つ以上の機能的構成要素の機能を有するように構成されてもよい。 In the embodiment and modifications, the server 300 may be configured to have at least some of the functions of the information processing devices 180 and 180A and the learning device 400. For example, server 300 may be configured to have the functionality of one or more of the functional components included in information processors 180 and 180A shown in FIGS. Server 300 may be configured to have the functionality of one or more of the functional components included in learning device 400 shown in FIG.
 また、本開示の技術の各態様例は、以下のように挙げられる。本開示の一態様に係る情報処理装置は、作業をロボットに実行させるための前記ロボットの操作を評価する情報処理装置であって、処理回路を含み、前記処理回路は、ユーザによって操作装置に入力される操作指令に従って動作する前記ロボットの動作に関連する情報である、動作関連情報を検出する検出処理と、検出された前記動作関連情報を、前記ロボットの動作に関連する基準である動作関連基準と比較し評価する評価処理と、前記評価処理での評価結果を、前記ユーザが前記操作装置を操作しつつ情報の提示を受ける提示装置に提示させる提示処理とを、前記作業の実行中に実行するように構成され、前記処理回路は、評価対象の前記ロボットの動作である評価対象動作の都度の第1タイミングと、前記作業における評価対象の作業段階である評価対象段階の都度の第2タイミングとのうちのいずれか又は両方のタイミングで前記評価処理を実行し、前記処理回路は、前記第1タイミングで前記評価処理を実行する場合、前記評価対象動作の実行順序と同じ順序で前記評価対象動作に関する前記評価結果を前記提示装置に提示させ、前記処理回路は、前記第2タイミングで前記評価処理を実行する場合、前記評価対象段階の実行順序と同じ順序で前記評価対象段階に関する前記評価結果を前記提示装置に提示させる。 In addition, each aspect example of the technology of the present disclosure is listed as follows. An information processing device according to an aspect of the present disclosure is an information processing device that evaluates an operation of the robot for causing the robot to perform a task, the processing circuit including a processing circuit, wherein the processing circuit receives an input from a user to an operation device. detection processing for detecting motion-related information, which is information related to the motion of the robot that operates in accordance with an operation command received; and a presentation process of presenting the evaluation result of the evaluation process on a presentation device that receives information presentation while the user operates the operation device, are executed during the execution of the work. The processing circuit is configured to perform a first timing each time an evaluation target motion is a motion of the robot to be evaluated, and a second timing each time an evaluation target stage is a work stage to be evaluated in the work. When the evaluation processing is executed at the first timing, the processing circuit executes the evaluation processing in the same order as the execution order of the evaluation target operation. When causing the presentation device to present the evaluation result regarding the action, and executing the evaluation process at the second timing, the processing circuit presents the evaluation result regarding the evaluation target stage in the same order as the execution order of the evaluation target stage. is presented on the presentation device.
 上記態様によると、情報処理装置は、評価対象動作の都度の第1タイミングと評価対象段階の都度の第2タイミングとのうちのいずれか又は両方のタイミングで、評価対象動作と評価対象段階とのうちのいずれか又は両方に関する評価処理を実行し、提示装置を介して評価結果をユーザに提示する。ユーザは、上記のタイミングで操作に対する評価を確認することができる。さらに、ユーザは、操作の順序と同じ順序で、操作に対する評価を確認することができる。これにより、ユーザは、評価を受けた操作と当該操作への評価結果とを認識することができるため、次に評価対象の操作を行うときに操作精度を向上させる意欲を持ち得る。よって、情報処理装置は、ユーザのロボット操作への意欲を向上させることができる。 According to the above aspect, the information processing apparatus performs the evaluation target motion and the evaluation target stage at either or both of the first timing each time the evaluation target motion is performed and the second timing each time the evaluation target stage is performed. Evaluation processing is performed on either or both of them, and the evaluation results are presented to the user via the presentation device. The user can check the evaluation of the operation at the above timing. Furthermore, the user can confirm the evaluation for the operation in the same order as the operation order. As a result, the user can recognize the evaluated operation and the evaluation result of the operation, so that the user can be motivated to improve the operation accuracy when performing the evaluation target operation next time. Therefore, the information processing device can increase the user's desire to operate the robot.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記第1タイミングで前記評価処理を実行する場合、第1の前記評価対象動作に関する前記評価処理と、前記第1の評価対象動作に関する前記評価結果の前記提示処理とを、前記第1の評価対象動作の次の第2の前記評価対象動作のタイミングまでに実行してもよい。 In the information processing device according to one aspect of the present disclosure, when the processing circuit executes the evaluation process at the first timing, the evaluation process related to the first evaluation target action and the first evaluation target action may be executed by the timing of the second evaluation target action following the first evaluation target action.
 上記態様によると、情報処理装置は、評価対象動作に関する評価処理及び提示処理を次の評価対象動作のタイミングまでに実行する。よって、ユーザは、評価を受けた評価対象動作及び当該評価対象動作に関する評価結果を的確に認識することができる。 According to the above aspect, the information processing device executes the evaluation process and the presentation process regarding the evaluation target motion by the timing of the next evaluation target motion. Therefore, the user can accurately recognize the evaluation target action that has been evaluated and the evaluation result regarding the evaluation target action.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記第2タイミングで前記評価処理を実行する場合、第1の前記評価対象段階に関する前記評価処理と、前記第1の評価対象段階に関する前記評価結果の前記提示処理とを、前記第1の評価対象段階の次の第2の前記評価対象段階のタイミングまでに実行してもよい。 In the information processing device according to one aspect of the present disclosure, when the processing circuit executes the evaluation process at the second timing, the evaluation process related to the first evaluation stage and the first evaluation stage may be executed by the timing of the second evaluation target stage following the first evaluation target stage.
 上記態様によると、情報処理装置は、評価対象段階に関する評価処理及び提示処理を次の評価対象段階のタイミングまでに実行する。よって、ユーザは、評価を受けた評価対象段階及び当該評価対象段階に関する評価結果を的確に認識することができる。 According to the above aspect, the information processing device executes the evaluation process and the presentation process for the evaluation target stage by the timing of the next evaluation target stage. Therefore, the user can accurately recognize the evaluation target stage that has been evaluated and the evaluation result related to the evaluation target stage.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記評価結果に基づき、前記ロボットの操作に対して前記ユーザに与える対価をさらに決定し、決定された前記対価と、前記作業の実行中の前記対価の積み上げの結果とを前記提示装置に提示させてもよい。 In the information processing device according to an aspect of the present disclosure, the processing circuit further determines a compensation to be given to the user for operating the robot based on the evaluation result, The presentation device may be caused to present the results of the accumulation of the considerations being executed.
 上記態様によると、情報処理装置は、個々の評価結果を示す対価と、対価を積み上げた結果とを、提示装置を介してユーザに提示する。これにより、ユーザは、より多くの対価を得るために、ロボット操作への意欲を向上し得る。例えば、対価は、報酬額、ポイント、得点及びスコア等であってもよい。 According to the above aspect, the information processing device presents to the user, via the presentation device, the price indicating the individual evaluation results and the accumulated result of the price. Thereby, the user can be motivated to operate the robot in order to obtain more compensation. For example, the consideration may be a reward amount, points, score, score, or the like.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記動作関連情報として、前記操作指令と、前記操作指令に従って前記ロボットに動作させるために前記ロボットに対して生成される動作指令と、前記ロボットの実際の動作結果と、前記ロボットの周辺環境の状態と、前記ロボットが作用を加える対象物の状態と、前記対象物の周辺環境の状態とのうちの1つ以上を検出してもよい。上記態様によると、情報処理装置は、多様な情報に基づき、評価処理を行うことができる。 In the information processing device according to one aspect of the present disclosure, the processing circuit may include, as the motion-related information, the operation command and a motion command generated for the robot to cause the robot to operate according to the operation command. , detecting one or more of an actual operation result of the robot, a state of the surrounding environment of the robot, a state of an object to which the robot acts, and a state of the surrounding environment of the object. good too. According to the above aspect, the information processing device can perform evaluation processing based on various information.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記動作関連情報を評価するための特徴である2つ以上の評価対象特徴を、前記動作関連基準に含まれる前記2つ以上の評価対象特徴の基準と比較し評価し、前記2つ以上の評価対象特徴の評価結果に基づき前記動作関連情報を評価してもよい。 In the information processing device according to one aspect of the present disclosure, the processing circuit may convert two or more evaluation target features, which are features for evaluating the action-related information, into the two or more evaluation target features included in the action-related criteria. The action-related information may be evaluated based on the evaluation results of the two or more evaluation target features by comparing with a reference of the evaluation target features.
 上記態様によると、情報処理装置での評価方法が、明確且つ簡潔になり得る。よって、評価処理の処理量の低減及び迅速化が可能になる。例えば、評価対象特徴は、ロボットの動作についての所要時間、位置、姿勢、軌跡、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度等の特徴を含んでもよい。評価対象特徴は、ロボットがロボットの周辺環境に与える力及び衝撃等の特徴、並びに、ロボットが与える当該周辺環境の位置、姿勢、形状及び状態の変化等の特徴を含んでもよい。評価対象特徴は、ロボットが作用を加える対象物の位置、姿勢、位置の速度、姿勢の速度、位置の加速度及び姿勢の加速度等の特徴、ロボットが作用を加える対象物についての形状の変化、状態の変化及び出来映え等の特徴、ロボットが対象物の周辺環境に与える力及び衝撃等の特徴、並びに、ロボットが与える当該周辺環境の位置、姿勢、形状及び状態の変化等の特徴を含んでもよい。情報処理装置は、2つ以上の評価対象特徴それぞれの評価結果に重み付けを加えて、動作関連情報の評価結果を決定してもよい。 According to the above aspect, the evaluation method in the information processing device can be clear and concise. Therefore, it is possible to reduce the processing amount of the evaluation process and speed up the process. For example, the features to be evaluated may include features such as required time, position, orientation, trajectory, position velocity, orientation velocity, position acceleration, and orientation acceleration for the motion of the robot. The evaluation target features may include features such as force and impact applied by the robot to the surrounding environment of the robot, and features such as changes in the position, posture, shape and state of the surrounding environment applied by the robot. Features to be evaluated include features such as the position, posture, positional velocity, posture velocity, positional acceleration, and posture acceleration of the object to which the robot acts, and shape changes and states of the object to which the robot acts characteristics such as changes and workmanship, characteristics such as force and impact exerted by the robot on the surrounding environment of the object, and characteristics such as changes in the position, posture, shape and state of the surrounding environment given by the robot. The information processing apparatus may determine the evaluation result of the motion-related information by adding weights to the evaluation results of each of the two or more evaluation target features.
 本開示の一態様に係る情報処理装置は、第1記憶装置をさらに含み、前記第1記憶装置は、前記動作関連基準の情報と、前記動作関連基準に基づく前記動作関連情報の前記評価結果の情報とを蓄積し、前記処理回路は、前記第1記憶装置に蓄積される第1の前記動作関連基準の情報及び前記第1の動作関連基準に基づく前記評価結果の情報を用いて、第2の前記動作関連基準の情報を決定してもよい。上記態様によると、情報処理装置は、第1記憶装置に蓄積される評価結果に応じて、動作関連基準を新たに決定することができる。第1記憶装置は、処理回路に含まれてもよく、処理回路から分離されてもよい。 The information processing apparatus according to an aspect of the present disclosure further includes a first storage device, and the first storage device stores the information of the action-related criteria and the evaluation result of the action-related information based on the action-related criteria. and the processing circuit uses the information of the first action-related criterion and the information of the evaluation result based on the first action-related criterion stored in the first storage device to perform a second may determine information of said action-related criteria of. According to the above aspect, the information processing device can newly determine the action-related criteria according to the evaluation results accumulated in the first storage device. The first storage device may be included in the processing circuitry or separate from the processing circuitry.
 本開示の一態様に係る情報処理装置において、前記処理回路は、前記第1記憶装置に蓄積される情報の中から、指定される条件を満たす前記評価結果の情報と、前記評価結果に用いられる前記動作関連基準である前記第1の動作関連基準の情報とを抽出し、抽出された前記評価結果の情報及び前記第1の動作関連基準の情報を用いて前記第2の動作関連基準の情報を決定してもよい。 In the information processing device according to an aspect of the present disclosure, the processing circuit uses information of the evaluation result that satisfies a specified condition and the evaluation result from information accumulated in the first storage device. extracting the information of the first action-related criterion, which is the action-related criterion, and extracting the information of the second action-related criterion using the extracted information of the evaluation result and the information of the first action-related criterion; may be determined.
 上記態様によると、情報処理装置は、指定される条件を満たす評価結果を用いて新たな動作関連基準を決定することができる。よって、情報処理装置は、様々な目的の動作関連基準を決定することができる。例えば、指定条件は、ロボットによって作用が加えられる対象物、ロボット、作業、作業場所、作業環境、作業実施日及び作業実施期間等に対応する評価結果を指定する条件であってもよい。 According to the above aspect, the information processing device can determine a new action-related criterion using evaluation results that satisfy specified conditions. Thus, the information processing device can determine action-related criteria for various purposes. For example, the designated condition may be a condition that designates evaluation results corresponding to an object to be acted upon by the robot, the robot, the work, the work place, the work environment, the work execution date, the work execution period, and the like.
 本開示の一態様に係る学習装置は、本開示の一態様に係る情報処理装置における前記動作関連基準の情報と前記動作関連基準に基づく前記動作関連情報の前記評価結果の情報とを機械学習用データとして記憶する第2記憶装置と、機械学習回路とを含み、前記機械学習回路は、前記機械学習用データに含まれる前記評価結果の情報を機械学習用入力データとし、前記機械学習用データに含まれ且つ前記評価結果に用いられる前記動作関連基準の情報を教師データとして、機械学習し、前記機械学習回路は、前記評価結果の情報に対応する評価情報を入力データとし、前記評価情報に対応する前記動作関連基準の情報を出力データとする。 A learning device according to an aspect of the present disclosure uses the information of the action-related criterion and the information of the evaluation result of the action-related information based on the action-related criterion in the information processing device according to the aspect of the present disclosure for machine learning. a second storage device for storing data, and a machine learning circuit, wherein the machine learning circuit uses the information of the evaluation result included in the machine learning data as input data for machine learning, and the machine learning data Machine learning is performed using the information of the action-related reference included and used for the evaluation result as teacher data, and the machine learning circuit uses evaluation information corresponding to the information of the evaluation result as input data and corresponds to the evaluation information. The information of the action-related criteria is used as output data.
 上記態様によると、機械学習後の学習装置は、評価結果に対応する評価情報の入力を受け付けると、当該評価情報に適した動作関連基準を出力することができる。例えば、高い評価結果に対応する評価情報が学習装置に入力されると、学習装置は、高評価を得やすい簡易で低いレベルの動作関連基準を出力する。例えば、低い評価結果に対応する評価情報が学習装置に入力されると、学習装置は、高評価を得るのが難しい高度なレベルの動作関連基準を出力する。例えば、ユーザは、学習装置を用いることによって、ユーザが所望する難易度の動作関連基準を取得し、ロボット操作に用いることができる。第2記憶装置は、機械学習回路、処理回路及び第1記憶装置から分離されてもよく、機械学習回路、処理回路又は第1記憶装置に含まれてもよく、第1記憶装置を含んでもよい。 According to the above aspect, when the learning device after machine learning receives input of evaluation information corresponding to evaluation results, it can output action-related criteria suitable for the evaluation information. For example, when evaluation information corresponding to a high evaluation result is input to the learning device, the learning device outputs a simple, low-level action-related criterion that is likely to obtain a high evaluation. For example, when rating information corresponding to a low rating result is input to the learning device, the learning device outputs high-level action-related criteria that are difficult to obtain high ratings. For example, by using the learning device, the user can acquire the action-related reference of the difficulty level desired by the user and use it for operating the robot. The second storage device may be separate from the machine learning circuitry, the processing circuitry and the first storage device, may be included in the machine learning circuitry, the processing circuitry or the first storage device, and may include the first storage device. .
 本開示の一態様に係る情報処理システムは、本開示の一態様に係る情報処理装置と、本開示の一態様に係る学習装置とを備え、前記学習装置は、前記操作装置又は前記情報処理装置から入力データを受け付け、前記入力データに対する出力データを、前記情報処理装置に出力するように構成される。上記態様によると、情報処理システムは、本開示の一態様に係る情報処理装置及び学習装置と同様の効果を得ることができる。情報処理システムは、学習装置から出力される動作関連基準を用いて、情報処理装置での評価処理を実行することができる。 An information processing system according to one aspect of the present disclosure includes an information processing device according to one aspect of the present disclosure and a learning device according to one aspect of the present disclosure, wherein the learning device is the operating device or the information processing device. receives input data from and outputs output data corresponding to the input data to the information processing device. According to the above aspect, the information processing system can obtain the same effects as the information processing device and the learning device according to one aspect of the present disclosure. The information processing system can use the action-related criteria output from the learning device to perform evaluation processing in the information processing device.
 本開示の一態様に係るロボットシステムは、本開示の一態様に係る情報処理装置と、前記ロボットと、前記ロボットの動作を制御するロボットコントローラとを備え、前記情報処理装置は、前記ロボットコントローラと通信可能に接続される。上記態様によると、ロボットシステムは、本開示の一態様に係る情報処理装置と同様の効果を得ることができる。 A robot system according to one aspect of the present disclosure includes an information processing device according to one aspect of the present disclosure, the robot, and a robot controller that controls an operation of the robot, wherein the information processing device includes the robot controller and Communicatively connected. According to the above aspect, the robot system can obtain the same effects as the information processing device according to one aspect of the present disclosure.
 本開示の一態様に係るロボットシステムは、本開示の一態様に係る情報処理システムと、前記ロボットと、前記ロボットの動作を制御するロボットコントローラとを備え、前記情報処理装置は、前記ロボットコントローラと通信可能に接続される。上記態様によると、ロボットシステムは、本開示の一態様に係る情報処理システムと同様の効果を得ることができる。 A robot system according to one aspect of the present disclosure includes an information processing system according to one aspect of the present disclosure, the robot, and a robot controller that controls an operation of the robot, wherein the information processing device includes the robot controller and Communicatively connected. According to the above aspect, the robot system can obtain the same effects as the information processing system according to one aspect of the present disclosure.
 本明細書で開示する要素の機能は、開示された機能を実行するよう構成又はプログラムされた汎用プロセッサ、専用プロセッサ、集積回路、ASIC、従来の回路、及び/又は、それらの組み合わせ、を含む回路又は処理回路を使用して実行できる。プロセッサは、トランジスタやその他の回路を含むため、処理回路又は回路と見なされる。本開示において、回路、ユニット、又は手段は、列挙された機能を実行するハードウェアであるか、又は、列挙された機能を実行するようにプログラムされたハードウェアである。ハードウェアは、本明細書に開示されているハードウェアであってもよいし、あるいは、列挙された機能を実行するようにプログラム又は構成されているその他の既知のハードウェアであってもよい。ハードウェアが回路の一種と考えられるプロセッサである場合、回路、手段、又はユニットはハードウェアとソフトウェアの組み合わせであり、ソフトウェアはハードウェア及び/又はプロセッサの構成に使用される。 The functions of the elements disclosed herein may be circuits including general purpose processors, special purpose processors, integrated circuits, ASICs, conventional circuits, and/or combinations thereof configured or programmed to perform the disclosed functions. or can be performed using processing circuitry. A processor is considered a processing circuit or circuit because it includes transistors and other circuits. In this disclosure, a circuit, unit, or means is hardware that performs or is programmed to perform the recited functions. The hardware may be the hardware disclosed herein, or other known hardware programmed or configured to perform the recited functions. A circuit, means or unit is a combination of hardware and software, where the hardware is a processor which is considered a type of circuit, the software being used to configure the hardware and/or the processor.
 上記で用いた序数、数量等の数字は、全て本開示の技術を具体的に説明するために例示するものであり、本開示は例示された数字に制限されない。構成要素間の接続関係は、本開示の技術を具体的に説明するために例示するものであり、本開示の機能を実現する接続関係はこれに限定されない。 All numbers such as ordinal numbers and quantities used above are examples for specifically describing the technology of the present disclosure, and the present disclosure is not limited to the numbers illustrated. The connection relationship between components is an example for specifically describing the technology of the present disclosure, and the connection relationship for realizing the function of the present disclosure is not limited to this.
 機能ブロック図におけるブロックの分割は一例である。複数のブロックを一つのブロックとして実現してもよく、一つのブロックを複数に分割してもよく、一部の機能を他のブロックに移してもよく、これらの2つ以上を組み合わせてもよい。類似する機能を有する複数のブロックの機能を単一のハードウェア又はソフトウェアが並列又は時分割に処理してもよい。 The division of blocks in the functional block diagram is an example. Multiple blocks may be implemented as one block, one block may be divided into multiple blocks, some functions may be moved to other blocks, and two or more of these may be combined. . A single piece of hardware or software may process the functions of multiple blocks having similar functions in parallel or in a time division manner.
 本開示は、その本質的な特徴の精神から逸脱することなく、様々なかたちで実施され得るように、本開示の範囲は、明細書の記載よりも添付の請求項によって定義されるため、例示的な実施の形態及び変形例は、例示的なものであって限定的なものではない。請求項及びその範囲内にあるすべての変更、又は、請求項及びその範囲の均等物は、請求項によって包含されることが意図されている。 As the present disclosure may be embodied in various forms without departing from the spirit of its essential characteristics, the scope of the present disclosure is defined rather by the appended claims than by the written description; The exemplary embodiments and variations are exemplary and not limiting. All changes that come within the claims and their scope, or equivalents of the claims and their scope, are intended to be covered by the claims.

Claims (12)

  1.  作業をロボットに実行させるための前記ロボットの操作を評価する情報処理装置であって、
     処理回路を含み、
     前記処理回路は、
     ユーザによって操作装置に入力される操作指令に従って動作する前記ロボットの動作に関連する情報である、動作関連情報を検出する検出処理と、
     検出された前記動作関連情報を、前記ロボットの動作に関連する基準である動作関連基準と比較し評価する評価処理と、
     前記評価処理での評価結果を、前記ユーザが前記操作装置を操作しつつ情報の提示を受ける提示装置に提示させる提示処理とを、
     前記作業の実行中に実行するように構成され、
     前記処理回路は、評価対象の前記ロボットの動作である評価対象動作の都度の第1タイミングと、前記作業における評価対象の作業段階である評価対象段階の都度の第2タイミングとのうちのいずれか又は両方のタイミングで前記評価処理を実行し、
     前記処理回路は、前記第1タイミングで前記評価処理を実行する場合、前記評価対象動作の実行順序と同じ順序で前記評価対象動作に関する前記評価結果を前記提示装置に提示させ、
     前記処理回路は、前記第2タイミングで前記評価処理を実行する場合、前記評価対象段階の実行順序と同じ順序で前記評価対象段階に関する前記評価結果を前記提示装置に提示させる
     情報処理装置。
    An information processing device that evaluates an operation of the robot for causing the robot to perform a task,
    including a processing circuit;
    The processing circuit is
    a detection process for detecting motion-related information, which is information related to the motion of the robot that operates in accordance with an operation command input to an operation device by a user;
    an evaluation process for comparing and evaluating the detected motion-related information with a motion-related criterion, which is a criterion related to the motion of the robot;
    a presentation process for presenting an evaluation result of the evaluation process on a presentation device that receives information presentation while the user operates the operation device;
    configured to run during execution of said work,
    The processing circuit performs either a first timing each time an evaluation target motion is performed, which is a motion of the robot to be evaluated, or a second timing each time an evaluation target stage, which is a work stage to be evaluated in the work. or executing the evaluation process at both timings,
    When executing the evaluation process at the first timing, the processing circuit causes the presentation device to present the evaluation results regarding the evaluation target motion in the same order as the execution order of the evaluation target motion,
    When the evaluation process is executed at the second timing, the processing circuit causes the presentation device to present the evaluation results regarding the evaluation target stages in the same order as the execution order of the evaluation target stages.
  2.  前記処理回路は、前記第1タイミングで前記評価処理を実行する場合、第1の前記評価対象動作に関する前記評価処理と、前記第1の評価対象動作に関する前記評価結果の前記提示処理とを、前記第1の評価対象動作の次の第2の前記評価対象動作のタイミングまでに実行する
     請求項1に記載の情報処理装置。
    When executing the evaluation process at the first timing, the processing circuit performs the evaluation process regarding the first evaluation target action and the presentation process of the evaluation result regarding the first evaluation target action as described above. The information processing apparatus according to claim 1, which is executed by the timing of the second evaluation target action following the first evaluation target action.
  3.  前記処理回路は、前記第2タイミングで前記評価処理を実行する場合、第1の前記評価対象段階に関する前記評価処理と、前記第1の評価対象段階に関する前記評価結果の前記提示処理とを、前記第1の評価対象段階の次の第2の前記評価対象段階のタイミングまでに実行する
     請求項1又は2に記載の情報処理装置。
    When executing the evaluation process at the second timing, the processing circuit performs the evaluation process regarding the first evaluation target stage and the presentation process of the evaluation result regarding the first evaluation target stage as described above. The information processing apparatus according to claim 1 or 2, which is executed by the timing of the second evaluation target stage following the first evaluation target stage.
  4.  前記処理回路は、前記評価結果に基づき、前記ロボットの操作に対して前記ユーザに与える対価をさらに決定し、決定された前記対価と、前記作業の実行中の前記対価の積み上げの結果とを前記提示装置に提示させる
     請求項1から3のいずれか一項に記載の情報処理装置。
    The processing circuit further determines a compensation to be given to the user for operating the robot based on the evaluation result, and stores the determined compensation and a result of accumulating the compensation during execution of the work. The information processing apparatus according to any one of claims 1 to 3, which is caused to be presented by a presentation apparatus.
  5.  前記処理回路は、前記動作関連情報として、前記操作指令と、前記操作指令に従って前記ロボットに動作させるために前記ロボットに対して生成される動作指令と、前記ロボットの実際の動作結果と、前記ロボットの周辺環境の状態と、前記ロボットが作用を加える対象物の状態と、前記対象物の周辺環境の状態とのうちの1つ以上を検出する
     請求項1から4のいずれか一項に記載の情報処理装置。
    The processing circuit stores, as the motion-related information, the operation command, a motion command generated for the robot to cause the robot to move according to the operation command, an actual motion result of the robot, and the robot. 5. The apparatus according to any one of claims 1 to 4, wherein one or more of a state of the surrounding environment of the robot, a state of an object on which the robot acts, and a state of the surrounding environment of the object are detected. Information processing equipment.
  6.  前記処理回路は、前記動作関連情報を評価するための特徴である2つ以上の評価対象特徴を、前記動作関連基準に含まれる前記2つ以上の評価対象特徴の基準と比較し評価し、前記2つ以上の評価対象特徴の評価結果に基づき前記動作関連情報を評価する
     請求項1から5のいずれか一項に記載の情報処理装置。
    The processing circuit compares and evaluates two or more evaluation target features, which are features for evaluating the action-related information, with criteria for the two or more evaluation target features included in the action-related criteria, and The information processing apparatus according to any one of claims 1 to 5, wherein the action-related information is evaluated based on evaluation results of two or more evaluation target features.
  7.  第1記憶装置をさらに含み、
     前記第1記憶装置は、前記動作関連基準の情報と、前記動作関連基準に基づく前記動作関連情報の前記評価結果の情報とを蓄積し、
     前記処理回路は、前記第1記憶装置に蓄積される第1の前記動作関連基準の情報及び前記第1の動作関連基準に基づく前記評価結果の情報を用いて、第2の前記動作関連基準の情報を決定する
     請求項1から6のいずれか一項に記載の情報処理装置。
    further comprising a first storage device;
    the first storage device stores information on the action-related criteria and information on the evaluation result of the action-related information based on the action-related criteria;
    The processing circuitry uses the information of the first action-related criteria and the information of the evaluation result based on the first action-related criteria stored in the first storage device to determine the second of the action-related criteria. The information processing apparatus according to any one of claims 1 to 6, wherein information is determined.
  8.  前記処理回路は、前記第1記憶装置に蓄積される情報の中から、指定される条件を満たす前記評価結果の情報と、前記評価結果に用いられる前記動作関連基準である前記第1の動作関連基準の情報とを抽出し、抽出された前記評価結果の情報及び前記第1の動作関連基準の情報を用いて前記第2の動作関連基準の情報を決定する
     請求項7に記載の情報処理装置。
    The processing circuit selects, from information stored in the first storage device, information on the evaluation result that satisfies a specified condition, and the first action relation that is the action relation criterion used for the evaluation result. 8. The information processing apparatus according to claim 7, wherein reference information is extracted, and the extracted information of the evaluation result and the information of the first action-related reference are used to determine the information of the second action-related reference. .
  9.  請求項1から8のいずれか一項に記載の情報処理装置における前記動作関連基準の情報と前記動作関連基準に基づく前記動作関連情報の前記評価結果の情報とを機械学習用データとして記憶する第2記憶装置と、
     機械学習回路とを含み、
     前記機械学習回路は、前記機械学習用データに含まれる前記評価結果の情報を機械学習用入力データとし、前記機械学習用データに含まれ且つ前記評価結果に用いられる前記動作関連基準の情報を教師データとして、機械学習し、
     前記機械学習回路は、前記評価結果の情報に対応する評価情報を入力データとし、前記評価情報に対応する前記動作関連基準の情報を出力データとする
     学習装置。
    9. The information processing apparatus according to any one of claims 1 to 8, wherein the information of the action-related reference and the information of the evaluation result of the action-related information based on the action-related reference are stored as data for machine learning. 2 a storage device;
    a machine learning circuit;
    The machine learning circuit uses the information on the evaluation result contained in the data for machine learning as input data for machine learning, and the information on the action-related criterion contained in the data for machine learning and used for the evaluation result as a teacher. as data, machine learning,
    The learning device, wherein the machine learning circuit uses the evaluation information corresponding to the evaluation result information as input data and the action-related reference information corresponding to the evaluation information as output data.
  10.  請求項1から8のいずれか一項に記載の情報処理装置と、
     請求項9に記載の学習装置とを備え、
     前記学習装置は、前記操作装置又は前記情報処理装置から入力データを受け付け、前記入力データに対する出力データを、前記情報処理装置に出力するように構成される
     情報処理システム。
    an information processing apparatus according to any one of claims 1 to 8;
    A learning device according to claim 9,
    The information processing system, wherein the learning device is configured to receive input data from the operation device or the information processing device and output output data corresponding to the input data to the information processing device.
  11.  請求項1から8のいずれか一項に記載の情報処理装置と、
     前記ロボットと、
     前記ロボットの動作を制御するロボットコントローラとを備え、
     前記情報処理装置は、前記ロボットコントローラと通信可能に接続される
     ロボットシステム。
    an information processing apparatus according to any one of claims 1 to 8;
    the robot;
    A robot controller that controls the operation of the robot,
    The robot system, wherein the information processing device is communicably connected to the robot controller.
  12.  請求項10に記載の情報処理システムと、
     前記ロボットと、
     前記ロボットの動作を制御するロボットコントローラとを備え、
     前記情報処理装置は、前記ロボットコントローラと通信可能に接続される
     ロボットシステム。
    an information processing system according to claim 10;
    the robot;
    A robot controller that controls the operation of the robot,
    The robot system, wherein the information processing device is communicably connected to the robot controller.
PCT/JP2022/006514 2021-02-26 2022-02-18 Information processing device, learning device, information processing system, and robot system WO2022181458A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-030029 2021-02-26
JP2021030029A JP2022131206A (en) 2021-02-26 2021-02-26 Information processing device, learning device, information processing system, and robot system

Publications (1)

Publication Number Publication Date
WO2022181458A1 true WO2022181458A1 (en) 2022-09-01

Family

ID=83048943

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/006514 WO2022181458A1 (en) 2021-02-26 2022-02-18 Information processing device, learning device, information processing system, and robot system

Country Status (2)

Country Link
JP (1) JP2022131206A (en)
WO (1) WO2022181458A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013035244A1 (en) * 2011-09-06 2013-03-14 パナソニック株式会社 Robotic arm control device and control method, robot, control program and integrated electronic circuit
US20130209980A1 (en) * 2011-02-08 2013-08-15 The Trustees Of The University Of Pennsylvania Systems and methods for providing vibration feedback in robotic systems
JP2013178294A (en) * 2012-02-28 2013-09-09 Mitsubishi Heavy Ind Ltd Operation skill level evaluation system
JP2017064831A (en) * 2015-09-29 2017-04-06 日立Geニュークリア・エナジー株式会社 Remote work support system and remote work support method
JP2019184904A (en) * 2018-04-13 2019-10-24 ファナック株式会社 Operation training system
JP2020026025A (en) * 2018-08-10 2020-02-20 川崎重工業株式会社 Training processing device, mediation device, training system and training processing method
JP2020135362A (en) * 2019-02-19 2020-08-31 Telexistence株式会社 Management device, management method, and management system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130209980A1 (en) * 2011-02-08 2013-08-15 The Trustees Of The University Of Pennsylvania Systems and methods for providing vibration feedback in robotic systems
WO2013035244A1 (en) * 2011-09-06 2013-03-14 パナソニック株式会社 Robotic arm control device and control method, robot, control program and integrated electronic circuit
JP2013178294A (en) * 2012-02-28 2013-09-09 Mitsubishi Heavy Ind Ltd Operation skill level evaluation system
JP2017064831A (en) * 2015-09-29 2017-04-06 日立Geニュークリア・エナジー株式会社 Remote work support system and remote work support method
JP2019184904A (en) * 2018-04-13 2019-10-24 ファナック株式会社 Operation training system
JP2020026025A (en) * 2018-08-10 2020-02-20 川崎重工業株式会社 Training processing device, mediation device, training system and training processing method
JP2020135362A (en) * 2019-02-19 2020-08-31 Telexistence株式会社 Management device, management method, and management system

Also Published As

Publication number Publication date
JP2022131206A (en) 2022-09-07

Similar Documents

Publication Publication Date Title
US10507577B2 (en) Methods and systems for generating instructions for a robotic system to carry out a task
US10596695B1 (en) Replacing a first robot with a second robot during performance of a task by the first robot
JP6873941B2 (en) Robot work system and control method of robot work system
US10322506B2 (en) Systems, devices, articles, and methods for using trained robots
JP2020526402A (en) Judgment and use of corrections to robot actions
JP2019501384A (en) Autonomous positioning navigation equipment, positioning navigation method and autonomous positioning navigation system
CN114728417A (en) Robot autonomous object learning triggered by a remote operator
JP2019082999A (en) Robot systems incorporating cloud services systems
US10607079B2 (en) Systems and methods for generating three dimensional skeleton representations
JP2021527889A (en) Control method of autonomous mobile robot and autonomous mobile robot
WO2020071080A1 (en) Information processing device, control method, and program
US20210232149A1 (en) Systems and methods for persistent mapping of environmental parameters using a centralized cloud server and a robotic network
CN111421554A (en) Mechanical arm intelligent control system, method and device based on edge calculation
JP2003266349A (en) Position recognition method, device thereof, program thereof, recording medium thereof, and robot device provided with position recognition device
WO2022181458A1 (en) Information processing device, learning device, information processing system, and robot system
US20230341873A1 (en) Multi-Robot Control System and Method
WO2022131335A1 (en) Control device, robot system, and learning device
US11618164B2 (en) Robot and method of controlling same
US11656923B2 (en) Systems and methods for inter-process communication within a robot
Chudoba et al. A technical solution of a robotic e-learning system in the SyRoTek project
JP7473685B2 (en) Control device, robot system and learning device
Villemure et al. SwarmUS: An open hardware and software on-board platform for swarm robotics development
Gunawardhana et al. Indoor Autonomous Multi-Robot Communication System
Ziegler et al. Robocup rescue 2018 team description paper autonohm
JP2022516913A (en) A three-way communication system including end devices, edge servers for controlling end devices, and cloud servers and how this works.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22759494

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22759494

Country of ref document: EP

Kind code of ref document: A1