WO2021019787A1 - Task distribution device, task distribution system, method, and program - Google Patents

Task distribution device, task distribution system, method, and program Download PDF

Info

Publication number
WO2021019787A1
WO2021019787A1 PCT/JP2019/030362 JP2019030362W WO2021019787A1 WO 2021019787 A1 WO2021019787 A1 WO 2021019787A1 JP 2019030362 W JP2019030362 W JP 2019030362W WO 2021019787 A1 WO2021019787 A1 WO 2021019787A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
operator
information
operation terminal
exception event
Prior art date
Application number
PCT/JP2019/030362
Other languages
French (fr)
Japanese (ja)
Inventor
征久 唐子
真司 川上
隆宏 徳
多田 有為
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Priority to JP2021536593A priority Critical patent/JP7331928B2/en
Priority to PCT/JP2019/030362 priority patent/WO2021019787A1/en
Priority to CN201980098534.3A priority patent/CN114144806A/en
Publication of WO2021019787A1 publication Critical patent/WO2021019787A1/en
Priority to JP2022085083A priority patent/JP2022126658A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services

Definitions

  • the present invention relates to a task distribution technique for assigning a task for dealing with an exception event that occurs in a robot to an appropriate operator.
  • Patent Documents 1 and 2 can be mentioned as a technique for responding to an exception event by remote control.
  • Patent Document 1 discloses that the server discloses a task desired to be remotely controlled and determines a remote operator who requests the task in response to a bid from the remote operator.
  • this method has a problem of lacking responsiveness because the task is not executed without a bid from the remote operator.
  • the autonomous traveling robot disclosed in Patent Document 2 is configured to output a signal prompting remote control when an obstacle is detected and avoid the obstacle by remote control.
  • This autonomous traveling robot is configured to store operation information when an obstacle has been avoided in the past and to refer to it when it is necessary to newly avoid the obstacle.
  • the autonomous traveling robot and the remote operator are associated with each other in advance. If the robot and the remote operator have a one-to-one correspondence, there is a problem that a large number of remote operators are required. Further, when one remote operator is associated with a plurality of robots, there is a problem that it cannot be dealt with when an event requiring remote control occurs in a plurality of robots at the same time.
  • An object of the present invention is to be able to quickly deal with an exceptional event that occurs in a robot.
  • the present invention adopts the following configuration.
  • the first aspect of the present invention is an operation terminal connection unit that manages the connection state of the operation terminal owned by the operator, a state reception unit that receives state information from the robot, and the state reception unit based on the state information.
  • the information control unit that selects the operator for dealing with the exception event from the available operators, and the selected operator
  • It is a task distribution device including a transmission unit that transmits a response request for an exception event of the robot together with the state information of the robot to an operation terminal.
  • the robot includes an arbitrary device capable of autonomously performing some movement.
  • a robot is a device that includes a sensor and an actuator and is configured to autonomously control the actuator based on information obtained from the sensor.
  • the robot may be a mobile robot capable of autonomous movement, or may be a fixed robot that cannot move autonomously.
  • the robot state information may be at least information that can grasp the current state of the robot.
  • Examples of the robot state information include sensor information obtained from a sensor provided by the robot and analysis results analyzed by the robot based on the sensor information. It should be noted that it is not necessary that all the state information of the robot is transmitted to the task distribution device, and at least the state information that can determine that an exception event has occurred is transmitted to the robot. Further, receiving the state information from the robot includes both receiving the state information from the robot and receiving the state information via other devices.
  • An exceptional event is an event in which the robot cannot automatically process or is judged to be inappropriate.
  • an exceptional event can be regarded as an event that requires human operation or confirmation.
  • the exception event may be included in the status information, or may be determined by the task distribution device from the status information.
  • the management of the connection status of the operation terminal performed by the operation terminal connection unit may include establishing the online status of a plurality of operation terminals and monitoring the online status.
  • the online state includes a state in which the operation terminal can immediately respond to the notification from the task distribution device.
  • the information control unit selects an operator to deal with this exception event. Whether or not an exception event has occurred can be determined from the contents of the state information. However, in the embodiment in which the status information is notified to the task distribution device only when an exception event occurs, when the information control unit receives the status information, the exception event occurs without referring to the contents thereof. It can be determined that there is. For example, the information control unit may select an operator associated with the operating terminal that is online as an operator for dealing with the received exception event. The information control unit may further select an operator who satisfies the condition that the exception event of another robot is not being dealt with. However, this does not apply if there is no operator who can handle it.
  • this aspect has the above configuration, it is possible to select an operator who can quickly respond to an exception event. In other words, when an exception event occurs in the robot, it is possible to quickly respond to this exception event.
  • the task distribution device may further have an operator information storage unit that stores the skill of the operator.
  • the information control unit may select an operator having a skill that matches at least one of the robot type and the exceptional condition.
  • an appropriate operator can be selected for the exceptional event that has occurred.
  • the task distribution device includes a receiving unit that receives an operation command for the robot from the operation terminal, an output unit that outputs the operation command to the robot, a state of the robot, and the operator. It may further have a corresponding storage unit that stores the operation contents in association with each other.
  • the receiving means receives the response result for the exception event of the robot from the operation terminal, and the response storage unit stores the response result in association with the exception event of the robot and the operator. You may.
  • the content of the response result is not particularly limited, and may indicate, for example, whether the exception event has been resolved or not resolved, or may require confirmation on site by another operator. ..
  • the information control unit may evaluate the skill of the operator with reference to the corresponding information storage unit and update the operator information storage unit.
  • the operator information storage unit stores the usage cost of the operator, and the information control unit refers to the corresponding storage unit and pays a reward according to the correspondence history of the operator. May be determined.
  • the information control unit may classify the exception events of the robot and select an operator according to the type of the exception event as an operator for dealing with the exception event. For example, a corresponding operator or a group of operators may be predetermined for each type of exception event. In this way, the same type of exception event will be dealt with by a particular operator, and these operators will become more proficient in that type of exception event.
  • the transmission unit may transmit the state information of the robot to the operation terminal of the administrator at the site where the robot is installed in response to the request from the operator. In this way, when the operator determines that the robot needs to be processed at the work site, the site manager can confirm it.
  • the state information may include an image taken by a camera mounted on the robot and an analysis result of sensor information obtained from a sensor provided on the robot.
  • the operator refers to this information, it becomes easy to deal with the exception event.
  • a second aspect of the present invention is a task distribution system including a robot, an operation terminal for remotely controlling the robot, and the above-mentioned task distribution device.
  • the operation terminal when the operation terminal receives a request for coping with an exception event of the robot, the operation terminal is configured to display the state information and a graphical user interface (GUI) for operating the robot on a display unit. You may.
  • GUI graphical user interface
  • the robot may be a fixed robot or a mobile robot.
  • the mobile robot may be a mobile robot that has a mobile device, a sensor, and a camera and can move autonomously. It may be a robot in which an arm type robot is mounted on a mobile robot.
  • the present invention can be regarded as a task distribution device or task distribution system having at least a part of the above configuration or function.
  • the present invention also includes a task distribution method including at least a part of the above processing, a program for causing a computer to execute the method, or a computer-readable recording medium in which such a program is recorded non-temporarily. It can also be regarded as.
  • FIG. 1 is a diagram showing a configuration example of a task distribution system according to an embodiment.
  • FIG. 2 is a flowchart showing a process performed by the task distribution server in the embodiment.
  • FIG. 3 is a flowchart showing a process performed by the task distribution server in the embodiment.
  • FIG. 4 is a diagram illustrating a process performed by the information control unit in the embodiment.
  • FIG. 5 is a diagram showing an example of the operator information DB in the first embodiment.
  • FIG. 6 is a diagram showing an example of the correspondence information DB in the first embodiment.
  • FIG. 7 is a diagram showing an example of an image taken by the robot in the first embodiment.
  • FIG. 8 is a diagram showing an example of an image taken by the robot in the first embodiment.
  • FIG. 1 is a diagram showing a configuration example of a task distribution system according to an embodiment.
  • FIG. 2 is a flowchart showing a process performed by the task distribution server in the embodiment.
  • FIG. 3 is a flowchart showing
  • FIG. 9 is a diagram for explaining the result of dealing with the exception event in the first embodiment.
  • FIG. 10 is a diagram showing a configuration example of the robot according to the second embodiment.
  • 11A and 11B are diagrams showing an example of an image taken by the robot in the second embodiment.
  • FIG. 12 is a diagram showing an example of the operation information DB in the second embodiment.
  • FIG. 13 is a diagram showing an example of the correspondence information DB in the second embodiment.
  • FIG. 14 is a diagram showing a graphical user interface for remote control of the robot displayed on the operator terminal in the second embodiment.
  • FIG. 1 is a block diagram showing a configuration example of a task distribution system to which the present invention is applied.
  • the task distribution system includes a task distribution server (task distribution device) 1, an operation terminal 2, and robots such as a fixed robot 3a and a mobile robot 3b.
  • robots such as a fixed robot 3a and a mobile robot 3b.
  • the task distribution server 1 and the operation terminal 2 are connected via a wide area network such as the Internet, and the operator is located in a remote place away from the work position of the robot 3.
  • the task distribution server 1 selects an operator for responding to an exception event when an exception event occurs in the robot 3, and the selected operator is selected. Deliver a task to deal with exceptions to.
  • the server 1 holds the coping skill for each type of robot 3 or the type of exception event as information for each operator, and provides the robot 3 in which the exception event occurs or the operator having the skill to deal with the exception event. It may be configured to be selected.
  • the operator uses the operation terminal 2 to operate the robot 3 to deal with the exception event.
  • the operation instruction to the robot 3 is transmitted to the robot 3 via the server 1, and the server 1 can store the history of the operation contents.
  • the server 1 is a computer including a CPU 11, a memory 12, a storage 13, a network interface (not shown), and the like.
  • the server 1 provides the following functions by loading the program stored in the storage 13 into the memory 12 and executing the program. That is, the server 1 has a state reception unit 101, an information control unit 102, a transmission unit 103, an operation terminal connection unit 104, a reception unit 105, an output unit 106, an operator information DB 107, and a correspondence information DB 108 as its functional units.
  • DB means a database.
  • the state reception unit 101 receives state information from the fixed robot 3a and the mobile robot 3b.
  • the state receiving unit 101 receives the state information from the fixed robot 3a via the robot controller 5a, and receives the state information from the mobile robot 3b via the wireless communication device 6.
  • the state information may be transmitted from the robot 3 to the server 1 only when an exception event occurs in the robot 3, or may be transmitted from the robot 3 to the server 1 regardless of whether or not an exception event has occurred.
  • the information control unit 102 selects an operator to deal with the exception event when an exception event occurs in the robot 3, saves the operation content (correspondence history) from the operation terminal, evaluates the skill of the operator, and sends the operator to the operator. Performs processing such as reward determination. Details will be described below.
  • the transmission unit 103 transmits a response request for an exception event of the robot 3 and the state information of the robot 3 to the operation terminal 2.
  • the operation terminal connection unit 104 establishes a connection with the operation terminal 2 and monitors the online state of the operation terminal 2.
  • the online state means a state in which the operation terminal 2 can immediately communicate with the server 1 and send an operation command to the robot 3. Further, the online state not only means the connection state between the server 1 and the operation terminal 2, but may also mean that the operator can immediately respond to the request. Therefore, the operation terminal connection unit 104 may make an inquiry when, for example, the information representing the input by the operator is not received from the operation terminal 2 and it is unclear whether the operator can respond immediately.
  • the online state may establish a connection by a push notification to the operation terminal 2.
  • the operation terminal connection unit 104 not only keeps the operation terminal 2 online, but also whether or not the operation terminal 2 is dealing with an exception state of any robot 3, in other words, a new exception event of the robot 3. It also manages whether or not it is possible to respond to.
  • Information about the state of the operation terminal 2 is stored in, for example, the memory 12.
  • the receiving unit 105 receives the information input by the operator into the operation terminal 2 and transmitted to the server 1.
  • the information transmitted from the operation terminal 2 includes an operation command for the robot 3 and a response result for an exception event.
  • the operation command to the robot 3 is transmitted to the robot 3 via the output unit 106, and is stored in the correspondence information DB 108 as a correspondence history.
  • the result of dealing with the exception event is also saved as a correspondence history in the correspondence information DB.
  • the operator information DB 107 stores information about the operator.
  • the correspondence information DB 108 stores the contents of the correspondence history for the exception event.
  • the output unit 106 transmits the operation command of the robot 3 input from the operation terminal 2 to the robot 3.
  • An operation command to the fixed robot 3a is transmitted to the robot controller 5a, and the robot controller 5a controls the robot 3a according to the content of the operation command.
  • the operation command to the mobile robot 3b is transmitted to the robot controller 5b in the mobile robot 3b via the wireless communication device 6, and the robot controller 5b controls the mobile robot 3b according to the content of the operation command.
  • the controller 5b controls the mobile robot 3b according to the content of the operation command.
  • the operation terminal 2 is an arbitrary computer that can communicate with the server 1 and can remotely control the fixed robot 3a and the mobile robot 3b.
  • Examples of the operation terminal 2 include a desktop PC, a laptop PC, a tablet terminal, and a smartphone terminal.
  • the operation terminal 2 may be a dedicated teaching pendant for remotely controlling the robot.
  • the operation terminal 2 provides the user with different types of operation GUIs depending on the type of the robot 3. This GUI may be generated by the operation terminal 2 itself, or may be provided by the server 1 or other devices.
  • the operator can grasp the state information of the robot 3 via the GUI of the operation terminal 2.
  • This state information includes sensor information obtained from the sensor of the robot 3, an analysis result of the sensor information, and an image taken by the camera of the robot 3.
  • the operation terminal 2 receives information on a state requiring processing from the server 1, displays it on the screen, and prompts the operator to operate it.
  • the operator can input and transmit an operation command for the robot 3 and the result of the operation for the robot 3 via the GUI of the operation terminal 2.
  • the input information is transmitted from the operation terminal 2 to the receiving unit 105 of the server 1.
  • Robot 3 is a device that can autonomously execute some kind of operation.
  • the fixed robot 3a has an actuator and a sensor, and executes a process by controlling the actuator according to the information obtained from the sensor.
  • the control content of the fixed robot 3a is determined by the robot controller 5.
  • Examples of the fixed robot 3a include, but are not limited to, FA robots, NC machine tools, molding machines, press machines, automatic feeding devices for breeding animals and plants, and automatic irrigation and fertilizing devices for cultivated plants. ..
  • the mobile robot 3b is a device that is equipped with a moving device and a robot controller 5b in addition to an actuator and a sensor, can move autonomously, and executes some operation at the moving destination.
  • Examples of the mobile robot 3b include, but are not limited to, a transport robot, a cleaning robot, a waiter robot, a security robot, a robot combined with an arm type robot, and the like.
  • both the fixed robot 3a and the mobile robot 3b have a camera (imaging means) as a sensor.
  • the camera may be any camera such as a visible light camera, an infrared camera, and a three-dimensional camera.
  • the robot controller 5 controls the robot 3 according to a predefined program based on the information obtained from the sensor of the robot 3.
  • the robot controller 5 may be configured so that the control content can be dynamically changed by online learning.
  • the robot controller 5 is configured to be able to determine whether an event has occurred in which the robot 3 cannot automatically perform processing or it is not appropriate to perform processing automatically. Such an event is referred to as an exceptional event in the present disclosure.
  • an exceptional event there is an event in which the operation instruction transmitted from the robot controller 5 to the robot 3 and the operation result returned from the robot 3 do not match, and the robot 3 does not operate according to the operation instruction.
  • An example of an exceptional event is the occurrence of an event that requires human confirmation. Examples of such an event include an abnormal temperature rise of the robot 3, generation of abnormal noise, and abnormality of the operation target object.
  • the operator information DB 107 stores information about the operator. For example, as shown in FIG. 5, the operator information DB 107 stores the operator ID 501, the operation terminal information 502, the skill information 503, the hourly wage 504, and the work time history 505. These are merely examples, and the operator information DB 107 may not store all of these information, or may store other information.
  • the operator ID 501 is an identifier for uniquely identifying the operator in the system.
  • the operator logs in to the system using this operator ID and a password (not shown) to keep the system online.
  • the operation terminal information 502 is information about the operation terminal 2 that keeps the online state, and includes information necessary for identifying and connecting the operation terminal 2 from the server 1, for example, an IP address and a MAC address.
  • the operation terminal information 502 includes information indicating the characteristics of the operation terminal 2 itself such as a PC or a smartphone terminal, and information on a special user interface (UI) connected to the operation terminal, and these information are used for task matching. Be done.
  • the special UI is, for example, a sensor worn on the hand to operate the finger portion of the end effector.
  • Skill information 503 includes skill possession information 503a and skill evaluation 503b.
  • Skill 1, skill 2, ... Skill N shown in the figure are skills related to the robot 3 connected to the task distribution system, and these are collectively referred to as skill information 503. If the operator has a skill that can be operated for skill N, YES is registered in the skill possession information 503a, and NO is registered otherwise. If the operator has skill N, the evaluation is registered as skill evaluation 503b. The value of the skill evaluation 503b is determined by the number of experiences with skill N, the speed of response, the evaluation by the evaluator (administrator) in the system, and the like. For example, a value from 1 to 10 can be set, and 10 can be set as the highest rank. Further, the server 1 may automatically update the skill evaluation of the operator based on the result of responding to the exception event of the operator.
  • the hourly wage 504 is an hourly reward set for the skill of the operator. From the employer's point of view, the hourly wage 504 represents the operating cost of the operator. Here, the unit of hourly wage is used as a point, so that the reward can be calculated according to the minimum wage (or other wage level) of the operator's residential area.
  • the working time history 505 records the cumulative total of working hours of the operator. The work time history 505 can total the hours worked by the operator on a daily basis, a weekly basis, a monthly basis, etc., and can be used for reward calculation.
  • Correspondence information DB 108 stores the contents of the correspondence history for the exception event.
  • the state of the robot 3 and the operation content from the operator are stored in association with each other for each exception event.
  • the correspondence information DB 108 stores the exception event occurrence time 601, the robot ID 602 in which the exception event occurred, the state information 603, the operation information 604, the operation result 605, the operator ID 606, and the correspondence time 607. Will be done.
  • the operator information DB 107 may not store all of these information, or may store other information.
  • the exception event occurrence time 601 represents the time when the exception event occurs in the robot 3 and the server 1 (state reception unit 101) receives the status information of the exception event from the robot controller 5.
  • the robot ID 602 is an identifier for uniquely identifying the robot 3 in which the exception event has occurred in the system.
  • the status information 603 is information such as an exceptional status and a status sent from the robot 3.
  • the operation information 604 is information instructing an operation to be given to the robot 3 by the operator in order to return the exceptional state to the normal state based on the sent state information.
  • the state information 603 and the operation information 604 will be described in detail when the examples are described.
  • the operation result 605 is information for recording the result of the robot 3 being operated by the operation information.
  • the operator ID 606 is an identifier for uniquely identifying the operator who has dealt with the exception event.
  • the response time 607 represents the time required to return the exceptional state to the normal state. If the normal state cannot be restored, it indicates the time until the operator determines that the problem cannot be solved by remote control alone.
  • [processing] 2 and 3 are flowcharts showing the processes performed by the server 1 in the task distribution system.
  • step S101 when the receiving unit 105 of the server 1 receives the connection request from the operation terminal 2, the operation terminal connection unit 104 manages the connection of the operation terminal 2, that is, establishes the connection and monitors the online state in step S102. ..
  • the operation terminal connection unit 104 also manages whether or not the operation terminal 2 is dealing with an exception event of any robot 3. The connection management by the operation terminal connection unit 104 continues even after step S102 until the operation terminal 2 disconnects.
  • step S103 the state receiving unit 101 receives the state information from the robot 3 (fixed robot 3a or mobile robot 3b).
  • the exception event occurs in the robot 3 when the state information is received. Can be automatically determined.
  • the robot 3 may be configured to transmit state information to the server 1 regardless of the presence or absence of an exception event.
  • the information control unit 102 refers to the state information and an exception event occurs in the robot 3. If it is determined that an exception event has occurred, the process after step S104 may be executed.
  • step S104 the information control unit 102 matches the task for dealing with the exception event with the operator, that is, selects the operator for dealing with the exception event. Specifically, the information control unit 102 selects an operator to which a coping task is assigned from among the available operators in consideration of the skill of the operator and the exception event. As shown in FIG. 4, the information control unit 102 inputs the type (type) and state (defect state) of the robot 3 included in the state information received by the state reception unit 101, and the information included in the operator information DB 107. Compare and extract the appropriate operator. The first condition for the operator to match (assign) a task is to be online.
  • Another condition is that the coping skill for the type of robot in which the exception event occurs or the coping skill for the type of exception event is at the required level. It is also good to adopt the condition that other tasks are not being executed. However, if all the operators who can handle the exception event are executing other tasks, the operators who are executing the other tasks may be matched. Further, the matching operator may be determined in consideration of the reward for the operator. For example, the information control unit 102 may determine the operator who satisfies the condition that the skill is equal to or higher than the required level and has the highest score determined based on the skill and the reward as the matching operator. Here, it is assumed that the higher the required skill, the higher the score, and the lower the reward, the higher the score.
  • the information control unit 102 may classify the exception events of the robot into several types and assign each exception event to a specific operator according to the type of the exception event.
  • the specific operator corresponding to the exception event is typically a plurality of operators, and it is preferable that the information control unit 102 assigns the exception event to the operators in the same group according to the exception type. However, it does not exclude that a certain kind of exception event is assigned to one specific operator.
  • step S105 the information control unit 102 identifies the operation terminal 2 of the operator selected in step S104, and requests the specified operation terminal 2 to deal with exception handling and the state information of the robot 3. Send.
  • the information control unit 102 refers to the operator information DB 107 and acquires the operation terminal information of the selected operator.
  • the operation terminal information is information for communicating with the operation terminal 2, and is, for example, an IP address or a MAC address.
  • the operation terminal information may include terminal type information.
  • the information control unit 102 passes the state information received from the robot 3 via the state reception unit 101 and the operation terminal information to the transmission unit 103 as transmission information.
  • the transmission unit 103 transmits the state information and the task execution request to the operation terminal 2 specified by the operation terminal information.
  • the GUI for operating the robot 3 is displayed on the display unit.
  • the operation information for the robot 3 may be a command for remotely controlling the robot 3 by real-time control, or may be a command for processing a series of procedures.
  • the operation information to the robot 3 input to the operation terminal 2 is transmitted from the operation terminal 2 to the server 1 and received by the receiving unit 105.
  • step S106 the receiving unit 105 receives information from the operation terminal 2.
  • the reception information received by the reception unit 105 includes operation terminal information and operation information.
  • step S107 the information control unit 102 extracts the operation information from the received information and transmits the operation information from the output unit 106 to the target robot 3. As a result, the robot 3 can be made to perform the operation instructed by the operator.
  • the state information of the robot 3 is operated directly from the robot 3 or via the server 1. It may be transmitted to the terminal 2. In this way, the operator can grasp the latest state of the robot 3.
  • the operator evaluates the result of the operation, inputs it to the operation terminal 2, and transmits it to the server 1.
  • Examples of operation results include whether or not the handling of exception events has been completed, and whether or not it is necessary for the administrator to confirm on-site afterwards.
  • the completion of the countermeasure includes the case where the normal state is restored and the case where it is determined that the countermeasure is unnecessary (not abnormal).
  • the handling of the exception event is completed by the state receiving unit 101 receiving the state information that the robot 3 has returned to the normal operation according to the operation command from the operator. In this case, the task distribution server 1 transmits the operation result to the operation terminal 2.
  • step S108 the receiving unit 105 receives the operation result from the operation terminal 2.
  • step S109 the information control unit 102 associates the state information received from the robot 3 with the operation information and the operation result received from the operation terminal 2 and stores them in the correspondence information DB 108.
  • the information control unit 102 may store the time required from the start to the completion of the task for dealing with the exception event by the operator in the correspondence information DB 108.
  • the working time of the operator can be grasped, and it can be used for the operator's reward calculation and the operator's skill evaluation.
  • Information such as working hours and rewards may be stored in the operator information DB 107.
  • step S110 the information control unit 102 determines whether or not the handling of the exception event is completed. This determination may be made based on the operation result information transmitted from the operation terminal 2.
  • the process returns to step S105 and the robot 3 is operated based on the instruction from the operator. Although the same operator as the previous one is requested to take action here, the operator may return to step S104 to reselect the operator to take action. In this case, if necessary, the previous operator may be excluded from the candidates and another operator may be selected. If the countermeasure is completed (S110-YES), the process proceeds to step S111.
  • step S111 the information control unit 102 evaluates the skill of the operator based on the correspondence history and the result stored in the correspondence information DB 108, and updates the skill stored in the operator information DB 107. Skill evaluation takes into account the results of dealing with exception events, the content of coping, and the time required for coping.
  • step S112 the information control unit 102 determines the amount of remuneration to be paid to the operator based on the hourly remuneration of the operator stored in the operator information DB 107 and the time required to deal with the exception event. To do.
  • skill evaluation (S111) and reward calculation (S112) are performed every time an exception event is dealt with, but these processes may be performed separately at an appropriate timing.
  • the operator performs some operation on the robot 3 when an exception event occurs in the robot 3, but the operator does not actually perform the operation on the robot 3. May be good.
  • the operator may check the state of the robot 3 and confirm that no abnormality has actually occurred. In such a case, the operator may transmit from the operation terminal 2 to the server 1 that no action is required without transmitting the operation command to the robot 3.
  • the task distribution server 1 manages the connection with the operation terminal 2, and immediately selects an operator who can respond to the task distribution, so that a quick exception response is possible.
  • an appropriate operator is selected according to the type of robot and the type of exception event and the task is distributed, so that appropriate exception handling is possible.
  • this embodiment is a social infrastructure or platform that connects factories (robot work places) and remote labor force. Will contribute to solving these problems.
  • factories robot work places
  • the operator does not have to work in a specific place and can work from a remote place. Therefore, even people who have difficulty commuting long distances can provide a comfortable working environment, and even elderly people and local residents can work.
  • Example 1 Here, the task distribution system to which the present invention is applied will be described more specifically by taking as an example a case where the mobile robot 3b is a delivery robot that delivers a parcel or the like by automatic operation.
  • the delivery robot in this embodiment is a mobile robot that employs wheels as a moving mechanism.
  • an opposed two-wheel type delivery robot will be described as an example, but the moving mechanism of the delivery robot may be a wheel moving mechanism other than the opposed two-wheel type, or a multi-legged or endless track type moving mechanism.
  • the delivery robot has a storage unit for stably housing the delivered product.
  • the delivery robot has various sensors such as a position information acquisition sensor, a camera, a lidar, a millimeter wave radar, an ultrasonic sensor, and an acceleration sensor, and a built-in robot controller is based on the sensor information obtained from these sensors. 5 controls the movement and other movements of the delivery robot.
  • the delivery robot sends status information to the server 1 when an exception event occurs. In addition, it can operate based on an operation command transmitted from the operation terminal 2 via the server 1.
  • an abnormal state occurs in which the delivery robot is unmanned, that is, during delivery by automatic driving, the left wheel falls into a hole on the road and stops in a derailed state. Attempting to deal with the situation around the delivery robot without knowing it can lead to a more dangerous situation. For example, if the hole is larger than the delivery robot, there is a risk that the delivery robot will fall into the hole. Therefore, when the delivery robot determines from the information of the acceleration sensor that the aircraft is not leveled, it determines that an exception event has occurred and notifies the server 1 of the occurrence of the exception event together with the status information. It should be noted that the time at which the exception event is judged to have occurred depends on the level of the control technology of the mobile robot and the required safety level, and the above criteria are merely examples.
  • the state reception unit 101 When the state reception unit 101 receives the state information from the mobile robot 3b (S103), this information is stored in the correspondence information DB 108. As shown in FIG. 6, the occurrence time 601 of the exception event, the robot ID 602 in which the exception event occurred, and the state information 603 are stored in the database.
  • the state information 603 here includes position information, vehicle body tilt, wheel state, cargo, and images taken by a camera mounted on the robot 3b.
  • the position information is the position information obtained from the GPS device (position information acquisition sensor) mounted on the mobile robot 3b.
  • the vehicle body tilt is information indicating the front / rear / left / right tilt of the vehicle body obtained from the tilt sensor (accelerometer) mounted on the mobile robot 3b. For example, when a tilt exceeding 4 degrees (absolute value) occurs, it can be determined to be abnormal. In this example, the left-right tilt is ⁇ 5 degrees (minus indicates a downward slope to the left), and the absolute value of the tilt angle exceeds the threshold value, so that the robot 3b is determined to be in an abnormal state.
  • the state of the wheels is stopped. This is because the robot 3b stopped driving the wheels because it detected an abnormality.
  • the cargo information is the information input by the delivery source. The operator can determine what kind of operation and how much should be performed based on the cargo information. For example, as shown in this example, if the cargo is a parcel, it can be handled roughly. On the other hand, if the cargo is food and drink, it needs to be handled carefully. Further, if the cargo is food and drink and the vehicle body inclination is a large value such as 45 degrees, it can be considered that it is better to contact the delivery source than to recover from the abnormal state. This is because even if the food or drink rolls over in the container of the mobile robot 3b and is delivered to the delivery destination, it is considered that the food or drink is in a meaningless state.
  • the image from the robot camera is an image from the camera attached to the vehicle body of the mobile robot 3b (see FIG. 7).
  • the camera is mounted to capture at least the front in the direction of travel.
  • a plurality of robot cameras may be attached around the vehicle body to capture an around-view image taken around the vehicle body.
  • it may be an omnidirectional image (360 degree image) taken by an omnidirectional camera (360 degree camera).
  • the state control unit 102 of the server 1 can determine from the state information transmitted from the delivery robot (mobile robot) 3b that an exception event has occurred in the delivery robot 3b, and therefore, an operator for dealing with this exception event.
  • the task matching process for selecting is performed (step S104). Since the exception event in this example is a malfunction of the delivery robot 3b, the information control unit 102 deals with this exception event from among the operators who are online and do not process other tasks. Select an operator who has the skills to do so.
  • the information control unit 102 selects an operator in consideration of the evaluation of the skill of each operator and the hourly wage. For example, in this example, the operator 2 having a higher evaluation of skill 2 and a lower hourly wage is selected.
  • the information control unit 102 transmits a request for dealing with an abnormal event that has occurred in the delivery robot 3b and the state information of the delivery robot 3b to the operation terminal of the selected operator (step S105).
  • the operation terminal 2 that has received such information displays a screen including the state information of the delivery robot 3b and the GUI for operating the delivery robot 3b.
  • the operation terminal 2 is provided with a display unit for displaying such information and a user interface for inputting an operation instruction to the robot 3b.
  • the input of the operation information may be performed via the keyboard or may be performed by touching the screen.
  • the state information of the delivery robot (information shown in the state information 603 of FIG. 6) and the image (FIG. 7) taken by the camera mounted on the delivery robot 3b are displayed on the operation terminal 2. Based on this information, the operator can first determine whether or not there is an abnormal state that requires some action, and if it is an abnormal state, whether or not it is possible to deal with it by remote control. In some cases, determine what kind of operation should be performed.
  • the operator can read the following from the image shown in FIG. 7 and other sensor information.
  • -Since the entire landscape is downward-sloping, the mobile robot 3b is tilted to the left.
  • There is a high possibility that the left wheel has come off due to the angle of inclination.
  • -Since there is no gutter along the guardrail there is a high possibility that the wheel has come off in an unexpected hole.
  • -Since there are guardrails on the left and right it is a relatively safe place to operate and move the mobile robot 3b.
  • Because the guardrail on the left side is closer, it is safe to back up to the right rear within 1 meter in order to recover from the derailed state.
  • the state information and images sent when an exceptional event occurs are diverse, and the above matters can be instantly grasped by humans, but it is difficult for computers (AI, etc.).
  • AI computers
  • the operator can conclude that the left wheel of the delivery robot has fallen into the hole. Further, the operator can determine from the surrounding conditions that there is no problem even if the mobile robot 3b moves backward about 1 meter to the right in the traveling direction, and that it is necessary to move the mobile robot 3b for recovery.
  • Such situation determination and operation of the mobile robot 3b cannot be executed unless it is programmed in the mobile robot 3b. It is not realistic to perform programming that can handle all situations in advance, and it is still appropriate to let humans (operators) decide to properly deal with and operate various situations.
  • the operator determines the operation command to be given to the mobile robot 3b as follows. First, as a first step, it is instructed to rotate the left wheel at ⁇ 0.05 m / s while keeping the right wheel locked. Since the left wheel has fallen into the hole, the operator locks the right wheel outside the hole and instructs the operation to slowly retract the left wheel around the right wheel. By this operation, the left wheel hangs on the edge of the fallen hole, and the left wheel can be slowly pulled up.
  • the right wheel speed is instructed to be changed to -0.05 m / s.
  • the left wheel has been lifted from the hole.
  • the right wheel is retracted at the same speed as the left wheel, and the mobile robot 3b is moved away from the hole. If the vehicle body tilt does not become normal even after the operation of the first step is continued for a certain period of time, it may be instructed to end abnormally.
  • the operator stops the wheel operation after 10 seconds, takes an image with the front camera of the robot 3b, and instructs the execution of self-diagnosis whether or not the robot 3b has returned to the normal state.
  • the vehicle has moved to a position 0.5 meters away from the position where it fell into the hole.
  • This value should be set according to the surrounding situation, and it is appropriate to let a human (operator) determine the value. It is considered that the camera mounted on the robot 3b can take a picture of the hole at a distance of 0.5 meter from the hole.
  • the self-diagnosis is executed as the operation of the third step for reconfirmation. Not only is the vehicle body tilt within the normal range, but the mobile robot 3b can self-diagnose whether there is a functional problem, and if it returns to the normal state, the delivery work can be continued from there. Even if the mobile robot 3b determines that it is normal as a result of the self-diagnosis, the operator may determine that the abnormal state continues. In such a case, the determination by the operator may be prioritized.
  • the operator transmits the operation command consisting of the above first to third steps from the operation terminal 2 to the mobile robot 3b via the server 1 (steps S106 and 107).
  • the operator collectively transmits the instructions of the first to third steps to the mobile robot 3b.
  • the instruction may be transmitted in the unit of each step, or the instruction may be transmitted in a finer unit.
  • the operator may remotely control the mobile robot in real time.
  • the mobile robot 3b operates in accordance with the above operation command, and transmits the information obtained as a result to the operation terminal 2 via the server 1 or directly. As a result of the first to third steps, the mobile robot 3b can recover from the derailed state, retreat from the hole, and acquire an image of the hole as shown in FIG.
  • the operator can read the following from the image shown in FIG. -
  • the mobile robot 3b is horizontal from the entire landscape. ⁇ As instructed in the operation information, it moved to the right rear from the stop position. ⁇ A hole could be recognized from the image, and the left wheel was removed from the hole, resulting in an abnormal condition. -From the above, the mobile robot 3b has returned to the normal state.
  • the server 1 can record that there is a hole on the road together with the GPS position information. As a result, as shown in FIG. 9, the position 901 of the hole on the road can be grasped, and the mobile robot 3b traveling on this road will determine the traveling route 902 with reference to the position 901 of the hole in the future to reach the hole. You will be able to avoid the fall of.
  • the operator refers to the state information transmitted from the mobile robot 3b, inputs the result of the operation, and transmits it to the server 1 (step S108).
  • an operation result indicating that the normal state has been restored is input.
  • the server 1 stores the transmitted operation result in the correspondence information DB 108 together with the operation content performed, the operator ID, and the time required for the correspondence (step S109).
  • the operation result information may be manually input to the operation terminal 2 by the operator, or may be caused by the mobile robot 3b to make a self-judgment.
  • the server 1 may continuously request the same operator to take action, but may select a new operator and have another operator take action on the abnormal event.
  • the mobile robot 3b in the present embodiment can at least move autonomously and patrol the field or the greenhouse, and has a function of taking a picture and transferring the robot.
  • the mobile robot 3b includes a moving mechanism 1011, a robot controller 1012, a position information acquisition device 1013, a camera 1014, a main body (running unit) including a power supply (not shown), a pesticide storage unit 1021, and a pesticide spraying unit.
  • It includes an arm 1020 including a unit 1022, an illumination unit 1023, and a camera 1024.
  • one arm 1020 is provided with the functions of image capturing and pesticide spraying, but different arms may be provided with each function.
  • the arm 1020 can harvest fruits and scrape and pick up leaves.
  • the mobile robot 3b may be provided with other centers such as a distance sensor and an acceleration sensor other than those shown in the figure.
  • the moving mechanism 1011 is a moving mechanism such as a wheel type or an endless track type.
  • the robot controller 1012 is the same as that described above.
  • the position information acquisition device 1013 is a device for identifying the position of the robot in a field or a vinyl house.
  • the position information acquisition device 1013 may be a GPS device, or may be a device that reads a barcode or RFID attached to a field, a greenhouse, or a stock.
  • the camera 1014 captures the surroundings of the robot in a relatively wide range.
  • the mobile robot 3b has a pesticide spraying function by the pesticide storage section 1021 and the pesticide spraying section 1022, and can automatically spray pesticides when the robot in need of pesticide spraying determines.
  • the pesticide is sprayed, for example, from the arm.
  • the type and pathological condition of the pest can be identified from the image of the arm camera 1024, and the type of pesticide effective for the identified pest or pathological condition can be identified. For example, in the presence of Henosepilachna vigintioctopunctata, it can be determined that an aqueous solution of clothianidin is effective.
  • the pesticide storage unit 1021 may be loaded with a plurality of types of pesticides so as to deal with a plurality of pests and medical conditions. It can be seen from the image obtained from the camera 1024 that an abnormality may have occurred, but the robot determines whether it is actually abnormal and what to do if it is abnormal. It can be difficult to determine automatically. For example, if the leaves are discolored, the causes are various, such as pests, molds, and viruses, and different pesticides should be used depending on the cause.
  • the robot arm 1020 can be moved back and forth and left and right, and can be inserted into thick leaves.
  • a camera 1024 and an illumination unit 1023 are mounted on the tip of the robot arm 1020, and the direction of photography and illumination by the camera 1024 and the illumination unit 1023 can be changed by rotating the arm 1020. For example, by pointing the camera 1024 and the illumination unit 1023 upward, it is possible to take a picture of the back side of the leaf (see FIG. 11A).
  • FIG. 11A is a diagram showing an example of the image 1101 taken by the main body camera 1014 of the mobile robot 3b.
  • Image 1101 shows the arm 1020 in an extended state.
  • a discolored portion 1102 is observed on a part of the leaf surface of the tomato.
  • the robot controller 1012 determines that the cause of the discoloration cannot be identified only from the image of the discolored portion 1102 on the leaf surface, and determines that it is necessary to operate the arm 1020 to take an image of the leaf back.
  • the robot controller 1012 extends the arm to the position of the back of the leaf with the arm camera 1024 and the lighting unit 1023 facing upward, and takes a picture with the camera 1024.
  • the robot 3b may be controlled so as to inspect the entire number of strains even if no discolored portion is observed on the leaves. Further, if there is no abnormality, it is not necessary to send the image to the operation terminal.
  • FIG. 11B is a diagram showing an image 1103 of the back of a leaf taken by an arm camera 1024.
  • image processing may be performed by the robot controller 1012 or another processing device incorporated in the robot 3b, or may be performed by an external device to which the image is transmitted from the robot 3b. From the result of this image processing, it is highly possible that the robot controller 1012 has an abnormality, but it is unclear how to deal with it. Therefore, an event that requires the judgment of the operator (human). That is, it is determined that an exception event has occurred. Therefore, the robot controller 1012 transmits the state information to the server 1.
  • FIG. 12 shows an example of the correspondence information DB 108 in this embodiment. Since the table format itself is the same as that of the first embodiment (FIG. 6), detailed description thereof will be omitted. At this point, the date and time when the exception event occurred 1201, the robot ID 1202 where the exception event occurred, and the state information 1203 are stored in the database 108.
  • the state information 1203 includes position information, leaf surface state, leaf back state, leaf area, leaf area, fruit diameter, fruit color, and image.
  • the position information includes latitude / longitude information, a vinyl house number, and a stock number acquired by the position information acquisition device 1013.
  • the position information includes height information, and this height information is the position imaged by the camera 1014 mounted on the robot main body 1010, or the arm camera 1024 when the image is taken by the arm camera 1024. Or the height of the arm 1020.
  • the height of the arm camera 1024 or arm 1020 can be obtained from the encoder provided on the arm 1020.
  • the height information represents the height of the fruit or leaf of interest.
  • the leaf table information indicates the result of determining the state of the leaf table by the image recognition technology.
  • leaf surface information include "normal”, “discolored part”, “worm-eaten part”, and "insect”.
  • image recognition can be realized by a machine learning method such as deep learning, and since it is known, detailed description thereof will be omitted.
  • the leaf back information indicates the result of determining the leaf back state by the image recognition technology.
  • the mobile robot 3b takes an image of the leaf back and obtains an image recognition result.
  • the information on the back of the leaf is the recognition result, and in this example, the recognition result that "there is a discolored part” and "there is an insect” is obtained.
  • the state of the leaf surface becomes "normal”
  • the photography of the leaf back and the determination of the state may be omitted.
  • even if the state of the leaf surface is "normal” there may be an abnormality in the leaf back, so that both the leaf surface and the leaf back may be photographed to check the state.
  • the area of the leaf, the diameter of the fruit, and the color of the fruit are calculated based on the image taken by the main camera 1014 and the distance to the object (leaf or fruit) by the distance sensor. Since this information can be calculated using a known technique, detailed description thereof will be omitted.
  • FIGS. 11A and 11B The image from the camera 1014 mounted on the robot body and the image from the camera 1024 mounted on the robot arm are shown in FIGS. 11A and 11B, respectively.
  • the state control unit 102 of the server 1 can determine that an exception event has occurred from the state information transmitted from the mobile robot 3b, and therefore performs a task matching process for selecting an operator to deal with this exception event. (Step S104). Since the exception event in this example relates to the disease diagnosis of tomato, the information control unit 102 is for dealing with this exception event from among the operators who are online and do not process other tasks. Select an operator with skills.
  • FIG. 13 shows an example of the operator information DB 107 in this embodiment. Since the content of the operator information DB 107 is basically the same as that of the first embodiment, duplicate description will be omitted. Since this embodiment is an application to agricultural work, the skills related to agricultural work of each operator are stored. In the example shown in FIG. 10, "inspection of tomato leaf disease and pests" is set as one of the skill information. This skill means knowledge of the diseases and pests that occur in tomatoes and knowledge of the appropriate treatment for the diseases and pests found. Appropriate treatment is the ability to select or follow up effective pesticides against diseases and pests. As another skill, “harvesting tomatoes” is also set. This skill is a skill that allows you to look at the image sent from the robot, determine whether or not to harvest, and remotely control the arm to harvest.
  • the information control unit 102 transmits a request for dealing with an exception event and the state information of the mobile robot 3b to the operation terminal of the selected operator (step S105).
  • the operation terminal 2 that has received such information displays a screen including the state information of the delivery robot and the GUI for operating the delivery robot.
  • FIG. 14 shows an example of the GUI 1400 displayed on the operation terminal 2.
  • the GUI 1400 includes an image 1401 of the main camera 1014, an image 1402 of the arm camera 1024, and state information 1405 other than the image.
  • the GUI 1400 also includes an input unit 1403 for operating (moving) the robot body 1010, an input unit 1404 for operating (moving) the arm 1020, an input unit 1406 for inputting a series of operation commands, and an operation result.
  • the input unit 1407 for inputting the input contents and the transmission button 1408 for transmitting the input contents to the server 1 are included.
  • the operator inputs an operation command for the mobile robot 3b using the GUI 1400 and sends it to the server 1.
  • the first step is a leaf pruning process. From the images 1401, 1402 and the state information 1405, the operator confirms that the leaves have insects, and as a countermeasure, the leaves are pruned to prevent further spread of damage caused by the insects. Instruct to store in.
  • the closed box is owned by the robot, and it is assumed that the robot can automatically perform the pruning process of the target leaf by the robot arm and the storage process in the closed box.
  • the second step is the spraying treatment of pesticides.
  • the operator determines that the insect is likely to be an aphid, and inputs "YES" in the item for spraying pesticides, and then selects and inputs a pesticide acephate that is effective against aphids.
  • the robot is equipped with a plurality of types of pesticides, and that the pesticide spraying process can be performed automatically by the robot.
  • the operation command input by the operator is transmitted to the mobile robot 3b via the server 1 (steps S106 and S107).
  • the mobile robot 3b operates in accordance with the above operation command, and transmits the information obtained as a result to the operation terminal 2 via the server 1 or directly.
  • the operator can determine whether the leaf pruning treatment or the pesticide spraying treatment has been completed normally based on the image and the state information transmitted from the mobile robot 3b. If it is completed normally, enter "Processing completed" as the operation result.
  • the judgment by the operator is based on the image, the judgment may be wrong, and insects and diseases may have spread to places that cannot be found only by the robot image.
  • the operator inputs post-confirmation as "necessary” as a matter of contact to those who can be confirmed directly on the farm.
  • the operator may input "aphid” as the type of insect that is occurring as a judgment result.
  • the information on these operation results is transmitted from the operation terminal 2 to the server 1 (step S108).
  • the server 1 stores the transmitted operation result in the correspondence information DB 108 together with the operation command content, the operator ID, and the time required for the correspondence (step S109).
  • the operation information 1204 the operation result 1205, the operator ID 1206, and the correspondence time 1207 of the correspondence information DB 108 are updated.
  • the information control unit 102 of the server 1 responds to the operation terminal 2 of the operator at the site (farm) with an exception event. It sends information to inform you of what happened and what you did, and to ask you to confirm after the fact.
  • the operator at the site examines the strain on which the insect was attached and confirms whether there are any insects remaining around, and examines the leaves pruned by the mobile robot 3b to confirm whether the insect is actually an aphid. To do. If the remote operator makes a mistake, the field operator takes another action or registers the correct result in the server 1.
  • the mobile robot 3b may work on the farm during the night time.
  • the mobile robot 3b in this embodiment is not intended to be used in a fully robotized or automated farm. Therefore, the farm worker and the mobile robot 3b work in the same place, but it is more efficient to work in different time zones than to work in a mixed manner with the farm worker and the robot. For this reason, the work by the mobile robot 3b may be performed at night. Exceptional events need to be dealt with at night, but remote operators can respond to them wherever they are connected via a network. For example, the task of dealing with exceptional events can be assigned to remote workers in foreign countries with a large time difference. Allocation is also preferred.
  • Example 3 A case where the mobile robot 3b is a robot that manages products in a store will be briefly described.
  • the mobile robot 3b autonomously performs operations such as product replenishment, inspection, confirmation of missing items, and confirmation of price tags. Therefore, the mobile robot 3b is equipped with a manipulator and a camera.
  • Another example of an exceptional event is the case where a product is dropped during handling. If the robot 3b drops a product during the automatic operation of product delivery and the product rolls in an unexpected place, it is difficult to find out where the product went. Therefore, it is assumed that the remote operator is requested to search for and collect the dropped products. In addition, even if the product can be collected, the product may be damaged or dented, and the value as a product for sale may be impaired. Since it is difficult for the robot 3b to automatically make this determination, it is assumed that a remote operator is requested to perform the confirmation work.
  • Another example of an exceptional event is the case where the price tag cannot be read with high accuracy.
  • a specific product may be sold at a special price on a limited sale date, and the special price and the regular price tag may be exchanged.
  • the price tag is replaced, it is assumed that an image is taken by the camera mounted on the robot 3b and the price tag is confirmed by image recognition.
  • the robot may not be able to read the price tag with high accuracy.
  • a remote worker is requested to read the price tag.
  • the remote operator may instruct not only to read the price tag from the sent image but also to retake the price tag from a direction in which the reflection of the lighting does not occur.
  • the mobile robot 3b is a cleaning robot will be briefly described.
  • the mobile robot 3b has a moving mechanism, a floor cleaning function, a camera, and a wireless communication function, and performs self-propelling and floor cleaning based on map data of a work place.
  • the work place is a large place such as a building, a station, or an airport.
  • the mobile robot 3b also has a manipulator that pushes a button on the elevator.
  • the mobile robot 3b is also equipped with a microphone and a speaker, and has a function of interacting with a human.
  • Such a mobile robot 3b self-propells at the place specified by the map data and cleans the floor. When an obstacle that does not exist in the map data is detected, the mobile robot 3b needs to automatically avoid it.
  • an exceptional event is when a dirt on the floor is detected, but it cannot be determined whether the robot 3b can clean it.
  • an image is sent to the operation terminal 2 and a remote operator is requested to input an instruction to the operation terminal to have the robot clean or dispatch a cleaner.
  • the robot 3b transmits the size, type, and the like of the dust falling on the floor surface to the operation terminal. It is also important whether or not the subsequent cleaning is automatically continued when the liquid is detected by the sensor. This is because in the case of liquid stains, the stains may be diffused by the cleaning brush or wheels of the robot 3b. Therefore, it may be desirable to have the robot 3b continue cleaning other parts without cleaning the liquid, and have the cleaning staff clean the liquid.
  • Another example of an exceptional event is the case where it cannot be moved.
  • the robot 3b moves between floors.
  • the robot 3b uses an elevator to move between floors, and presses an operation button using a manipulator.
  • the operation buttons of the elevator are diverse, and in some cases, the display on the floor may be faint and difficult to read.
  • the robot cannot automatically determine which operation button should be pressed in this way, it is assumed that the place (floor) to be moved and the image are sent to the operation terminal and the remote operator is requested to determine.
  • An exceptional event is when you should communicate with humans. There may be a person around the robot 3b, and when a person is complaining to the robot, it is preferable that the person (operator) speaks directly. Even when the robot 3b is automated to have a conversation with a person, it may be determined that an exception event has occurred when the robot 3b determines that the communication is not proceeding smoothly. In such a case, it is assumed that the voice received by the robot is transmitted to the operation terminal and the remote operator is requested to respond. The sound emitted by a person to the robot 3b is output from the speaker of the operation terminal via the microphone of the robot, and the sound emitted by the remote operator is output from the speaker of the robot via the microphone of the operation terminal. As a result, a conversation between the remote operator and the person in the field can be realized, and for example, a complaint can be dealt with.
  • Example 5 In the above embodiment, it is assumed that the remote operator handles the exception event that occurs in the mobile robot 3b, but the target robot may be the fixed robot 3a.
  • An example of such a fixed robot 3a is a aquaculture cage management robot.
  • the cage management robot 3a manages aquaculture by sensing values such as water temperature, air temperature, and amount of solar radiation.
  • One of the tasks that makes it difficult for the robot 3a to realize automatic processing is determining the amount of feed. It is necessary to feed an appropriate amount according to the condition of the fish on that day, and it is not enough to simply feed a large amount due to the problems of environmental pollution and cost.
  • the living cage management robot 3a of this embodiment is equipped with a feeding device, a camera, a wireless communication device, and various sensors (water temperature, air temperature, amount of solar radiation), and transmits data acquired by the sensors by wireless communication.
  • the feeding timing is always regarded as an exception event. Therefore, at the feeding timing, the image taken by the cage management robot 3a and the data acquired by the sensor are transmitted to the operation terminal, and the remote operator is requested to control the feeding device.
  • the remote operator can control the feeding device while observing the image and adjust the amount of feeding while observing the state of the fish on the surface of the water, that is, the degree of biting into the food.
  • An operation terminal connection unit (104) that manages the connection status of the operation terminal (2) owned by the operator, and The status reception unit (101) that receives status information from the robots (3a, 3b) and When it is determined that an exception event has occurred in the robot based on the state information, the information control unit (102) selects an operator for dealing with the exception event from the available operators.
  • a transmission unit (103) that transmits a response request for an exception event of the robot to the operation terminal (2) of the selected operator together with the state information of the robot.
  • a task distribution device (1)
  • Task distribution server 2 Operation terminal 3a: Fixed robot 3b: Mobile robot 101: Status reception unit 102: Information control unit 103: Transmission unit 104: Operation terminal connection unit 105: Reception unit 106: Output unit 107 Operator information DB 108: Correspondence information DB

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Development Economics (AREA)
  • Game Theory and Decision Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Manipulator (AREA)

Abstract

This task distribution device comprises: an operation terminal connection unit that manages a connection state of an operation terminal owned by an operator; a state reception unit that receives state information from a robot; an information control unit that selects, from among available operators, an operator for coping with an exceptional event when it is determined on the basis of the state information that the exceptional event has occurred in the robot; and a transmission unit that transmits, to the operation terminal of the selected operator, a request for coping with the exceptional event of the robot together with the state information on the robot.

Description

タスク配信装置、タスク配信システム、方法、およびプログラムTask distribution device, task distribution system, method, and program
 本発明は、ロボットに生じた例外事象に対処するタスクを適切な操作者に割り当てるためのタスク配信技術に関する。 The present invention relates to a task distribution technique for assigning a task for dealing with an exception event that occurs in a robot to an appropriate operator.
 近年の情報技術の進展に伴い、ロボットが周囲の状況を判断して自律的に処理を行えるようになっている。しかしながら、ロボットはあらゆる状況に対応できるようには自動化されていないため、ロボットが対応できないような例外的な状況に陥った場合には、人間がこの例外事象に対処しなければならない。 With the progress of information technology in recent years, robots can judge the surrounding situation and perform processing autonomously. However, since robots are not automated to handle all situations, humans must deal with this exceptional event in the event of an exceptional situation that the robot cannot handle.
 ロボットは遠隔操作可能であることから、遠隔地にいるいずれかの操作者に遠隔操作によって例外事象に対処するよう依頼することが考えられる。このように例外事象に対して遠隔操作によって対応する技術として特許文献1,2が挙げられる。 Since the robot can be operated remotely, it is conceivable to ask any operator in a remote location to handle the exception event by remote control. Patent Documents 1 and 2 can be mentioned as a technique for responding to an exception event by remote control.
 特許文献1は、サーバが、遠隔操作してもらうことを希望するタスクを公開し、遠隔操作者からの入札に応じて当該タスクを依頼する遠隔操作者を決定することを開示する。しかしながら、この手法では、遠隔操作者からの入札がなければタスクが実行されないため、即応性に欠けるという問題がある。 Patent Document 1 discloses that the server discloses a task desired to be remotely controlled and determines a remote operator who requests the task in response to a bid from the remote operator. However, this method has a problem of lacking responsiveness because the task is not executed without a bid from the remote operator.
 特許文献2に開示される自律走行ロボットは、障害物が検出されたときに遠隔操作を促す信号を出力し、遠隔操作によって障害物を回避するように構成される。この自律走行ロボットは、過去に障害物を回避したときの操作情報を記憶しておき、新たに障害物を回避する必要があるときに参照可能に構成される。特許文献2では、自律走行ロボットと遠隔操作者があらかじめ対応づけられていることが想定されている。ロボットと遠隔操作者が1対1に対応させた場合は、多数の遠隔操作者が必要となるという問題がある。また一人の遠隔操作者が複数のロボットに対応づけられる場合は、遠隔操作が必要な事象が複数のロボットに同時に発生したときに対処できないという問題がある。 The autonomous traveling robot disclosed in Patent Document 2 is configured to output a signal prompting remote control when an obstacle is detected and avoid the obstacle by remote control. This autonomous traveling robot is configured to store operation information when an obstacle has been avoided in the past and to refer to it when it is necessary to newly avoid the obstacle. In Patent Document 2, it is assumed that the autonomous traveling robot and the remote operator are associated with each other in advance. If the robot and the remote operator have a one-to-one correspondence, there is a problem that a large number of remote operators are required. Further, when one remote operator is associated with a plurality of robots, there is a problem that it cannot be dealt with when an event requiring remote control occurs in a plurality of robots at the same time.
国際公開第2008/140011号International Publication No. 2008/140011 特開2013-206237号公報Japanese Unexamined Patent Publication No. 2013-206237
 本発明は、ロボットにおいて発生した例外事象に迅速に対処できるようにすることを目的とする。 An object of the present invention is to be able to quickly deal with an exceptional event that occurs in a robot.
 上記目的を達成するために、本発明は以下の構成を採用する。 In order to achieve the above object, the present invention adopts the following configuration.
 具体的には、本発明の第一態様は、操作者が有する操作端末の接続状態を管理する操作端末接続部と、ロボットから状態情報を受信する状態受付部と、前記状態情報に基づいて前記ロボットに例外事象が発生していると判断される場合に、当該例外事象に対処するための操作者を、利用可能な操作者の中から選択する情報制御部と、前記選択された操作者の操作端末に、前記ロボットの例外事象への対処要求を前記ロボットの状態情報とともに送信する送信部と、を備えるタスク配信装置である。 Specifically, the first aspect of the present invention is an operation terminal connection unit that manages the connection state of the operation terminal owned by the operator, a state reception unit that receives state information from the robot, and the state reception unit based on the state information. When it is determined that an exception event has occurred in the robot, the information control unit that selects the operator for dealing with the exception event from the available operators, and the selected operator It is a task distribution device including a transmission unit that transmits a response request for an exception event of the robot together with the state information of the robot to an operation terminal.
 本開示において、ロボットとは、何らかの動作を自律的に実行可能な任意の装置が含まれる。典型的には、ロボットは、センサとアクチュエータを含み、センサから得られる情報に基づいてアクチュエータを自律的に制御するように構成された装置である。ロボットは、自律移動可能なモバイルロボットであってもよいし、自律移動不可能な固定型ロボットであってもよい。 In the present disclosure, the robot includes an arbitrary device capable of autonomously performing some movement. Typically, a robot is a device that includes a sensor and an actuator and is configured to autonomously control the actuator based on information obtained from the sensor. The robot may be a mobile robot capable of autonomous movement, or may be a fixed robot that cannot move autonomously.
 ロボットの状態情報とは、少なくとも、ロボットの現在の状態が把握可能な情報であってよい。ロボットの状態情報の例として、ロボットが備えるセンサから得られるセンサ情報、センサ情報に基づいてロボットが解析した解析結果が含まれる。なお、ロボットの状態情報の全てがタスク配信装置に送信される必要はなく、少なくともロボットに例外事象が発生していると判断可能な状態情報が送信されればよい。また、ロボットから状態情報を受信するということは、ロボットから状態情報を受信することと、その他の装置を介して受信することの両方を含む。 The robot state information may be at least information that can grasp the current state of the robot. Examples of the robot state information include sensor information obtained from a sensor provided by the robot and analysis results analyzed by the robot based on the sensor information. It should be noted that it is not necessary that all the state information of the robot is transmitted to the task distribution device, and at least the state information that can determine that an exception event has occurred is transmitted to the robot. Further, receiving the state information from the robot includes both receiving the state information from the robot and receiving the state information via other devices.
 例外事象とは、ロボットが自動で処理を行えない、または、行うことが適切ではないと判断される事象である。あるいは、例外事象とは、人間による操作または確認が必要な事象と捉えることができる。例外事象は、状態情報に含まれていてもよいし、状態情報からタスク配信装置によって判断されてもよい。 An exceptional event is an event in which the robot cannot automatically process or is judged to be inappropriate. Alternatively, an exceptional event can be regarded as an event that requires human operation or confirmation. The exception event may be included in the status information, or may be determined by the task distribution device from the status information.
 操作端末接続部が行う操作端末の接続状態の管理には、複数の操作端末のオンライン状態を確立することと、オンライン状態を監視することが含まれてもよい。なお、オンライン状態というのは、タスク配信装置からの通知に対して操作端末が即座に応答可能な状態を含む。 The management of the connection status of the operation terminal performed by the operation terminal connection unit may include establishing the online status of a plurality of operation terminals and monitoring the online status. The online state includes a state in which the operation terminal can immediately respond to the notification from the task distribution device.
 情報制御部は、ロボットに例外事象が発生していると判断される場合に、この例外事象に対処するための操作者を選択する。例外事象が発生しているか否かは、状態情報の内容から判断可能である。ただし、例外事象が発生した場合のみに状態情報がタスク配信装置に通知される実施形態では、情報制御部は、状態情報を受信すれば、その内容を参照することなく、例外事象が発生していると判断可能である。情報制御部は、例えば、オンライン状態である操作端末に関連付けられた操作者を、受信した例外事象に対処するための操作者として選択してもよい。情報制御部は、さらに、他のロボットの例外事象に対処中ではないという条件を満たす操作者を選択するとよい。ただし、対処可能な操作者がいない場合にはこの限りではない。 When it is determined that an exception event has occurred in the robot, the information control unit selects an operator to deal with this exception event. Whether or not an exception event has occurred can be determined from the contents of the state information. However, in the embodiment in which the status information is notified to the task distribution device only when an exception event occurs, when the information control unit receives the status information, the exception event occurs without referring to the contents thereof. It can be determined that there is. For example, the information control unit may select an operator associated with the operating terminal that is online as an operator for dealing with the received exception event. The information control unit may further select an operator who satisfies the condition that the exception event of another robot is not being dealt with. However, this does not apply if there is no operator who can handle it.
 本態様は、上記のような構成を有するので、例外事象に対して迅速に対応可能な操作者を選択できる。言い換えると、ロボットに例外事象が発生したときに、この例外事象に迅速に対応することが可能となる。 Since this aspect has the above configuration, it is possible to select an operator who can quickly respond to an exception event. In other words, when an exception event occurs in the robot, it is possible to quickly respond to this exception event.
 本態様にかかるタスク配信装置は、操作者のスキルを記憶する操作者情報記憶部をさらに有してもよい。また、情報制御部は、ロボットの種類および例外状態の少なくとも一方に合致するスキルを有する操作者を選択してもよい。 The task distribution device according to this embodiment may further have an operator information storage unit that stores the skill of the operator. In addition, the information control unit may select an operator having a skill that matches at least one of the robot type and the exceptional condition.
 このような構成によれば、発生した例外事象に対して適切な操作者を選択することができる。 According to such a configuration, an appropriate operator can be selected for the exceptional event that has occurred.
 本態様にかかるタスク配信装置は、前記操作端末から前記ロボットに対する操作指令を受信する受信部と、前記操作指令を前記ロボットに対して出力する出力部と、前記ロボットの状態と前記操作者からの操作内容とを関連付けて記憶する対応記憶部と、さらに有してもよい。 The task distribution device according to this embodiment includes a receiving unit that receives an operation command for the robot from the operation terminal, an output unit that outputs the operation command to the robot, a state of the robot, and the operator. It may further have a corresponding storage unit that stores the operation contents in association with each other.
 本態様において、前記受信手段は、前記ロボットの例外事象に対する対処結果を前記操作端末から受信し、前記対応記憶部は、前記対処結果を、前記ロボットの例外事象および前記操作者と関連付けて記憶してもよい。対処結果の内容は特に限定されず、例えば、例外事象が解決したかあるいは解決していないかを表すものであってもよいし、他のオペレータによる現場での確認を求めるものであってもよい。 In the present embodiment, the receiving means receives the response result for the exception event of the robot from the operation terminal, and the response storage unit stores the response result in association with the exception event of the robot and the operator. You may. The content of the response result is not particularly limited, and may indicate, for example, whether the exception event has been resolved or not resolved, or may require confirmation on site by another operator. ..
 本態様において、前記情報制御部は、前記対応情報記憶部を参照して前記操作者のスキルを評価し、前記操作者情報記憶部を更新してもよい。 In this embodiment, the information control unit may evaluate the skill of the operator with reference to the corresponding information storage unit and update the operator information storage unit.
 本態様において、前記操作者情報記憶部は、前記操作者の利用費用を記憶しており、前記情報制御部は、前記対応記憶部を参照して前記操作者の対応履歴に応じて支払うべき報酬を決定してもよい。 In this embodiment, the operator information storage unit stores the usage cost of the operator, and the information control unit refers to the corresponding storage unit and pays a reward according to the correspondence history of the operator. May be determined.
 本態様において、前記情報制御部は、前記ロボットの例外事象を分類し、例外事象の種類に応じた操作者を、前記例外事象に対処するための操作者として選択してもよい。例えば、例外事象の種類ごとに、対応する操作者、あるいは操作者のグループをあらかじめ定めておいてもよい。このようにすれば、同じ種類の例外事象が特定の操作者によって対処されることになるので、これらの操作者は、当該種類の例外事象に対する習熟度が増す。 In this embodiment, the information control unit may classify the exception events of the robot and select an operator according to the type of the exception event as an operator for dealing with the exception event. For example, a corresponding operator or a group of operators may be predetermined for each type of exception event. In this way, the same type of exception event will be dealt with by a particular operator, and these operators will become more proficient in that type of exception event.
 本態様において、前記送信部は、前記操作者からの要求に応じて、前記ロボットが設置されている現場にいる管理者の操作端末に、前記ロボットの状態情報を送信してもよい。このようにすれば、ロボットの作業現場での処理が必要であると操作者が判断した場合に、現場の管理者が確認を行える。 In this embodiment, the transmission unit may transmit the state information of the robot to the operation terminal of the administrator at the site where the robot is installed in response to the request from the operator. In this way, when the operator determines that the robot needs to be processed at the work site, the site manager can confirm it.
 本態様において、状態情報には、ロボットに搭載されたカメラによって撮影された画像と、ロボットに備えられたセンサから得られるセンサ情報の解析結果が含まれてもよい。これらの情報を操作者が参照することで、例外事象への対処が容易になる。 In this embodiment, the state information may include an image taken by a camera mounted on the robot and an analysis result of sensor information obtained from a sensor provided on the robot. When the operator refers to this information, it becomes easy to deal with the exception event.
 本発明の第二の態様は、ロボットと、前記ロボットを遠隔操作するための操作端末と、上述のタスク配信装置と、を備えるタスク配信システムである。 A second aspect of the present invention is a task distribution system including a robot, an operation terminal for remotely controlling the robot, and the above-mentioned task distribution device.
 本態様において、前記操作端末は、前記ロボットの例外事象への対処要求を受信すると、当該状態情報と前記ロボットを操作するためのグラフィカルユーザーインターフェース(GUI)とを表示部に表示するように構成されてもよい。 In this embodiment, when the operation terminal receives a request for coping with an exception event of the robot, the operation terminal is configured to display the state information and a graphical user interface (GUI) for operating the robot on a display unit. You may.
 本態様において、ロボットは、固定型ロボットであってもよいし、モバイルロボットであってもよい。モバイルロボットは、移動装置と、センサと、カメラとを有し、自律移動可能なモバイルロボットであってもよい。モバイルロボットの上にアーム型ロボットを搭載したロボットであってもよい。 In this embodiment, the robot may be a fixed robot or a mobile robot. The mobile robot may be a mobile robot that has a mobile device, a sensor, and a camera and can move autonomously. It may be a robot in which an arm type robot is mounted on a mobile robot.
 なお、本発明は、上記構成ないし機能の少なくとも一部を有するタスク配信装置またはタスク配信システムとして捉えることができる。また、本発明は、上記処理の少なくとも一部を含む、タスク配信方法や、当該方法をコンピュータに実行させるためのプログラム、又は、そのようなプログラムを非一時的に記録したコンピュータ読取可能な記録媒体として捉えることもできる。上記構成及び処理の各々は技術的な矛盾が生じない限り互いに組み合わせて本発明を構成することができる。 The present invention can be regarded as a task distribution device or task distribution system having at least a part of the above configuration or function. The present invention also includes a task distribution method including at least a part of the above processing, a program for causing a computer to execute the method, or a computer-readable recording medium in which such a program is recorded non-temporarily. It can also be regarded as. Each of the above configurations and processes can be combined with each other to construct the present invention as long as there is no technical contradiction.
 本発明によれば、ロボットにおいて発生した例外事象に迅速に対処できる。 According to the present invention, it is possible to quickly deal with an exceptional event that occurs in a robot.
図1は、実施形態に係るタスク配信システムの構成例を示す図である。FIG. 1 is a diagram showing a configuration example of a task distribution system according to an embodiment. 図2は、実施形態においてタスク配信サーバが行う処理を示すフローチャートである。FIG. 2 is a flowchart showing a process performed by the task distribution server in the embodiment. 図3は、実施形態においてタスク配信サーバが行う処理を示すフローチャートである。FIG. 3 is a flowchart showing a process performed by the task distribution server in the embodiment. 図4は、実施形態において情報制御部が行う処理を説明する図である。FIG. 4 is a diagram illustrating a process performed by the information control unit in the embodiment. 図5は、実施例1における操作者情報DBの例を示す図である。FIG. 5 is a diagram showing an example of the operator information DB in the first embodiment. 図6は、実施例1における対応情報DBの例を示す図である。FIG. 6 is a diagram showing an example of the correspondence information DB in the first embodiment. 図7は、実施例1におけるロボットが撮影する画像の例を示す図である。FIG. 7 is a diagram showing an example of an image taken by the robot in the first embodiment. 図8は、実施例1におけるロボットが撮影する画像の例を示す図である。FIG. 8 is a diagram showing an example of an image taken by the robot in the first embodiment. 図9は、実施例1における例外事象への対処結果を説明する図である。FIG. 9 is a diagram for explaining the result of dealing with the exception event in the first embodiment. 図10は、実施例2に係るロボットの構成例を示す図である。FIG. 10 is a diagram showing a configuration example of the robot according to the second embodiment. 図11A,図11Bは、実施例2におけるロボットが撮影する画像の例を示す図である。11A and 11B are diagrams showing an example of an image taken by the robot in the second embodiment. 図12は、実施例2における操作情報DBの例を示す図である。FIG. 12 is a diagram showing an example of the operation information DB in the second embodiment. 図13は、実施例2における対応情報DBの例を示す図である。FIG. 13 is a diagram showing an example of the correspondence information DB in the second embodiment. 図14は、実施例2における操作者端末に表示されるロボットの遠隔操作用のグラフィカルユーザーインターフェースを示す図である。FIG. 14 is a diagram showing a graphical user interface for remote control of the robot displayed on the operator terminal in the second embodiment.
 <適用例>
 図1は、本発明が適用されたタスク配信システムの構成例を示すブロック図である。タスク配信システムは、タスク配信サーバ(タスク配信装置)1、操作端末2、固定型ロボット3aやモバイルロボット3bのようなロボットを含んで構成される。以下、固定型ロボット3aとモバイルロボット3bを区別する必要がないときは、単にロボット3と称することもある。タスク配信サーバ1と操作端末2は、インターネットのような広域ネットワークを介して接続され、操作者はロボット3の作業位置から離れた遠隔地に所在する。
<Application example>
FIG. 1 is a block diagram showing a configuration example of a task distribution system to which the present invention is applied. The task distribution system includes a task distribution server (task distribution device) 1, an operation terminal 2, and robots such as a fixed robot 3a and a mobile robot 3b. Hereinafter, when it is not necessary to distinguish between the fixed robot 3a and the mobile robot 3b, it may be simply referred to as the robot 3. The task distribution server 1 and the operation terminal 2 are connected via a wide area network such as the Internet, and the operator is located in a remote place away from the work position of the robot 3.
 本システムにおいて、タスク配信サーバ1(以下、単にサーバ1とも称する)は、ロボット3に例外事象が発生したときに、この例外事象に対応するための操作者を選択して、選択された操作者に例外事象への対処タスクを配信する。なお、サーバ1は、操作者ごとに、ロボット3の種類あるいは例外事象の種類ごとの対処スキルを情報として保持し、例外事象が発生したロボット3あるいはその例外事象に対処するスキルを有する操作者を選択するように構成されてもよい。 In this system, the task distribution server 1 (hereinafter, also simply referred to as server 1) selects an operator for responding to an exception event when an exception event occurs in the robot 3, and the selected operator is selected. Deliver a task to deal with exceptions to. The server 1 holds the coping skill for each type of robot 3 or the type of exception event as information for each operator, and provides the robot 3 in which the exception event occurs or the operator having the skill to deal with the exception event. It may be configured to be selected.
 操作者は操作端末2を用いて、ロボット3の操作を行って例外事象に対処する。なお、ロボット3に対する操作指示はサーバ1を介してロボット3に送信され、サーバ1は操作内容の履歴を格納可能である。 The operator uses the operation terminal 2 to operate the robot 3 to deal with the exception event. The operation instruction to the robot 3 is transmitted to the robot 3 via the server 1, and the server 1 can store the history of the operation contents.
[構成]
 サーバ1は、CPU11、メモリ12、ストレージ13、ネットワークインタフェース(不図示)等を備えるコンピュータである。ストレージ13に格納されたプログラムをCPU11がメモリ12にロードして実行することにより、サーバ1は以下の機能を提供する。すなわち、サーバ1は、状態受付部101、情報制御部102、送信部103、操作端末接続部104、受信部105、出力部106、操作者情報DB107、対応情報DB108をその機能部として有する。DBとはデータベースを意味する。
[Constitution]
The server 1 is a computer including a CPU 11, a memory 12, a storage 13, a network interface (not shown), and the like. The server 1 provides the following functions by loading the program stored in the storage 13 into the memory 12 and executing the program. That is, the server 1 has a state reception unit 101, an information control unit 102, a transmission unit 103, an operation terminal connection unit 104, a reception unit 105, an output unit 106, an operator information DB 107, and a correspondence information DB 108 as its functional units. DB means a database.
 状態受付部101は、固定型ロボット3aやモバイルロボット3bから状態情報を受信する。なお、状態受付部101は、固定型ロボット3aからはロボットコントローラ5aを介して状態情報を受信し、モバイルロボット3bからは無線通信装置6を介して状態情報を受信する。状態情報は、ロボット3に例外事象が発生したときのみロボット3からサーバ1に送信されてもよいし、例外事象の発生有無にかかわらずロボット3からサーバ1に送信されてもよい。 The state reception unit 101 receives state information from the fixed robot 3a and the mobile robot 3b. The state receiving unit 101 receives the state information from the fixed robot 3a via the robot controller 5a, and receives the state information from the mobile robot 3b via the wireless communication device 6. The state information may be transmitted from the robot 3 to the server 1 only when an exception event occurs in the robot 3, or may be transmitted from the robot 3 to the server 1 regardless of whether or not an exception event has occurred.
 情報制御部102は、ロボット3に例外事象が発生した際の当該例外事象に対処する操作者の選択、操作端末からの操作内容(対応履歴)の保存、操作者のスキル評価、操作者への報酬決定などの処理を行う。詳細は以下で説明する。 The information control unit 102 selects an operator to deal with the exception event when an exception event occurs in the robot 3, saves the operation content (correspondence history) from the operation terminal, evaluates the skill of the operator, and sends the operator to the operator. Performs processing such as reward determination. Details will be described below.
 送信部103は、ロボット3の例外事象への対処要求、およびロボット3の状態情報を操作端末2へ送信する。 The transmission unit 103 transmits a response request for an exception event of the robot 3 and the state information of the robot 3 to the operation terminal 2.
 操作端末接続部104は、操作端末2とのあいだの接続の確立を行い、操作端末2のオンライン状態の監視を行う。ここでは、オンライン状態とは、操作端末2がサーバ1と即座に通信を行ってロボット3に操作指令を送信できる状態を意味する。また、オンライン状態とは、サーバ1と操作端末2のあいだの接続状態を意味するだけでなく、操作者が要求に対して即座に対応できることを意味してもよい。したがって、操作端末接続部104は、例えば、操作者による入力を表す情報を操作端末2から受信せず、操作者が即座に対応可能であるか不明な場合に、問い合わせを行ってもよい。なお、オンライン状態は、操作端末2へのプッシュ通知により接続を確立するものであってもよい。また、操作端末接続部104は、操作端末2のオンライン状態だけでなく、操作端末2がいずれかのロボット3の例外状態に対処中であるか否か、言い換えると、新たにロボット3の例外事象に対応可能であるか否かも管理する。操作端末2の状態に関する情報は、例えば、メモリ12に格納される。 The operation terminal connection unit 104 establishes a connection with the operation terminal 2 and monitors the online state of the operation terminal 2. Here, the online state means a state in which the operation terminal 2 can immediately communicate with the server 1 and send an operation command to the robot 3. Further, the online state not only means the connection state between the server 1 and the operation terminal 2, but may also mean that the operator can immediately respond to the request. Therefore, the operation terminal connection unit 104 may make an inquiry when, for example, the information representing the input by the operator is not received from the operation terminal 2 and it is unclear whether the operator can respond immediately. In addition, the online state may establish a connection by a push notification to the operation terminal 2. Further, the operation terminal connection unit 104 not only keeps the operation terminal 2 online, but also whether or not the operation terminal 2 is dealing with an exception state of any robot 3, in other words, a new exception event of the robot 3. It also manages whether or not it is possible to respond to. Information about the state of the operation terminal 2 is stored in, for example, the memory 12.
 受信部105は、操作者が操作端末2に入力してサーバ1に送信した情報を受信する。操作端末2から送信される情報には、ロボット3に対する操作指令や、例外事象への対処結果が含まれる。ロボット3への操作指令は、出力部106を介してロボット3に送信されるとともに、対応情報DB108に対応履歴として保存される。また、例外事象への対処結果も、対応情報DBに対応履歴として保存される。 The receiving unit 105 receives the information input by the operator into the operation terminal 2 and transmitted to the server 1. The information transmitted from the operation terminal 2 includes an operation command for the robot 3 and a response result for an exception event. The operation command to the robot 3 is transmitted to the robot 3 via the output unit 106, and is stored in the correspondence information DB 108 as a correspondence history. In addition, the result of dealing with the exception event is also saved as a correspondence history in the correspondence information DB.
 操作者情報DB107は、操作者に関する情報を記憶する。対応情報DB108は、例外事象に対する対応履歴の内容を記憶する。これらのデータベースについては、図5,図6を参照して、以下で説明する。 The operator information DB 107 stores information about the operator. The correspondence information DB 108 stores the contents of the correspondence history for the exception event. These databases will be described below with reference to FIGS. 5 and 6.
 出力部106は、操作端末2から入力されたロボット3の操作指令をロボット3に送信する。なお、固定型ロボット3aへの操作指令はロボットコントローラ5aに送信され、ロボットコントローラ5aがこの操作指令の内容に従ってロボット3aを制御する。モバイルロボット3bへの操作指令は無線通信装置6を介してモバイルロボット3b内のロボットコントローラ5bに送信され、ロボットコントローラ5bがこの操作指令の内容に従ってモバイルロボット3bを制御する。以下、ロボットコントローラ5aとロボットコントローラ5bを区別する必要がないときは、単にコントローラ5と称することもある。 The output unit 106 transmits the operation command of the robot 3 input from the operation terminal 2 to the robot 3. An operation command to the fixed robot 3a is transmitted to the robot controller 5a, and the robot controller 5a controls the robot 3a according to the content of the operation command. The operation command to the mobile robot 3b is transmitted to the robot controller 5b in the mobile robot 3b via the wireless communication device 6, and the robot controller 5b controls the mobile robot 3b according to the content of the operation command. Hereinafter, when it is not necessary to distinguish between the robot controller 5a and the robot controller 5b, it may be simply referred to as the controller 5.
 操作端末2は、サーバ1と通信可能であり、固定型ロボット3aおよびモバイルロボット3bを遠隔操作することが可能な任意のコンピュータである。操作端末2として、デスクトップPC、ラップトップ型PC、タブレット端末、スマートフォン端末が例示できる。操作端末2は、ロボットを遠隔操作するための、専用のティーチングペンダントであってもよい。操作端末2は、ロボット3の種類に応じて異なる種類の操作用GUIをユーザに提供する。このGUIは、操作端末2自身が生成したものであってもよいし、サーバ1やその他の装置から提供されるものであってもよい。操作者は、操作端末2のGUIを介して、ロボット3の状態情報を把握可能である。この状態情報には、ロボット3のセンサから得られるセンサ情報、当該センサ情報の解析結果、ロボット3が有するカメラによって撮影された画像が含まれる。操作端末2は、処理が必要な状態の情報をサーバ1から受信し、これを画面に表示して、操作者に操作を促す。操作者は、操作端末2のGUIを介して、ロボット3に対する操作指令、およびロボット3に対する操作の結果を、入力・送信可能である。入力情報は、操作端末2からサーバ1の受信部105に送信される。 The operation terminal 2 is an arbitrary computer that can communicate with the server 1 and can remotely control the fixed robot 3a and the mobile robot 3b. Examples of the operation terminal 2 include a desktop PC, a laptop PC, a tablet terminal, and a smartphone terminal. The operation terminal 2 may be a dedicated teaching pendant for remotely controlling the robot. The operation terminal 2 provides the user with different types of operation GUIs depending on the type of the robot 3. This GUI may be generated by the operation terminal 2 itself, or may be provided by the server 1 or other devices. The operator can grasp the state information of the robot 3 via the GUI of the operation terminal 2. This state information includes sensor information obtained from the sensor of the robot 3, an analysis result of the sensor information, and an image taken by the camera of the robot 3. The operation terminal 2 receives information on a state requiring processing from the server 1, displays it on the screen, and prompts the operator to operate it. The operator can input and transmit an operation command for the robot 3 and the result of the operation for the robot 3 via the GUI of the operation terminal 2. The input information is transmitted from the operation terminal 2 to the receiving unit 105 of the server 1.
 ロボット3は、何らかの動作を自律的に実行可能な装置である。固定型ロボット3aは、アクチュエータとセンサを有しており、センサから得られる情報に従ってアクチュエータを制御することにより、処理を実行する。固定型ロボット3aの制御内容は、ロボットコントローラ5によって決定される。固定型ロボット3aの例として、FA用ロボット、NC工作機械、成形機、プレス機、飼育動植物への自動給餌装置、栽培植物への自動潅水施肥装置が挙げられるが、これらの例に限定はされない。 Robot 3 is a device that can autonomously execute some kind of operation. The fixed robot 3a has an actuator and a sensor, and executes a process by controlling the actuator according to the information obtained from the sensor. The control content of the fixed robot 3a is determined by the robot controller 5. Examples of the fixed robot 3a include, but are not limited to, FA robots, NC machine tools, molding machines, press machines, automatic feeding devices for breeding animals and plants, and automatic irrigation and fertilizing devices for cultivated plants. ..
 モバイルロボット3bは、アクチュエータとセンサに加えて、移動装置、ロボットコントローラ5bを搭載しており、自律的に移動可能であり、移動先で何らかの動作を実行する装置である。モバイルロボット3bの例として、運搬ロボット、掃除ロボット、給仕ロボット、警備ロボット、アーム型ロボットと組み合わせたロボットなどが挙げられるが、これらの例に限定されない。 The mobile robot 3b is a device that is equipped with a moving device and a robot controller 5b in addition to an actuator and a sensor, can move autonomously, and executes some operation at the moving destination. Examples of the mobile robot 3b include, but are not limited to, a transport robot, a cleaning robot, a waiter robot, a security robot, a robot combined with an arm type robot, and the like.
 固定型ロボット3a、モバイルロボット3bのいずれも、センサとしてカメラ(撮像手段)を有することも好ましい。カメラは、可視光カメラ、赤外カメラ、3次元カメラなど任意のカメラであってよい。 It is also preferable that both the fixed robot 3a and the mobile robot 3b have a camera (imaging means) as a sensor. The camera may be any camera such as a visible light camera, an infrared camera, and a three-dimensional camera.
 ロボットコントローラ5は、ロボット3のセンサから得られる情報に基づいて、あらかじめ定義されたプログラムに従ってロボット3を制御する。なお、ロボットコントローラ5は、オンライン学習により、制御内容を動的に変更可能に構成されてもよい。 The robot controller 5 controls the robot 3 according to a predefined program based on the information obtained from the sensor of the robot 3. The robot controller 5 may be configured so that the control content can be dynamically changed by online learning.
 ロボットコントローラ5は、ロボット3において自動で処理を行えない、または、自動で処理を行うことが適切ではない事象が発生しているかを判断可能に構成される。このような事象を、本開示では、例外事象と称する。例外事象の例として、ロボットコントローラ5からロボット3へ送信する操作指示とロボット3から返信される操作結果とが不一致であり、ロボット3が操作指示通りに動作していない事象が挙げられる。また、例外事象の例として、人間による確認が必要な事象の発生が挙げられる。このような事象として、ロボット3の異常な温度上昇や、異音の発生、操作対象物の異常が挙げられる。 The robot controller 5 is configured to be able to determine whether an event has occurred in which the robot 3 cannot automatically perform processing or it is not appropriate to perform processing automatically. Such an event is referred to as an exceptional event in the present disclosure. As an example of the exceptional event, there is an event in which the operation instruction transmitted from the robot controller 5 to the robot 3 and the operation result returned from the robot 3 do not match, and the robot 3 does not operate according to the operation instruction. An example of an exceptional event is the occurrence of an event that requires human confirmation. Examples of such an event include an abnormal temperature rise of the robot 3, generation of abnormal noise, and abnormality of the operation target object.
[データベース]
 操作者情報DB107は、操作者に関する情報を記憶する。例えば、図5に示すように、操作者情報DB107は、操作者ID501、操作端末情報502、スキル情報503、時給504、および作業時間履歴505が格納される。これらは例示に過ぎず、操作者情報DB107はこれら全ての情報を記憶しなくてもよいし、その他の情報を記憶してもよい。
[Database]
The operator information DB 107 stores information about the operator. For example, as shown in FIG. 5, the operator information DB 107 stores the operator ID 501, the operation terminal information 502, the skill information 503, the hourly wage 504, and the work time history 505. These are merely examples, and the operator information DB 107 may not store all of these information, or may store other information.
 操作者ID501は、操作者をシステム内で一意に特定するための識別子である。操作者は、この操作者IDと不図示のパスワードを用いてシステムにログインし、オンライン状態を保つ。操作端末情報502は、オンライン状態を保つ操作端末2に関する情報であり、サーバ1から操作端末2を特定して接続するために必要な情報、例えば、IPアドレスや、MACアドレスを含む。操作端末情報502は、PCやスマートフォン端末のような操作端末2自体の特性を示す情報、操作端末に接続された特殊なユーザインタフェース(UI)の情報を含み、これらの情報はタスクのマッチングに用いられる。特殊UIは、例えば、エンドエフェクタの指の部分を操作するために手に装着するセンサなどである。 The operator ID 501 is an identifier for uniquely identifying the operator in the system. The operator logs in to the system using this operator ID and a password (not shown) to keep the system online. The operation terminal information 502 is information about the operation terminal 2 that keeps the online state, and includes information necessary for identifying and connecting the operation terminal 2 from the server 1, for example, an IP address and a MAC address. The operation terminal information 502 includes information indicating the characteristics of the operation terminal 2 itself such as a PC or a smartphone terminal, and information on a special user interface (UI) connected to the operation terminal, and these information are used for task matching. Be done. The special UI is, for example, a sensor worn on the hand to operate the finger portion of the end effector.
 スキル情報503は、スキル保有情報503a、スキル評価503bを含む。図に示す、スキル1、スキル2、・・・スキルNは、タスク配信システムに接続されたロボット3に関するスキルであり、これらをまとめてスキル情報503と呼ぶ。操作者がスキルNに対して操作できるスキルを持っていれば、スキル保有情報503aにはYESが登録され、そうでなければNOが登録されている。また、操作者がスキルNを有している場合、その評価がスキル評価503bとして登録される。スキル評価503bは、スキルNに対する経験回数、応答の速さ、システム内の評価者(管理者)による評価などにより値が決められる。例えば、1から10の値を設定でき、10が最高ランクとして設定できる。また、操作者の例外事象への対応結果に基づいて、サーバ1が操作者のスキル評価自動的に更新してもよい。 Skill information 503 includes skill possession information 503a and skill evaluation 503b. Skill 1, skill 2, ... Skill N shown in the figure are skills related to the robot 3 connected to the task distribution system, and these are collectively referred to as skill information 503. If the operator has a skill that can be operated for skill N, YES is registered in the skill possession information 503a, and NO is registered otherwise. If the operator has skill N, the evaluation is registered as skill evaluation 503b. The value of the skill evaluation 503b is determined by the number of experiences with skill N, the speed of response, the evaluation by the evaluator (administrator) in the system, and the like. For example, a value from 1 to 10 can be set, and 10 can be set as the highest rank. Further, the server 1 may automatically update the skill evaluation of the operator based on the result of responding to the exception event of the operator.
 時給504は、操作者が有するスキル等に設定される時間あたりの報酬である。雇用者の立場からは、時給504は操作者の利用費用を表す。ここでは、時給の単位をポイントとしているが、これは操作者の居住地域の最低賃金(あるいはその他の賃金レベル)に合わせて報酬計算をできるようにするためである。作業時間履歴505は、操作者が働いた時間の累計を記録する。作業時間履歴505は、日ごと、週ごと、月ごとなどで操作者が働いた時間を集計でき、報酬計算に利用できる。 The hourly wage 504 is an hourly reward set for the skill of the operator. From the employer's point of view, the hourly wage 504 represents the operating cost of the operator. Here, the unit of hourly wage is used as a point, so that the reward can be calculated according to the minimum wage (or other wage level) of the operator's residential area. The working time history 505 records the cumulative total of working hours of the operator. The work time history 505 can total the hours worked by the operator on a daily basis, a weekly basis, a monthly basis, etc., and can be used for reward calculation.
 対応情報DB108は、例外事象に対する対応履歴の内容を記憶する。対応情報DB108には、例外事象ごとに、ロボット3の状態と操作者からの操作内容とが関連付けて記憶される。例えば、図6に示すように、対応情報DB108は、例外事象の発生時刻601、例外事象が発生したロボットID602、状態情報603、操作情報604、操作結果605、操作者ID606、対応時間607が格納される。これらは例示に過ぎず、操作者情報DB107はこれら全ての情報を記憶しなくてもよいし、その他の情報を記憶してもよい。 Correspondence information DB 108 stores the contents of the correspondence history for the exception event. In the correspondence information DB 108, the state of the robot 3 and the operation content from the operator are stored in association with each other for each exception event. For example, as shown in FIG. 6, the correspondence information DB 108 stores the exception event occurrence time 601, the robot ID 602 in which the exception event occurred, the state information 603, the operation information 604, the operation result 605, the operator ID 606, and the correspondence time 607. Will be done. These are merely examples, and the operator information DB 107 may not store all of these information, or may store other information.
 例外事象の発生時刻601は、ロボット3に例外事象が発生し、ロボットコントローラ5からサーバ1(状態受付部101)が例外事象の状態情報を受け付けた時間を表す。ロボットID602は、例外事象が発生したロボット3をシステム内で一意に特定するための識別子である。 The exception event occurrence time 601 represents the time when the exception event occurs in the robot 3 and the server 1 (state reception unit 101) receives the status information of the exception event from the robot controller 5. The robot ID 602 is an identifier for uniquely identifying the robot 3 in which the exception event has occurred in the system.
 状態情報603は、ロボット3から送られてくる例外状態およびステータスなどの情報である。操作情報604は、送られてきた状態情報を元に、操作者が例外状態を通常状態に戻すために、ロボット3に与える操作を指示する情報である。状態情報603および操作情報604については、実施例の説明の際に詳しく説明する。 The status information 603 is information such as an exceptional status and a status sent from the robot 3. The operation information 604 is information instructing an operation to be given to the robot 3 by the operator in order to return the exceptional state to the normal state based on the sent state information. The state information 603 and the operation information 604 will be described in detail when the examples are described.
 操作結果605は、操作情報によってロボット3が操作された結果を記録するための情報である。操作者ID606は、例外事象に対処した操作者を一意に特定するための識別子である。対応時間607は、例外状態を通常状態に戻すために要した時間を表す。通常状態に戻せなかった場合には、遠隔操作だけでは解決不可能であると操作者が判断するまでの時間を表す。 The operation result 605 is information for recording the result of the robot 3 being operated by the operation information. The operator ID 606 is an identifier for uniquely identifying the operator who has dealt with the exception event. The response time 607 represents the time required to return the exceptional state to the normal state. If the normal state cannot be restored, it indicates the time until the operator determines that the problem cannot be solved by remote control alone.
[処理]
 図2,図3は、タスク配信システムにおいてサーバ1が行う処理を示したフローチャートである。
[processing]
2 and 3 are flowcharts showing the processes performed by the server 1 in the task distribution system.
 ステップS101において、サーバ1の受信部105が操作端末2から接続要求を受け付けると、ステップS102において、操作端末接続部104が操作端末2の接続管理、すなわち、接続の確立とオンライン状態の監視を行う。また、操作端末接続部104は、操作端末2がいずれかのロボット3の例外事象に対処中であるか否かも管理する。操作端末接続部104による接続管理は、ステップS102以降も操作端末2が接続を切断するまで継続する。 In step S101, when the receiving unit 105 of the server 1 receives the connection request from the operation terminal 2, the operation terminal connection unit 104 manages the connection of the operation terminal 2, that is, establishes the connection and monitors the online state in step S102. .. The operation terminal connection unit 104 also manages whether or not the operation terminal 2 is dealing with an exception event of any robot 3. The connection management by the operation terminal connection unit 104 continues even after step S102 until the operation terminal 2 disconnects.
 ステップS103において、状態受付部101は、ロボット3(固定型ロボット3aまたはモバイルロボット3b)から状態情報を受信する。ここでは、ロボット3からサーバ1への状態情報の送信は例外事象が発生したときのみに行うことを想定しているので、状態情報を受信した場合にはロボット3に例外事象が発生していると自動的に判断できる。ただし、ロボット3は例外事象の有無にかかわらず状態情報をサーバ1に送信するように構成されてもよく、この場合は、情報制御部102が状態情報を参照してロボット3に例外事象が発生しているか判断して、例外事象が発生していると判断される場合にステップS104以降の処理を実行するようにしてもよい。 In step S103, the state receiving unit 101 receives the state information from the robot 3 (fixed robot 3a or mobile robot 3b). Here, since it is assumed that the state information is transmitted from the robot 3 to the server 1 only when an exception event occurs, the exception event occurs in the robot 3 when the state information is received. Can be automatically determined. However, the robot 3 may be configured to transmit state information to the server 1 regardless of the presence or absence of an exception event. In this case, the information control unit 102 refers to the state information and an exception event occurs in the robot 3. If it is determined that an exception event has occurred, the process after step S104 may be executed.
 ステップS104において、情報制御部102は、例外事象への対処タスクと操作者とのマッチング、すなわち、例外事象に対処する操作者の選択を行う。具体的には、情報制御部102は、利用可能な操作者の中から、操作者のスキルと例外事象とを考慮して、対処タスクを割り当てる操作者を選択する。図4に示すように、情報制御部102は、状態受付部101が受け取った状態情報に含まれるロボット3の種類(タイプ)および状態(不具合の状態)と、操作者情報DB107に含まれる情報を比較して、適切な操作者を抽出する。タスクがマッチングされる(割り当てられる)操作者の条件として、まず、オンライン状態であることが挙げられる。他の条件として、例外事象が発生しているロボット種類に対する対処スキル、または例外事象の種類に対する対処スキルが、要求される水準であるという条件も挙げられる。また、他のタスクを実行中でないという条件も採用するとよい。ただし、例外事象に対処可能な操作者の全てが他のタスクを実行中である場合には、他のタスクを実行中の操作者がマッチングされてもよい。さらに、操作者に対する報酬を考慮してマッチングする操作者が決定されてもよい。例えば、情報制御部102は、スキルが要求水準以上であるという条件を満たし、かつ、スキルと報酬に基づいて決定されるスコアが最も高い操作者を、マッチングする操作者として決定してもよい。ここで、スコアは、必要なスキルが高いほど高く、報酬が低いほど高く決定することが想定される。 In step S104, the information control unit 102 matches the task for dealing with the exception event with the operator, that is, selects the operator for dealing with the exception event. Specifically, the information control unit 102 selects an operator to which a coping task is assigned from among the available operators in consideration of the skill of the operator and the exception event. As shown in FIG. 4, the information control unit 102 inputs the type (type) and state (defect state) of the robot 3 included in the state information received by the state reception unit 101, and the information included in the operator information DB 107. Compare and extract the appropriate operator. The first condition for the operator to match (assign) a task is to be online. Another condition is that the coping skill for the type of robot in which the exception event occurs or the coping skill for the type of exception event is at the required level. It is also good to adopt the condition that other tasks are not being executed. However, if all the operators who can handle the exception event are executing other tasks, the operators who are executing the other tasks may be matched. Further, the matching operator may be determined in consideration of the reward for the operator. For example, the information control unit 102 may determine the operator who satisfies the condition that the skill is equal to or higher than the required level and has the highest score determined based on the skill and the reward as the matching operator. Here, it is assumed that the higher the required skill, the higher the score, and the lower the reward, the higher the score.
 なお、情報制御部102は、ロボットの例外事象をいくつかの種類に分類し、例外事象の種類に応じて特定の操作者に各例外事象を割り当てるようにしてもよい。ここで、例外事象に応じた特定の操作者は典型的には複数の操作者である、情報制御部102は例外種類に応じた同一グループ内の操作者に例外事象を割り当てることが好ましい。ただし、ある種類の例外事象が特定の一人の操作者に割り当てられることを除外するものではない。 Note that the information control unit 102 may classify the exception events of the robot into several types and assign each exception event to a specific operator according to the type of the exception event. Here, the specific operator corresponding to the exception event is typically a plurality of operators, and it is preferable that the information control unit 102 assigns the exception event to the operators in the same group according to the exception type. However, it does not exclude that a certain kind of exception event is assigned to one specific operator.
 ステップS105において、情報制御部102は、ステップS104にて選択された操作者の操作端末2を特定し、特定された操作端末2に対して例外処理への対処要求およびロボット3の状態情報とを送信する。図4に示すように、情報制御部102は、操作者情報DB107を参照して、選択された操作者の操作端末情報を取得する。操作端末情報は、操作端末2と通信を行うための情報であり、例えば、IPアドレスやMACアドレスである。操作端末情報には端末種別情報が含まれていてもよい。情報制御部102は、状態受付部101を介してロボット3から受信した状態情報と、操作端末情報とを、送信情報として送信部103に渡す。送信部103は、操作端末情報によって特定される操作端末2に対して、状態情報およびタスク実行依頼とを送信する。 In step S105, the information control unit 102 identifies the operation terminal 2 of the operator selected in step S104, and requests the specified operation terminal 2 to deal with exception handling and the state information of the robot 3. Send. As shown in FIG. 4, the information control unit 102 refers to the operator information DB 107 and acquires the operation terminal information of the selected operator. The operation terminal information is information for communicating with the operation terminal 2, and is, for example, an IP address or a MAC address. The operation terminal information may include terminal type information. The information control unit 102 passes the state information received from the robot 3 via the state reception unit 101 and the operation terminal information to the transmission unit 103 as transmission information. The transmission unit 103 transmits the state information and the task execution request to the operation terminal 2 specified by the operation terminal information.
 操作端末2は、例外事象への対処タスクおよびロボット3の状態情報を受信すると、このロボット3を操作するためのGUIを表示部に表示する。このGUIでは、ロボット3の状態情報の確認、ロボット3に対する操作情報(操作指令)の入力、対処結果の入力、などが行える。ロボット3に対する操作情報は、ロボット3をリアルタイム制御で遠隔操作するための指令であってもよいし、一連の手順の処理を行わせる指令であってもよい。操作端末2に入力されたロボット3への操作情報は、操作端末2からサーバ1に送信されて、受信部105によって受信される。 When the operation terminal 2 receives the task of coping with the exception event and the state information of the robot 3, the GUI for operating the robot 3 is displayed on the display unit. In this GUI, it is possible to confirm the state information of the robot 3, input the operation information (operation command) to the robot 3, input the coping result, and the like. The operation information for the robot 3 may be a command for remotely controlling the robot 3 by real-time control, or may be a command for processing a series of procedures. The operation information to the robot 3 input to the operation terminal 2 is transmitted from the operation terminal 2 to the server 1 and received by the receiving unit 105.
 ステップS106において、受信部105は、操作端末2から情報を受信する。受信部105が受信する受信情報には、操作端末情報および操作情報が含まれる。 In step S106, the receiving unit 105 receives information from the operation terminal 2. The reception information received by the reception unit 105 includes operation terminal information and operation information.
 ステップS107において、情報制御部102は、受信情報から操作情報を抽出し、出力部106から当該操作情報を対象のロボット3に送信する。これにより、ロボット3に対して、操作者が指示した動作を行わせることができる。 In step S107, the information control unit 102 extracts the operation information from the received information and transmits the operation information from the output unit 106 to the target robot 3. As a result, the robot 3 can be made to perform the operation instructed by the operator.
 なお、図2,3のフローチャートでは示していないが、操作端末2からの操作指令等によってロボット3の状態が更新された場合、ロボット3の状態情報がロボット3から直接またはサーバ1を介して操作端末2に送信されてもよい。このようにすれば、操作者はロボット3の最新の状態を把握可能となる。 Although not shown in the flowcharts of FIGS. 2 and 3, when the state of the robot 3 is updated by an operation command or the like from the operation terminal 2, the state information of the robot 3 is operated directly from the robot 3 or via the server 1. It may be transmitted to the terminal 2. In this way, the operator can grasp the latest state of the robot 3.
 操作者からの操作指令に従ってロボット3が動作をした後、操作者はその操作の結果を評価して、操作端末2に入力してサーバ1に送信する。操作結果の例として、例外事象への対処が完了したか否か、管理者による現場での事後確認が必要か否か、が挙げられる。なお、対処完了には、正常状態に復帰した場合と、対処が不要である(異常ではない)と判断した場合が含まれる。なお、操作者からの操作指令に従ってロボット3が通常の動作に戻った状態情報を状態受付部101が受信することによって、例外事象への対処が完了したとしてもよい。この場合、タスク配信サーバ1が操作結果を操作端末2に送信する。 After the robot 3 operates according to the operation command from the operator, the operator evaluates the result of the operation, inputs it to the operation terminal 2, and transmits it to the server 1. Examples of operation results include whether or not the handling of exception events has been completed, and whether or not it is necessary for the administrator to confirm on-site afterwards. It should be noted that the completion of the countermeasure includes the case where the normal state is restored and the case where it is determined that the countermeasure is unnecessary (not abnormal). It is also possible that the handling of the exception event is completed by the state receiving unit 101 receiving the state information that the robot 3 has returned to the normal operation according to the operation command from the operator. In this case, the task distribution server 1 transmits the operation result to the operation terminal 2.
 ステップS108において、受信部105が、操作端末2から操作結果を受信する。 In step S108, the receiving unit 105 receives the operation result from the operation terminal 2.
 ステップS109において、情報制御部102は、ロボット3から受信した状態情報と、操作端末2から受信した操作情報および操作結果を関連付けて、対応情報DB108に格納する。これにより、ロボット3がどのような状態の時に、操作者がどのような操作を行ったかを履歴として残せる。また、情報制御部102は、操作者による例外事象への対処タスクの開始から修了までに要した時間を、対応情報DB108に格納してもよい。これにより、操作者の作業時間を把握可能になり、操作者の報酬計算や、操作者のスキル評価に利用できる。なお、作業時間や報酬などの情報は、操作者情報DB107に格納してもよい。 In step S109, the information control unit 102 associates the state information received from the robot 3 with the operation information and the operation result received from the operation terminal 2 and stores them in the correspondence information DB 108. As a result, it is possible to record as a history what kind of operation the operator performed when the robot 3 was in what state. Further, the information control unit 102 may store the time required from the start to the completion of the task for dealing with the exception event by the operator in the correspondence information DB 108. As a result, the working time of the operator can be grasped, and it can be used for the operator's reward calculation and the operator's skill evaluation. Information such as working hours and rewards may be stored in the operator information DB 107.
 ステップS110では、情報制御部102は、例外事象への対処が完了したか否かを判断する。この判断は、操作端末2から送信される操作結果情報に基づいて行えばよい。対処が未完了である場合(S110-NO)には、ステップS105に戻って、操作者からの指示に基づくロボット3の操作を行う。なお、ここでは前回と同一の操作者に対して対処を依頼しているが、ステップS104に戻って対処する操作者の再選択を行ってもよい。この場合、必要に応じて、前回の操作者を候補から除外して、別の操作者を選択するようにしてもよい。また、対処が完了している場合(S110-YES)には、処理はステップS111に進む。 In step S110, the information control unit 102 determines whether or not the handling of the exception event is completed. This determination may be made based on the operation result information transmitted from the operation terminal 2. When the countermeasure is not completed (S110-NO), the process returns to step S105 and the robot 3 is operated based on the instruction from the operator. Although the same operator as the previous one is requested to take action here, the operator may return to step S104 to reselect the operator to take action. In this case, if necessary, the previous operator may be excluded from the candidates and another operator may be selected. If the countermeasure is completed (S110-YES), the process proceeds to step S111.
 ステップS111では、情報制御部102が、対応情報DB108に格納されている対応履歴および結果に基づいて操作者のスキルを評価し、操作者情報DB107に格納されているスキルを更新する。スキルの評価には、例外事象への対処結果、対処の内容、対処に要した時間などが考慮される。 In step S111, the information control unit 102 evaluates the skill of the operator based on the correspondence history and the result stored in the correspondence information DB 108, and updates the skill stored in the operator information DB 107. Skill evaluation takes into account the results of dealing with exception events, the content of coping, and the time required for coping.
 ステップS112では、情報制御部102は、操作者情報DB107に格納されている操作者の時間あたりの報酬と、例外事象の対処に要した時間とに基づいて、操作者に支払うべき報酬額を決定する。 In step S112, the information control unit 102 determines the amount of remuneration to be paid to the operator based on the hourly remuneration of the operator stored in the operator information DB 107 and the time required to deal with the exception event. To do.
 なお、ここでは、例外事象への対処を行うたびに、スキル評価(S111)および報酬計算(S112)を行っているが、これらの処理は適当なタイミングで別途行われても構わない。 Here, skill evaluation (S111) and reward calculation (S112) are performed every time an exception event is dealt with, but these processes may be performed separately at an appropriate timing.
 また、上記の例において、ロボット3に例外事象が発生した場合に、操作者がロボット3に対して何らかの操作を行うように説明したが、実際には操作者はロボット3に対する操作を行わなくてもよい。例えば、ロボット3に例外事象が発生した場合であっても、操作者がロボット3の状態を確認して実際には異常が発生していないと確認できる場合もある。このような場合には、操作者は、ロボット3に対する操作指令を送信することなく、対処不要という旨を操作端末2からサーバ1に送信してもよい。 Further, in the above example, it has been described that the operator performs some operation on the robot 3 when an exception event occurs in the robot 3, but the operator does not actually perform the operation on the robot 3. May be good. For example, even when an exception event occurs in the robot 3, the operator may check the state of the robot 3 and confirm that no abnormality has actually occurred. In such a case, the operator may transmit from the operation terminal 2 to the server 1 that no action is required without transmitting the operation command to the robot 3.
[有利な効果]
 本実施形態によれば、ロボット3で発生した例外事象を、遠隔地にいる操作者に対処してもらうことができる。ここで、タスク配信サーバ1は、操作端末2との接続管理を行っており、即座に対応可能な操作者を選択してタスク配信をするので、迅速な例外対応が可能である。また、操作者のスキルを考慮して、ロボットの種類や例外事象の種類に応じて適切な操作者を選択してタスク配信するので、適切な例外対応が可能である。
[Advantageous effect]
According to the present embodiment, it is possible to have an operator at a remote location deal with an exceptional event generated by the robot 3. Here, the task distribution server 1 manages the connection with the operation terminal 2, and immediately selects an operator who can respond to the task distribution, so that a quick exception response is possible. In addition, considering the skill of the operator, an appropriate operator is selected according to the type of robot and the type of exception event and the task is distributed, so that appropriate exception handling is possible.
 操作者を工場等のロボットの作業位置に待機させる必要がなく、操作者は遠隔地から複数の場所のロボットを操作できる。したがって、本システムを導入すれば、ロボットの例外処理に対応する人員を自社工場に待機させる必要がないので、コスト削減が可能である。 There is no need to make the operator stand by at the working position of the robot such as a factory, and the operator can operate the robot at multiple locations from a remote location. Therefore, if this system is introduced, it is not necessary to have the personnel corresponding to the exception handling of the robot on standby at the company's factory, so that the cost can be reduced.
 近年、少子高齢化に起因する労働力不足や地方での雇用不足などの問題が指摘されているが、本実施形態は工場(ロボットの作業場所)と遠隔地の労働力を結びつける社会インフラあるいはプラットフォームを提供するので、これらの問題解消に寄与する。本システムでは、操作者は特定の場所で勤務する必要は無くなり遠隔地から作業を行える。したがって長距離の通勤が困難な人であっても働きやすい環境を提供できるので、高齢者や地方在住者であっても作業が可能である。 In recent years, problems such as labor shortages and employment shortages in rural areas due to the declining birthrate and aging population have been pointed out, but this embodiment is a social infrastructure or platform that connects factories (robot work places) and remote labor force. Will contribute to solving these problems. With this system, the operator does not have to work in a specific place and can work from a remote place. Therefore, even people who have difficulty commuting long distances can provide a comfortable working environment, and even elderly people and local residents can work.
 また、日本の労働力不足解消のために外国人労働やの日本への移民が議論されている。本システムにおける操作者は日本国内に所在する必要がなく外国に所在してもよい。操作端末のインタフェースを外国語にすれば、外国人が外国から操作することもでき、外国人を移民させることなく労働力の輸入が可能となるという効果もある。 In addition, foreign labor and immigration to Japan are being discussed in order to solve the labor shortage in Japan. The operator in this system does not have to be located in Japan and may be located in a foreign country. If the interface of the operation terminal is in a foreign language, foreigners can operate it from abroad, and there is also the effect that labor can be imported without immigrating foreigners.
<実施例1>
 ここでは、モバイルロボット3bが小包などを自動運転により配達する配達ロボットである場合を例として、本発明が適用されたタスク配信システムをより具体的に説明する。
<Example 1>
Here, the task distribution system to which the present invention is applied will be described more specifically by taking as an example a case where the mobile robot 3b is a delivery robot that delivers a parcel or the like by automatic operation.
 本実施例における配達ロボットは、移動機構として車輪を採用するモバイルロボットである。ここでは、対向2輪型の配達ロボットを例として説明するが、配達ロボットの移動機構は対向2輪型以外の車輪移動機構や、多脚式や無端軌道式の移動機構であってもよい。配達ロボットは、配送品を安定に収容するための収容部を有する。また、配達ロボットは、位置情報取得センサ、カメラ、LIDAR、ミリ波レーダ、超音波センサ、加速度センサ等の各種のセンサを有し、これらのセンサから得られるセンサ情報に基づいて、内蔵のロボットコントローラ5が配達ロボットの移動その他の運動を制御する。 The delivery robot in this embodiment is a mobile robot that employs wheels as a moving mechanism. Here, an opposed two-wheel type delivery robot will be described as an example, but the moving mechanism of the delivery robot may be a wheel moving mechanism other than the opposed two-wheel type, or a multi-legged or endless track type moving mechanism. The delivery robot has a storage unit for stably housing the delivered product. In addition, the delivery robot has various sensors such as a position information acquisition sensor, a camera, a lidar, a millimeter wave radar, an ultrasonic sensor, and an acceleration sensor, and a built-in robot controller is based on the sensor information obtained from these sensors. 5 controls the movement and other movements of the delivery robot.
 配達ロボットは、例外事象が発生したときに、状態情報をサーバ1に送信する。また、サーバ1を介して操作端末2から送信される操作指令に基づいて動作可能である。 The delivery robot sends status information to the server 1 when an exception event occurs. In addition, it can operate based on an operation command transmitted from the operation terminal 2 via the server 1.
 ここでは、配達ロボットが、無人すなわち自動運転での配達途中で、路上の穴に左車輪が落ち、脱輪状態で停止したという異常状態(例外事象)が発生したと仮定する。配達ロボットの周りの状態を把握しないで対処しようとすると、より危険な状態に陥る可能性がある。例えば、穴が配達ロボットよりも大きければ、配達ロボットが穴に落下する危険がある。そこで、配達ロボットは、加速度センサの情報から機体の水平が保たれていないと判断した時点で、例外事象が発生したと判断して、サーバ1に状態情報とともに例外事象の発生を通知する。なお、どの時点で例外事象が発生したと判断するかは、モバイルロボットの制御技術のレベルや要求される安全レベルに応じて変わるものであり、上記の基準はあくまでも一例に過ぎない。 Here, it is assumed that an abnormal state (exception event) occurs in which the delivery robot is unmanned, that is, during delivery by automatic driving, the left wheel falls into a hole on the road and stops in a derailed state. Attempting to deal with the situation around the delivery robot without knowing it can lead to a more dangerous situation. For example, if the hole is larger than the delivery robot, there is a risk that the delivery robot will fall into the hole. Therefore, when the delivery robot determines from the information of the acceleration sensor that the aircraft is not leveled, it determines that an exception event has occurred and notifies the server 1 of the occurrence of the exception event together with the status information. It should be noted that the time at which the exception event is judged to have occurred depends on the level of the control technology of the mobile robot and the required safety level, and the above criteria are merely examples.
 状態受付部101が、モバイルロボット3bから状態情報を受け付ける(S103)と、この情報は対応情報DB108に格納される。図6に示されるように、例外事象の発生時刻601、例外事象が発生したロボットID602、および状態情報603がデータベースに格納される。 When the state reception unit 101 receives the state information from the mobile robot 3b (S103), this information is stored in the correspondence information DB 108. As shown in FIG. 6, the occurrence time 601 of the exception event, the robot ID 602 in which the exception event occurred, and the state information 603 are stored in the database.
 ここでの状態情報603は、位置情報、車体の傾き、車輪の状態、積荷、およびロボット3bに搭載されたカメラかが撮影した画像が含まれる。位置情報は、モバイルロボット3bが搭載するGPS装置(位置情報取得センサ)から得られる位置情報である。車体傾きは、モバイルロボット3bが搭載する傾きセンサ(加速度センサ)から得られる、車体の前後左右の傾きを表す情報である。例えば、4度(絶対値)を越える傾きが発生したときに異常と判断することができる。この例では、左右傾きが-5度(マイナスは左下がりを示す)であり、傾き角度の絶対値が閾値を超えたのでロボット3bは異常状態であると判断している。 The state information 603 here includes position information, vehicle body tilt, wheel state, cargo, and images taken by a camera mounted on the robot 3b. The position information is the position information obtained from the GPS device (position information acquisition sensor) mounted on the mobile robot 3b. The vehicle body tilt is information indicating the front / rear / left / right tilt of the vehicle body obtained from the tilt sensor (accelerometer) mounted on the mobile robot 3b. For example, when a tilt exceeding 4 degrees (absolute value) occurs, it can be determined to be abnormal. In this example, the left-right tilt is −5 degrees (minus indicates a downward slope to the left), and the absolute value of the tilt angle exceeds the threshold value, so that the robot 3b is determined to be in an abnormal state.
 車輪の状態は停止状態である。これは、ロボット3bが異常を検出したため車輪の駆動を停止したためである。積荷情報は配送元で入力される情報である。操作者は、積荷情報に基づいて、どのような操作をどれだけ行ってよいかを判断できる。例えば、この例に示すように積荷が小包であれば多少乱暴に扱っても問題ないと考えられる。一方、積荷が飲食物であれば慎重に扱う必要があることがわかる。また、積荷が飲食物であり車体傾きが45度など大きい値であれば、異常状態を回復するよりも、配送元に連絡する方が得策であると考えることもできる。モバイルロボット3bのコンテナ内で、飲食物が横転し、配送先に届けられたとしても、意味のない状態になっていると考えられるからである。 The state of the wheels is stopped. This is because the robot 3b stopped driving the wheels because it detected an abnormality. The cargo information is the information input by the delivery source. The operator can determine what kind of operation and how much should be performed based on the cargo information. For example, as shown in this example, if the cargo is a parcel, it can be handled roughly. On the other hand, if the cargo is food and drink, it needs to be handled carefully. Further, if the cargo is food and drink and the vehicle body inclination is a large value such as 45 degrees, it can be considered that it is better to contact the delivery source than to recover from the abnormal state. This is because even if the food or drink rolls over in the container of the mobile robot 3b and is delivered to the delivery destination, it is considered that the food or drink is in a meaningless state.
 ロボットカメラからの画像は、モバイルロボット3bの車体に取り付けられたカメラによる画像である(図7参照)。カメラは、少なくとも進行方向前方を撮影するように取り付けられている。ロボットカメラは、車体の周囲に複数取り付けられて、車体周囲の撮影するアラウンドビュー画像を撮影してもよい。あるいは、全方位カメラ(360度カメラ)によって撮影される全方位画像(360度画像)であってもよい。 The image from the robot camera is an image from the camera attached to the vehicle body of the mobile robot 3b (see FIG. 7). The camera is mounted to capture at least the front in the direction of travel. A plurality of robot cameras may be attached around the vehicle body to capture an around-view image taken around the vehicle body. Alternatively, it may be an omnidirectional image (360 degree image) taken by an omnidirectional camera (360 degree camera).
 サーバ1の状態制御部102は、配達ロボット(モバイルロボット)3bから送信される状態情報から配達ロボット3bに例外事象が発生していると判断でき、したがって、この例外事象に対処するための操作者を選択するタスクマッチングの処理を行う(ステップS104)。この例での例外事象は、配達ロボット3bに対する不具合であるので、情報制御部102は、オンライン状態であり、かつ他のタスクの処理を行っていない操作者の中から、この例外事象に対処するためのスキルを有する操作者を選択する。 The state control unit 102 of the server 1 can determine from the state information transmitted from the delivery robot (mobile robot) 3b that an exception event has occurred in the delivery robot 3b, and therefore, an operator for dealing with this exception event. The task matching process for selecting is performed (step S104). Since the exception event in this example is a malfunction of the delivery robot 3b, the information control unit 102 deals with this exception event from among the operators who are online and do not process other tasks. Select an operator who has the skills to do so.
 オンライン状態かつ他のタスクを処理していない操作者として、図5に示す操作者1と操作者2が存在する場合を考える。ここで、操作者1,2はいずれも、「モバイルロボット、配達、不具合対応」のスキルを有している。情報制御部102は、各操作者の当該スキルの評価や時給を考慮して、操作者を選択する。例えば、この例では、スキル2の評価がより高く、かつ、時給がより安い操作者2が選択される。 Consider the case where the operator 1 and the operator 2 shown in FIG. 5 exist as operators who are online and do not process other tasks. Here, both the operators 1 and 2 have the skills of "mobile robot, delivery, and troubleshooting". The information control unit 102 selects an operator in consideration of the evaluation of the skill of each operator and the hourly wage. For example, in this example, the operator 2 having a higher evaluation of skill 2 and a lower hourly wage is selected.
 情報制御部102は、配達ロボット3bに生じた異常事象への対処を求める要求と、配達ロボット3bの状態情報とを、選択した操作者の操作端末に対して送信する(ステップS105)。これらの情報を受信した操作端末2は、配達ロボット3bの状態情報と、配達ロボット3bを操作するためのGUIとを含む画面を表示する。操作端末2は、これらの情報を表示するための表示部と、ロボット3bへの操作指示を入力するためのユーザインタフェースが備えられている。操作情報の入力は、キーボードを介して行われてもよいし、画面へのタッチにより行われてもよい。他のユーザインタフェースの例として、モバイルロボット3bを動かすための簡単な指示(プログラム、コマンド列)を入力するインタフェースがある。 The information control unit 102 transmits a request for dealing with an abnormal event that has occurred in the delivery robot 3b and the state information of the delivery robot 3b to the operation terminal of the selected operator (step S105). The operation terminal 2 that has received such information displays a screen including the state information of the delivery robot 3b and the GUI for operating the delivery robot 3b. The operation terminal 2 is provided with a display unit for displaying such information and a user interface for inputting an operation instruction to the robot 3b. The input of the operation information may be performed via the keyboard or may be performed by touching the screen. As an example of another user interface, there is an interface for inputting simple instructions (program, command sequence) for moving the mobile robot 3b.
 操作端末2には、例えば、配達ロボットの状態情報(図6の状態情報603に示す情報)や、配達ロボット3bに搭載されたカメラによって撮影された画像(図7)が表示される。操作者は、これらの情報を元に、まず何らかの対処が必要な異常状態であるか否か、異常状態である場合は遠隔操作により対処が可能であるか否か、遠隔操作により対処が可能である場合にはどのような操作を行うべきか、などを判断する。 For example, the state information of the delivery robot (information shown in the state information 603 of FIG. 6) and the image (FIG. 7) taken by the camera mounted on the delivery robot 3b are displayed on the operation terminal 2. Based on this information, the operator can first determine whether or not there is an abnormal state that requires some action, and if it is an abnormal state, whether or not it is possible to deal with it by remote control. In some cases, determine what kind of operation should be performed.
 本例では、図7に示す画像やその他のセンサ情報から、操作者は以下のことを読み取れる。
・風景全体が右下がりとなっているため、モバイルロボット3bは左に傾いている。
・傾きの角度から、左車輪が脱輪している可能性が高い。
・ガードレールに沿った側溝がないことから、不測の穴に脱輪している可能性が高い。
・左右にガードレールがあることから、モバイルロボット3bを操作して動かしても比較的安全な場所である。
・モバイルロボット3bの方向を変えてバックまたは直進させる際に、左右のガードレールに接触しないように操作する必要がある。
・左側のガードレールの方が近いため、脱輪状態から復帰するためには、右後方に1メートル以内でバックすることが安全である。
In this example, the operator can read the following from the image shown in FIG. 7 and other sensor information.
-Since the entire landscape is downward-sloping, the mobile robot 3b is tilted to the left.
・ There is a high possibility that the left wheel has come off due to the angle of inclination.
-Since there is no gutter along the guardrail, there is a high possibility that the wheel has come off in an unexpected hole.
-Since there are guardrails on the left and right, it is a relatively safe place to operate and move the mobile robot 3b.
-When changing the direction of the mobile robot 3b and moving it back or straight, it is necessary to operate it so as not to touch the left and right guardrails.
・ Because the guardrail on the left side is closer, it is safe to back up to the right rear within 1 meter in order to recover from the derailed state.
 例外事象の発生時に送られる状態情報や画像は千差万別であり、上記の事柄は人間には瞬時に把握できることであるが、コンピュータ(AI等)にとっては困難なことである。操作者は、モバイルロボット3bからの画像や、その他の状態情報に基づいて、配達ロボットの左車輪が穴に落ちたと結論できる。さらに、操作者は、周囲の状況から、モバイルロボット3bの進行方向右後ろに1メートル程度後退しても問題ないと判断し、モバイルロボット3bを動かすことが復旧に必要であると判断できる。このような状況判断やモバイルロボット3bの操作は、モバイルロボット3bにプログラミングされていなければ実行することができない。あらゆる状況に対応可能なプログラミングを事前に行うことは現実的ではなく、千差万別の状況に適切に対処して操作することは、いまだに人間(操作者)に判断させるのが適切である。 The state information and images sent when an exceptional event occurs are diverse, and the above matters can be instantly grasped by humans, but it is difficult for computers (AI, etc.). Based on the image from the mobile robot 3b and other state information, the operator can conclude that the left wheel of the delivery robot has fallen into the hole. Further, the operator can determine from the surrounding conditions that there is no problem even if the mobile robot 3b moves backward about 1 meter to the right in the traveling direction, and that it is necessary to move the mobile robot 3b for recovery. Such situation determination and operation of the mobile robot 3b cannot be executed unless it is programmed in the mobile robot 3b. It is not realistic to perform programming that can handle all situations in advance, and it is still appropriate to let humans (operators) decide to properly deal with and operate various situations.
 操作者は、このような判断のもと、モバイルロボット3bに与える操作指令を以下のように決定する。まず、第1ステップとして、右車輪をロックしたまま、左車輪を-0.05m/sで回転させることを指示する。左車輪が穴に落ちている状況なので、操作者は、穴の外にある右車輪をロックし、そこを軸として左車輪をゆっくりと後退させる動作を指示する。この動作により、落ちた穴の縁に左車輪が掛かり、ゆっくりと左車輪を引き上げることができる。 Based on such a judgment, the operator determines the operation command to be given to the mobile robot 3b as follows. First, as a first step, it is instructed to rotate the left wheel at −0.05 m / s while keeping the right wheel locked. Since the left wheel has fallen into the hole, the operator locks the right wheel outside the hole and instructs the operation to slowly retract the left wheel around the right wheel. By this operation, the left wheel hangs on the edge of the fallen hole, and the left wheel can be slowly pulled up.
 第2ステップとして、車体傾きが0±2度(正常状態)になった時点で、右車輪速度を-0.05m/sに変更するよう指示する。車体の傾きが明らかに正常範囲になった時点で、左車輪が穴から引き揚げられたことを判断できる。その判断の後に、右車輪を左車輪と同じ速度で後退させ、モバイルロボット3bを穴から遠ざけるように動かす。なお、第1ステップの動作を一定時間継続しても車体傾きが正常状態にならない場合には異常終了するように指示してもよい。 As the second step, when the vehicle body tilt reaches 0 ± 2 degrees (normal state), the right wheel speed is instructed to be changed to -0.05 m / s. When the inclination of the car body is clearly within the normal range, it can be determined that the left wheel has been lifted from the hole. After that judgment, the right wheel is retracted at the same speed as the left wheel, and the mobile robot 3b is moved away from the hole. If the vehicle body tilt does not become normal even after the operation of the first step is continued for a certain period of time, it may be instructed to end abnormally.
 第3ステップとして、操作者は、10秒後に車輪動作を停止し、ロボット3bの前方カメラで画像を撮影し、かつ、通常状態に戻ったかを自己診断実行を指示する。第2ステップの後退動作を10秒間継続することで、穴に落ちた位置から0.5メートル離れた位置に移動したことになる。この値は周囲の状況に応じて設定するべき値であり、値の決定も人間(操作者)に行わせることが適切である。穴から0.5メートル離れることで、ロボット3bに搭載されたカメラが穴を撮影できると考えられる。 As the third step, the operator stops the wheel operation after 10 seconds, takes an image with the front camera of the robot 3b, and instructs the execution of self-diagnosis whether or not the robot 3b has returned to the normal state. By continuing the backward operation of the second step for 10 seconds, it means that the vehicle has moved to a position 0.5 meters away from the position where it fell into the hole. This value should be set according to the surrounding situation, and it is appropriate to let a human (operator) determine the value. It is considered that the camera mounted on the robot 3b can take a picture of the hole at a distance of 0.5 meter from the hole.
 第3ステップの操作として自己診断を実行するのは、再確認のためである。車体傾きが正常範囲になっただけでなく、モバイルロボット3bが機能的に問題ないか自己診断させ、通常状態に戻ったのであればそこから配達作業を続行することができる。モバイルロボット3bが自己診断の結果正常であると判断しても、操作者は異常状態が継続していると判断することもあり、このような場合は操作者による判断を優先させてもよい。 The self-diagnosis is executed as the operation of the third step for reconfirmation. Not only is the vehicle body tilt within the normal range, but the mobile robot 3b can self-diagnose whether there is a functional problem, and if it returns to the normal state, the delivery work can be continued from there. Even if the mobile robot 3b determines that it is normal as a result of the self-diagnosis, the operator may determine that the abnormal state continues. In such a case, the determination by the operator may be prioritized.
 操作者は、上記の第1から第3ステップからなる操作指令を、操作端末2からサーバ1を介してモバイルロボット3bに送信する(ステップS106,107)。本実施例では、操作者は、上記の第1から第3のステップの指示を一括でモバイルロボット3bに対して送信することを想定する。しかしながら、それぞれのステップの単位で指示を送信してもよいし、より細かい単位で指示を送信してもよい。また、モバイルロボットからの状態情報がリアルタイムで送信される場合には、操作者はリアルタイムでモバイルロボットを遠隔操作しても構わない。 The operator transmits the operation command consisting of the above first to third steps from the operation terminal 2 to the mobile robot 3b via the server 1 (steps S106 and 107). In this embodiment, it is assumed that the operator collectively transmits the instructions of the first to third steps to the mobile robot 3b. However, the instruction may be transmitted in the unit of each step, or the instruction may be transmitted in a finer unit. Further, when the state information from the mobile robot is transmitted in real time, the operator may remotely control the mobile robot in real time.
 モバイルロボット3bは、上記の操作指令に従って動作して、その結果として得られる情報を、サーバ1を介して、または直接に、操作端末2に送信する。モバイルロボット3bは、第1から第3のステップの結果、脱輪状態から復帰し穴から後退して、図8に示すような、穴を撮影した画像を取得できる。 The mobile robot 3b operates in accordance with the above operation command, and transmits the information obtained as a result to the operation terminal 2 via the server 1 or directly. As a result of the first to third steps, the mobile robot 3b can recover from the derailed state, retreat from the hole, and acquire an image of the hole as shown in FIG.
 本例では、操作者は図8に示す画像から、以下のことを読み取れる。
・風景全体からモバイルロボット3bが水平状態になっている。
・操作情報の指示通り、停止位置から右後方に移動した。
・画像から穴が認識でき、そこに左車輪が脱輪していたために異常状態となっていた。
・以上のことから、モバイルロボット3bは正常状態に復帰した。
In this example, the operator can read the following from the image shown in FIG.
-The mobile robot 3b is horizontal from the entire landscape.
・ As instructed in the operation information, it moved to the right rear from the stop position.
・ A hole could be recognized from the image, and the left wheel was removed from the hole, resulting in an abnormal condition.
-From the above, the mobile robot 3b has returned to the normal state.
 サーバ1は、道路上に穴があることをGPSの位置情報とともに記録することができる。これにより、図9に示すように道路上の穴の位置901が把握可能となり、今後この道路を走行するモバイルロボット3bが穴の位置901を参考にして走行経路902を決定することで、穴への落下をより確実に回避できるようになる。 The server 1 can record that there is a hole on the road together with the GPS position information. As a result, as shown in FIG. 9, the position 901 of the hole on the road can be grasped, and the mobile robot 3b traveling on this road will determine the traveling route 902 with reference to the position 901 of the hole in the future to reach the hole. You will be able to avoid the fall of.
 また、図8に示す画像以外にも、車体傾きが0度であり自己診断の結果が正常であることなどを示す情報が、操作端末2に送信されて表示される。 In addition to the image shown in FIG. 8, information indicating that the vehicle body tilt is 0 degrees and the self-diagnosis result is normal is transmitted to the operation terminal 2 and displayed.
 操作者は、モバイルロボット3bから送信される状態情報を参照して、操作の結果を入力してサーバ1に送信する(ステップS108)。本例では、通常状態に復帰した旨を示す操作結果が入力される。サーバ1は、送信された操作結果を、行った操作内容、操作者ID、対応に要した時間とともに対応情報DB108に格納する(ステップS109)。このように、状態情報と操作情報とその結果情報とを関連付けて保存することで、どのような状況においてどのような操作により正常状態に復帰できるのかというノウハウを蓄積することができる。操作結果の情報は、操作者が手動で操作端末2に入力してもよいし、モバイルロボット3bに自己判断させてもよい。 The operator refers to the state information transmitted from the mobile robot 3b, inputs the result of the operation, and transmits it to the server 1 (step S108). In this example, an operation result indicating that the normal state has been restored is input. The server 1 stores the transmitted operation result in the correspondence information DB 108 together with the operation content performed, the operator ID, and the time required for the correspondence (step S109). In this way, by storing the state information, the operation information, and the result information in association with each other, it is possible to accumulate know-how as to what kind of operation can be used to return to the normal state in what situation. The operation result information may be manually input to the operation terminal 2 by the operator, or may be caused by the mobile robot 3b to make a self-judgment.
 なお、1回の操作で通常状態に復帰せずに、通常状態に戻るために試行錯誤を繰り返す場合も想定される。この場合は、再度、操作者からの操作指令の入力が行われる。なお、サーバ1は、同一の操作者に引き続き対処を依頼してもよいが、新たな操作者を選択して別の操作者に異常事象への対処を行わせてもよい。 In addition, it is assumed that trial and error may be repeated in order to return to the normal state without returning to the normal state with one operation. In this case, the operation command is input from the operator again. The server 1 may continuously request the same operator to take action, but may select a new operator and have another operator take action on the abnormal event.
<実施例2>
 次に、モバイルロボット3bがトマトの栽培管理を行う農業ロボットである場合を例として、本発明が適用されたタスク配信システムを具体的に説明する。
<Example 2>
Next, the task distribution system to which the present invention is applied will be specifically described by taking as an example the case where the mobile robot 3b is an agricultural robot that manages the cultivation of tomatoes.
 本実施形態におけるモバイルロボット3bは、少なくとも、自律的に移動して圃場やビニールハウス内を巡回可能であり、写真を撮影して転送する機能を有する。モバイルロボット3bは、図10に示すように、移動機構1011とロボットコントローラ1012と位置情報取得装置1013とカメラ1014と電源(不図示)を含む本体(走行部)と、農薬収容部1021と農薬散布部1022と照明部1023とカメラ1024を含むアーム1020を備える。ここでは、1つのアーム1020に画像撮影と農薬散布の機能を持たせているが、異なるアームに対して各機能を持たせてもよい。また、アーム1020は、実の収穫や、葉をかき分けたりつまみ上げたりすることができる。また、モバイルロボット3bは、図示した以外にも、距離センサや加速度センサなどのその他のセンタを備えていてもよい。 The mobile robot 3b in the present embodiment can at least move autonomously and patrol the field or the greenhouse, and has a function of taking a picture and transferring the robot. As shown in FIG. 10, the mobile robot 3b includes a moving mechanism 1011, a robot controller 1012, a position information acquisition device 1013, a camera 1014, a main body (running unit) including a power supply (not shown), a pesticide storage unit 1021, and a pesticide spraying unit. It includes an arm 1020 including a unit 1022, an illumination unit 1023, and a camera 1024. Here, one arm 1020 is provided with the functions of image capturing and pesticide spraying, but different arms may be provided with each function. In addition, the arm 1020 can harvest fruits and scrape and pick up leaves. Further, the mobile robot 3b may be provided with other centers such as a distance sensor and an acceleration sensor other than those shown in the figure.
 移動機構1011は、車輪型や無端軌道式などの移動機構である。ロボットコントローラ1012は上述したものと同様である。位置情報取得装置1013は、圃場やビニールハウス内でのロボットの位置を特定するための装置である。位置情報取得装置1013はGPS装置であってもよいし、圃場やビニールハウスまたは株に取り付けられたバーコードやRFIDに読み込む装置であってもよい。カメラ1014は、ロボットの周囲を比較的広い範囲で撮影する。 The moving mechanism 1011 is a moving mechanism such as a wheel type or an endless track type. The robot controller 1012 is the same as that described above. The position information acquisition device 1013 is a device for identifying the position of the robot in a field or a vinyl house. The position information acquisition device 1013 may be a GPS device, or may be a device that reads a barcode or RFID attached to a field, a greenhouse, or a stock. The camera 1014 captures the surroundings of the robot in a relatively wide range.
 モバイルロボット3bは、農薬収容部1021と農薬散布部1022による農薬散布機能を有し、農薬散布が必要であるロボットが判断するときに自動で農薬を散布可能である。農薬は例えばアームから散布される。害虫がいる場合や病害がある場合には、アームカメラ1024の画像から害虫の種類や病状を判別でき、特定された害虫や病状に対して有効な農薬の種類が特定できる。例えば、ニジュウヤホシテントウがいる場合には、クロチアニジン水溶液が有効であると判断できる。複数の害虫や病状に対応できるように農薬収容部1021は、複数種類の農薬を搭載していてもよい。なお、カメラ1024から得られる画像から異常が発生している可能性があることはわかるが、実際に異常であるか、および異常である場合にどのように対処すべきであるかを、ロボットが自動的に判断するのが難しい場合がある。例えば、葉に変色がある場合、その原因は害虫、カビ、ウイルスなど様々であり、原因によって異なる農薬を使用すべきである。 The mobile robot 3b has a pesticide spraying function by the pesticide storage section 1021 and the pesticide spraying section 1022, and can automatically spray pesticides when the robot in need of pesticide spraying determines. The pesticide is sprayed, for example, from the arm. When there is a pest or there is a disease, the type and pathological condition of the pest can be identified from the image of the arm camera 1024, and the type of pesticide effective for the identified pest or pathological condition can be identified. For example, in the presence of Henosepilachna vigintioctopunctata, it can be determined that an aqueous solution of clothianidin is effective. The pesticide storage unit 1021 may be loaded with a plurality of types of pesticides so as to deal with a plurality of pests and medical conditions. It can be seen from the image obtained from the camera 1024 that an abnormality may have occurred, but the robot determines whether it is actually abnormal and what to do if it is abnormal. It can be difficult to determine automatically. For example, if the leaves are discolored, the causes are various, such as pests, molds, and viruses, and different pesticides should be used depending on the cause.
 ロボットアーム1020は、前後左右に移動でき、茂った葉の中に挿入することができる。ロボットアーム1020の先端にはカメラ1024と照明部1023が搭載されており、アーム1020を回転させることでカメラ1024と照明部1023による撮影および照明の方向を変更可能である。例えば、カメラ1024および照明部1023を上向きとすることで、葉の裏側の撮影可能である(図11A参照)。 The robot arm 1020 can be moved back and forth and left and right, and can be inserted into thick leaves. A camera 1024 and an illumination unit 1023 are mounted on the tip of the robot arm 1020, and the direction of photography and illumination by the camera 1024 and the illumination unit 1023 can be changed by rotating the arm 1020. For example, by pointing the camera 1024 and the illumination unit 1023 upward, it is possible to take a picture of the back side of the leaf (see FIG. 11A).
 図11Aは、モバイルロボット3bの本体カメラ1014が撮影した画像1101の例を示す図である。画像1101には、アーム1020を伸ばした状態であり、アーム1020が写っている。図示されるように、トマトの葉表の一部に変色部分1102が認められる。ロボットコントローラ1012は、葉表の変色部分1102の画像だけからでは変色の原因が特定できないと判断し、アーム1020を動作させて葉裏の画像を撮影する必要があると判断したと仮定する。ロボットコントローラ1012は、アームカメラ1024および照明部1023を上向きの状態で、アームを葉裏の位置まで伸ばしてカメラ1024による撮影を実施する。上記一連の動作はあくまでも一例であり、葉に変色部分が認められなくても、株全数に対して検査を行うようにロボット3bを制御してもよい。また、異常がない場合には画像を操作端末に送らなくてもよい。 FIG. 11A is a diagram showing an example of the image 1101 taken by the main body camera 1014 of the mobile robot 3b. Image 1101 shows the arm 1020 in an extended state. As shown, a discolored portion 1102 is observed on a part of the leaf surface of the tomato. It is assumed that the robot controller 1012 determines that the cause of the discoloration cannot be identified only from the image of the discolored portion 1102 on the leaf surface, and determines that it is necessary to operate the arm 1020 to take an image of the leaf back. The robot controller 1012 extends the arm to the position of the back of the leaf with the arm camera 1024 and the lighting unit 1023 facing upward, and takes a picture with the camera 1024. The above series of operations is merely an example, and the robot 3b may be controlled so as to inspect the entire number of strains even if no discolored portion is observed on the leaves. Further, if there is no abnormality, it is not necessary to send the image to the operation terminal.
 図11Bは、アームカメラ1024によって撮影された葉裏の画像1103を示す図である。画像1103に対して画像処理を施すことで、葉裏にも変色部分1104があり、また、その周囲に複数の同様の大きさの小さい斑点1105があることが認識できる。この画像処理は、ロボットコントローラ1012あるいはロボット3bに組み込まれたその他の処理装置によって行われてもよいし、ロボット3bから画像を送信された外部の装置によって行われてもよい。ロボットコントローラ1012は、この画像処理の結果から、異常が発生している可能性は高いが、どのように対処してよいかは不明であり、したがって操作者(人間)による判断を仰ぐべき事象、すなわち例外事象が発生したと判断する。したがって、ロボットコントローラ1012は、状態情報をサーバ1に送信する。 FIG. 11B is a diagram showing an image 1103 of the back of a leaf taken by an arm camera 1024. By performing image processing on the image 1103, it can be recognized that there is also a discolored portion 1104 on the back of the leaf, and there are a plurality of small spots 1105 of the same size around the discolored portion 1104. This image processing may be performed by the robot controller 1012 or another processing device incorporated in the robot 3b, or may be performed by an external device to which the image is transmitted from the robot 3b. From the result of this image processing, it is highly possible that the robot controller 1012 has an abnormality, but it is unclear how to deal with it. Therefore, an event that requires the judgment of the operator (human). That is, it is determined that an exception event has occurred. Therefore, the robot controller 1012 transmits the state information to the server 1.
 サーバ1の状態受付部101が、モバイルロボット3bから状態情報を受け付ける(S103)と、この情報は対応情報DB108に格納される。図12に、本実施例における対応情報DB108の例を示す。テーブル形式自体は、実施例1(図6)と同様であるため、詳細な説明は省略する。この時点では、例外事象の発生日時1201、例外事象が発生したロボットID1202、および状態情報1203がデータベース108に格納される。 When the status reception unit 101 of the server 1 receives the status information from the mobile robot 3b (S103), this information is stored in the correspondence information DB 108. FIG. 12 shows an example of the correspondence information DB 108 in this embodiment. Since the table format itself is the same as that of the first embodiment (FIG. 6), detailed description thereof will be omitted. At this point, the date and time when the exception event occurred 1201, the robot ID 1202 where the exception event occurred, and the state information 1203 are stored in the database 108.
 本例では、状態情報1203に、位置情報、葉表の状態、葉裏の状態、葉の面積、葉の面積、実の直径、実の色、画像が含まれる。位置情報は、位置情報取得装置1013によって取得された緯度経度情報、ビニールハウス番号、株番号を含む。また、位置情報は高さ情報を含んでおり、この高さ情報はロボット本体1010に搭載されたカメラ1014で撮像される位置、またはアームカメラ1024によって画像が撮影されている場合にはアームカメラ1024またはアーム1020の高さである。アームカメラ1024またはアーム1020の高さは、アーム1020に備えられるエンコーダから取得可能である。高さ情報は、対象とする実または葉の高さを表す。 In this example, the state information 1203 includes position information, leaf surface state, leaf back state, leaf area, leaf area, fruit diameter, fruit color, and image. The position information includes latitude / longitude information, a vinyl house number, and a stock number acquired by the position information acquisition device 1013. Further, the position information includes height information, and this height information is the position imaged by the camera 1014 mounted on the robot main body 1010, or the arm camera 1024 when the image is taken by the arm camera 1024. Or the height of the arm 1020. The height of the arm camera 1024 or arm 1020 can be obtained from the encoder provided on the arm 1020. The height information represents the height of the fruit or leaf of interest.
 葉表の情報は、画像認識技術による葉表の状態の判別結果を示す。葉表の情報の例として、「正常」「変色部分あり」「虫食い部分あり」「虫あり」などを例示できる。このような画像認識は、ディープラーニングなどの機械学習の手法により実現可能であり、公知であるため詳細な説明は省略する。 The leaf table information indicates the result of determining the state of the leaf table by the image recognition technology. Examples of leaf surface information include "normal", "discolored part", "worm-eaten part", and "insect". Such image recognition can be realized by a machine learning method such as deep learning, and since it is known, detailed description thereof will be omitted.
 葉裏の情報は、画像認識技術による葉裏の状態の判別結果を示す。本例では、葉表の状態が「変色部分あり」であったので、モバイルロボット3bは葉裏の画像を撮影し画像認識結果を得ている。葉裏の情報はこの認識結果であり、本例では「変色部分あり」かつ「虫あり」という認識結果が得られている。なお、葉表の状態が「正常」となった場合は、葉裏の撮影および状態の判断を省略してもよい。また、葉表の状態が「正常」であっても葉裏には異常がある場合もあるので、葉表および葉裏の両方を撮影して状態を確認するようにしてもよい。 The leaf back information indicates the result of determining the leaf back state by the image recognition technology. In this example, since the state of the leaf surface was "there is a discolored part", the mobile robot 3b takes an image of the leaf back and obtains an image recognition result. The information on the back of the leaf is the recognition result, and in this example, the recognition result that "there is a discolored part" and "there is an insect" is obtained. When the state of the leaf surface becomes "normal", the photography of the leaf back and the determination of the state may be omitted. Further, even if the state of the leaf surface is "normal", there may be an abnormality in the leaf back, so that both the leaf surface and the leaf back may be photographed to check the state.
 葉の面積、実の直径、実の色は、本体カメラ1014の撮影画像および距離センサによる対象物(葉または実)までの距離に基づいて計算によって求められる。これらの情報は公知技術を用いて計算することができるので詳細な説明は省略する。 The area of the leaf, the diameter of the fruit, and the color of the fruit are calculated based on the image taken by the main camera 1014 and the distance to the object (leaf or fruit) by the distance sensor. Since this information can be calculated using a known technique, detailed description thereof will be omitted.
 ロボット本体に搭載したカメラ1014からの画像およびロボットアームに搭載したカメラ1024からの画像は、それぞれ図11Aおよび図11Bに示したものである。 The image from the camera 1014 mounted on the robot body and the image from the camera 1024 mounted on the robot arm are shown in FIGS. 11A and 11B, respectively.
 サーバ1の状態制御部102は、モバイルロボット3bから送信される状態情報から例外事象が発生していると判断でき、したがって、この例外事象に対処するための操作者を選択するタスクマッチングの処理を行う(ステップS104)。この例での例外事象は、トマトの病害診断に関するので、情報制御部102は、オンライン状態であり、かつ他のタスクの処理を行っていない操作者の中から、この例外事象に対処するためのスキルを有する操作者を選択する。 The state control unit 102 of the server 1 can determine that an exception event has occurred from the state information transmitted from the mobile robot 3b, and therefore performs a task matching process for selecting an operator to deal with this exception event. (Step S104). Since the exception event in this example relates to the disease diagnosis of tomato, the information control unit 102 is for dealing with this exception event from among the operators who are online and do not process other tasks. Select an operator with skills.
 本実施例における操作者情報DB107の例を図13に示す。操作者情報DB107の内容は実施例1と基本的に同様であるので、重複した説明は省略する。本実施例は農作業への応用であるので、各操作者の農作業に関するスキルが格納される。図10に示す例では、スキル情報の一つとして、「トマトの葉の病気、害虫の点検」が設定されている。このスキルは、トマトに発生する病気および害虫についての知識があり、発見した病気および害虫に対する適切な処置についての知識を意味する。適切な処置とは、病気および害虫に対する有効な農薬の選択または経過観察する判断能力である。他のスキルとして、「トマトの収穫」も設定される。このスキルは、ロボットから送信される画像を見て、収穫してよいか否かの判断、およびアームを遠隔操作して収穫できるスキルである。 FIG. 13 shows an example of the operator information DB 107 in this embodiment. Since the content of the operator information DB 107 is basically the same as that of the first embodiment, duplicate description will be omitted. Since this embodiment is an application to agricultural work, the skills related to agricultural work of each operator are stored. In the example shown in FIG. 10, "inspection of tomato leaf disease and pests" is set as one of the skill information. This skill means knowledge of the diseases and pests that occur in tomatoes and knowledge of the appropriate treatment for the diseases and pests found. Appropriate treatment is the ability to select or follow up effective pesticides against diseases and pests. As another skill, "harvesting tomatoes" is also set. This skill is a skill that allows you to look at the image sent from the robot, determine whether or not to harvest, and remotely control the arm to harvest.
 オンライン状態かつ他のタスクを処理していない操作者として、図13に示す操作者1と操作者2が存在する場合を考える。本例の例外事象はトマトの葉の病気や害虫の点検に関するので、対応できる操作者は操作者1のみである。したがって、サーバ1の情報制御部は、今回の例外事象と操作者1とをマッチングする(ステップS104)。 Consider the case where the operator 1 and the operator 2 shown in FIG. 13 exist as operators who are online and do not process other tasks. Since the exception event in this example is related to the inspection of tomato leaf diseases and pests, the only operator who can handle it is operator 1. Therefore, the information control unit of the server 1 matches the exception event this time with the operator 1 (step S104).
 情報制御部102は、例外事象への対処を求める要求と、モバイルロボット3bの状態情報とを、選択した操作者の操作端末に対して送信する(ステップS105)。これらの情報を受信した操作端末2は、配達ロボットの状態情報と、配達ロボットを操作するためのGUIとを含む画面を表示する。 The information control unit 102 transmits a request for dealing with an exception event and the state information of the mobile robot 3b to the operation terminal of the selected operator (step S105). The operation terminal 2 that has received such information displays a screen including the state information of the delivery robot and the GUI for operating the delivery robot.
 図14は、操作端末2に表示されるGUI1400の例を示す。GUI1400には、本体カメラ1014の画像1401、アームカメラ1024の画像1402、画像以外の状態情報1405が含まれる。GUI1400には、また、ロボット本体1010を操作(移動)させるための入力部1403、アーム1020を操作(移動)させるための入力部1404、一連の操作指令を入力するための入力部1406、操作結果を入力するための入力部1407、および入力内容をサーバ1に送信するための送信ボタン1408が含まれる。 FIG. 14 shows an example of the GUI 1400 displayed on the operation terminal 2. The GUI 1400 includes an image 1401 of the main camera 1014, an image 1402 of the arm camera 1024, and state information 1405 other than the image. The GUI 1400 also includes an input unit 1403 for operating (moving) the robot body 1010, an input unit 1404 for operating (moving) the arm 1020, an input unit 1406 for inputting a series of operation commands, and an operation result. The input unit 1407 for inputting the input contents and the transmission button 1408 for transmitting the input contents to the server 1 are included.
 操作者は、GUI1400を用いてモバイルロボット3bに対する操作指令を入力してサーバ1に送信する。例えば、次のような2ステップの操作指令を入力することが考えられる。第1ステップは、葉の剪定処理である。操作者は、画像1401,1402や状態情報1405から、葉に虫が付いていることを確認し、その処置として、虫による被害をこれ以上広げないようにするために葉を剪定して密閉箱に収納するよう指示する。なお、密閉箱はロボットが有しており、目標とする葉のロボットアームによる剪定処理および密閉箱への収納処理はロボットが自動で行えることを前提としている。 The operator inputs an operation command for the mobile robot 3b using the GUI 1400 and sends it to the server 1. For example, it is conceivable to input the following two-step operation command. The first step is a leaf pruning process. From the images 1401, 1402 and the state information 1405, the operator confirms that the leaves have insects, and as a countermeasure, the leaves are pruned to prevent further spread of damage caused by the insects. Instruct to store in. The closed box is owned by the robot, and it is assumed that the robot can automatically perform the pruning process of the target leaf by the robot arm and the storage process in the closed box.
 第2のステップは、農薬の散布処理である。操作者は、虫がアブラムシの可能性が高いと判断し、農薬散布の項目に「YES」を入力し、続いて、アブラムシに効果のある農薬アセフェートを選択して入力する。ここでは、ロボットには複数種類の農薬が備えられており、農薬散布処理はロボットが自動で行えることを前提としている。 The second step is the spraying treatment of pesticides. The operator determines that the insect is likely to be an aphid, and inputs "YES" in the item for spraying pesticides, and then selects and inputs a pesticide acephate that is effective against aphids. Here, it is assumed that the robot is equipped with a plurality of types of pesticides, and that the pesticide spraying process can be performed automatically by the robot.
 操作者によって入力された操作指令は、サーバ1を介してモバイルロボット3bに送信される(ステップS106,S107)。モバイルロボット3bは、上記の操作指令に従って動作し、その結果として得られる情報を、サーバ1を介して、または直接に、操作端末2に送信する。操作者は、モバイルロボット3bから送信される画像や状態情報に基づいて、葉の剪定処理や農薬散布処理が正常に完了したか判断できる。正常に完了している場合には、操作結果として「処理完了」を入力する。また、操作者による判断は画像に基づくものであるため、判断が誤っている可能性があること、虫や病気がロボットの画像だけでは発見できない場所に広がっている可能性があることなどから、農場で直接確認できる人への連絡事項として、操作者は事後確認を「必要」と入力する。また、操作者は、判断結果として、発生している虫の種類として「アブラムシ」を入力してもよい。これらの操作結果の情報は、操作端末2からサーバ1に送信される(ステップS108)。 The operation command input by the operator is transmitted to the mobile robot 3b via the server 1 (steps S106 and S107). The mobile robot 3b operates in accordance with the above operation command, and transmits the information obtained as a result to the operation terminal 2 via the server 1 or directly. The operator can determine whether the leaf pruning treatment or the pesticide spraying treatment has been completed normally based on the image and the state information transmitted from the mobile robot 3b. If it is completed normally, enter "Processing completed" as the operation result. In addition, since the judgment by the operator is based on the image, the judgment may be wrong, and insects and diseases may have spread to places that cannot be found only by the robot image. The operator inputs post-confirmation as "necessary" as a matter of contact to those who can be confirmed directly on the farm. In addition, the operator may input "aphid" as the type of insect that is occurring as a judgment result. The information on these operation results is transmitted from the operation terminal 2 to the server 1 (step S108).
 サーバ1は、送信された操作結果を、操作指令内容、操作者ID、対応に要した時間とともに対応情報DB108に格納する(ステップS109)。この処理により、図12に示すように、対応情報DB108の操作情報1204,操作結果1205,操作者ID1206、対応時間1207が更新される。 The server 1 stores the transmitted operation result in the correspondence information DB 108 together with the operation command content, the operator ID, and the time required for the correspondence (step S109). By this process, as shown in FIG. 12, the operation information 1204, the operation result 1205, the operator ID 1206, and the correspondence time 1207 of the correspondence information DB 108 are updated.
 なお、本実施例では、操作結果として事後確認が必要とされていることから、サーバ1の情報制御部102は、現場(農場)にいる操作者の操作端末2に対して、例外事象の対応情報を送信して、どのような事象が発生し、どのような処置を行ったかを知らせるとともに、事後確認を行うように通知する。これを受けて、現場の操作者は、虫がついていた株を調べて周囲に虫が残っていないかや、モバイルロボット3bが剪定した葉を調べて虫が実際にアブラムシであるかなどを確認する。もし、遠隔操作者による判断が間違っていた場合には、現場操作者は別の処置をしたり、正しい結果をサーバ1に登録したりする。 In this embodiment, since post-confirmation is required as the operation result, the information control unit 102 of the server 1 responds to the operation terminal 2 of the operator at the site (farm) with an exception event. It sends information to inform you of what happened and what you did, and to ask you to confirm after the fact. In response to this, the operator at the site examines the strain on which the insect was attached and confirms whether there are any insects remaining around, and examines the leaves pruned by the mobile robot 3b to confirm whether the insect is actually an aphid. To do. If the remote operator makes a mistake, the field operator takes another action or registers the correct result in the server 1.
 なお、本実施例において、モバイルロボット3bが農場で作業を行うのは夜の時間帯としてもよい。本実施例におけるモバイルロボット3bは、完全にロボット化あるいは自動化がされた農場で用いられることは想定していない。そのため、農作業者とモバイルロボット3bが同じ場所で作業をすることになるが、農作業者とロボットが混在して作業をするよりも、時間帯を分けて作業をする方が効率がよい。このような理由から、モバイルロボット3bによる作業は夜間に行ってもよい。例外事象への対処は夜間に行う必要があるが、遠隔操作者はネットワークでつながっていればどこにいても対応できるため、例えば、例外事象への対処タスクを時差の大きい外国にいる遠隔作業者に割り当てることも好ましい。 In this embodiment, the mobile robot 3b may work on the farm during the night time. The mobile robot 3b in this embodiment is not intended to be used in a fully robotized or automated farm. Therefore, the farm worker and the mobile robot 3b work in the same place, but it is more efficient to work in different time zones than to work in a mixed manner with the farm worker and the robot. For this reason, the work by the mobile robot 3b may be performed at night. Exceptional events need to be dealt with at night, but remote operators can respond to them wherever they are connected via a network. For example, the task of dealing with exceptional events can be assigned to remote workers in foreign countries with a large time difference. Allocation is also preferred.
 また、夜の時間帯に農作業ロボットに写真撮影させた場合、ロボット本体またはロボットアームに搭載する照明を使用する。照明条件の変化する昼間ではなく照明条件が一定の夜間に作業を行うことで、同じ条件の画像撮影が可能である。遠隔作業者は同一条件下で作業を行えるため、スキルの習熟速度が向上することが期待できる。 Also, if you let the farming robot take a picture during the night time, use the lighting mounted on the robot body or robot arm. By working at night when the lighting conditions are constant, rather than during the day when the lighting conditions change, it is possible to take images under the same conditions. Since remote workers can work under the same conditions, it can be expected that the skill proficiency speed will improve.
<実施例3>
 モバイルロボット3bが店舗の商品管理を行うロボットである場合について簡単に説明する。モバイルロボット3bは、商品補充、検品、欠品の確認、値札の確認などの作業を自律的に行う。そのため、モバイルロボット3bには、マニプレータやカメラが搭載される。
<Example 3>
A case where the mobile robot 3b is a robot that manages products in a store will be briefly described. The mobile robot 3b autonomously performs operations such as product replenishment, inspection, confirmation of missing items, and confirmation of price tags. Therefore, the mobile robot 3b is equipped with a manipulator and a camera.
 本実施例で想定される例外事象を例示する。まず、画像から物体認識を高精度で行えない場合が挙げられる。反射率の高い不定形のパッケージの商品や、アルミラミネートされたパッケージ、透過率の高い透明パッケージの商品は、物体認識を高精度に行えないか全く行えない場合がある。したがって、このような場合に、商品の認識を遠隔作業者に依頼することが想定される。 Illustrate the exception event assumed in this embodiment. First, there is a case where object recognition cannot be performed with high accuracy from an image. Indeterminate package products with high reflectance, aluminum-laminated packages, and transparent package products with high transmittance may not be able to perform object recognition with high accuracy or at all. Therefore, in such a case, it is assumed that the remote worker is requested to recognize the product.
 次に、例外事象の例として、物体を把持する際のハンドリング位置を決定できない場合が挙げられる。不定形や重量バランスが偏っている商品については、モバイルロボット3bは商品のどの部分を掴んで把持すればよいか判断できない場合がある。このような時に、商品のどの部分を掴めば安定して把持ができるかを教示するよう遠隔作業者に依頼することが想定される。 Next, as an example of an exceptional event, there is a case where the handling position when gripping an object cannot be determined. For products with irregular shapes or imbalanced weight balance, the mobile robot 3b may not be able to determine which part of the product should be grasped and grasped. In such a case, it is assumed that a remote operator is requested to teach which part of the product should be grasped for stable gripping.
 例外事象の別の例として、ハンドリング途中で商品を落とした場合が挙げられる。ロボット3bが品出しの自動作業中に商品を落とした場合、想定外の場所に商品が転がっていったりすると、その商品がどこへ行ったかを探すことは困難である。そこで、遠隔操作者に落とした商品の探索および回収を依頼することが想定される。また、商品を回収できたとして、その商品が傷ついたり凹んだりして売り物としての価値が損なわれている場合もある。この判断をロボット3bが自動的に行うのが困難であるので、遠隔作業者に確認作業を依頼することも想定される。 Another example of an exceptional event is the case where a product is dropped during handling. If the robot 3b drops a product during the automatic operation of product delivery and the product rolls in an unexpected place, it is difficult to find out where the product went. Therefore, it is assumed that the remote operator is requested to search for and collect the dropped products. In addition, even if the product can be collected, the product may be damaged or dented, and the value as a product for sale may be impaired. Since it is difficult for the robot 3b to automatically make this determination, it is assumed that a remote operator is requested to perform the confirmation work.
 例外事象のさらに別の例として、値札の読み取りを高精度で行えない場合が挙げられる。販売店では、販売日を限定して特定の商品を特価で販売することがあり、特価と通常価格の値札を付け替えることがある。値札を付け替えた場合、ロボット3bに搭載されたカメラで画像撮影し、画像認識によって値札の確認作業を行うことが想定される。値札の状態によっては、ロボットが値札の読み取りを高精度に行えない場合がある。このような場合に、値札の読み取りを遠隔作業者に依頼することが想定される。なお、遠隔操作者は、送られた画像から値札を読み取るだけでなく、照明の反射が起きないような方向から値札の再撮影を行うように指示してもよい。 Another example of an exceptional event is the case where the price tag cannot be read with high accuracy. At a store, a specific product may be sold at a special price on a limited sale date, and the special price and the regular price tag may be exchanged. When the price tag is replaced, it is assumed that an image is taken by the camera mounted on the robot 3b and the price tag is confirmed by image recognition. Depending on the state of the price tag, the robot may not be able to read the price tag with high accuracy. In such a case, it is assumed that a remote worker is requested to read the price tag. The remote operator may instruct not only to read the price tag from the sent image but also to retake the price tag from a direction in which the reflection of the lighting does not occur.
<実施例4>
 モバイルロボット3bが掃除ロボットである場合について簡単に説明する。モバイルロボット3bは、移動機構、床の掃除機能、カメラ、無線通信機能を有しており、作業場所の地図データに基づいて自走および床掃除を行う。作業場所は、ビル、駅、空港などの広い場所である。モバイルロボット3bは、また、エレベータのボタンを押すマニプレータを有する。モバイルロボット3bは、また、マイクおよびスピーカを備えており、人間と対話を行う機能も有する。
<Example 4>
A case where the mobile robot 3b is a cleaning robot will be briefly described. The mobile robot 3b has a moving mechanism, a floor cleaning function, a camera, and a wireless communication function, and performs self-propelling and floor cleaning based on map data of a work place. The work place is a large place such as a building, a station, or an airport. The mobile robot 3b also has a manipulator that pushes a button on the elevator. The mobile robot 3b is also equipped with a microphone and a speaker, and has a function of interacting with a human.
 このようなモバイルロボット3bは、地図データで指定された場所を自走し、床の掃除作業を行う。地図データに存在しない障害物を検知した場合には、モバイルロボット3bは自動的に回避する必要がある。 Such a mobile robot 3b self-propells at the place specified by the map data and cleans the floor. When an obstacle that does not exist in the map data is detected, the mobile robot 3b needs to automatically avoid it.
 本実施例で想定される例外事象を例示する。まず、地図データにない障害物を検知したが、安全な回避方法が不明な場合が挙げられる。このような場合に、画像を操作端末に送って、遠隔操作者に安全な回避を行うように依頼することが想定される。 Illustrate the exception event assumed in this embodiment. First, an obstacle that is not in the map data is detected, but a safe workaround is unknown. In such a case, it is assumed that the image is sent to the operation terminal and the remote operator is requested to perform safe avoidance.
 例外事象の他の例として、床の汚れを検知したが、ロボット3bが清掃できるか否かを判断できない場合が挙げられる。このような場合に、操作端末2に画像を送り、ロボットに清掃をさせるか、清掃員を派遣させるかの指示を操作端末に入力するよう遠隔操作者に依頼することが想定される。例えば、ロボット3bは床面に落ちているごみの大きさ、種類などを操作端末に送信する。また、センサにより液体を検出した場合、それ以降の清掃を自動で続けるかどうかも重要である。液体の汚れの場合、ロボット3bの清掃用のブラシもしくは車輪で汚れを拡散する可能性があるからである。したがって、ロボット3bには液体の清掃を行わせずにその他の箇所の清掃を続けさせ、液体の清掃は清掃員に行わせることが望ましいこともある。 Another example of an exceptional event is when a dirt on the floor is detected, but it cannot be determined whether the robot 3b can clean it. In such a case, it is assumed that an image is sent to the operation terminal 2 and a remote operator is requested to input an instruction to the operation terminal to have the robot clean or dispatch a cleaner. For example, the robot 3b transmits the size, type, and the like of the dust falling on the floor surface to the operation terminal. It is also important whether or not the subsequent cleaning is automatically continued when the liquid is detected by the sensor. This is because in the case of liquid stains, the stains may be diffused by the cleaning brush or wheels of the robot 3b. Therefore, it may be desirable to have the robot 3b continue cleaning other parts without cleaning the liquid, and have the cleaning staff clean the liquid.
 例外事象の他の例として、移動ができない場合が挙げられる。ビルの掃除の場合、ロボット3bがフロア間を移動することが想定される。ロボット3bはフロア間の移動にエレベータを用い、マニプレータを用いて操作ボタンを押下する。しかしながら、エレベータの操作ボタンは千差万別であり、場合により階の表示がかすれて読みにくい場合もある。このようにロボットがどの操作ボタンを押すべきか自動で判断できない場合は、移動したい場所(フロア)と画像を操作端末に送って、遠隔操作者に判断を依頼することが想定される。 Another example of an exceptional event is the case where it cannot be moved. In the case of cleaning a building, it is assumed that the robot 3b moves between floors. The robot 3b uses an elevator to move between floors, and presses an operation button using a manipulator. However, the operation buttons of the elevator are diverse, and in some cases, the display on the floor may be faint and difficult to read. When the robot cannot automatically determine which operation button should be pressed in this way, it is assumed that the place (floor) to be moved and the image are sent to the operation terminal and the remote operator is requested to determine.
 例外事象のさらに他の例として、人間とコミュニケーションをとるべき場合が挙げられる。ロボット3bの周囲には人がいる可能性もあり、人がロボットに対して苦情を言っているような場合、人間(操作者)が直接話をした方が好ましい。なお、ロボット3bが人と会話を行うように自動化されている場合であっても、コミュニケーションが円滑に進んでいないとロボット3bが判断した場合に例外事象が発生したと判断してもよい。このような場合は、ロボットが受信した音声を操作端末に送信して、遠隔操作者に対応を依頼することが想定される。人がロボット3bに対して発した音声はロボットのマイクを介して操作端末のスピーカから出力され、遠隔操作者が発した音声は操作端末のマイクを介してロボットのスピーカから出力される。これにより、遠隔操作者と現場の人との会話が実現でき、例えば、苦情に対応できる。 Another example of an exceptional event is when you should communicate with humans. There may be a person around the robot 3b, and when a person is complaining to the robot, it is preferable that the person (operator) speaks directly. Even when the robot 3b is automated to have a conversation with a person, it may be determined that an exception event has occurred when the robot 3b determines that the communication is not proceeding smoothly. In such a case, it is assumed that the voice received by the robot is transmitted to the operation terminal and the remote operator is requested to respond. The sound emitted by a person to the robot 3b is output from the speaker of the operation terminal via the microphone of the robot, and the sound emitted by the remote operator is output from the speaker of the robot via the microphone of the operation terminal. As a result, a conversation between the remote operator and the person in the field can be realized, and for example, a complaint can be dealt with.
<実施例5>
 以上の実施例ではモバイルロボット3bに生じた例外事象を遠隔操作者に対処してもらうことを想定していたが、対象のロボットは固定型ロボット3aであってもよい。このような固定型ロボット3aの例として、養殖生簀の管理ロボットが挙げられる。
<Example 5>
In the above embodiment, it is assumed that the remote operator handles the exception event that occurs in the mobile robot 3b, but the target robot may be the fixed robot 3a. An example of such a fixed robot 3a is a aquaculture cage management robot.
 生け簀管理ロボット3aは、水温、気温、日射量などの値をセンシングし、養殖の管理を行う。ロボット3aによる自動処理の実現が困難な作業として、給餌量の決定がある。給餌量は、その日の魚の状態に応じて適量を与えることが必要であり、また環境汚染の問題やコストの問題から単に多くの量を給餌すればよいというわけではない。 The cage management robot 3a manages aquaculture by sensing values such as water temperature, air temperature, and amount of solar radiation. One of the tasks that makes it difficult for the robot 3a to realize automatic processing is determining the amount of feed. It is necessary to feed an appropriate amount according to the condition of the fish on that day, and it is not enough to simply feed a large amount due to the problems of environmental pollution and cost.
 本実施例の生け簀管理ロボット3aは、餌投入装置、カメラ、無線通信装置、各種センサ(水温、気温、日射量)を備えており、センサによって取得されたデータを無線通信によって送信する。本実施例において、給餌量の決定には操作者(人間)による判断が必要となるため、給餌タイミングは常に例外事象が発生するとみなす。したがって、給餌タイミングでは、生け簀管理ロボット3aによって撮影された画像およびセンサによって取得されたデータが操作端末に送信されて、餌投入装置の制御を遠隔操作者に依頼する。遠隔操作者は、画像を見ながら餌投入装置を制御し、水面での魚の様子、つまり餌への食いつき具合を観察しながら投入する餌の量を調節することできる。 The living cage management robot 3a of this embodiment is equipped with a feeding device, a camera, a wireless communication device, and various sensors (water temperature, air temperature, amount of solar radiation), and transmits data acquired by the sensors by wireless communication. In this embodiment, since the determination of the feeding amount requires a judgment by the operator (human), the feeding timing is always regarded as an exception event. Therefore, at the feeding timing, the image taken by the cage management robot 3a and the data acquired by the sensor are transmitted to the operation terminal, and the remote operator is requested to control the feeding device. The remote operator can control the feeding device while observing the image and adjust the amount of feeding while observing the state of the fish on the surface of the water, that is, the degree of biting into the food.
 なお、生け簀管理ロボット3aではなく、自動運転装置を備え付けられた船に搭載された餌投入装置を制御する場合にも同様に有効である。 It should be noted that it is also effective when controlling the bait feeding device mounted on the ship equipped with the automatic driving device instead of the cage management robot 3a.
(その他)
 上述した各実施形態は、本発明の例示に過ぎない。本発明は上記の具体的な形態に限定されることはなく、その技術的思想の範囲内で種々の変形が可能である。
(Other)
Each of the above embodiments is merely an example of the present invention. The present invention is not limited to the above-mentioned specific form, and various modifications can be made within the scope of its technical idea.
(付記1)
 操作者が有する操作端末(2)の接続状態を管理する操作端末接続部(104)と、
 ロボット(3a,3b)から状態情報を受信する状態受付部(101)と、
 前記状態情報に基づいて前記ロボットに例外事象が発生していると判断される場合に、当該例外事象に対処するための操作者を、利用可能な操作者の中から選択する情報制御部(102)と、
 前記選択された操作者の操作端末(2)に、前記ロボットの例外事象への対処要求を前記ロボットの状態情報とともに送信する送信部(103)と、
 を備える、タスク配信装置(1)。
(Appendix 1)
An operation terminal connection unit (104) that manages the connection status of the operation terminal (2) owned by the operator, and
The status reception unit (101) that receives status information from the robots (3a, 3b) and
When it is determined that an exception event has occurred in the robot based on the state information, the information control unit (102) selects an operator for dealing with the exception event from the available operators. )When,
A transmission unit (103) that transmits a response request for an exception event of the robot to the operation terminal (2) of the selected operator together with the state information of the robot.
A task distribution device (1).
(付記2)
 コンピュータによって実行される方法であって、
 操作者が有する操作端末(2)の接続状態を管理するステップ(S102)と、
 ロボット(3a,3b)から状態情報を受信するステップ(S103)と、
 前記状態情報に基づいて前記ロボットに例外事象が発生していると判断される場合に、当該例外事象に対処するための操作者を、利用可能な操作者の中から選択するステップ(S104)と、
 前記選択された操作者の操作端末(2)に、前記ロボットの例外事象への対処要求を前記ロボットの状態情報とともに送信するステップ(S105)と、
 を含む、方法。
(Appendix 2)
The method performed by the computer
A step (S102) for managing the connection status of the operation terminal (2) held by the operator, and
The step (S103) of receiving the status information from the robots (3a, 3b) and
When it is determined that an exception event has occurred in the robot based on the state information, an operator for dealing with the exception event is selected from the available operators (S104). ,
A step (S105) of transmitting a response request to the exception event of the robot to the operation terminal (2) of the selected operator together with the state information of the robot.
Including methods.
1:タスク配信サーバ   2:操作端末
3a:固定型ロボット   3b:モバイルロボット
101:状態受付部    102:情報制御部   103:送信部
104:操作端末接続部  105:受信部     106:出力部
107操作者情報DB   108:対応情報DB
1: Task distribution server 2: Operation terminal 3a: Fixed robot 3b: Mobile robot 101: Status reception unit 102: Information control unit 103: Transmission unit 104: Operation terminal connection unit 105: Reception unit 106: Output unit 107 Operator information DB 108: Correspondence information DB

Claims (16)

  1.  操作者が有する操作端末の接続状態を管理する操作端末接続部と、
     ロボットから状態情報を受信する状態受付部と、
     前記状態情報に基づいて前記ロボットに例外事象が発生していると判断される場合に、当該例外事象に対処するための操作者を、利用可能な操作者の中から選択する情報制御部と、
     前記選択された操作者の操作端末に、前記ロボットの例外事象への対処要求を前記ロボットの状態情報とともに送信する送信部と、
     を備える、タスク配信装置。
    An operation terminal connection unit that manages the connection status of the operation terminal owned by the operator,
    A status reception unit that receives status information from the robot,
    When it is determined that an exception event has occurred in the robot based on the state information, an information control unit that selects an operator for dealing with the exception event from the available operators.
    A transmission unit that transmits a response request to the exception event of the robot to the operation terminal of the selected operator together with the state information of the robot.
    A task distribution device equipped with.
  2.  前記操作端末接続部は、複数の前記操作端末のオンライン状態を確立し、かつオンライン状態を監視し、
     前記情報制御部は、オンライン状態である前記操作端末に関連付けられた操作者を、前記例外事象に対処するための操作者として選択する、
     請求項1に記載のタスク配信装置。
    The operation terminal connection unit establishes an online state of a plurality of the operation terminals and monitors the online state.
    The information control unit selects an operator associated with the operation terminal that is online as an operator for dealing with the exception event.
    The task distribution device according to claim 1.
  3.  操作者のスキルを記憶する操作者情報記憶部をさらに有し、
     前記情報制御部は、前記ロボットの種類および前記例外事象の少なくとも一方と合致するスキルを有する操作者を選択する、
     請求項1に記載のタスク配信装置。
    It also has an operator information storage unit that stores the skills of the operator.
    The information control unit selects an operator having a skill that matches at least one of the robot type and the exception event.
    The task distribution device according to claim 1.
  4.  前記操作端末から前記ロボットに対する操作指令を受信する受信部と、
     前記操作指令を前記ロボットに対して出力する出力部と、
     前記ロボットの状態と前記操作者からの操作内容とを関連付けて記憶する対応情報記憶部と、
     をさらに有する、請求項3に記載のタスク配信装置。
    A receiving unit that receives an operation command for the robot from the operation terminal,
    An output unit that outputs the operation command to the robot and
    A corresponding information storage unit that stores the state of the robot and the operation content from the operator in association with each other.
    The task distribution device according to claim 3, further comprising.
  5.  前記受信部は、前記ロボットの例外事象に対する対処結果を前記操作端末から受信し、
     前記対応情報記憶部は、前記対処結果を、前記ロボットの例外事象および前記操作者と関連付けて記憶する、
     請求項4に記載のタスク配信装置。
    The receiving unit receives the response result for the exception event of the robot from the operating terminal, and receives the response result.
    The corresponding information storage unit stores the response result in association with the exception event of the robot and the operator.
    The task distribution device according to claim 4.
  6.  前記情報制御部は、前記対応情報記憶部を参照して前記操作者のスキルを評価し、前記操作者情報記憶部を更新する、
     請求項4に記載のタスク配信装置。
    The information control unit evaluates the skill of the operator with reference to the corresponding information storage unit, and updates the operator information storage unit.
    The task distribution device according to claim 4.
  7.  前記操作者情報記憶部は、前記操作者の利用費用を記憶しており、
     前記情報制御部は、前記対応情報記憶部を参照して前記操作者の対応履歴に応じて支払うべき報酬を決定する、
     請求項4に記載のタスク配信装置。
    The operator information storage unit stores the usage cost of the operator.
    The information control unit refers to the corresponding information storage unit and determines a reward to be paid according to the correspondence history of the operator.
    The task distribution device according to claim 4.
  8.  前記情報制御部は、前記ロボットの例外事象を分類し、例外事象の種類に応じた操作者を、前記例外事象に対処するための操作者として選択する、
     請求項1に記載のタスク配信装置。
    The information control unit classifies the exception events of the robot, and selects an operator according to the type of the exception event as an operator for dealing with the exception event.
    The task distribution device according to claim 1.
  9.  前記送信部は、前記操作者からの要求に応じて、前記ロボットが設置されている現場にいる管理者の操作端末に、前記ロボットの状態情報を送信する、
     請求項1に記載のタスク配信装置。
    In response to a request from the operator, the transmitter transmits state information of the robot to an operation terminal of an administrator at a site where the robot is installed.
    The task distribution device according to claim 1.
  10.  前記状態情報は、前記ロボットに備えられたカメラによって撮影された画像と、前記ロボットに備えられたセンサから得られるセンサ情報の解析結果と、を含む、
     請求項1に記載のタスク配信装置。
    The state information includes an image taken by a camera provided in the robot and an analysis result of sensor information obtained from a sensor provided in the robot.
    The task distribution device according to claim 1.
  11.  ロボットと、
     前記ロボットを遠隔操作するための操作端末と、
     請求項1に記載されたタスク配信装置と、
     を備えるタスク配信システム。
    With a robot
    An operation terminal for remotely controlling the robot and
    The task distribution device according to claim 1 and
    Task delivery system with.
  12.  前記操作端末は、前記ロボットの例外事象への対処要求を受信すると、前記状態情報と前記ロボットを操作するためのグラフィカルユーザーインターフェースとを表示部に表示するように構成される、
     請求項11に記載のタスク配信システム。
    Upon receiving a request for coping with an exception event of the robot, the operation terminal is configured to display the state information and a graphical user interface for operating the robot on a display unit.
    The task distribution system according to claim 11.
  13.  前記ロボットは、移動装置と、センサと、カメラとを有し、自律移動可能なモバイルロボットである、
     請求項11に記載のタスク配信システム。
    The robot is a mobile robot that has a mobile device, a sensor, and a camera and can move autonomously.
    The task distribution system according to claim 11.
  14.  コンピュータによって実行される方法であって、
     操作者が有する操作端末の接続状態を管理するステップと、
     ロボットから状態情報を受信するステップと、
     前記状態情報に基づいて前記ロボットに例外事象が発生していると判断される場合に、当該例外事象に対処するための操作者を、利用可能な操作者の中から選択するステップと、
     前記選択された操作者の操作端末に、前記ロボットの例外事象への対処要求を前記ロボットの状態情報とともに送信するステップと、
     を含む、方法。
    The method performed by the computer
    Steps to manage the connection status of the operation terminal owned by the operator,
    Steps to receive status information from the robot,
    When it is determined that an exception event has occurred in the robot based on the state information, a step of selecting an operator for dealing with the exception event from the available operators, and
    A step of transmitting a response request to the exception event of the robot to the operation terminal of the selected operator together with the state information of the robot, and
    Including methods.
  15.  コンピュータを請求項1に記載のタスク配信装置の各手段として機能させるためのプログラム。 A program for making a computer function as each means of the task distribution device according to claim 1.
  16.  コンピュータに請求項14に記載の方法の各ステップを実行させるためのプログラム。 A program for causing a computer to execute each step of the method according to claim 14.
PCT/JP2019/030362 2019-08-01 2019-08-01 Task distribution device, task distribution system, method, and program WO2021019787A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2021536593A JP7331928B2 (en) 2019-08-01 2019-08-01 Task distribution device, task distribution system, method, and program
PCT/JP2019/030362 WO2021019787A1 (en) 2019-08-01 2019-08-01 Task distribution device, task distribution system, method, and program
CN201980098534.3A CN114144806A (en) 2019-08-01 2019-08-01 Task allocation device, task allocation system, method, and program
JP2022085083A JP2022126658A (en) 2019-08-01 2022-05-25 Apparatus, system, method, and program for cultivation management of crops

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/030362 WO2021019787A1 (en) 2019-08-01 2019-08-01 Task distribution device, task distribution system, method, and program

Publications (1)

Publication Number Publication Date
WO2021019787A1 true WO2021019787A1 (en) 2021-02-04

Family

ID=74230670

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/030362 WO2021019787A1 (en) 2019-08-01 2019-08-01 Task distribution device, task distribution system, method, and program

Country Status (3)

Country Link
JP (2) JP7331928B2 (en)
CN (1) CN114144806A (en)
WO (1) WO2021019787A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023053520A1 (en) * 2021-09-30 2023-04-06 ソニーグループ株式会社 Information processing system, information processing method, and program

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115297025A (en) * 2022-07-26 2022-11-04 来也科技(北京)有限公司 IA robot monitoring method and device based on RPA and AI

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002354551A (en) * 2001-05-25 2002-12-06 Mitsubishi Heavy Ind Ltd Robot service providing method and system thereof
JP2007183332A (en) * 2006-01-05 2007-07-19 Advanced Telecommunication Research Institute International Operation training device
JP2007190659A (en) * 2006-01-20 2007-08-02 Advanced Telecommunication Research Institute International Robot remote operation system
JP2018067785A (en) * 2016-10-19 2018-04-26 前川 博文 Communication robot system
JP2018142280A (en) * 2017-02-28 2018-09-13 国立大学法人東北大学 Interaction support apparatus and interactive apparatus
JP2019067211A (en) * 2017-10-02 2019-04-25 株式会社オカムラ Management system, management system control method, and program

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5810494B2 (en) * 2010-09-07 2015-11-11 株式会社ニコン Plant cultivation system, plant cultivation plant, harvesting device, and plant cultivation method
JP5720398B2 (en) * 2011-04-25 2015-05-20 ソニー株式会社 Evaluation apparatus and method, service providing system, and computer program
US9265187B2 (en) * 2013-11-20 2016-02-23 Rowbot Systems Llc Robotic platform and method for performing multiple functions in agricultural systems
WO2019003042A1 (en) * 2017-06-27 2019-01-03 株式会社半導体エネルギー研究所 Semiconductor device, and manufacturing method for semiconductor device
JP6760654B2 (en) * 2017-08-04 2020-09-23 H2L株式会社 Remote control system and management server

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002354551A (en) * 2001-05-25 2002-12-06 Mitsubishi Heavy Ind Ltd Robot service providing method and system thereof
JP2007183332A (en) * 2006-01-05 2007-07-19 Advanced Telecommunication Research Institute International Operation training device
JP2007190659A (en) * 2006-01-20 2007-08-02 Advanced Telecommunication Research Institute International Robot remote operation system
JP2018067785A (en) * 2016-10-19 2018-04-26 前川 博文 Communication robot system
JP2018142280A (en) * 2017-02-28 2018-09-13 国立大学法人東北大学 Interaction support apparatus and interactive apparatus
JP2019067211A (en) * 2017-10-02 2019-04-25 株式会社オカムラ Management system, management system control method, and program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023053520A1 (en) * 2021-09-30 2023-04-06 ソニーグループ株式会社 Information processing system, information processing method, and program

Also Published As

Publication number Publication date
JPWO2021019787A1 (en) 2021-02-04
CN114144806A (en) 2022-03-04
JP2022126658A (en) 2022-08-30
JP7331928B2 (en) 2023-08-23

Similar Documents

Publication Publication Date Title
Vasconez et al. Human–robot interaction in agriculture: A survey and current challenges
US20200333782A1 (en) Method and system for agriculture
US10875752B2 (en) Systems, devices and methods of providing customer support in locating products
JP2022126658A (en) Apparatus, system, method, and program for cultivation management of crops
JP2019095937A (en) Farm crops growth supporting system, information collector, growth supporting server, and farm crops sales supporting system
JP2021078447A (en) Floor feeding poultry house management system
CN115485641A (en) Service robot system, robot and method for operating service robot
US11265516B2 (en) Autonomously mobile outdoor device with a surveillance module
KR20210006785A (en) Robot system for preventing harmful insects using artificial intelligence
Lefranc Trends in Robotics Management and Business Automation
Yerebakan et al. Human–Robot Collaboration in Modern Agriculture: A Review of the Current Research Landscape
US20220133114A1 (en) Autonomous Cleaning Robot
JP2022064123A (en) Robot system
US20240127164A1 (en) Fetching and guarding packages using drones
US20240094712A1 (en) Robot staging area management
Bucher et al. Adaptive Robotic Chassis (ARC): RoboCrop a smart agricultural robot toolset
Edan Human-robot collaboration in agricultural robots
Salah et al. Cooperative Robotic Systems in Agriculture
CA3137337A1 (en) Autonomous cleaning robot
George Humanoid Robots as Poultry Partners: Enhancing Welfare Through Collaboration on the Farm
IT202000004561A1 (en) INTERACTIVE SYSTEM FOR THE PREDICTION AND TREATMENT OF PHYTOPATHOLOGIES
JP2023140945A (en) Inspection device and inspection method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19939865

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021536593

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19939865

Country of ref document: EP

Kind code of ref document: A1