WO2022080006A1 - Système robotique, procédé de commande pour celui-ci, programme et système - Google Patents

Système robotique, procédé de commande pour celui-ci, programme et système Download PDF

Info

Publication number
WO2022080006A1
WO2022080006A1 PCT/JP2021/030432 JP2021030432W WO2022080006A1 WO 2022080006 A1 WO2022080006 A1 WO 2022080006A1 JP 2021030432 W JP2021030432 W JP 2021030432W WO 2022080006 A1 WO2022080006 A1 WO 2022080006A1
Authority
WO
WIPO (PCT)
Prior art keywords
support
target
robot system
information
work
Prior art date
Application number
PCT/JP2021/030432
Other languages
English (en)
Japanese (ja)
Inventor
亮輔 川西
Original Assignee
東京ロボティクス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 東京ロボティクス株式会社 filed Critical 東京ロボティクス株式会社
Publication of WO2022080006A1 publication Critical patent/WO2022080006A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/06Control stands, e.g. consoles, switchboards

Definitions

  • the present invention relates to a semi-autonomous robot system that involves remote control by an operator.
  • Automation using robots is being promoted at production sites and distribution warehouses.
  • Patent Document 1 is an example of such a robot that is remotely operated.
  • the document discloses a technique in which an operator remotely controls a robot to machine-learn the remotely controlled movement of the robot and acquire new capabilities.
  • the target operation itself is executed as an alternative, there is a risk that it will be an excessive burden on the operator who performs remote control.
  • the target motion is a piece picking motion for gripping an object
  • the motion requires a precise operation, so that an operator who performs a remote control is required to have high concentration.
  • remote control of a robot having multiple joints may be difficult in the first place, and in that case, the operator is required to have considerable skill in operation.
  • the present invention has been made in view of the above-mentioned technical background, and an object thereof is to provide a semi-autonomous robot or the like with a small support burden by an operator.
  • the robot system has a target motion executing unit that executes a predetermined target motion, and a target motion that determines whether the target motion is normally executed or is expected to be executed normally.
  • the determination unit the support request unit that makes a support request to the operator terminal when the target operation is not normally executed or is not expected to be executed normally, and the operation of changing the execution condition of the target operation.
  • a support information receiving unit that receives support information including a first support information that is information to be executed from the operator terminal that has received the support request, and a support control unit that performs control corresponding to the support information. , Is equipped.
  • the operator causes the robot system to execute an action of changing the execution condition of the target action, not the target action itself in the robot system, so that only joint support is required and the support is provided.
  • the burden is small. That is, it is possible to provide a semi-autonomous robot system in which the support burden by the operator is small.
  • the support means that the semi-autonomous robot is operated or instructed for the purpose of assisting the execution of the target motion.
  • the support request means the information requesting the operator for such assistance.
  • the support information may further include a second support information that causes the robot system to execute an alternative operation of the target operation.
  • the operator can further perform an alternative motion of the target motion to the robot system, so that the robot system can be directly assisted.
  • the first support information may be information for remotely controlling the robot system at the operator terminal.
  • the operator can make the robot system execute an operation of changing the execution condition of the target operation by remote control.
  • the target operation may be an operation of a predetermined work by the robot system
  • the first support information may be an operation of changing the environment of the work.
  • the success of the target motion in the robot system can be indirectly assisted by changing the execution condition of the target motion through the motion of changing the environment which is the execution condition of the target motion.
  • the work is a work operation work for operating the work using the robot system.
  • the operation of changing the environment may be an operation of changing the position and / or posture of the work.
  • the work may include a target work to be operated and a non-target work which is a work other than the target work.
  • the work includes the operation target and other works.
  • the target motion is an motion of recognizing and operating the work, and when the recognition of the work fails, the target motion is not normally executed or is not expected to be executed normally. It may be a thing to be judged as a thing.
  • a support request is made when the recognition of the work fails, so that the operator changes the execution condition of the target operation, including the change of the recognition unit itself and the change of the environment, to the robot. Can be executed.
  • the target motion is an motion of recognizing and operating a work including a target work to be operated and a non-target work other than the target work, and the target motion determining unit performs the target motion when the recognition of the work fails. Determining that it has not been or is not expected to be performed normally, the support information provides the robot system with an action of grasping the non-target work and moving it out of the operation target area. It may be informational.
  • the execution condition of the target operation is changed by moving the non-target work including the obstacle to the outside of the operation target area, so that the target is indirectly targeted. It can assist the success of the operation.
  • the robot system may further include a sensor for observing an environment in which a target motion is performed, and the first support information may include command information for changing a sensing condition in the sensor.
  • the success of the target motion can be indirectly assisted by changing the sensing condition which is the execution condition of the target motion.
  • the change of the sensing condition may be realized by moving the sensor, switching the sensor to be used, changing the parameter of the sensor, or a combination thereof.
  • the sensing conditions are changed by moving or switching the sensor or changing the parameters, so that appropriate recognition by the sensor can be ensured by various means.
  • the sensor is a camera, and the parameters may include optical system parameters related to the camera and signal processing parameters of information captured by the camera.
  • the recognition accuracy can be changed and the success of the target motion can be indirectly assisted.
  • the support information may further include a third support information which is instruction information for causing the robot system to execute a recovery operation preset for the robot system.
  • the target operation may be a work operation operation for operating a work using the robot system
  • the recovery operation may be an operation for changing the position and / or posture of the work by the robot system.
  • the robot system changes the environment that is the execution condition of the target motion by itself, so that the success of the target motion can be easily and indirectly assisted.
  • the work includes a target work to be operated and a non-target work which is other work, and the recovery operation rearranges the target work so that the target work can be grasped at a desired position and / or posture. It may be an action to be performed.
  • the work includes a target work to be operated and a non-target work which is a work other than the target work, and the recovery operation may be an operation of changing the position and / or posture of the non-target work.
  • the robot system since the robot system performs an action of changing the position and / or posture of a non-target work such as an obstacle by itself, it is possible to easily assist the success of the target action.
  • the recovery operation may be an operation of moving the non-target work to a temporary storage place or a disposal place.
  • the non-target work can be placed outside the recognition environment, so that the success of the target operation can be indirectly assisted.
  • the robot system further includes an alarm unit, and the support information further includes a fourth support information which is instruction information for causing the robot system to execute an alarm in the alarm unit. There may be.
  • the robot system is further based on a recognition unit that recognizes a situation, a learned model obtained by performing machine learning based on the recognition result in the recognition unit and the support information, and a recognition result in the recognition unit. Further, a self-support information generation unit that generates self-support information, which is information for self-support, and a self-support control unit that controls corresponding to the self-support information may be provided.
  • self-support information can be generated based on a trained model in which the content of support from the operator terminal is machine-learned, so that manual support can be gradually reduced. This allows the robot system to be closer to a more fully autonomous system.
  • the transmission unit that transmits the support information to the machine learning server, the recognition result before and after the support by the support information, and the support information. It may be provided with a model receiving unit that receives the generated trained model.
  • support information is additionally machine-learned, so that the robot system can be brought closer to a more completely autonomous system as well as data accumulation.
  • the quality of self-support can be determined in advance.
  • the self-support control unit may further include an execution determination unit that determines whether or not to perform control based on the self-support information based on the evaluation value.
  • the present invention can also be considered as a method. That is, the method according to the present invention determines a target motion execution step for executing a predetermined target motion, and a target motion determination for determining whether the target motion is normally executed or is expected to be executed normally.
  • the step, the support request step for requesting support to the operator terminal when the target operation is not executed normally or is not expected to be executed normally, and the operation for changing the execution condition of the target operation are executed.
  • a support information receiving step for receiving support information including a first support information which is information to be performed from the operator terminal which has received the support request, a support control step for performing control corresponding to the support information, and a support control step. It is equipped with.
  • the present invention can also be thought of as a computer program. That is, the program according to the present invention determines a target motion execution step for executing a predetermined target motion and a target motion determination for determining whether the target motion has been normally executed or is expected to be executed normally.
  • the step, the support request step for requesting support to the operator terminal when the target operation is not executed normally or is not expected to be executed normally, and the operation for changing the execution condition of the target operation are executed.
  • a support information receiving step for receiving support information including a first support information which is information to be performed from the operator terminal which has received the support request, a support control step for performing control corresponding to the support information, and a support control step. It is equipped with.
  • the present invention can also be thought of as a system that includes a robot system. That is, the system according to the present invention has a target motion execution unit that executes a predetermined target motion, and a target motion determination that determines whether the target motion is normally executed or is expected to be executed normally. When the target operation is not executed normally or is not expected to be executed normally, the support requesting unit and the support requesting unit that make a support request to the operator terminal and the operation for changing the execution condition of the target operation are executed.
  • a support information receiving unit that receives support information including a first support information that is information to be performed from the operator terminal that has received the support request, a support control unit that controls the support information, and a support control unit. It is equipped with.
  • FIG. 1 is an overall configuration diagram of the system.
  • FIG. 2 is a functional block diagram of the system.
  • FIG. 3 is a flowchart relating to the operation of the information processing apparatus.
  • FIG. 4 is a flowchart relating to the operation of the operator terminal.
  • FIG. 5 is a detailed flowchart of the first support process.
  • FIG. 6 is a display example of the image in the box before and after the support.
  • FIG. 7 is a detailed flowchart of the second support process.
  • FIG. 8 is a detailed flowchart of the third support process.
  • FIG. 9 is a detailed flowchart of the fourth support process.
  • FIG. 10 is an explanatory diagram of other situations in which a piece picking task is performed and requires support.
  • the robot system is described as having a robot arm in the present embodiment, various hardware other than the robot arm can be adopted.
  • the piece picking task is exemplified as a task of the robot system, the task is not limited thereto. Therefore, it can be applied to various other tasks such as boxing work and transportation work.
  • FIG. 1 is an overall configuration diagram of the system 200 according to the present embodiment.
  • the system 200 includes a plurality of robot systems 1, a management server 3, and an operator terminal 5, which are connected to each other via a LAN (local area network) by wire or wirelessly.
  • LAN local area network
  • the devices are connected by LAN, but they may be connected via the Internet.
  • Each robot system 1 includes an information processing device 11 that performs various information processing described later, a plurality of sensors 15 connected to the information processing device 1, and an articulated robot arm 17 connected to the information processing device 1. Has been done.
  • the information processing apparatus 11, the sensor 15, and the robot arm 17 are described as constituting one system while being physically separated from each other in the present embodiment, a part or all of them are integrated. It may be configured in.
  • the information processing device 11 includes a control unit composed of a CPU and the like for executing various programs, and a storage unit composed of a ROM, RAM, a hard disk, and the like.
  • the sensor 15 is a three-dimensional camera fixedly arranged in the environment so that the inside of a box containing a large number of workpieces to be the target of the piece picking task can be imaged, and each sensor photographs the inside of the box from different angles. are doing. That is, it is arranged so that recognition can be performed from different angles by switching the sensor.
  • the information acquired from the sensor is transmitted to the information processing apparatus 11.
  • the three-dimensional camera which is the sensor 15, is fixed in the environment, and as will be described later, according to the configuration shared for both recognition and remote control, the same viewpoint can be used, so that the configuration is simplified. At the same time, it is advantageous in terms of cost.
  • the method of attaching the sensor 15 is not limited to the example of fixing the sensor 15 in the environment in this way, and a camera attached to the robot arm 17 or the like may be used, or an existing surveillance camera may be diverted.
  • the articulated robot arm 17 has a gripper (not shown) at its tip, and performs a task of sandwiching a product in a box and placing it on a predetermined position outside the box, for example, on a belt conveyor or the like. Further, the articulated robot arm 17 is provided with an alarm device that emits a predetermined sound and light, so that the robot arm 17 can be noticed by workers in the vicinity and the like.
  • the server 3 performs machine learning processing based on various information described later and generates a trained model. Further, the generated trained model is appropriately provided to the robot system 1.
  • the operator terminal 5 is an information processing device operated by an operator, for example, a PC (personal computer). As shown in the figure, the operator terminal 5 is connected to a head-mounted display 58 having a display unit and presenting image information to the operator, and a wearable input device 59 for remote control. The operator can operate the wearable input device 59 to perform various inputs while viewing the video information presented by the head-mounted display 58.
  • a PC personal computer
  • the display device and the input device are not limited to the examples of the present embodiment, and various known devices can be used.
  • the display device may be a normal display
  • the input device may be an input device using a stick lever, a keyboard, or the like.
  • FIG. 2 is a functional block diagram of the system 200. As is clear from the figure, the robot system 1 exchanges information with the server 3 and the operator terminal 5 via the respective transmission / reception units.
  • the robot system 1 includes an information processing device 11, a sensor 15, and an articulated robot arm 17.
  • the information processing device 11 includes a sensor information acquisition unit 111 that acquires sensor 15 detection information, and a recognition processing unit 112 that performs a predetermined recognition processing unit based on the sensor information.
  • the recognition result by the recognition processing unit 112 is provided to the determination unit 114.
  • the determination unit 114 determines whether or not to make a support request described later based on the recognition result, and provides the determination result to the motion generation unit 118 and the evaluation unit 115.
  • the motion generation unit 118 and the motion command unit 119 issue motion commands to the robot arm 17.
  • the evaluation unit 115 evaluates the propriety of self-support by the method described later, and provides the evaluation result to the support request unit 116 or the self-support operation generation unit 120.
  • the support request unit 116 provides the support request signal and other information to the transmission / reception unit 127.
  • the self-support motion generation unit 120 generates a self-support motion based on the stored learned model or the like, and issues a command to the robot arm 17 via the motion command unit 119.
  • the support process execution unit 121 When the support process execution unit 121 receives the support information from the transmission / reception unit, the support process execution unit 121 executes various support processes according to the support information. For example, it controls the operation of the robot arm 17 and gives a command to the alarm unit 171.
  • the operation result by self-support or the operation result by the operator's support is transmitted to the operation result transmission unit 125, respectively.
  • the operation result transmission unit 125 transmits information to the server 3.
  • the server 3 includes a learning processing unit 33, a trained model providing unit 35, and a transmitting / receiving unit 51.
  • the learning processing unit 33 performs machine learning processing based on the recognition result obtained via the sensor 15, the content of the support operation described later, and the recognition result after the support operation, and the recognition result is improved based on the recognition result. Generate a trained model that infers the support behavior that it will do.
  • the trained model providing unit 35 provides the generated trained model to the information processing apparatus 11.
  • various other methodologies can be adopted for the learning structure of machine learning. For example, the success or failure of the target operation after the support operation may be further taken into consideration. Further, as a learning method, not only supervised learning but also reinforcement learning may be performed under a predetermined evaluation standard.
  • the operator terminal 5 includes a sensor information acquisition unit 55, an operation information generation unit 54, a support information generation unit 52, a transmission / reception unit 51, and a display processing unit 57.
  • the sensor information acquisition unit 55 acquires sensor detection information from a sensor provided in the wearable input device 59 for remote control.
  • the operation information generation unit 54 generates operation information for the robot arm 17 based on the sensor detection information.
  • the support information generation unit 54 detects the operation information generated by the support information operator, generates various support information, and transmits the various support information to the transmission / reception unit 51.
  • the display processing unit 51 performs predetermined display processing related to the robot system 1 on the display unit of the head-mounted display 59 based on the information from the transmission / reception unit 51.
  • the system configuration in this embodiment is an example, and the number and configuration of each device connected to the network can be freely changed.
  • the server 3 may monitor various information of the robot system 1 and the operator terminal 5 and mediate the exchange of information between the robot system 1 and the operator terminal 5, for example, information such as a support request.
  • FIG. 3 is a flowchart relating to the operation of the information processing apparatus 11 constituting the robot system 1.
  • the sensor information acquisition unit 11 acquires the sensor detection information from the sensor 15, and the image information obtained by photographing the inside of the box in the present embodiment (S1). ).
  • the recognition processing unit 112 performs a process of recognizing a work such as a product to be gripped by the robot arm 17 based on the sensor information (S2).
  • the recognition is performed based on the image information, but the present invention is not limited to such a configuration. Therefore, for example, the configuration may be such that the three-dimensional information acquired by the three-dimensional camera is used instead of the image or together with the image.
  • the determination unit 114 performs a process of determining the necessity of support (S4). More specifically, the determination unit 114 does not need support when recognition by the recognition unit 112 is normally performed, for example, when the work to be gripped can be recognized with sufficient recognition accuracy. On the other hand, if the recognition is not performed normally, it is determined that support is required.
  • the necessity of support is determined based on whether or not normal recognition is possible, but the present invention is not limited to such a configuration, and various determination methods can be adopted.
  • the determination process may be performed based on a criterion such as whether or not a gripping solution for the gripping object can be calculated.
  • the motion generation unit 118 generates a target motion (S6).
  • the operation command unit 119 issues an operation command to the robot arm 17 to execute the generated target operation (S6).
  • a process of evaluating whether or not the target operation was successful is performed based on the sensor 15 information (S8).
  • the end determination is performed (S9), and if a predetermined end signal or the like is detected, the process ends.
  • a series of processes from the sensor information acquisition process (S1) are repeated again (S9NO). If the end signal is detected, the process ends.
  • the evaluation process described later is executed (S11). That is, if the target operation fails, the processing described later is executed as if support is required.
  • the evaluation process is performed by the evaluation unit 115 (S11). More specifically, the evaluation unit 115 infers the support operation that will improve the recognition result by operating the trained model with the recognition result as an input, and simulates the result state when the inferred support operation is executed. .. An evaluation value is generated by judging the quality of this simulation result.
  • the evaluation processing method is not limited to the method of this embodiment as long as it determines the quality of the self-support operation. Therefore, various methods can be used.
  • a process of determining whether the evaluation value is equal to or higher than a predetermined threshold value, that is, a determination of the possibility of self-support is performed (S12).
  • the self-support motion generation unit 120 cooperates with the motion command unit 119.
  • a process for executing a self-support operation is performed (S16). That is, a process of causing the robot arm 17 to execute a self-supporting operation is performed.
  • the operation result transmission unit 125 performs a process of transmitting the content of the support operation and the recognition result after the self-support operation is completed to the server 3 (S18).
  • the learning processing unit 33 of the server 3 additionally performs machine learning processing based on the transmitted information.
  • the trained model providing unit 35 performs a process of transmitting the generated trained model to the robot system 1. This trained model is used for generation of self-supporting motion, evaluation processing, and the like.
  • the support request unit 116 performs a process of requesting support to the operator terminal 5. Do (S21).
  • the information processing device 11 After making a support request, the information processing device 11 goes into a standby state until it receives support information from the operator terminal 5 (S22NO).
  • the support process execution 121 performs the execution process according to the support information (S24). After that, a predetermined end determination is performed, and if the end signal is not detected, the reception standby state (S22NO) is set again and a series of processes are repeated.
  • the operation result transmission unit 25 performs a process of transmitting various information related to the recognition result and the support content to the server 3 (S27). After the transmission process (S27), the above-mentioned end determination process (S9) is performed.
  • the robot system 1 machine-learns the contents of the support instructed from the operator terminal 5 by using the server 3, and gradually performs the self-support operation. Operator support can be reduced. That is, this makes it possible to eventually aim for a fully autonomous system.
  • FIG. 4 is a flowchart relating to the operation of the operator terminal 5.
  • the operator terminal 5 When the process starts, the operator terminal 5 is in a standby state until a support request is received from the robot system 1 (S51NO).
  • the display processing unit 57 When a support request is received in this state (S51YES), the display processing unit 57 performs a process of displaying the recognition result at the time of making the support request on the display unit.
  • the operator terminal 5 After the display process, the operator terminal 5 performs a process of accepting an input regarding the support means by the operator (S54NO). That is, the operator inputs the support means to the operator terminal 5 while checking the display unit.
  • any of the first to fourth support processes is executed according to the input support means (S55, S57, S59, S61). .. This series of processing is repeated until a predetermined end signal is detected.
  • FIG. 5 is a detailed flowchart of the first support process.
  • the first support process executes an operation of changing the execution condition of the target operation.
  • the display unit when the process starts, the display unit performs a process of displaying the sensor information transmitted from the robot system 1 (S551).
  • FIG. 6A is a display example of the image in the box 19 before support displayed on the display unit.
  • a large number of cylindrical workpieces 191 are randomly placed in the box 19.
  • the target work 191a to be gripped out of a large number of works 191 is tilted with the bottom surface facing upward.
  • the sensor information acquisition unit 55 After the display process, the sensor information acquisition unit 55 performs a process of acquiring input information including the sensor information provided in the wearable input device for remote control (S552). At this time, the operator equipped with the wearable input device performs an operation to execute a support operation such as changing the execution condition of the target operation, rather than an alternative operation of the target operation in the robot arm 17.
  • the operation information generation unit 54 performs a process of generating operation information based on the input information (S554). After the operation information generation process, a process for generating a sensor information acquisition request is performed (S555).
  • the information processing apparatus 11 Upon receiving this transmission information, the information processing apparatus 11 executes a process corresponding to the first support information (S24). That is, the information processing apparatus 11 recognizes that it is the first support information, sets it to the first support mode, operates the robot arm 17 so as to correspond to the operation information, and operates the sensor after the operation. The process of transmitting the information to the operator terminal 5 again is performed.
  • the end determination is performed, and when the support end signal input by the operator is detected, it is determined that there is an end command (S558YES), and the end process of the first support process (S560). )I do.
  • FIG. 6B is a display example of the image in the supported box 19 displayed on the display unit after the series of processes related to the first support is repeated.
  • the target work 191a which was placed at an angle with the bottom facing upward, is placed horizontally on the floor surface parallel to the side of the box 19 as a result of the first support by the operator. ..
  • the work posture becomes suitable for gripping, and the robot arm 17 can normally recognize and grip the target work and continue the piece picking motion which is the target motion.
  • the operator causes the robot system to execute an operation of changing the execution condition of the target operation, not the target operation itself in the robot system 1, so that only indirect support is required.
  • the support burden is small. That is, it is possible to provide a semi-autonomous robot system in which the support burden by the operator is small.
  • the execution condition of the target motion is a condition that is a prerequisite for executing the target motion.
  • an environmental condition including a scene or a situation recognized by the sensor 15 or a robot system such as the sensor 15 to be used. 1
  • software conditions such as own hardware conditions and sensor parameters.
  • FIG. 7 is a detailed flowchart of the second support process.
  • the second support process is a process for the operator to execute the target operation itself as an alternative.
  • the display unit when the process starts, the display unit performs a process of displaying the sensor information transmitted from the robot system 1 (S571).
  • the sensor information acquisition unit 55 performs a process of acquiring input information including sensor information provided in the wearable input device for remote control (S572).
  • the operator wearing the wearable input device 59 executes an alternative operation of the target operation in the robot arm 17 or a support operation that performs substantially the same operation as the target operation while receiving the image presented by the head-mounted display 58. Do the operation to do. For example, when the target motion is a piece picking motion, the operator operates the robot arm 17 to perform piece picking.
  • the operation information generation unit 54 performs a process of generating operation information based on the input information (S574). After the operation information generation process, a process for generating a sensor information acquisition request is performed (S575).
  • the information processing apparatus 11 Upon receiving this transmission information, the information processing apparatus 11 executes a process corresponding to the second support information (S24). That is, the information processing apparatus 11 recognizes that it is the second support information, sets it to the second support mode, operates the robot arm 17 so as to correspond to the operation information, and operates the sensor after the operation. The process of transmitting the information to the operator terminal 5 again is performed.
  • the end determination is performed, and when the support end signal input by the operator is detected, it is determined that there is an end command (S578YES), and the end process of the first support process (S580). )I do.
  • the operator can make the robot system execute an alternative motion of the target motion, so that the robot system 1 can be assisted by various means.
  • FIG. 8 is a detailed flowchart of the third support process.
  • the operator sends only the recovery operation instruction without operating the operation of the robot system 1 one by one, and causes the robot system 1 to autonomously perform a predetermined operation.
  • the recovery operation means an operation that is stored or acquired in a predetermined robot system 1 and the robot system 1 autonomously changes the execution condition of the target operation.
  • the information processing apparatus 11 Upon receiving this transmission information, the information processing apparatus 11 executes a process corresponding to the third support information (S24). That is, the information processing apparatus 11 recognizes that it is the third support information, sets the third support mode, and then performs a process of causing the robot arm 17 to execute a predetermined recovery operation.
  • the recovery operation is an operation of gripping a non-target work that is not a gripping target, randomly changing the position and posture, and returning the work.
  • the content of the recovery operation is not limited to such an example. Therefore, for example, the target work to be gripped may be gripped and the same operation may be performed, or the work may be taken out and temporarily placed outside the box or discarded.
  • the hardware conditions are not limited to physical operation, and hardware conditions such as switching the sensor 15 to be used may be changed.
  • FIG. 9 is a detailed flowchart of the fourth support process.
  • the operator gives an alarm instruction to the robot system 1 and requests the workers and the like around the robot system 1 to assist the robot system 1.
  • the operator terminal 5 performs the process of transmitting the alerting instruction (S611).
  • the information processing apparatus 11 Upon receiving this transmission information, the information processing apparatus 11 executes a process corresponding to the fourth support information (S24). That is, the information processing apparatus 11 recognizes that it is the fourth support information, sets the fourth support mode, and then performs a process of causing the alarm unit 171 to execute a predetermined alarm operation. .. This alarming operation attracts the attention of surrounding workers with sound and light, and it is expected that the worker will provide predetermined support.
  • FIG. 10 is an explanatory diagram regarding other situations in which support is required when performing a piece picking task.
  • the figure (a) shows the inside of the box before the support, and the figure (b) shows the state inside the box after the support.
  • the information processing apparatus 11 performs recognition processing of the target work and calculation of the gripping solution, and determines that support is necessary because the gripping solution cannot be calculated (S4). Then, after the predetermined evaluation process (S11), the information processing apparatus 11 performs self-support (S16) or a process based on the support from the operator (S24) to change the position and posture of the obstacle 191c. Do the action.
  • This position and posture changing operation may be, for example, an operation of grasping an obstacle and placing it on another place, or an operation of removing it sideways using the tip of the robot arm 17 without grasping it. There may be.
  • the operator causes the robot system to execute an operation of changing the execution condition of the target operation, not the target operation itself in the robot system 1, so that only indirect support is required.
  • the support burden is small. That is, it is possible to provide a semi-autonomous robot system with a small support burden by the operator.
  • the content of the support operation is not limited to the content of the above-described embodiment, that is, the first to fourth support processes. Therefore, for example, in order to change the recognized scene, the operation of shaking the box with respect to the robot system 1, the operation of breaking the stacked work, the operation of stirring the work in the box, etc. by the support or self-support by the operator, etc. May be executed.
  • the hardware conditions may be changed. For example, a command may be given to switch the sensor 15 used for recognition and acquire information from another sensor 15. According to such a configuration, recognition can be performed by using a camera or the like from an appropriate angle, so that the success probability of the target motion can be further improved.
  • the hardware conditions may be changed, for example, the parameters of the sensor 15 used for recognition, particularly the camera, may be changed.
  • This parameter includes optical system parameters related to the camera (angle of view, focal length, pin, etc.) and signal processing parameters (exposure time, gain, white balance, ⁇ correction, etc.) of the information captured by the camera.
  • the prerequisite box may be changed, or an operation requesting the change of the box may be executed.
  • the present invention can be used at least in an industry that manufactures robot systems and the like.
  • Robot system 11 Information processing device 15 Sensor 17 Robot arm 3 Server 5 Operator terminal 58 Head-mounted display 59 Wearable input device

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne un système robotique comportant: une unité d'exécution d'action cible qui exécute une action cible prescrite; une unité de détermination d'action cible qui détermine si l'action cible a été exécutée normalement ou s'il est prévu qu'elle soit exécutée normalement; une unité de demande d'assistance qui, si l'action cible n'a pas été exécutée normalement ou s'il n'est pas prévu qu'elle soit exécutée normalement, émet une demande d'assistance vers un terminal d'opérateur; une unité de réception d'informations d'assistance qui reçoit, en provenance du terminal d'opérateur ayant reçu la demande d'assistance, des informations d'assistance comprenant des premières informations d'assistance, qui sont des informations causant l'exécution d'une action pour changer les conditions d'exécution de l'action cible; et une unité de commande d'assistance qui effectue une commande correspondant aux informations d'assistance.
PCT/JP2021/030432 2020-10-12 2021-08-19 Système robotique, procédé de commande pour celui-ci, programme et système WO2022080006A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-172089 2020-10-12
JP2020172089A JP2022063707A (ja) 2020-10-12 2020-10-12 ロボットシステム、その制御方法及びプログラム並びにシステム

Publications (1)

Publication Number Publication Date
WO2022080006A1 true WO2022080006A1 (fr) 2022-04-21

Family

ID=81207929

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/030432 WO2022080006A1 (fr) 2020-10-12 2021-08-19 Système robotique, procédé de commande pour celui-ci, programme et système

Country Status (2)

Country Link
JP (1) JP2022063707A (fr)
WO (1) WO2022080006A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007190641A (ja) * 2006-01-19 2007-08-02 Advanced Telecommunication Research Institute International コミュニケーションロボット
JP2007213190A (ja) * 2006-02-08 2007-08-23 Advanced Telecommunication Research Institute International コミュニケーションロボット改良システム
WO2010092981A1 (fr) * 2009-02-12 2010-08-19 三菱電機株式会社 Système de robot industriel
JP2010207989A (ja) * 2009-03-11 2010-09-24 Honda Motor Co Ltd 対象物の把持システム及び同システムにおける干渉検出方法
JP2011000685A (ja) * 2009-06-19 2011-01-06 Denso Wave Inc ビンピッキングシステム
JP2018149669A (ja) * 2017-03-14 2018-09-27 オムロン株式会社 学習装置及び学習方法
JP2021070140A (ja) * 2019-11-01 2021-05-06 株式会社Preferred Networks 被遠隔操作装置、遠隔操作システム、遠隔操作支援方法、プログラム及び非一時的コンピュータ可読媒体

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007190641A (ja) * 2006-01-19 2007-08-02 Advanced Telecommunication Research Institute International コミュニケーションロボット
JP2007213190A (ja) * 2006-02-08 2007-08-23 Advanced Telecommunication Research Institute International コミュニケーションロボット改良システム
WO2010092981A1 (fr) * 2009-02-12 2010-08-19 三菱電機株式会社 Système de robot industriel
JP2010207989A (ja) * 2009-03-11 2010-09-24 Honda Motor Co Ltd 対象物の把持システム及び同システムにおける干渉検出方法
JP2011000685A (ja) * 2009-06-19 2011-01-06 Denso Wave Inc ビンピッキングシステム
JP2018149669A (ja) * 2017-03-14 2018-09-27 オムロン株式会社 学習装置及び学習方法
JP2021070140A (ja) * 2019-11-01 2021-05-06 株式会社Preferred Networks 被遠隔操作装置、遠隔操作システム、遠隔操作支援方法、プログラム及び非一時的コンピュータ可読媒体

Also Published As

Publication number Publication date
JP2022063707A (ja) 2022-04-22

Similar Documents

Publication Publication Date Title
CN111194452B (zh) 动作预测系统及动作预测方法
JP6549545B2 (ja) 人の行動を学習してロボットを制御する制御装置およびロボットシステム
US20160354925A1 (en) Robot control apparatus, robot, and robot system
JP6900918B2 (ja) 学習装置及び学習方法
JP6869060B2 (ja) マニピュレータの制御装置、制御方法およびプログラム、ならびに作業システム
WO2020138446A1 (fr) Dispositif de commande de robot, système de robot, et procédé de commande de robot
WO2018113263A1 (fr) Procédé, système et appareil permettant de commander un robot, et robot
Bjerkeng et al. Active camera control with obstacle avoidance for remote operations with industrial manipulators: Implementation and experimental results
CN112469538B (zh) 数据生成装置及方法、数据生成程序、以及远程操作系统
JP7247552B2 (ja) 学習装置、ロボット制御装置、及びロボット制御システム
Zhou et al. Teleman: Teleoperation for legged robot loco-manipulation using wearable imu-based motion capture
WO2022080006A1 (fr) Système robotique, procédé de commande pour celui-ci, programme et système
Lee et al. Reinforcement Learning-based Virtual Fixtures for Teleoperation of Hydraulic Construction Machine
Yu et al. Efficiency and learnability comparison of the gesture-based and the mouse-based telerobotic systems
Das et al. Neuro-Adaptive Dynamic Control with Edge-Computing for Collaborative Digital Twin of an Industrial Robotic Manipulator
KR20190091870A (ko) 모션센서와 vr을 활용한 로봇 제어 시스템
JP3749883B2 (ja) 遠隔操作方法及び装置
WO2022080007A1 (fr) Serveur, procédé de commande de celui-ci, programme, et système
CN111702759A (zh) 示教系统及机器人的示教方法
WO2018168537A1 (fr) Appareil cible d'apprentissage et procédé de fonctionnement
WO2023162164A1 (fr) Dispositif de support d'enseignement, système de travail, procédé de support d'enseignement et programme de support d'enseignement
US20230278223A1 (en) Robots, tele-operation systems, computer program products, and methods of operating the same
Arsenopoulos et al. A human-robot interface for industrial robot programming using RGB-D sensor
JP7217212B2 (ja) 現場監視装置、現場監視方法及び現場監視プログラム
Manaparampil Enhancing Manipulation Performance with Grounded Haptic UI Teleoperation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21879748

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21879748

Country of ref document: EP

Kind code of ref document: A1