WO2021220679A1 - Dispositif, procédé et programme de commande de robot - Google Patents

Dispositif, procédé et programme de commande de robot Download PDF

Info

Publication number
WO2021220679A1
WO2021220679A1 PCT/JP2021/012351 JP2021012351W WO2021220679A1 WO 2021220679 A1 WO2021220679 A1 WO 2021220679A1 JP 2021012351 W JP2021012351 W JP 2021012351W WO 2021220679 A1 WO2021220679 A1 WO 2021220679A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
person
mobile robot
predetermined
movement
Prior art date
Application number
PCT/JP2021/012351
Other languages
English (en)
Japanese (ja)
Inventor
義弘 坂本
史朗 佐久間
Original Assignee
東京ロボティクス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 東京ロボティクス株式会社 filed Critical 東京ロボティクス株式会社
Publication of WO2021220679A1 publication Critical patent/WO2021220679A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to, for example, a robot control device and the like.
  • Patent Document 1 discloses a human-cooperative mobile robot that autonomously travels in a work area shared with humans in a factory.
  • the present invention has been made to solve the above-mentioned technical problems, and an object of the present invention is to provide a robot control device or the like that does not reduce work efficiency even when there is an obstacle on a movement path. It is in.
  • the robot control device is a robot control device that controls a mobile robot, and includes a movement path generation unit that generates a movement path from the current location to the destination, and the generated movement path along the generated movement path.
  • a movement control unit that moves a mobile robot, an obstacle detection unit that detects an obstacle on the movement path within a predetermined distance, and an obstacle response that executes a predetermined response for removing the obstacle from the movement path.
  • the obstacle detection unit detects whether or not the obstacle is a person, and the obstacle response unit selects one or a plurality of the obstacles depending on whether or not the obstacle is a person. Take the prescribed action.
  • a predetermined action to be taken to remove the obstacle from the movement route is selected depending on whether the obstacle is a person or not. Therefore, it is possible to select a more appropriate action according to the obstacle and efficiently remove the obstacle from the movement route. As a result, even when an obstacle is detected on the movement path, it is possible to suppress a decrease in the work efficiency of the mobile robot. In addition, since it is possible to execute a plurality of predetermined measures described later in combination, it is possible to realize more detailed and diverse measures, and it is possible to remove obstacles more efficiently.
  • the mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is a person, the obstacle detecting unit is the person.
  • the obstacle handling portion may be one that detects the position of the head of the robot and controls the driving means to direct the face portion to the person's head as the predetermined response.
  • the face is directed to the person's head, so that the person can be actively encouraged to evict from the movement route.
  • the detected obstacle is a person and there are a plurality of the persons
  • a specific unit for identifying the person closest to the mobile robot is further provided, and the obstacle response unit is the face portion as the predetermined response.
  • the face is directed to the head of the person, the face may be directed to the head of the person specified by the specific part.
  • the mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is not a human, the obstacle detecting unit moves.
  • the position of the head of a person in the vicinity of the robot is detected, and the obstacle handling unit controls the driving means to direct the face to the head of the person in the vicinity as the predetermined response. It may be there.
  • the face is directed to the head of the surrounding person, so that it is possible to actively appeal to the surrounding person to remove the obstacle from the movement path. can.
  • the notification means for performing the predetermined notification may be further provided.
  • the mobile robot further includes a speaker as the notification means, and the obstacle handling unit uses the speaker to make a predetermined sound selected according to whether or not the obstacle is a person as the predetermined response. It may be something to output.
  • the predetermined sound includes a plurality of types of sounds having different working intensities, and the obstacle handling unit changes the predetermined sound type so that the degree of working becomes stronger at predetermined time intervals. It may be.
  • the strength of the action can be gradually increased, so it is appropriate to gradually increase the strength according to the behavior of the person without making the person uncomfortable by suddenly speaking strong words. It is possible to work with high strength.
  • the obstacle handling unit may increase the volume of the predetermined sound at predetermined time intervals.
  • At least one or more of the predetermined sound type, volume, number of outputs, and output duration output by the obstacle handling unit may be configured to be preset.
  • the speaker may be arranged so that the predetermined sound is output from the head.
  • a predetermined sound can be output from a position substantially similar to that of a person, so that the person listening to the sound does not feel uncomfortable and the sound can be transmitted more effectively. ..
  • the obstacle handling unit may change the strength of the action by the notification means at predetermined time intervals.
  • the movement route generation unit When the obstacle is detected, the movement route generation unit generates a movement route that avoids the obstacle from the current location and reaches the destination, and the obstacle response unit generates the obstacle from the current location.
  • the predetermined response may be executed.
  • a movement route that can avoid the obstacle and reach the destination is regenerated, and such a route is regenerated. If this is not done, a predetermined action is taken to remove the obstacle from the movement route, so that the mobile robot can positively secure the movement route to the destination and improve the work efficiency. ..
  • the movement route generation unit removes the obstacle from the movement route even when the obstacle is detected and a predetermined time elapses after the obstacle response unit executes the predetermined response. If not, it may generate a movement route from the current location to avoid the obstacle and reach the destination.
  • a predetermined action for removing the obstacle from the movement route is executed, and then the obstacle moves on the movement route even after a predetermined time elapses. If it is not removed from the robot, a movement route that can avoid the obstacle and reach the destination is regenerated. Therefore, the mobile robot actively secures the movement route to the destination to improve work efficiency. Can be enhanced.
  • the movement control unit may decelerate or stop the mobile robot when the obstacle is detected during movement.
  • the obstacle handling unit may maintain the stopped state of the mobile robot as the predetermined response.
  • the mobile robot may further include one or more arms having a manipulator mechanism.
  • the obstacle handling unit may be one that points to the obstacle by using the arm portion as the predetermined response when the detected obstacle is not a person.
  • the robot control method according to the present invention is a robot control method for controlling a mobile robot, which includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route.
  • a movement control step for moving a mobile robot an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path.
  • the robot control program according to the present invention is a robot control program that controls a mobile robot, and includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route.
  • a movement control step for moving a mobile robot an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path.
  • FIG. 1 is a schematic configuration diagram of a mobile robot to which a robot control device is applied.
  • FIG. 2 shows the appearance of the mobile robot, but is a schematic configuration diagram.
  • FIG. 3 is a diagram illustrating a head of a mobile robot.
  • FIG. 4 is a flowchart illustrating movement control.
  • FIG. 5 is a flowchart illustrating the movement process.
  • FIG. 6 is a diagram illustrating a method of detecting an obstacle on a movement path.
  • FIG. 7 is a flowchart illustrating the obstacle handling process.
  • FIG. 8 is a diagram illustrating a person identification process.
  • FIG. 9 is a diagram illustrating the operation of the human correspondence process.
  • FIG. 10 is a diagram illustrating the operation of the object handling process.
  • FIGS. 1 to 3 are diagrams illustrating a configuration of a mobile robot 100 to which the robot control device (controller 1) according to the embodiment of the present invention is applied.
  • the mobile robot 100 is a robot that is supposed to move in a work area shared with humans, and is, for example, a mobile manipulator having a function of picking parts at a factory or the like and transporting them to a predetermined place, an airport, or the like. It may be applied to a service robot or the like that guides a traveler or the like at the facility of.
  • a roughly humanoid mobile manipulator including a head 30 and an arm portion 40 having a manipulator mechanism as shown in FIG. 2 will be described.
  • the mobile robot 100 of the present embodiment includes a controller 1 that functions as a robot control device that controls the operation of the mobile robot 100, a mobile mechanism driving means 2, a head driving means 3, and a detecting means 4. And the notification means 5.
  • the controller 1 has functional units such as a movement route generation unit 11, a movement control unit 12, an obstacle detection unit 13, and an obstacle response unit 14.
  • a central processing unit (CPU) as a processor
  • ROM read-only memory
  • RAM random access memory
  • I / O interface input / output interface
  • the ROM included in the controller 1 stores a program (control program) for executing each function of each of the above-mentioned functional units. That is, the controller 1 executes the functions of the movement path generation unit 11, the movement control unit 12, the obstacle detection unit 13, and the obstacle response unit 14 by executing various programs stored in the storage medium.
  • the above-mentioned configurations as the processor and the storage medium constituting the controller 1 are examples, and in addition to or in place of these, a GPU, a flash memory, a hard disk, a storage, and the like may be included. Further, the functions of the above-mentioned functional units do not necessarily have to be realized only by the controller 1, and are configured to be realized by a plurality of controllers appropriately selected for each functional unit or by coordinating with each other. May be good. The functions of each functional unit of the controller 1 will be described below.
  • the movement route generation unit 11 generates a route (movement route) on which the mobile robot 100 intends to move. Specifically, the movement route generation unit 11 executes path planning to generate a route from a predetermined position to a destination based on map information (map data) of a work area. The generated route is set as the movement route of the mobile robot 100.
  • the predetermined position here may be a predetermined predetermined position or may be the current position of the mobile robot 100.
  • the specific method of path planning executed by the movement route generation unit 11 is not particularly limited, and a known method used in a mobile robot such as an existing mobile manipulator may be appropriately adopted.
  • the movement control unit 12 controls the movement mechanism driving means 2 in order to move the mobile robot 100 along the movement path generated by the movement route generation unit 11.
  • the mobile robot 100 of the present embodiment is configured as a so-called autonomous traveling type mobile robot that autonomously moves along the generated movement path.
  • the moving mechanism driving means 2 will be described later with reference to FIG.
  • the detection means 4 detects an obstacle existing in the surrounding environment of the mobile robot 200. Obstacles here include not only objects but also people. Further, the detection means 4 is not only used as a means for detecting an obstacle, but may also be configured as a means for recognizing the surrounding environment of the mobile robot 100 so that the mobile robot 100 autonomously travels.
  • the detection means 4 includes, for example, at least one of a sensor (image sensor, lidar, laser scanner, etc.), a stereo camera, and the like.
  • the detection means 4 of the present embodiment is composed of an image sensor, and the detection means 4 is also referred to as a sensor 4 below.
  • the image sensor is a sensor having an image sensor that converts light detected through an optical system into an electric signal, and the type thereof is not particularly limited.
  • the sensor 4 is configured to be able to detect an obstacle existing in the surrounding environment including at least the front, and is provided, for example, in a portion corresponding to the human eye, which is configured in the face portion 31 shown in FIG. 3A. good. Further, a sensor 4 capable of detecting all directions, which is a so-called 360-degree camera, may be provided on the crown of the head 30, for example.
  • the sensor 4 of the present embodiment detects the position of an obstacle existing in the vicinity including at least the front of the mobile robot 100, and image data capable of recognizing the position and posture of the person when the obstacle is a person. Then, the detection data (image data) is transmitted to the controller 1 (obstacle detection unit 13).
  • the obstacle detection unit 13 detects an obstacle on the movement path based on the detection data (image data) acquired by the sensor 4. In addition, the obstacle detection unit 13 identifies whether or not the obstacle is a person, and if the obstacle is a person, detects the position and posture of the person. Further, the obstacle detection unit 13 may be configured to detect not only obstacles on the movement path but also the position and posture of a person existing in the vicinity of the mobile robot 100. However, the posture of the person detected by the obstacle detection unit 21 is not necessarily the posture of the whole body, and at least the position of the person's head may be detectable.
  • Various known methods can be adopted as a method for detecting the position and posture of a person based on the image data, but in the present embodiment, the skeleton (bone) of the person is detected from the image data and the person's skeleton (bone) is detected.
  • a method (bone model) for detecting the position and the posture may be adopted.
  • the obstacle detection unit 13 can easily detect the position of the head of a person on the movement path and around the mobile robot 100.
  • the "obstacle on the moving path" detected by the obstacle detecting unit 13 includes an obstacle that may come into contact with or collide with the obstacle when the mobile robot 100 moves on the moving path. good.
  • the "movement path” here is not only conceived as a simple line drawn corresponding to the path that the mobile robot 100 plans to move, but also has a width corresponding to the horizontal width of the mobile robot 100. It may be thought of as a belt to have.
  • a specific example of a method for detecting an "obstacle on a moving path" will be described later with reference to FIG.
  • the obstacle response unit 14 executes a predetermined response to the obstacle.
  • the predetermined response here includes, for example, a response such as driving the head 30 or performing a predetermined notification from the notification means 5.
  • the predetermined response may be appropriately selected depending on the type of obstacle (for example, whether or not it is a person) and the situation. Details of the predetermined correspondence will be described later with reference to FIGS. 9 and 10.
  • the head direction control unit 14a controls the head driving means 3 in order to drive the head 30 (see FIG. 2) provided in the mobile robot 100. Specifically, the head direction control unit 14a controls the front facing direction of the head 30 via the head driving means 3. The details of the head driving means 3 will be described later with reference to FIG.
  • the head direction control unit 14a of the present embodiment is described as a functional unit of the obstacle handling unit 14.
  • the notification control unit 14b uses the notification means 5 to perform a predetermined notification.
  • the notification means 5 of the present embodiment is a speaker, and the notification means 5 is also referred to as a speaker 5 below. That is, the notification control unit 14b outputs a predetermined sound using the speaker 5. The details of the predetermined sound will be described later.
  • the notification control unit 14b of the present embodiment is described as a functional unit of the obstacle handling unit 14.
  • FIG. 2 is a diagram for explaining the configuration of the mobile robot 100, and mainly shows an example of the appearance of the mobile robot 100.
  • the mobile robot 100 of the present embodiment has a substantially humanoid shape, and is located on the robot main body 10, the mobile trolley 20 that supports the robot main body 10, and the upper end of the robot main body 10. It is composed of a head portion 30 provided and an arm portion 40 extending from the front surface of the robot main body portion 10.
  • the mobile trolley 20 that supports the robot body 10 mainly has a function of moving on the surface (floor surface) on which the mobile robot 100 is placed.
  • the type of the mobile trolley 20 is not particularly limited, but in the present embodiment, the omnidirectional mobile trolley is adopted.
  • the omnidirectional moving trolley is a trolley configured to be able to move in all directions, for example, by providing a plurality of omni wheels as driving wheels.
  • the moving carriage 20 of the present embodiment has three driving wheels 22 and a moving mechanism for driving the driving wheels 22 via a belt or the like inside the skirt 21 represented in the outer appearance. It may be composed of the driving means 2.
  • the moving mechanism driving means 2 of the present embodiment is composed of one or a plurality of motors, and hereinafter, the moving mechanism driving means 2 is also referred to as a motor 2.
  • the head 30 may be provided at the upper end of the robot main body 10 as a configuration corresponding to a human head (a portion above the neck) in the substantially humanoid mobile robot 100. Further, it is desirable that the head 30 is configured so that a person can recognize the front surface as a face.
  • the head 30 is configured to be able to control the direction in which the front faces.
  • the mobile robot 100 of the present embodiment includes one or a plurality of head driving means 3 (actuators 3), and controls the direction of the front surface thereof by moving the head via the head driving means 3. It is configured so that it can be done.
  • the front surface of the head 30 will be referred to as a face portion 31.
  • the mobile robot 100 of the present embodiment has a vertical rotation axis that rotates the head 30 in the horizontal direction (horizontal direction) with respect to the floor surface, and enables the head 30 to be rotationally driven in the left-right direction. It has a unit driving means 3 (servo motor 3a). As a result, the mobile robot 100 can rotate and drive only the head 30 to the left and right with respect to the vertical direction.
  • the arrangement of the servomotor 3a is not limited to the arrangement shown in the drawing, and may be appropriately changed as long as the direction of the face portion 31 can be moved in the left-right direction.
  • the servomotor 3a may be provided at the lower end of the robot main body 10 so as to rotationally drive the robot main body 10 to the left and right with respect to the moving carriage 20.
  • the servomotor 3a can change the direction of the face portion 31 by rotationally driving the head portion 30 together with the robot main body portion 10 to the left and right with respect to the vertical direction.
  • the mobile robot 100 has a head driving means 3 (servomotor) which has a horizontal rotation axis for driving the head 30 in the vertical direction with respect to the floor surface and enables an operation of looking up and down of the head 30. 3b) may be provided. As a result, the mobile robot 100 can also move the direction of the face portion 31 in the vertical direction.
  • a head driving means 3 (servomotor) which has a horizontal rotation axis for driving the head 30 in the vertical direction with respect to the floor surface and enables an operation of looking up and down of the head 30. 3b) may be provided.
  • the mobile robot 100 can also move the direction of the face portion 31 in the vertical direction.
  • FIG. 3 is a diagram for more specifically explaining a configuration example of the head 30 of the present embodiment.
  • the shape of the head 30 is not particularly limited as long as the front surface can be recognized as the face portion 31.
  • the face portion 31 has a structural feature that can be visually recognized only on the front portion of the head portion 30 so that the front surface can be easily recognized as the face portion 31.
  • FIG. 3A by providing a configuration corresponding to two eyes imitating a human face only on the front surface of the head 30, the front surface can be easily recognized as the face portion 31.
  • a characteristic portion (a portion indicated by a thick arrow) made of a transparent member may be provided only on the front surface of the head 30.
  • only the front surface may be made flat.
  • the head 30 may be provided on the robot main body 10 via a configuration (neck) corresponding to a human neck, or may be provided directly on the upper end of the robot main body 10. good. That is, the servomotors 3a and 3b may be provided on the neck or the head 30. Further, the detection means 4 is provided on the head 30 so as to detect the external environment from, for example, a portion corresponding to the eyes shown in FIG. 3A and a portion made of a transparent member shown in FIG. 3B. You may. Further, the speaker 5 may be provided so as to output sound from the head portion 30, for example, the face portion 31.
  • the above is a configuration example of the head 30. The explanation will be continued by returning to FIG.
  • the mobile robot 100 of the present embodiment includes an arm portion 40 on the front surface of the robot main body portion 10.
  • the arm portion 40 is provided with a manipulator mechanism, and a gripping mechanism 41 for gripping a member to be transported or the like is provided at a tip corresponding to a free end.
  • the shape, number, and arrangement of the arm portions 40 are not limited to the illustrated modes and may be appropriately changed according to the purpose.
  • the mobile robot 100 may be configured as a dual-arm robot having arm portions 40 on both side surfaces of the robot main body portion 10.
  • the above is a configuration example of the mobile robot 100 of this embodiment.
  • the mobile robot 100 is omitted in FIG. 2, the mobile robot 100 has other configurations required for controlling the operation of the mobile robot 100, for example, other actuators for driving the arm 40 and the like, and a power source.
  • a battery or the like may be provided.
  • FIG. 4 is a flowchart illustrating the movement control performed by the mobile robot 100 of the present embodiment.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S11 the controller 1 executes the task acquisition process.
  • the task acquisition process is a process for the mobile robot 100 to acquire information necessary for generating a movement route by a movement route generation process (S12) described later.
  • identification information for example, identification information (part ID) of the part to be picked, position information of the destination (for example, position coordinates), and the like are acquired.
  • the acquisition method is not limited, and may be acquired by using, for example, a known wireless communication technique, or may be configured to be acquired via an information input means (for example, a touch panel or the like) (not shown).
  • step S12 the controller 1 (movement route generation unit 11) generates a movement route from the current location to the destination.
  • the destination may be set according to the information acquired by the task acquisition process. For example, when the component ID is acquired by the task acquisition process, the movement route generation unit 11 sets the location where the component specified by the component ID is placed as the destination, and sets the route from the current location to the destination. Generate. The generated route is set as the movement route of the mobile robot 100. At that time, when a plurality of candidate routes are generated, setting conditions such as giving priority to the route that can reach the destination earliest or the route that has the shortest distance to the destination may be predetermined. When the movement route to the destination is set, the movement process for moving the mobile robot 100 to the destination is executed (S13).
  • step S13 the controller 1 (movement control unit 12) executes a movement process for moving the mobile robot 100 to the destination.
  • the movement control unit 12 controls the motor 2 to move the mobile robot 100 to the destination along the set movement path.
  • the specific method of the movement process executed in this step is not particularly limited, and various known methods may be appropriately adopted.
  • the mobile robot 100 controls the motor 2 while performing known self-position estimation such as odometry and scan matching based on the route planned by path planning and the map data of the work area stored in advance. , It may be configured to realize autonomous movement (autonomous driving) to the destination.
  • step S13 The details of the movement process in step S13 will be described with reference to FIG.
  • FIG. 5 is a flowchart illustrating a movement process performed by the mobile robot 100 of the present embodiment.
  • the movement process described in this flowchart is always executed while the mobile robot 100 is moving to the destination along the set movement route.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S131 the controller 1 (obstacle detection unit 13) detects an obstacle on the movement path within a predetermined distance.
  • the controller 1 obstacle detection unit 13 detects an obstacle on the movement path within a predetermined distance.
  • FIG. 6 is a diagram illustrating a method in which the obstacle detection unit 13 of the present embodiment detects an obstacle on the movement path.
  • the mobile robot 100 and the obstacle 300 detected by the detecting means 4 are shown.
  • the dotted arrow extending from the mobile robot 100 indicates the generated movement path.
  • the solid arrow extending from the mobile robot 100 indicates the direction of movement closest to the mobile robot 100 (movement direction from the current location), that is, the direction from the current location toward the first turning point (waypoint) on the movement route. be.
  • the mobile robot 100 and the obstacle 300 are regarded as simple two-dimensional figures projected from above on the floor surface, and whether or not the two-dimensional figures interfere with each other. It may be performed by determining whether or not.
  • the mobile robot 100 is represented by, for example, a series of circles projected corresponding to each link of the mobile carriage 20 and the arm 40.
  • the obstacle (person) 300 is represented by a circle considering a region that expands in the horizontal direction when the arm is extended in the horizontal direction. When the obstacle is not a person, it may be represented by a two-dimensional figure appropriately selected according to the outer shape of the obstacle.
  • the outer edge of the projected two-dimensional figure for example, the diameter of a circle, does not necessarily have to match the maximum horizontal width of the mobile robot 100 and the maximum horizontal width of the obstacle 300 when viewed from above. ..
  • the projected two-dimensional figure may be appropriately adjusted by making it slightly wider to allow a margin or narrowing it in consideration of a realistic movable range.
  • the circle corresponding to the mobile robot 100 is generated along the movement path while the mobile robot 100 moves for a predetermined distance or a predetermined time.
  • the circles corresponding to the mobile robot 100 that moves within the predetermined distance indicated by the dotted line circles are shown in chronological order.
  • the circle corresponding to the mobile robot 100 at the current location is shown as (t)
  • the circles generated in time series along the moving path are shown as (t + 1) and (t + 2).
  • the obstacle detection unit 13 calculates whether or not the circle corresponding to the mobile robot 100 and the circle corresponding to the obstacle 300 come into contact (interference) within a predetermined distance.
  • the obstacle detection unit 13 can contact or collide with the obstacle 300 at (t + 2), that is, the movement path to the obstacle 300 at the position (t + 2), for example, as shown in the drawing. It can be easily detected that it is hindered.
  • the predetermined distance indicated by the dotted line circle in the figure that is, the distance range for detecting an obstacle on the movement route with reference to the current location may be appropriately set.
  • the predetermined distance here is a distance set mainly with the intention of excluding the detection of an obstacle located too far from the mobile robot 100, such as 5 m, the moving speed of the mobile robot 100, and the work. It may be set as appropriate according to the size of the area and the like.
  • the figure shows an example of detecting an obstacle that may collide when the mobile robot 100 moves in a range closer to the waypoint, that is, in the latest moving direction, but the present invention is not limited to this.
  • the predetermined distance for detecting an obstacle may be appropriately set, and in principle, it may be set within a range that can be detected by the detecting means 4 and a range that exceeds the waypoint.
  • the collision determination method for generating the above-mentioned two-dimensional figure as a collision verification figure is an example, and is not limited to this.
  • the obstacle detection unit 13 may generate, for example, a band having a width corresponding to the horizontal width of the mobile robot 100 along the movement path as a collision verification figure. Further, the obstacle detection unit 13 may adopt another known collision determination method for generating a three-dimensional figure. In this case, the mobile robot 200 and the obstacle 300 may be represented by a series of three-dimensional spheres or a polygon mesh as a collision verification figure.
  • step S131 If no obstacle is detected on the movement path in the process of step S131 (NO determination), the processes of steps S131 to S132 are repeatedly executed until the destination is reached. When an obstacle is detected on the movement path (YES determination), the process of step S133 is executed.
  • step S133 the controller 1 determines whether or not to generate another movement route for avoiding obstacles based on the preset setting. If it is preset to regenerate the movement route when an obstacle is detected (YES determination), the process proceeds to step S134. When an obstacle is detected, the movement route is regenerated. If it is not set in advance (NO determination), the process proceeds to the obstacle handling process in step S134.
  • the setting method for regenerating the movement route is not limited, and may be set via, for example, a known wireless communication technique, or may be set via an information input means (for example, a touch panel) (not shown). It may be configured. In addition, it is not necessary to set in advance regarding the regeneration of the movement route. In this case, the process of step S133 may be deleted, and the controller 1 executes the process of step S134 when an obstacle is detected on the movement path (S131, YES determination).
  • step S134 the controller 1 (movement route generation unit 11) tries to generate a movement route (avoidance route) that can reach the destination from the current location by avoiding the obstacle detected in step S131.
  • an upper limit of the distance or time required to reach the destination may be set in advance. For example, when the upper limit of the distance of 100 m is set, an obstacle is detected on the passage to the destination, and the obstacle cannot be avoided in view of the width of the passage. In the only detour, if the distance from the current location to the destination exceeds 100 m, an avoidance route is not generated and a new travel route is not reset. Similar to the movement route generation process in step S12, when a plurality of candidate routes are generated, the route that can reach the destination earliest or the route that has the shortest distance to the destination is prioritized. May be predetermined.
  • the movement route resetting process for resetting the generated avoidance route as the movement route is executed (S135), and the destination is set along the reset movement route.
  • the processes of steps S131 to S132 are repeatedly executed until they are reached. During that time, the mobile robot 100 moves along the movement path, and when it is determined that the destination has been reached (S132, YES determination), the movement process ends.
  • step S134 if there is no route that can reach the destination by avoiding obstacles from the current location, or if there is no route that satisfies the above upper limit or setting condition, the avoidance route is not generated (S134). , NO determination), the process proceeds to the obstacle handling process of step S136 in order to execute a predetermined response for removing the obstacle.
  • the controller 1 moves the mobile robot 100.
  • a process of decelerating or stopping the movement of the robot (deceleration process) may be executed.
  • the obstacle handling process (S136) or the movement route resetting process (S135) can be safely executed before colliding with the obstacle.
  • the degree of deceleration and the stop position (distance to the obstacle) may be set in advance, or may be appropriately adjusted according to the moving speed of the mobile robot 100 at the time when the obstacle is detected. ..
  • FIG. 7 is a flowchart illustrating the obstacle handling process (S136) performed by the mobile robot 100 of the present embodiment.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S1361 the controller 1 (obstacle detection unit 13) determines whether or not the obstacle is a person. Whether or not the obstacle is a person can be easily determined by analyzing the detection data (image data) acquired by the detection means 4.
  • the detection means 4 may be configured to be able to detect infrared rays radiated from the obstacle in order to determine whether or not the person is a person from the temperature of the obstacle.
  • the process of step S1362 is executed to determine whether or not there are a plurality of detected obstacles (persons). If the obstacle is not a person (NO determination), the process proceeds to step S1365 to execute the object handling process for the non-human obstacle. The details of the object handling process will be described later with reference to FIG.
  • step S1362 the controller 1 determines whether or not there are a plurality of detected persons. If the number of detected persons is not a plurality but one (NO determination), the process of step S1364 is performed in order to execute the person-correspondence process for the detected persons. The details of the human handling process will be described later with reference to FIG. On the other hand, if there are a plurality of detected persons (YES determination), the person identification process (S1363) is executed in order to specify the target for executing the person correspondence process. The details of the person identification process will be described with reference to FIG.
  • FIG. 8 is a diagram illustrating a person identification process.
  • FIG. 8 is a view of the positional relationship between the mobile robot 100 and a person from above, which is indicated by the mobile robot 100, the movement path indicated by the dotted arrow extending from the mobile robot 100, and the solid line arrow extending from the mobile robot 100. The latest movement direction, a predetermined distance from the mobile robot 100 indicated by a dotted circle, and a plurality of people including the person A and the person B are shown.
  • Person A and person B shown in FIG. 8 are both persons as obstacles detected on the movement path within a predetermined distance.
  • the controller 1 identifies the person who is predicted to first contact or collide when the mobile robot 100 moves. .. Specifically, the person (person A) closest to the mobile robot 100 is identified as the person who is predicted to collide first. However, the moving speed and the moving direction of the person may be taken into consideration in identifying the person who is expected to collide first. In this case, for example, when the person A slowly moves away from the mobile robot 200 while the person B quickly approaches the mobile robot 200, the person B is identified as the first person to be predicted to collide. There may be.
  • the person correspondence process in step S1364 is executed.
  • step S1364 since it is determined that the obstacle is a person (S1361, YES determination), one or a plurality of predetermined actions selected when the target is a person are executed for the detected obstacle. do. Specifically, it will be described with reference to FIG.
  • FIG. 9 is a diagram for explaining the human correspondence process.
  • the human response process of the present embodiment includes at least one or more of the following three responses. Specifically, in the human correspondence process, the movement of the mobile robot 100 is stopped (human correspondence 1), the face 31 is directed to the human head (human correspondence 2), and a predetermined sound is output (human correspondence 3). ), Or a combination of two or more.
  • the dotted arrow in the figure indicates the movement path of the mobile robot 100.
  • FIG. 9A is a diagram illustrating how the mobile robot 100 is executing either one or both of the human correspondence 1 and the human correspondence 3.
  • the mobile robot 100 may stand still in front of a person who is an obstacle (human correspondence 1).
  • the stopped state is maintained.
  • Such a response gives priority to human behavior, and in principle, quietly waits for a person to notice the existence of the mobile robot 100 and evict from the movement route.
  • the mobile robot 100 may output a predetermined sound via the speaker 5 in the state of FIG. 9A (human correspondence 3).
  • the predetermined sound may be a warning sound such as a buzzer sound, or a voice such as "I'm sorry, please go” or "pass".
  • These predetermined sound types may be appropriately selected from a plurality of patterns stored in advance according to prior settings and the like. Not only the type of sound, but also the volume, the number of times the sound is repeated (the number of times of output), the duration of sound output, and the like may be set in advance.
  • the mobile robot 100 can output an appropriate sound according to the scene in which the mobile robot 100 is used and the request of the user who manages the mobile robot 100.
  • the output timing may be appropriately set. For example, it may be set to output a sound when a person does not evacuate even if the stationary state continues for 5 seconds.
  • the type, volume, number of repetitions, etc. of the output sound may be changed according to the duration of the human processing. For example, prepare multiple types of sounds with different working strengths according to the strength (meaning) of words, such as "I'm sorry", “Please go”, “Doke!, So that the degree of working gradually increases.
  • the type of sound to be output may be changed to stronger words at predetermined time intervals.
  • the predetermined time here may be set as appropriate.
  • the time interval for stepwise change does not have to be equal.
  • the mobile robot 100 can urge evacuation with an appropriate strength that changes stepwise according to the human behavior without making the person uncomfortable or surprised by suddenly uttering a strong word at a loud volume. can.
  • FIG. 9B is a diagram illustrating a state in which the mobile robot 100 is executing a response (human response 2) in which the face portion 31 is directed to the human head.
  • the obstacle handling unit 14 head drive control unit 14a
  • the obstacle handling unit 14 may turn the face portion 31 toward the human head as a human handling process.
  • it is possible to urge a person on the movement path to evacuate, which is stronger than the person correspondence 1 that only stands still and softer than the person correspondence 3 that outputs a sound.
  • It may be executed in combination with at least one of the person correspondence 1 and the person correspondence 3, and the combination timing may be appropriately set.
  • the face 31 may be directed toward the head of a person after a lapse of a predetermined time from a stationary state, and a sound may be output when a predetermined time has passed. This makes it possible to work more finely depending on the situation.
  • the obstacle response unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned person responses 1 to 3 for the person.
  • FIG. 10 is a diagram for explaining the object handling process.
  • the object handling process is a process performed when it is determined that the obstacle is not a person (S1361, NO determination), and includes at least one or both of the following two responses. Specifically, in the object handling process, after stopping the movement of the mobile robot 100, the face 31 is directed to the heads of people in the vicinity (object handling 1), and a sound is output to the people in the vicinity (objects). Correspondence 2) is executed either one or in combination. In addition, similarly to the above-mentioned person correspondence 1, the object correspondence 3 which stands still in front of an obstacle may be executed. The dotted arrow in the figure indicates the movement path of the mobile robot 100.
  • FIG. 10A shows a state in which the mobile robot 100 is stopped in front of an obstacle (object).
  • the obstacle detection unit 13 detects the position of a person around the mobile robot 100, particularly the position of the head of a person around the mobile robot 100.
  • the obstacle handling unit 14 may turn the face portion 31 toward the head of a person in the vicinity as an object handling process (object handling 1). ).
  • the mobile robot 100 can positively appeal to the people around it that the movement route is obstructed by an obstacle.
  • the type of sound output at this time may be, for example, the following voice, different from that at the time of human processing.
  • the obstacle handling unit 14 (notification control unit 14b) emits a voice requesting that the obstacle be removed from the movement route, such as "Please move the object" by the object handling process (object handling 2). It may be output to a person.
  • the mobile robot 100 can more accurately appeal to the surrounding people to remove the obstacle from the moving path in order to secure the moving path.
  • a so-called directional speaker may be adopted as the speaker 5 that outputs sound. This makes it possible to ask for help only from a specific person in the vicinity, for example, even in a field where sound cannot be emitted unnecessarily.
  • the object correspondence 2 may be executed in the stationary state (object correspondence 3) shown in FIG. 10 (a).
  • the mobile robot 100 may output a sound such as "Who is it?" Toward the periphery without limiting the target for outputting the sound.
  • the sound type, the volume, the number of times the sound is repeated, and the sound output duration may be appropriately changed in the same manner as described above in the human handling process. ..
  • the obstacle handling unit 14 may point to an obstacle to be removed by using the arm 40 as an object handling process (object handling 4).
  • object handling 4 an object handling process
  • the mobile robot 100 can more accurately convey the obstacles to be removed from the movement path to the surrounding people.
  • Various movements can be adopted as the movements for pointing to obstacles.
  • the mobile robot 100 may perform an action of pointing the tip of the arm 40 toward an obstacle so that a person points to an object, and further moves the tip of the arm 40 back and forth with respect to the obstacle. You may perform an action that emphasizes and points to the obstacle that you want to remove.
  • the object correspondence 4 may also be executed in combination with any one or more of the above-mentioned object correspondences 1 to 3.
  • the obstacle handling unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned object handling 1 to 4 to the surrounding people.
  • FIGS. 4, 5 and 7 are examples for explaining the movement control of the present embodiment, and do not necessarily limit the execution as shown in the drawings, and the above-mentioned technical techniques are not necessarily limited. It may be changed as appropriate as long as it is effective.
  • step S133 shown in FIG. 5 may be deleted.
  • the mobile robot 100 is configured to try to generate an avoidance route when an obstacle is detected, and to execute an obstacle handling process when an appropriate avoidance route is not generated. May be good.
  • the mobile robot 100 detects an obstacle (S131, YES determination) and the obstacle is not removed from the movement route even after a predetermined time has elapsed after executing the above-mentioned predetermined response as the obstacle response process.
  • an attempt may be made to generate a movement route (avoidance route) that avoids the obstacle from the current location and reaches the destination (S134).
  • the mobile robot 100 detects an obstacle, it first executes an obstacle handling process, and then generates an avoidance route when the obstacle is not removed from the moving route even after a lapse of a predetermined time.
  • the predetermined time here may be set as appropriate and is not particularly limited.
  • the mobile robot 100 regenerates a movement path that can avoid the obstacle and reach the destination, and such If the route cannot be regenerated, a predetermined action is taken to remove the obstacle from the movement route.
  • the mobile robot 100 executes a predetermined response for removing the obstacle from the movement path, and even if the predetermined response is executed, the obstacle If is not removed, avoid obstacles and regenerate a travel route that can reach the destination.
  • the mobile robot 100 can actively secure the movement route and move to the destination even when the movement route is blocked by an obstacle, so that the decrease in work efficiency can be suppressed. ..
  • the notification means 5 does not necessarily have to be a speaker, and may be a lamp, a laser, a display, or the like that appeals to human vision using light.
  • the obstacle handling unit 14 may change the strength of the action at predetermined time intervals even when performing a predetermined notification using the notification means 5 configured in this way in the obstacle handling process. ..
  • the obstacle handling unit 14 works on the content displayed on the display in the same manner as the predetermined notification using the speaker described above. The strength may be changed to be stronger at predetermined time intervals.
  • the configuration of the mobile robot 100 shown in FIG. 2 may be appropriately changed as long as it includes at least the head 30 and is configured to be movable along the movement path.
  • the mobile robot 100 does not have to include the robot main body 10.
  • the head 30 may be configured to be provided directly on the moving carriage 20.
  • the present invention can be used at least in an industry that manufactures robot control devices and the like.

Abstract

La présente invention concerne un dispositif de commande de robot dont l'efficacité de travail n'est pas réduite, même si un obstacle est présent sur l'itinéraire de déplacement. Le dispositif de commande de robot comprend une unité de génération d'itinéraire de déplacement qui génère un itinéraire de déplacement, d'un emplacement actuel vers un emplacement de destination ; une unité de commande de déplacement qui amène un robot mobile à se déplacer le long de l'itinéraire de déplacement généré ; une unité de détection d'obstacle qui détecte un obstacle sur l'itinéraire de déplacement dans les limites d'une distance prescrite ; et une unité de contre-mesure d'obstacle qui exécute une contre-mesure prescrite pour éliminer l'obstacle de l'itinéraire de déplacement, l'unité de détection d'obstacle détectant si l'obstacle est une personne ou non, l'unité de contre-mesure d'obstacle exécutant au moins une contre-mesure prescrite sélectionnée selon que l'obstacle est une personne ou non.
PCT/JP2021/012351 2020-04-30 2021-03-24 Dispositif, procédé et programme de commande de robot WO2021220679A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-080586 2020-04-30
JP2020080586A JP2021171905A (ja) 2020-04-30 2020-04-30 ロボット制御装置、方法、及びプログラム

Publications (1)

Publication Number Publication Date
WO2021220679A1 true WO2021220679A1 (fr) 2021-11-04

Family

ID=78281162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/012351 WO2021220679A1 (fr) 2020-04-30 2021-03-24 Dispositif, procédé et programme de commande de robot

Country Status (2)

Country Link
JP (1) JP2021171905A (fr)
WO (1) WO2021220679A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115958575A (zh) * 2023-03-16 2023-04-14 中国科学院自动化研究所 类人灵巧操作移动机器人

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023089817A1 (fr) * 2021-11-22 2023-05-25 三菱電機株式会社 Dispositif de traitement d'informations, système de simulation, procédé de simulation, et programme de simulation
JP7434634B1 (ja) 2023-03-28 2024-02-20 Kddi株式会社 情報処理装置、情報処理方法及びプログラム
JP7434635B1 (ja) 2023-03-28 2024-02-20 Kddi株式会社 情報処理装置、情報処理方法及びプログラム
JP7397228B1 (ja) 2023-03-31 2023-12-12 Kddi株式会社 情報処理装置、情報処理方法、プログラム及び情報処理システム

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001129787A (ja) * 1999-11-02 2001-05-15 Atr Media Integration & Communications Res Lab 自律移動ロボット
JP2011204145A (ja) * 2010-03-26 2011-10-13 Sony Corp 移動装置、移動方法およびプログラム
JP2013537487A (ja) * 2010-05-20 2013-10-03 アイロボット コーポレイション 移動式ヒューマンインターフェースロボット
WO2015147149A1 (fr) * 2014-03-28 2015-10-01 ヤンマー株式会社 Véhicule de chantier à déplacement autonome
JP2016081403A (ja) * 2014-10-21 2016-05-16 株式会社Ihiエアロスペース 無人移動体とその経路生成方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001129787A (ja) * 1999-11-02 2001-05-15 Atr Media Integration & Communications Res Lab 自律移動ロボット
JP2011204145A (ja) * 2010-03-26 2011-10-13 Sony Corp 移動装置、移動方法およびプログラム
JP2013537487A (ja) * 2010-05-20 2013-10-03 アイロボット コーポレイション 移動式ヒューマンインターフェースロボット
WO2015147149A1 (fr) * 2014-03-28 2015-10-01 ヤンマー株式会社 Véhicule de chantier à déplacement autonome
JP2016081403A (ja) * 2014-10-21 2016-05-16 株式会社Ihiエアロスペース 無人移動体とその経路生成方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115958575A (zh) * 2023-03-16 2023-04-14 中国科学院自动化研究所 类人灵巧操作移动机器人

Also Published As

Publication number Publication date
JP2021171905A (ja) 2021-11-01

Similar Documents

Publication Publication Date Title
WO2021220679A1 (fr) Dispositif, procédé et programme de commande de robot
US10293484B2 (en) Robot device and method of controlling the robot device
JP6936081B2 (ja) ロボット
KR101986919B1 (ko) 충돌 회피 및 궤도 복원 기능을 가진 휴머노이드 로봇
US9020682B2 (en) Autonomous mobile body
JP5987842B2 (ja) コミュニケーション引き込みシステム、コミュニケーション引き込み方法およびコミュニケーション引き込みプログラム
US9079307B2 (en) Autonomous locomotion apparatus, autonomous locomotion method, and program for autonomous locomotion apparatus
WO2016010614A1 (fr) Cages de sécurité virtuelles pour dispositifs robotiques
US11154991B2 (en) Interactive autonomous robot configured for programmatic interpretation of social cues
JP2009113190A (ja) 自律動作型ロボットおよび自律動作型ロボットの動作制御方法
JP2017528779A (ja) 受動的追跡要素を用いる乗り物車両追跡及び制御システム
JP2016151897A (ja) 移動体制御装置および移動体制御方法
US11241790B2 (en) Autonomous moving body and control program for autonomous moving body
WO2019244644A1 (fr) Dispositif de commande de corps mobile, procédé de commande de corps mobile et programme
KR20190105214A (ko) 인공 지능을 통해 구속 상황을 회피하는 로봇 청소기 및 그의 동작 방법
JP2003140747A (ja) 自律移動装置及び自律移動装置運用システム
JP2013188815A (ja) 制御装置及び制御方法、並びにコンピューター・プログラム
JP5552710B2 (ja) ロボットの移動制御システム、ロボットの移動制御プログラムおよびロボットの移動制御方法
KR20210026595A (ko) 로봇이 관리자 모드로 이동하는 방법 및 이를 구현하는 로봇
WO2021220678A1 (fr) Robot mobile, procédé de commande de robot mobile et programme de commande
JP2020021351A (ja) ロボット、ロボット制御プログラムおよびロボット制御方法
JP2009151382A (ja) 移動体
JP7258438B2 (ja) ロボット、ロボット制御プログラムおよびロボット制御方法
JP7360792B2 (ja) 移動体、学習器、及び学習器製造方法
WO2021235141A1 (fr) Corps mobile et dispositif de commande, et procédé et programme associés

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21795466

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21795466

Country of ref document: EP

Kind code of ref document: A1