WO2021220679A1 - Robot control device, method, and program - Google Patents

Robot control device, method, and program Download PDF

Info

Publication number
WO2021220679A1
WO2021220679A1 PCT/JP2021/012351 JP2021012351W WO2021220679A1 WO 2021220679 A1 WO2021220679 A1 WO 2021220679A1 JP 2021012351 W JP2021012351 W JP 2021012351W WO 2021220679 A1 WO2021220679 A1 WO 2021220679A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
person
mobile robot
predetermined
movement
Prior art date
Application number
PCT/JP2021/012351
Other languages
French (fr)
Japanese (ja)
Inventor
義弘 坂本
史朗 佐久間
Original Assignee
東京ロボティクス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 東京ロボティクス株式会社 filed Critical 東京ロボティクス株式会社
Publication of WO2021220679A1 publication Critical patent/WO2021220679A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/06Safety devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • the present invention relates to, for example, a robot control device and the like.
  • Patent Document 1 discloses a human-cooperative mobile robot that autonomously travels in a work area shared with humans in a factory.
  • the present invention has been made to solve the above-mentioned technical problems, and an object of the present invention is to provide a robot control device or the like that does not reduce work efficiency even when there is an obstacle on a movement path. It is in.
  • the robot control device is a robot control device that controls a mobile robot, and includes a movement path generation unit that generates a movement path from the current location to the destination, and the generated movement path along the generated movement path.
  • a movement control unit that moves a mobile robot, an obstacle detection unit that detects an obstacle on the movement path within a predetermined distance, and an obstacle response that executes a predetermined response for removing the obstacle from the movement path.
  • the obstacle detection unit detects whether or not the obstacle is a person, and the obstacle response unit selects one or a plurality of the obstacles depending on whether or not the obstacle is a person. Take the prescribed action.
  • a predetermined action to be taken to remove the obstacle from the movement route is selected depending on whether the obstacle is a person or not. Therefore, it is possible to select a more appropriate action according to the obstacle and efficiently remove the obstacle from the movement route. As a result, even when an obstacle is detected on the movement path, it is possible to suppress a decrease in the work efficiency of the mobile robot. In addition, since it is possible to execute a plurality of predetermined measures described later in combination, it is possible to realize more detailed and diverse measures, and it is possible to remove obstacles more efficiently.
  • the mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is a person, the obstacle detecting unit is the person.
  • the obstacle handling portion may be one that detects the position of the head of the robot and controls the driving means to direct the face portion to the person's head as the predetermined response.
  • the face is directed to the person's head, so that the person can be actively encouraged to evict from the movement route.
  • the detected obstacle is a person and there are a plurality of the persons
  • a specific unit for identifying the person closest to the mobile robot is further provided, and the obstacle response unit is the face portion as the predetermined response.
  • the face is directed to the head of the person, the face may be directed to the head of the person specified by the specific part.
  • the mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is not a human, the obstacle detecting unit moves.
  • the position of the head of a person in the vicinity of the robot is detected, and the obstacle handling unit controls the driving means to direct the face to the head of the person in the vicinity as the predetermined response. It may be there.
  • the face is directed to the head of the surrounding person, so that it is possible to actively appeal to the surrounding person to remove the obstacle from the movement path. can.
  • the notification means for performing the predetermined notification may be further provided.
  • the mobile robot further includes a speaker as the notification means, and the obstacle handling unit uses the speaker to make a predetermined sound selected according to whether or not the obstacle is a person as the predetermined response. It may be something to output.
  • the predetermined sound includes a plurality of types of sounds having different working intensities, and the obstacle handling unit changes the predetermined sound type so that the degree of working becomes stronger at predetermined time intervals. It may be.
  • the strength of the action can be gradually increased, so it is appropriate to gradually increase the strength according to the behavior of the person without making the person uncomfortable by suddenly speaking strong words. It is possible to work with high strength.
  • the obstacle handling unit may increase the volume of the predetermined sound at predetermined time intervals.
  • At least one or more of the predetermined sound type, volume, number of outputs, and output duration output by the obstacle handling unit may be configured to be preset.
  • the speaker may be arranged so that the predetermined sound is output from the head.
  • a predetermined sound can be output from a position substantially similar to that of a person, so that the person listening to the sound does not feel uncomfortable and the sound can be transmitted more effectively. ..
  • the obstacle handling unit may change the strength of the action by the notification means at predetermined time intervals.
  • the movement route generation unit When the obstacle is detected, the movement route generation unit generates a movement route that avoids the obstacle from the current location and reaches the destination, and the obstacle response unit generates the obstacle from the current location.
  • the predetermined response may be executed.
  • a movement route that can avoid the obstacle and reach the destination is regenerated, and such a route is regenerated. If this is not done, a predetermined action is taken to remove the obstacle from the movement route, so that the mobile robot can positively secure the movement route to the destination and improve the work efficiency. ..
  • the movement route generation unit removes the obstacle from the movement route even when the obstacle is detected and a predetermined time elapses after the obstacle response unit executes the predetermined response. If not, it may generate a movement route from the current location to avoid the obstacle and reach the destination.
  • a predetermined action for removing the obstacle from the movement route is executed, and then the obstacle moves on the movement route even after a predetermined time elapses. If it is not removed from the robot, a movement route that can avoid the obstacle and reach the destination is regenerated. Therefore, the mobile robot actively secures the movement route to the destination to improve work efficiency. Can be enhanced.
  • the movement control unit may decelerate or stop the mobile robot when the obstacle is detected during movement.
  • the obstacle handling unit may maintain the stopped state of the mobile robot as the predetermined response.
  • the mobile robot may further include one or more arms having a manipulator mechanism.
  • the obstacle handling unit may be one that points to the obstacle by using the arm portion as the predetermined response when the detected obstacle is not a person.
  • the robot control method according to the present invention is a robot control method for controlling a mobile robot, which includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route.
  • a movement control step for moving a mobile robot an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path.
  • the robot control program according to the present invention is a robot control program that controls a mobile robot, and includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route.
  • a movement control step for moving a mobile robot an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path.
  • FIG. 1 is a schematic configuration diagram of a mobile robot to which a robot control device is applied.
  • FIG. 2 shows the appearance of the mobile robot, but is a schematic configuration diagram.
  • FIG. 3 is a diagram illustrating a head of a mobile robot.
  • FIG. 4 is a flowchart illustrating movement control.
  • FIG. 5 is a flowchart illustrating the movement process.
  • FIG. 6 is a diagram illustrating a method of detecting an obstacle on a movement path.
  • FIG. 7 is a flowchart illustrating the obstacle handling process.
  • FIG. 8 is a diagram illustrating a person identification process.
  • FIG. 9 is a diagram illustrating the operation of the human correspondence process.
  • FIG. 10 is a diagram illustrating the operation of the object handling process.
  • FIGS. 1 to 3 are diagrams illustrating a configuration of a mobile robot 100 to which the robot control device (controller 1) according to the embodiment of the present invention is applied.
  • the mobile robot 100 is a robot that is supposed to move in a work area shared with humans, and is, for example, a mobile manipulator having a function of picking parts at a factory or the like and transporting them to a predetermined place, an airport, or the like. It may be applied to a service robot or the like that guides a traveler or the like at the facility of.
  • a roughly humanoid mobile manipulator including a head 30 and an arm portion 40 having a manipulator mechanism as shown in FIG. 2 will be described.
  • the mobile robot 100 of the present embodiment includes a controller 1 that functions as a robot control device that controls the operation of the mobile robot 100, a mobile mechanism driving means 2, a head driving means 3, and a detecting means 4. And the notification means 5.
  • the controller 1 has functional units such as a movement route generation unit 11, a movement control unit 12, an obstacle detection unit 13, and an obstacle response unit 14.
  • a central processing unit (CPU) as a processor
  • ROM read-only memory
  • RAM random access memory
  • I / O interface input / output interface
  • the ROM included in the controller 1 stores a program (control program) for executing each function of each of the above-mentioned functional units. That is, the controller 1 executes the functions of the movement path generation unit 11, the movement control unit 12, the obstacle detection unit 13, and the obstacle response unit 14 by executing various programs stored in the storage medium.
  • the above-mentioned configurations as the processor and the storage medium constituting the controller 1 are examples, and in addition to or in place of these, a GPU, a flash memory, a hard disk, a storage, and the like may be included. Further, the functions of the above-mentioned functional units do not necessarily have to be realized only by the controller 1, and are configured to be realized by a plurality of controllers appropriately selected for each functional unit or by coordinating with each other. May be good. The functions of each functional unit of the controller 1 will be described below.
  • the movement route generation unit 11 generates a route (movement route) on which the mobile robot 100 intends to move. Specifically, the movement route generation unit 11 executes path planning to generate a route from a predetermined position to a destination based on map information (map data) of a work area. The generated route is set as the movement route of the mobile robot 100.
  • the predetermined position here may be a predetermined predetermined position or may be the current position of the mobile robot 100.
  • the specific method of path planning executed by the movement route generation unit 11 is not particularly limited, and a known method used in a mobile robot such as an existing mobile manipulator may be appropriately adopted.
  • the movement control unit 12 controls the movement mechanism driving means 2 in order to move the mobile robot 100 along the movement path generated by the movement route generation unit 11.
  • the mobile robot 100 of the present embodiment is configured as a so-called autonomous traveling type mobile robot that autonomously moves along the generated movement path.
  • the moving mechanism driving means 2 will be described later with reference to FIG.
  • the detection means 4 detects an obstacle existing in the surrounding environment of the mobile robot 200. Obstacles here include not only objects but also people. Further, the detection means 4 is not only used as a means for detecting an obstacle, but may also be configured as a means for recognizing the surrounding environment of the mobile robot 100 so that the mobile robot 100 autonomously travels.
  • the detection means 4 includes, for example, at least one of a sensor (image sensor, lidar, laser scanner, etc.), a stereo camera, and the like.
  • the detection means 4 of the present embodiment is composed of an image sensor, and the detection means 4 is also referred to as a sensor 4 below.
  • the image sensor is a sensor having an image sensor that converts light detected through an optical system into an electric signal, and the type thereof is not particularly limited.
  • the sensor 4 is configured to be able to detect an obstacle existing in the surrounding environment including at least the front, and is provided, for example, in a portion corresponding to the human eye, which is configured in the face portion 31 shown in FIG. 3A. good. Further, a sensor 4 capable of detecting all directions, which is a so-called 360-degree camera, may be provided on the crown of the head 30, for example.
  • the sensor 4 of the present embodiment detects the position of an obstacle existing in the vicinity including at least the front of the mobile robot 100, and image data capable of recognizing the position and posture of the person when the obstacle is a person. Then, the detection data (image data) is transmitted to the controller 1 (obstacle detection unit 13).
  • the obstacle detection unit 13 detects an obstacle on the movement path based on the detection data (image data) acquired by the sensor 4. In addition, the obstacle detection unit 13 identifies whether or not the obstacle is a person, and if the obstacle is a person, detects the position and posture of the person. Further, the obstacle detection unit 13 may be configured to detect not only obstacles on the movement path but also the position and posture of a person existing in the vicinity of the mobile robot 100. However, the posture of the person detected by the obstacle detection unit 21 is not necessarily the posture of the whole body, and at least the position of the person's head may be detectable.
  • Various known methods can be adopted as a method for detecting the position and posture of a person based on the image data, but in the present embodiment, the skeleton (bone) of the person is detected from the image data and the person's skeleton (bone) is detected.
  • a method (bone model) for detecting the position and the posture may be adopted.
  • the obstacle detection unit 13 can easily detect the position of the head of a person on the movement path and around the mobile robot 100.
  • the "obstacle on the moving path" detected by the obstacle detecting unit 13 includes an obstacle that may come into contact with or collide with the obstacle when the mobile robot 100 moves on the moving path. good.
  • the "movement path” here is not only conceived as a simple line drawn corresponding to the path that the mobile robot 100 plans to move, but also has a width corresponding to the horizontal width of the mobile robot 100. It may be thought of as a belt to have.
  • a specific example of a method for detecting an "obstacle on a moving path" will be described later with reference to FIG.
  • the obstacle response unit 14 executes a predetermined response to the obstacle.
  • the predetermined response here includes, for example, a response such as driving the head 30 or performing a predetermined notification from the notification means 5.
  • the predetermined response may be appropriately selected depending on the type of obstacle (for example, whether or not it is a person) and the situation. Details of the predetermined correspondence will be described later with reference to FIGS. 9 and 10.
  • the head direction control unit 14a controls the head driving means 3 in order to drive the head 30 (see FIG. 2) provided in the mobile robot 100. Specifically, the head direction control unit 14a controls the front facing direction of the head 30 via the head driving means 3. The details of the head driving means 3 will be described later with reference to FIG.
  • the head direction control unit 14a of the present embodiment is described as a functional unit of the obstacle handling unit 14.
  • the notification control unit 14b uses the notification means 5 to perform a predetermined notification.
  • the notification means 5 of the present embodiment is a speaker, and the notification means 5 is also referred to as a speaker 5 below. That is, the notification control unit 14b outputs a predetermined sound using the speaker 5. The details of the predetermined sound will be described later.
  • the notification control unit 14b of the present embodiment is described as a functional unit of the obstacle handling unit 14.
  • FIG. 2 is a diagram for explaining the configuration of the mobile robot 100, and mainly shows an example of the appearance of the mobile robot 100.
  • the mobile robot 100 of the present embodiment has a substantially humanoid shape, and is located on the robot main body 10, the mobile trolley 20 that supports the robot main body 10, and the upper end of the robot main body 10. It is composed of a head portion 30 provided and an arm portion 40 extending from the front surface of the robot main body portion 10.
  • the mobile trolley 20 that supports the robot body 10 mainly has a function of moving on the surface (floor surface) on which the mobile robot 100 is placed.
  • the type of the mobile trolley 20 is not particularly limited, but in the present embodiment, the omnidirectional mobile trolley is adopted.
  • the omnidirectional moving trolley is a trolley configured to be able to move in all directions, for example, by providing a plurality of omni wheels as driving wheels.
  • the moving carriage 20 of the present embodiment has three driving wheels 22 and a moving mechanism for driving the driving wheels 22 via a belt or the like inside the skirt 21 represented in the outer appearance. It may be composed of the driving means 2.
  • the moving mechanism driving means 2 of the present embodiment is composed of one or a plurality of motors, and hereinafter, the moving mechanism driving means 2 is also referred to as a motor 2.
  • the head 30 may be provided at the upper end of the robot main body 10 as a configuration corresponding to a human head (a portion above the neck) in the substantially humanoid mobile robot 100. Further, it is desirable that the head 30 is configured so that a person can recognize the front surface as a face.
  • the head 30 is configured to be able to control the direction in which the front faces.
  • the mobile robot 100 of the present embodiment includes one or a plurality of head driving means 3 (actuators 3), and controls the direction of the front surface thereof by moving the head via the head driving means 3. It is configured so that it can be done.
  • the front surface of the head 30 will be referred to as a face portion 31.
  • the mobile robot 100 of the present embodiment has a vertical rotation axis that rotates the head 30 in the horizontal direction (horizontal direction) with respect to the floor surface, and enables the head 30 to be rotationally driven in the left-right direction. It has a unit driving means 3 (servo motor 3a). As a result, the mobile robot 100 can rotate and drive only the head 30 to the left and right with respect to the vertical direction.
  • the arrangement of the servomotor 3a is not limited to the arrangement shown in the drawing, and may be appropriately changed as long as the direction of the face portion 31 can be moved in the left-right direction.
  • the servomotor 3a may be provided at the lower end of the robot main body 10 so as to rotationally drive the robot main body 10 to the left and right with respect to the moving carriage 20.
  • the servomotor 3a can change the direction of the face portion 31 by rotationally driving the head portion 30 together with the robot main body portion 10 to the left and right with respect to the vertical direction.
  • the mobile robot 100 has a head driving means 3 (servomotor) which has a horizontal rotation axis for driving the head 30 in the vertical direction with respect to the floor surface and enables an operation of looking up and down of the head 30. 3b) may be provided. As a result, the mobile robot 100 can also move the direction of the face portion 31 in the vertical direction.
  • a head driving means 3 (servomotor) which has a horizontal rotation axis for driving the head 30 in the vertical direction with respect to the floor surface and enables an operation of looking up and down of the head 30. 3b) may be provided.
  • the mobile robot 100 can also move the direction of the face portion 31 in the vertical direction.
  • FIG. 3 is a diagram for more specifically explaining a configuration example of the head 30 of the present embodiment.
  • the shape of the head 30 is not particularly limited as long as the front surface can be recognized as the face portion 31.
  • the face portion 31 has a structural feature that can be visually recognized only on the front portion of the head portion 30 so that the front surface can be easily recognized as the face portion 31.
  • FIG. 3A by providing a configuration corresponding to two eyes imitating a human face only on the front surface of the head 30, the front surface can be easily recognized as the face portion 31.
  • a characteristic portion (a portion indicated by a thick arrow) made of a transparent member may be provided only on the front surface of the head 30.
  • only the front surface may be made flat.
  • the head 30 may be provided on the robot main body 10 via a configuration (neck) corresponding to a human neck, or may be provided directly on the upper end of the robot main body 10. good. That is, the servomotors 3a and 3b may be provided on the neck or the head 30. Further, the detection means 4 is provided on the head 30 so as to detect the external environment from, for example, a portion corresponding to the eyes shown in FIG. 3A and a portion made of a transparent member shown in FIG. 3B. You may. Further, the speaker 5 may be provided so as to output sound from the head portion 30, for example, the face portion 31.
  • the above is a configuration example of the head 30. The explanation will be continued by returning to FIG.
  • the mobile robot 100 of the present embodiment includes an arm portion 40 on the front surface of the robot main body portion 10.
  • the arm portion 40 is provided with a manipulator mechanism, and a gripping mechanism 41 for gripping a member to be transported or the like is provided at a tip corresponding to a free end.
  • the shape, number, and arrangement of the arm portions 40 are not limited to the illustrated modes and may be appropriately changed according to the purpose.
  • the mobile robot 100 may be configured as a dual-arm robot having arm portions 40 on both side surfaces of the robot main body portion 10.
  • the above is a configuration example of the mobile robot 100 of this embodiment.
  • the mobile robot 100 is omitted in FIG. 2, the mobile robot 100 has other configurations required for controlling the operation of the mobile robot 100, for example, other actuators for driving the arm 40 and the like, and a power source.
  • a battery or the like may be provided.
  • FIG. 4 is a flowchart illustrating the movement control performed by the mobile robot 100 of the present embodiment.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S11 the controller 1 executes the task acquisition process.
  • the task acquisition process is a process for the mobile robot 100 to acquire information necessary for generating a movement route by a movement route generation process (S12) described later.
  • identification information for example, identification information (part ID) of the part to be picked, position information of the destination (for example, position coordinates), and the like are acquired.
  • the acquisition method is not limited, and may be acquired by using, for example, a known wireless communication technique, or may be configured to be acquired via an information input means (for example, a touch panel or the like) (not shown).
  • step S12 the controller 1 (movement route generation unit 11) generates a movement route from the current location to the destination.
  • the destination may be set according to the information acquired by the task acquisition process. For example, when the component ID is acquired by the task acquisition process, the movement route generation unit 11 sets the location where the component specified by the component ID is placed as the destination, and sets the route from the current location to the destination. Generate. The generated route is set as the movement route of the mobile robot 100. At that time, when a plurality of candidate routes are generated, setting conditions such as giving priority to the route that can reach the destination earliest or the route that has the shortest distance to the destination may be predetermined. When the movement route to the destination is set, the movement process for moving the mobile robot 100 to the destination is executed (S13).
  • step S13 the controller 1 (movement control unit 12) executes a movement process for moving the mobile robot 100 to the destination.
  • the movement control unit 12 controls the motor 2 to move the mobile robot 100 to the destination along the set movement path.
  • the specific method of the movement process executed in this step is not particularly limited, and various known methods may be appropriately adopted.
  • the mobile robot 100 controls the motor 2 while performing known self-position estimation such as odometry and scan matching based on the route planned by path planning and the map data of the work area stored in advance. , It may be configured to realize autonomous movement (autonomous driving) to the destination.
  • step S13 The details of the movement process in step S13 will be described with reference to FIG.
  • FIG. 5 is a flowchart illustrating a movement process performed by the mobile robot 100 of the present embodiment.
  • the movement process described in this flowchart is always executed while the mobile robot 100 is moving to the destination along the set movement route.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S131 the controller 1 (obstacle detection unit 13) detects an obstacle on the movement path within a predetermined distance.
  • the controller 1 obstacle detection unit 13 detects an obstacle on the movement path within a predetermined distance.
  • FIG. 6 is a diagram illustrating a method in which the obstacle detection unit 13 of the present embodiment detects an obstacle on the movement path.
  • the mobile robot 100 and the obstacle 300 detected by the detecting means 4 are shown.
  • the dotted arrow extending from the mobile robot 100 indicates the generated movement path.
  • the solid arrow extending from the mobile robot 100 indicates the direction of movement closest to the mobile robot 100 (movement direction from the current location), that is, the direction from the current location toward the first turning point (waypoint) on the movement route. be.
  • the mobile robot 100 and the obstacle 300 are regarded as simple two-dimensional figures projected from above on the floor surface, and whether or not the two-dimensional figures interfere with each other. It may be performed by determining whether or not.
  • the mobile robot 100 is represented by, for example, a series of circles projected corresponding to each link of the mobile carriage 20 and the arm 40.
  • the obstacle (person) 300 is represented by a circle considering a region that expands in the horizontal direction when the arm is extended in the horizontal direction. When the obstacle is not a person, it may be represented by a two-dimensional figure appropriately selected according to the outer shape of the obstacle.
  • the outer edge of the projected two-dimensional figure for example, the diameter of a circle, does not necessarily have to match the maximum horizontal width of the mobile robot 100 and the maximum horizontal width of the obstacle 300 when viewed from above. ..
  • the projected two-dimensional figure may be appropriately adjusted by making it slightly wider to allow a margin or narrowing it in consideration of a realistic movable range.
  • the circle corresponding to the mobile robot 100 is generated along the movement path while the mobile robot 100 moves for a predetermined distance or a predetermined time.
  • the circles corresponding to the mobile robot 100 that moves within the predetermined distance indicated by the dotted line circles are shown in chronological order.
  • the circle corresponding to the mobile robot 100 at the current location is shown as (t)
  • the circles generated in time series along the moving path are shown as (t + 1) and (t + 2).
  • the obstacle detection unit 13 calculates whether or not the circle corresponding to the mobile robot 100 and the circle corresponding to the obstacle 300 come into contact (interference) within a predetermined distance.
  • the obstacle detection unit 13 can contact or collide with the obstacle 300 at (t + 2), that is, the movement path to the obstacle 300 at the position (t + 2), for example, as shown in the drawing. It can be easily detected that it is hindered.
  • the predetermined distance indicated by the dotted line circle in the figure that is, the distance range for detecting an obstacle on the movement route with reference to the current location may be appropriately set.
  • the predetermined distance here is a distance set mainly with the intention of excluding the detection of an obstacle located too far from the mobile robot 100, such as 5 m, the moving speed of the mobile robot 100, and the work. It may be set as appropriate according to the size of the area and the like.
  • the figure shows an example of detecting an obstacle that may collide when the mobile robot 100 moves in a range closer to the waypoint, that is, in the latest moving direction, but the present invention is not limited to this.
  • the predetermined distance for detecting an obstacle may be appropriately set, and in principle, it may be set within a range that can be detected by the detecting means 4 and a range that exceeds the waypoint.
  • the collision determination method for generating the above-mentioned two-dimensional figure as a collision verification figure is an example, and is not limited to this.
  • the obstacle detection unit 13 may generate, for example, a band having a width corresponding to the horizontal width of the mobile robot 100 along the movement path as a collision verification figure. Further, the obstacle detection unit 13 may adopt another known collision determination method for generating a three-dimensional figure. In this case, the mobile robot 200 and the obstacle 300 may be represented by a series of three-dimensional spheres or a polygon mesh as a collision verification figure.
  • step S131 If no obstacle is detected on the movement path in the process of step S131 (NO determination), the processes of steps S131 to S132 are repeatedly executed until the destination is reached. When an obstacle is detected on the movement path (YES determination), the process of step S133 is executed.
  • step S133 the controller 1 determines whether or not to generate another movement route for avoiding obstacles based on the preset setting. If it is preset to regenerate the movement route when an obstacle is detected (YES determination), the process proceeds to step S134. When an obstacle is detected, the movement route is regenerated. If it is not set in advance (NO determination), the process proceeds to the obstacle handling process in step S134.
  • the setting method for regenerating the movement route is not limited, and may be set via, for example, a known wireless communication technique, or may be set via an information input means (for example, a touch panel) (not shown). It may be configured. In addition, it is not necessary to set in advance regarding the regeneration of the movement route. In this case, the process of step S133 may be deleted, and the controller 1 executes the process of step S134 when an obstacle is detected on the movement path (S131, YES determination).
  • step S134 the controller 1 (movement route generation unit 11) tries to generate a movement route (avoidance route) that can reach the destination from the current location by avoiding the obstacle detected in step S131.
  • an upper limit of the distance or time required to reach the destination may be set in advance. For example, when the upper limit of the distance of 100 m is set, an obstacle is detected on the passage to the destination, and the obstacle cannot be avoided in view of the width of the passage. In the only detour, if the distance from the current location to the destination exceeds 100 m, an avoidance route is not generated and a new travel route is not reset. Similar to the movement route generation process in step S12, when a plurality of candidate routes are generated, the route that can reach the destination earliest or the route that has the shortest distance to the destination is prioritized. May be predetermined.
  • the movement route resetting process for resetting the generated avoidance route as the movement route is executed (S135), and the destination is set along the reset movement route.
  • the processes of steps S131 to S132 are repeatedly executed until they are reached. During that time, the mobile robot 100 moves along the movement path, and when it is determined that the destination has been reached (S132, YES determination), the movement process ends.
  • step S134 if there is no route that can reach the destination by avoiding obstacles from the current location, or if there is no route that satisfies the above upper limit or setting condition, the avoidance route is not generated (S134). , NO determination), the process proceeds to the obstacle handling process of step S136 in order to execute a predetermined response for removing the obstacle.
  • the controller 1 moves the mobile robot 100.
  • a process of decelerating or stopping the movement of the robot (deceleration process) may be executed.
  • the obstacle handling process (S136) or the movement route resetting process (S135) can be safely executed before colliding with the obstacle.
  • the degree of deceleration and the stop position (distance to the obstacle) may be set in advance, or may be appropriately adjusted according to the moving speed of the mobile robot 100 at the time when the obstacle is detected. ..
  • FIG. 7 is a flowchart illustrating the obstacle handling process (S136) performed by the mobile robot 100 of the present embodiment.
  • the storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
  • step S1361 the controller 1 (obstacle detection unit 13) determines whether or not the obstacle is a person. Whether or not the obstacle is a person can be easily determined by analyzing the detection data (image data) acquired by the detection means 4.
  • the detection means 4 may be configured to be able to detect infrared rays radiated from the obstacle in order to determine whether or not the person is a person from the temperature of the obstacle.
  • the process of step S1362 is executed to determine whether or not there are a plurality of detected obstacles (persons). If the obstacle is not a person (NO determination), the process proceeds to step S1365 to execute the object handling process for the non-human obstacle. The details of the object handling process will be described later with reference to FIG.
  • step S1362 the controller 1 determines whether or not there are a plurality of detected persons. If the number of detected persons is not a plurality but one (NO determination), the process of step S1364 is performed in order to execute the person-correspondence process for the detected persons. The details of the human handling process will be described later with reference to FIG. On the other hand, if there are a plurality of detected persons (YES determination), the person identification process (S1363) is executed in order to specify the target for executing the person correspondence process. The details of the person identification process will be described with reference to FIG.
  • FIG. 8 is a diagram illustrating a person identification process.
  • FIG. 8 is a view of the positional relationship between the mobile robot 100 and a person from above, which is indicated by the mobile robot 100, the movement path indicated by the dotted arrow extending from the mobile robot 100, and the solid line arrow extending from the mobile robot 100. The latest movement direction, a predetermined distance from the mobile robot 100 indicated by a dotted circle, and a plurality of people including the person A and the person B are shown.
  • Person A and person B shown in FIG. 8 are both persons as obstacles detected on the movement path within a predetermined distance.
  • the controller 1 identifies the person who is predicted to first contact or collide when the mobile robot 100 moves. .. Specifically, the person (person A) closest to the mobile robot 100 is identified as the person who is predicted to collide first. However, the moving speed and the moving direction of the person may be taken into consideration in identifying the person who is expected to collide first. In this case, for example, when the person A slowly moves away from the mobile robot 200 while the person B quickly approaches the mobile robot 200, the person B is identified as the first person to be predicted to collide. There may be.
  • the person correspondence process in step S1364 is executed.
  • step S1364 since it is determined that the obstacle is a person (S1361, YES determination), one or a plurality of predetermined actions selected when the target is a person are executed for the detected obstacle. do. Specifically, it will be described with reference to FIG.
  • FIG. 9 is a diagram for explaining the human correspondence process.
  • the human response process of the present embodiment includes at least one or more of the following three responses. Specifically, in the human correspondence process, the movement of the mobile robot 100 is stopped (human correspondence 1), the face 31 is directed to the human head (human correspondence 2), and a predetermined sound is output (human correspondence 3). ), Or a combination of two or more.
  • the dotted arrow in the figure indicates the movement path of the mobile robot 100.
  • FIG. 9A is a diagram illustrating how the mobile robot 100 is executing either one or both of the human correspondence 1 and the human correspondence 3.
  • the mobile robot 100 may stand still in front of a person who is an obstacle (human correspondence 1).
  • the stopped state is maintained.
  • Such a response gives priority to human behavior, and in principle, quietly waits for a person to notice the existence of the mobile robot 100 and evict from the movement route.
  • the mobile robot 100 may output a predetermined sound via the speaker 5 in the state of FIG. 9A (human correspondence 3).
  • the predetermined sound may be a warning sound such as a buzzer sound, or a voice such as "I'm sorry, please go” or "pass".
  • These predetermined sound types may be appropriately selected from a plurality of patterns stored in advance according to prior settings and the like. Not only the type of sound, but also the volume, the number of times the sound is repeated (the number of times of output), the duration of sound output, and the like may be set in advance.
  • the mobile robot 100 can output an appropriate sound according to the scene in which the mobile robot 100 is used and the request of the user who manages the mobile robot 100.
  • the output timing may be appropriately set. For example, it may be set to output a sound when a person does not evacuate even if the stationary state continues for 5 seconds.
  • the type, volume, number of repetitions, etc. of the output sound may be changed according to the duration of the human processing. For example, prepare multiple types of sounds with different working strengths according to the strength (meaning) of words, such as "I'm sorry", “Please go”, “Doke!, So that the degree of working gradually increases.
  • the type of sound to be output may be changed to stronger words at predetermined time intervals.
  • the predetermined time here may be set as appropriate.
  • the time interval for stepwise change does not have to be equal.
  • the mobile robot 100 can urge evacuation with an appropriate strength that changes stepwise according to the human behavior without making the person uncomfortable or surprised by suddenly uttering a strong word at a loud volume. can.
  • FIG. 9B is a diagram illustrating a state in which the mobile robot 100 is executing a response (human response 2) in which the face portion 31 is directed to the human head.
  • the obstacle handling unit 14 head drive control unit 14a
  • the obstacle handling unit 14 may turn the face portion 31 toward the human head as a human handling process.
  • it is possible to urge a person on the movement path to evacuate, which is stronger than the person correspondence 1 that only stands still and softer than the person correspondence 3 that outputs a sound.
  • It may be executed in combination with at least one of the person correspondence 1 and the person correspondence 3, and the combination timing may be appropriately set.
  • the face 31 may be directed toward the head of a person after a lapse of a predetermined time from a stationary state, and a sound may be output when a predetermined time has passed. This makes it possible to work more finely depending on the situation.
  • the obstacle response unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned person responses 1 to 3 for the person.
  • FIG. 10 is a diagram for explaining the object handling process.
  • the object handling process is a process performed when it is determined that the obstacle is not a person (S1361, NO determination), and includes at least one or both of the following two responses. Specifically, in the object handling process, after stopping the movement of the mobile robot 100, the face 31 is directed to the heads of people in the vicinity (object handling 1), and a sound is output to the people in the vicinity (objects). Correspondence 2) is executed either one or in combination. In addition, similarly to the above-mentioned person correspondence 1, the object correspondence 3 which stands still in front of an obstacle may be executed. The dotted arrow in the figure indicates the movement path of the mobile robot 100.
  • FIG. 10A shows a state in which the mobile robot 100 is stopped in front of an obstacle (object).
  • the obstacle detection unit 13 detects the position of a person around the mobile robot 100, particularly the position of the head of a person around the mobile robot 100.
  • the obstacle handling unit 14 may turn the face portion 31 toward the head of a person in the vicinity as an object handling process (object handling 1). ).
  • the mobile robot 100 can positively appeal to the people around it that the movement route is obstructed by an obstacle.
  • the type of sound output at this time may be, for example, the following voice, different from that at the time of human processing.
  • the obstacle handling unit 14 (notification control unit 14b) emits a voice requesting that the obstacle be removed from the movement route, such as "Please move the object" by the object handling process (object handling 2). It may be output to a person.
  • the mobile robot 100 can more accurately appeal to the surrounding people to remove the obstacle from the moving path in order to secure the moving path.
  • a so-called directional speaker may be adopted as the speaker 5 that outputs sound. This makes it possible to ask for help only from a specific person in the vicinity, for example, even in a field where sound cannot be emitted unnecessarily.
  • the object correspondence 2 may be executed in the stationary state (object correspondence 3) shown in FIG. 10 (a).
  • the mobile robot 100 may output a sound such as "Who is it?" Toward the periphery without limiting the target for outputting the sound.
  • the sound type, the volume, the number of times the sound is repeated, and the sound output duration may be appropriately changed in the same manner as described above in the human handling process. ..
  • the obstacle handling unit 14 may point to an obstacle to be removed by using the arm 40 as an object handling process (object handling 4).
  • object handling 4 an object handling process
  • the mobile robot 100 can more accurately convey the obstacles to be removed from the movement path to the surrounding people.
  • Various movements can be adopted as the movements for pointing to obstacles.
  • the mobile robot 100 may perform an action of pointing the tip of the arm 40 toward an obstacle so that a person points to an object, and further moves the tip of the arm 40 back and forth with respect to the obstacle. You may perform an action that emphasizes and points to the obstacle that you want to remove.
  • the object correspondence 4 may also be executed in combination with any one or more of the above-mentioned object correspondences 1 to 3.
  • the obstacle handling unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned object handling 1 to 4 to the surrounding people.
  • FIGS. 4, 5 and 7 are examples for explaining the movement control of the present embodiment, and do not necessarily limit the execution as shown in the drawings, and the above-mentioned technical techniques are not necessarily limited. It may be changed as appropriate as long as it is effective.
  • step S133 shown in FIG. 5 may be deleted.
  • the mobile robot 100 is configured to try to generate an avoidance route when an obstacle is detected, and to execute an obstacle handling process when an appropriate avoidance route is not generated. May be good.
  • the mobile robot 100 detects an obstacle (S131, YES determination) and the obstacle is not removed from the movement route even after a predetermined time has elapsed after executing the above-mentioned predetermined response as the obstacle response process.
  • an attempt may be made to generate a movement route (avoidance route) that avoids the obstacle from the current location and reaches the destination (S134).
  • the mobile robot 100 detects an obstacle, it first executes an obstacle handling process, and then generates an avoidance route when the obstacle is not removed from the moving route even after a lapse of a predetermined time.
  • the predetermined time here may be set as appropriate and is not particularly limited.
  • the mobile robot 100 regenerates a movement path that can avoid the obstacle and reach the destination, and such If the route cannot be regenerated, a predetermined action is taken to remove the obstacle from the movement route.
  • the mobile robot 100 executes a predetermined response for removing the obstacle from the movement path, and even if the predetermined response is executed, the obstacle If is not removed, avoid obstacles and regenerate a travel route that can reach the destination.
  • the mobile robot 100 can actively secure the movement route and move to the destination even when the movement route is blocked by an obstacle, so that the decrease in work efficiency can be suppressed. ..
  • the notification means 5 does not necessarily have to be a speaker, and may be a lamp, a laser, a display, or the like that appeals to human vision using light.
  • the obstacle handling unit 14 may change the strength of the action at predetermined time intervals even when performing a predetermined notification using the notification means 5 configured in this way in the obstacle handling process. ..
  • the obstacle handling unit 14 works on the content displayed on the display in the same manner as the predetermined notification using the speaker described above. The strength may be changed to be stronger at predetermined time intervals.
  • the configuration of the mobile robot 100 shown in FIG. 2 may be appropriately changed as long as it includes at least the head 30 and is configured to be movable along the movement path.
  • the mobile robot 100 does not have to include the robot main body 10.
  • the head 30 may be configured to be provided directly on the moving carriage 20.
  • the present invention can be used at least in an industry that manufactures robot control devices and the like.

Abstract

[Problem] To provide a robot control device in which work efficiency is not reduced even if there is an obstacle on a movement route. [Solution] Provided is a robot control device comprising a movement route generation unit that generates a movement route from a current location to a destination location, a movement control unit that causes a mobile robot to move along the generated movement route, an obstacle detection unit that detects an obstacle on the movement route within a prescribed distance, and an obstacle countermeasure unit that executes a prescribed countermeasure for removing the obstacle from the movement route, wherein: the obstacle detection unit detects whether or not the obstacle is a person; and the obstacle countermeasure unit executes one or a plurality of the prescribed countermeasure selected in accordance with whether or not the obstacle is a person.

Description

ロボット制御装置、方法、及びプログラムRobot controls, methods, and programs
 本発明は、例えば、ロボットの制御装置等に関する。 The present invention relates to, for example, a robot control device and the like.
 近年、工場や倉庫のみならず、商業ビルや空港、介護施設等、様々な場所にロボットが適用されている。例えば特許文献1では、工場で人と共有する作業領域を自律走行する人協働型の移動ロボットが開示されている。 In recent years, robots have been applied not only to factories and warehouses, but also to various places such as commercial buildings, airports, and nursing care facilities. For example, Patent Document 1 discloses a human-cooperative mobile robot that autonomously travels in a work area shared with humans in a factory.
特許第6412179号公報Japanese Patent No. 6412179
 ところで、移動ロボットが計画した目的地までの移動経路を移動する際、障害物が移動経路に立ち塞がることがある。特許文献1の移動ロボットは、移動経路を立ち塞ぐ障害物と接触した場合には、接触時に加わる力を検知して、移動ロボットの移動を停止するように構成される。 By the way, when the mobile robot moves the movement route to the planned destination, obstacles may be blocked by the movement route. When the mobile robot of Patent Document 1 comes into contact with an obstacle that blocks the movement path, it is configured to detect a force applied at the time of contact and stop the movement of the mobile robot.
 このような移動ロボットの移動経路に障害物がある場合は、移動経路上の障害物に周囲の人間が気づき、これを取り除くまでは移動経路を確保することができないので、作業効率が低下するという課題がある。 When there is an obstacle in the movement path of such a mobile robot, the surrounding people notice the obstacle on the movement path, and the movement path cannot be secured until the obstacle is removed, so that the work efficiency is lowered. There are challenges.
 本発明は、上述の技術的課題を解決するためになされたものであり、その目的とするところは、移動経路上に障害物がある場合でも作業効率を低下させないロボット制御装置等を提供することにある。 The present invention has been made to solve the above-mentioned technical problems, and an object of the present invention is to provide a robot control device or the like that does not reduce work efficiency even when there is an obstacle on a movement path. It is in.
 上述の技術的課題は、以下の構成を有するロボット制御装置等により解決することができる。 The above technical problem can be solved by a robot control device or the like having the following configuration.
 すなわち、本発明に係るロボット制御装置は、移動ロボットを制御するロボット制御装置であって、現在地から目的地までの移動経路を生成する移動経路生成部と、生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御部と、所定距離内における前記移動経路上の障害物を検出する障害物検出部と、前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応部と、を備え、前記障害物検出部は、前記障害物が人か否かを検出し、前記障害物対応部は、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する。 That is, the robot control device according to the present invention is a robot control device that controls a mobile robot, and includes a movement path generation unit that generates a movement path from the current location to the destination, and the generated movement path along the generated movement path. A movement control unit that moves a mobile robot, an obstacle detection unit that detects an obstacle on the movement path within a predetermined distance, and an obstacle response that executes a predetermined response for removing the obstacle from the movement path. The obstacle detection unit detects whether or not the obstacle is a person, and the obstacle response unit selects one or a plurality of the obstacles depending on whether or not the obstacle is a person. Take the prescribed action.
 このような構成によれば、移動経路上に障害物が検出された場合に、当該障害物を移動経路から除くために実行する所定の対応を障害物が人か否かに応じて選択することができるので、障害物に応じてより適切な働きかけを選択し、移動経路から障害物を効率よく取り除くことができる。これにより、移動経路上に障害物が検出された場合でも、移動ロボットの作業効率の低下を抑制することができる。また、後述する所定の対応を複数組み合わせて実行することもできるので、よりきめ細かい、多様な対応を実現することが可能となり、障害物をより効率的に取り除くことができる。 According to such a configuration, when an obstacle is detected on the movement route, a predetermined action to be taken to remove the obstacle from the movement route is selected depending on whether the obstacle is a person or not. Therefore, it is possible to select a more appropriate action according to the obstacle and efficiently remove the obstacle from the movement route. As a result, even when an obstacle is detected on the movement path, it is possible to suppress a decrease in the work efficiency of the mobile robot. In addition, since it is possible to execute a plurality of predetermined measures described later in combination, it is possible to realize more detailed and diverse measures, and it is possible to remove obstacles more efficiently.
 前記移動ロボットは、正面を顔部とする頭部と、前記頭部を駆動する駆動手段と、をさらに備え、検出した前記障害物が人である場合は、前記障害物検出部は、前記人の頭の位置を検出し、前記障害物対応部は、前記所定の対応として、前記駆動手段を制御して前記顔部を前記人の頭に向ける、ものであってよい。 The mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is a person, the obstacle detecting unit is the person. The obstacle handling portion may be one that detects the position of the head of the robot and controls the driving means to direct the face portion to the person's head as the predetermined response.
 このような構成によれば、障害物が人である場合に当該人の頭に顔部を向けるので、当該人に対して移動経路から立ち退くことを積極的に促すことができる。 According to such a configuration, when the obstacle is a person, the face is directed to the person's head, so that the person can be actively encouraged to evict from the movement route.
 検出した前記障害物が人であって、かつ当該人が複数いる場合に前記移動ロボットに最も近い人を特定する特定部をさらに備え、前記障害物対応部は、前記所定の対応として前記顔部を前記人の頭に向ける場合は、前記顔部を前記特定部が特定した前記人の頭に向ける、ものであってよい。 When the detected obstacle is a person and there are a plurality of the persons, a specific unit for identifying the person closest to the mobile robot is further provided, and the obstacle response unit is the face portion as the predetermined response. When the face is directed to the head of the person, the face may be directed to the head of the person specified by the specific part.
 このような構成によれば、障害物としての人が複数人検出された場合であっても、人の行動原理と同様の順番で特定された人に対して、順次、違和感なく退避を促すことができる。 According to such a configuration, even if a plurality of people as obstacles are detected, the people identified in the same order as the human behavior principle are sequentially urged to evacuate without discomfort. Can be done.
 前記移動ロボットは、正面を顔部とする頭部と、前記頭部を駆動する駆動手段と、をさらに備え、検出した前記障害物が人ではない場合は、前記障害物検出部は、前記移動ロボットの周辺にいる人の頭の位置を検出し、前記障害物対応部は、前記所定の対応として、前記駆動手段を制御して前記顔部を周辺にいる前記人の頭に向ける、ものであってよい。 The mobile robot further includes a head whose face is the front surface and a driving means for driving the head, and when the detected obstacle is not a human, the obstacle detecting unit moves. The position of the head of a person in the vicinity of the robot is detected, and the obstacle handling unit controls the driving means to direct the face to the head of the person in the vicinity as the predetermined response. It may be there.
 このような構成によれば、障害物が物である場合に、周囲にいる人の頭に顔部を向けるので、障害物を移動経路から取り除くことを周囲にいる人に積極的に訴えることができる。 According to such a configuration, when the obstacle is an object, the face is directed to the head of the surrounding person, so that it is possible to actively appeal to the surrounding person to remove the obstacle from the movement path. can.
 前記所定の対応として、所定の報知を行うための報知手段をさらに備える、ものであってよい。 As the predetermined response, the notification means for performing the predetermined notification may be further provided.
 このような構成によれば、報知手段を用いて、障害物を移動経路から取り除くことを周囲にいる人により明確に訴えることができる。 According to such a configuration, it is possible to clearly appeal to the people around to remove the obstacle from the movement route by using the notification means.
 前記移動ロボットは前記報知手段としてのスピーカをさらに備え、前記障害物対応部は、前記所定の対応として、前記障害物が人か否かに応じて選択される所定の音を前記スピーカを用いて出力する、ものであってよい。 The mobile robot further includes a speaker as the notification means, and the obstacle handling unit uses the speaker to make a predetermined sound selected according to whether or not the obstacle is a person as the predetermined response. It may be something to output.
 このような構成によれば、スピーカを用いて人の聴覚に訴えることができるので、障害物を移動経路から除くことをより強く、より明確に訴えることができる。また、出力する音は、障害物が人か否かに応じて選択されるので、より適切な音を選択して、障害物を移動経路から効率よく取り除くことができる。 According to such a configuration, since it is possible to appeal to human hearing using a speaker, it is possible to appeal more strongly and more clearly to remove obstacles from the movement route. Further, since the sound to be output is selected according to whether or not the obstacle is a person, a more appropriate sound can be selected and the obstacle can be efficiently removed from the movement path.
 前記所定の音は、働きかけの強さが異なる複数種類の音声を含み、前記障害物対応部は、前記所定の音の種類を所定の時間毎に働きかけの度合いが強くなるように変更する、ものであってよい。 The predetermined sound includes a plurality of types of sounds having different working intensities, and the obstacle handling unit changes the predetermined sound type so that the degree of working becomes stronger at predetermined time intervals. It may be.
 このような構成によれば、働きかけの強さを徐々に強くすることができるので、いきなり強い言葉を発するなどして人を不快にさせることなく、人の行動に応じて段階的に強くなる適切な強度での働きかけを実現することができる。 With such a configuration, the strength of the action can be gradually increased, so it is appropriate to gradually increase the strength according to the behavior of the person without making the person uncomfortable by suddenly speaking strong words. It is possible to work with high strength.
 前記障害物対応部は、前記所定の音の音量を所定の時間毎に大きくする、ものであってよい。 The obstacle handling unit may increase the volume of the predetermined sound at predetermined time intervals.
 このような構成によれば、いきなり大音量を発するなどして人を驚かせることなく、人の行動に応じて段階的に大きくなる適切な音量での働きかけを実現することができる。 According to such a configuration, it is possible to work at an appropriate volume that gradually increases according to the behavior of the person without astonishing the person by suddenly emitting a loud volume.
 前記障害物対応部が出力する前記所定の音の種類、音量、出力回数、および出力継続時間の少なくとも一つ以上は、予め設定可能に構成される、ものであってよい。 At least one or more of the predetermined sound type, volume, number of outputs, and output duration output by the obstacle handling unit may be configured to be preset.
 このような構成によれば、移動ロボットが適用される場所等に応じた適切な音を設定することが可能になる。 According to such a configuration, it is possible to set an appropriate sound according to the place where the mobile robot is applied.
 前記スピーカは、前記所定の音が前記頭部から出力されるように配置される、ものであってよい。 The speaker may be arranged so that the predetermined sound is output from the head.
 このような構成によれば、所定の音を、人と略同様の位置から出力することができるので、音を聞く側の人に違和感を生じさせず、より効果的に音を伝えることができる。 According to such a configuration, a predetermined sound can be output from a position substantially similar to that of a person, so that the person listening to the sound does not feel uncomfortable and the sound can be transmitted more effectively. ..
 前記障害物対応部は、前記報知手段による働きかけの強さを所定の時間毎に変更する、ものであってよい。 The obstacle handling unit may change the strength of the action by the notification means at predetermined time intervals.
 このような構成によれば、前記報知手段によって、人の行動に応じて段階的に強くなる適切な強度での働きかけを実現することができる。 According to such a configuration, it is possible to realize an action with an appropriate strength that gradually becomes stronger according to a person's behavior by the notification means.
 前記移動経路生成部は、前記障害物が検出された場合に、現在地から当該障害物を回避して前記目的地に到達する移動経路を生成し、前記障害物対応部は、現在地から前記障害物を回避して前記目的地に到達する移動経路が生成できなかった場合に、前記所定の対応を実行する、ものであってよい。 When the obstacle is detected, the movement route generation unit generates a movement route that avoids the obstacle from the current location and reaches the destination, and the obstacle response unit generates the obstacle from the current location. When the movement route to reach the destination cannot be generated by avoiding the above, the predetermined response may be executed.
 このような構成によれば、移動経路上に障害物が検出された場合には、当該障害物を回避して目的地まで到達可能な移動経路が再生成され、又、そのような経路が再生成されなかった場合には、障害物を移動経路から除くための所定の対応が実行されるので、移動ロボットが目的地までの移動経路を積極的に確保して、作業効率を高めることができる。 According to such a configuration, when an obstacle is detected on the movement route, a movement route that can avoid the obstacle and reach the destination is regenerated, and such a route is regenerated. If this is not done, a predetermined action is taken to remove the obstacle from the movement route, so that the mobile robot can positively secure the movement route to the destination and improve the work efficiency. ..
 前記移動経路生成部は、前記障害物が検出された場合であって、かつ前記障害物対応部が前記所定の対応を実行してから所定時間経過しても前記障害物が前記移動経路から除かれない場合に、現在地から当該障害物を回避して前記目的地に到達する移動経路を生成する、ものであってよい。 The movement route generation unit removes the obstacle from the movement route even when the obstacle is detected and a predetermined time elapses after the obstacle response unit executes the predetermined response. If not, it may generate a movement route from the current location to avoid the obstacle and reach the destination.
 このような構成によれば、移動経路上に障害物が検出された場合には、障害物を移動経路から除くための所定の対応が実行され、その後所定時間経過しても障害物が移動経路から取り除かれない場合には、当該障害物を回避して目的地まで到達可能な移動経路が再生成されるので、移動ロボットが目的地までの移動経路を積極的に確保して、作業効率を高めることができる。 According to such a configuration, when an obstacle is detected on the movement route, a predetermined action for removing the obstacle from the movement route is executed, and then the obstacle moves on the movement route even after a predetermined time elapses. If it is not removed from the robot, a movement route that can avoid the obstacle and reach the destination is regenerated. Therefore, the mobile robot actively secures the movement route to the destination to improve work efficiency. Can be enhanced.
 前記移動制御部は、移動中に前記障害物が検出された場合は、前記移動ロボットを減速又は停止させる、ものであってよい。 The movement control unit may decelerate or stop the mobile robot when the obstacle is detected during movement.
 このような構成によれば、移動経路の再設定、又は障害物への所定の対応を、障害物に接触または衝突する前に、安全に実行することができる。 According to such a configuration, it is possible to safely perform the resetting of the movement route or the predetermined response to the obstacle before contacting or colliding with the obstacle.
 障害物対応部は、前記障害物が検出された場合に前記移動ロボットが停止した場合には、前記所定の対応として、前記移動ロボットの停止状態を維持する、ものであってよい。 When the mobile robot stops when the obstacle is detected, the obstacle handling unit may maintain the stopped state of the mobile robot as the predetermined response.
 このような構成によれば、例えば、移動ロボットの作業を停止させてでも静かに運用したい状況にも対応できる移動ロボットを提供することができる。 According to such a configuration, for example, it is possible to provide a mobile robot that can respond to a situation in which the work of the mobile robot is to be stopped and quietly operated.
 前記移動ロボットは、マニピュレータ機構を有する一又は複数の腕部をさらに備える、ものであってよい。 The mobile robot may further include one or more arms having a manipulator mechanism.
 このような構成によれば、上述の各機能を備える移動マニピュレータを実現することができる。 With such a configuration, it is possible to realize a mobile manipulator having each of the above-mentioned functions.
 前記障害物対応部は、検出した前記障害物が人ではない場合に、前記所定の対応として前記腕部を用いて前記障害物を指し示す、ものであってよい。 The obstacle handling unit may be one that points to the obstacle by using the arm portion as the predetermined response when the detected obstacle is not a person.
 このような構成によれば、移動経路から取り除いてほしい障害物を周囲の人により的確に伝えることができる。 With such a configuration, obstacles that you want to remove from the movement route can be more accurately communicated to the people around you.
 本発明は、方法として観念することもできる。すなわち、本発明にかかるロボット制御方法は、移動ロボットを制御するロボット制御方法であって、現在地から目的地までの移動経路を生成する移動経路生成ステップと、生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御ステップと、前記移動経路上の障害物を検出する障害物検出ステップと、前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応ステップと、を含み、前記障害物検出ステップでは、前記障害物が人か否かを検出し、前記障害物対応ステップでは、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する。 The present invention can also be thought of as a method. That is, the robot control method according to the present invention is a robot control method for controlling a mobile robot, which includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route. A movement control step for moving a mobile robot, an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path. Including, in the obstacle detection step, whether or not the obstacle is a person is detected, and in the obstacle response step, one or a plurality of the predetermined responses selected depending on whether or not the obstacle is a person is performed. Run.
 本発明は、コンピュータプログラムとして観念することもできる。すなわち、本発明に係るロボット制御プログラムは、移動ロボットを制御するロボット制御プログラムであって、現在地から目的地までの移動経路を生成する移動経路生成ステップと、生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御ステップと、前記移動経路上の障害物を検出する障害物検出ステップと、前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応ステップと、を含み、前記障害物検出ステップでは、前記障害物が人か否かを検出し、前記障害物対応ステップでは、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する。 The present invention can also be thought of as a computer program. That is, the robot control program according to the present invention is a robot control program that controls a mobile robot, and includes a movement route generation step for generating a movement route from a current location to a destination, and the generated movement route along the generated movement route. A movement control step for moving a mobile robot, an obstacle detection step for detecting an obstacle on the movement path, and an obstacle response step for executing a predetermined response for removing the obstacle from the movement path. Including, in the obstacle detection step, whether or not the obstacle is a person is detected, and in the obstacle response step, one or a plurality of the predetermined responses selected depending on whether or not the obstacle is a person is performed. Run.
 本発明によれば、移動経路上に障害物がある場合でも作業効率を低下させないロボット制御装置等を提供することができる。 According to the present invention, it is possible to provide a robot control device or the like that does not reduce work efficiency even when there is an obstacle on the movement path.
図1は、ロボット制御装置が適用される移動ロボットの概略構成図である。FIG. 1 is a schematic configuration diagram of a mobile robot to which a robot control device is applied. 図2は、移動ロボットの外観を示すが概略構成図である。FIG. 2 shows the appearance of the mobile robot, but is a schematic configuration diagram. 図3は、移動ロボットの頭部を説明する図である。FIG. 3 is a diagram illustrating a head of a mobile robot. 図4は、移動制御を説明するフローチャートである。FIG. 4 is a flowchart illustrating movement control. 図5は、移動処理を説明するフローチャートである。FIG. 5 is a flowchart illustrating the movement process. 図6は、移動経路上の障害物を検出する方法を説明する図である。FIG. 6 is a diagram illustrating a method of detecting an obstacle on a movement path. 図7は、障害物対応処理を説明するフローチャートである。FIG. 7 is a flowchart illustrating the obstacle handling process. 図8は、人特定処理を説明する図である。FIG. 8 is a diagram illustrating a person identification process. 図9は、人対応処理の動作を説明する図である。FIG. 9 is a diagram illustrating the operation of the human correspondence process. 図10は、物対応処理の動作を説明する図である。FIG. 10 is a diagram illustrating the operation of the object handling process.
 [実施形態]
  図1から図3は、本発明の一実施形態に係るロボット制御装置(コントローラ1)が適用される移動ロボット100の構成を説明する図である。移動ロボット100は、人と共有の作業領域内を移動することが想定されるロボットであって、例えば、工場等で部品をピックして所定場所に運搬する機能等を有する移動マニピュレータや、空港等の施設で旅行者等を案内するサービスロボット等に適用されてよい。本実施形態では、移動ロボット100が、図2に示すような頭部30とマニピュレータ機構を有する腕部40とを備える略人型の移動マニピュレータに適用された例について説明する。
[Embodiment]
1 to 3 are diagrams illustrating a configuration of a mobile robot 100 to which the robot control device (controller 1) according to the embodiment of the present invention is applied. The mobile robot 100 is a robot that is supposed to move in a work area shared with humans, and is, for example, a mobile manipulator having a function of picking parts at a factory or the like and transporting them to a predetermined place, an airport, or the like. It may be applied to a service robot or the like that guides a traveler or the like at the facility of. In the present embodiment, an example in which the mobile robot 100 is applied to a roughly humanoid mobile manipulator including a head 30 and an arm portion 40 having a manipulator mechanism as shown in FIG. 2 will be described.
 図1に示すとおり、本実施形態の移動ロボット100は、移動ロボット100の動作を制御するロボット制御装置として機能するコントローラ1と、移動機構駆動手段2と、頭部駆動手段3と、検出手段4と、報知手段5と、を含んで構成される。 As shown in FIG. 1, the mobile robot 100 of the present embodiment includes a controller 1 that functions as a robot control device that controls the operation of the mobile robot 100, a mobile mechanism driving means 2, a head driving means 3, and a detecting means 4. And the notification means 5.
 コントローラ1は、移動経路生成部11、移動制御部12、障害物検出部13、および障害物対応部14等の機能部を有する。コントローラ1は、例えば、プロセッサとしての中央演算装置(CPU)、記憶媒体としての読み出し専用メモリ(ROM)およびランダムアクセスメモリ(RAM)、入出力インタフェース(I/Oインタフェース)等がバスを介して接続されて構成される情報処理装置である。また、コントローラ1が備えるROMには、前述の各機能部がそれぞれに有する各機能を実行するためのプログラム(制御プログラム)が格納されている。すなわち、コントローラ1は、記憶媒体に格納された各種プログラムを実行することによって、移動経路生成部11、移動制御部12、障害物検出部13、および障害物対応部14の各機能部の機能を実現するように構成される。なお、コントローラ1を構成するプロセッサおよび記憶媒体として上述した構成は例示であって、これらに加えて、或いは代えて、GPU、フラッシュメモリ、ハードディスク、ストレージ等を含んでもよい。また、上述の各機能部の機能は、必ずしもコントローラ1のみによって実現される必要はなく、機能部毎に適宜選択された複数のコントローラがそれぞれ、或いは協調することによって実現されるように構成されてもよい。コントローラ1の各機能部の機能について以下説明する。 The controller 1 has functional units such as a movement route generation unit 11, a movement control unit 12, an obstacle detection unit 13, and an obstacle response unit 14. In the controller 1, for example, a central processing unit (CPU) as a processor, a read-only memory (ROM) and a random access memory (RAM) as a storage medium, an input / output interface (I / O interface), and the like are connected via a bus. It is an information processing unit configured by the interface. Further, the ROM included in the controller 1 stores a program (control program) for executing each function of each of the above-mentioned functional units. That is, the controller 1 executes the functions of the movement path generation unit 11, the movement control unit 12, the obstacle detection unit 13, and the obstacle response unit 14 by executing various programs stored in the storage medium. It is configured to be realized. The above-mentioned configurations as the processor and the storage medium constituting the controller 1 are examples, and in addition to or in place of these, a GPU, a flash memory, a hard disk, a storage, and the like may be included. Further, the functions of the above-mentioned functional units do not necessarily have to be realized only by the controller 1, and are configured to be realized by a plurality of controllers appropriately selected for each functional unit or by coordinating with each other. May be good. The functions of each functional unit of the controller 1 will be described below.
 移動経路生成部11は、移動ロボット100が移動しようとする経路(移動経路)を生成する。具体的には、移動経路生成部11は、作業領域の地図情報(マップデータ)に基づいて所定位置から目的地までの経路を生成するパスプランニングを実行する。生成された経路は移動ロボット100の移動経路として設定される。ここでの所定位置は、予め定められた所定の位置であってもよいし、移動ロボット100の現在位置であってもよい。移動経路生成部11が実行するパスプランニングの具体的な手法は特に限定されず、既存の移動マニピュレータ等の移動ロボットに用いられている公知の手法が適宜採用されてよい。 The movement route generation unit 11 generates a route (movement route) on which the mobile robot 100 intends to move. Specifically, the movement route generation unit 11 executes path planning to generate a route from a predetermined position to a destination based on map information (map data) of a work area. The generated route is set as the movement route of the mobile robot 100. The predetermined position here may be a predetermined predetermined position or may be the current position of the mobile robot 100. The specific method of path planning executed by the movement route generation unit 11 is not particularly limited, and a known method used in a mobile robot such as an existing mobile manipulator may be appropriately adopted.
 移動制御部12は、移動経路生成部11が生成した移動経路に沿って移動ロボット100を移動させるために、移動機構駆動手段2を制御する。このように、本実施形態の移動ロボット100は、生成した移動経路に沿って自律的に移動するいわゆる自律走行型の移動ロボットとして構成される。移動機構駆動手段2については図2を参照して後述する。 The movement control unit 12 controls the movement mechanism driving means 2 in order to move the mobile robot 100 along the movement path generated by the movement route generation unit 11. As described above, the mobile robot 100 of the present embodiment is configured as a so-called autonomous traveling type mobile robot that autonomously moves along the generated movement path. The moving mechanism driving means 2 will be described later with reference to FIG.
 検出手段4は、移動ロボット200の周囲環境に存在する障害物を検出する。ここでの障害物は、物だけでなく人も含む。また、検出手段4は、障害物を検出する手段として用いられるだけでなく、移動ロボット100が自律走行するために移動ロボット100の周辺環境を認識するための手段としても構成されてよい。検出手段4は、例えば、センサ(イメージセンサ、ライダ、レーザスキャナ等)、ステレオカメラ等の少なくとも一つを含む。本実施形態の検出手段4は、イメージセンサから構成されるものとし、検出手段4を以下ではセンサ4とも称する。なお、イメージセンサは、光学系を通して検出した光を電気信号に変換する撮像素子を有するセンサであってその種類については特に制限されない。センサ4は、少なくとも前方を含む周囲環境に存在する障害物を検出可能に構成され、例えば、図3(a)で示す顔部31に構成された人の目に相当する部分等に設けられてよい。また、いわゆる360度カメラと言われるような全方位を検出できるセンサ4が例えば頭部30の頭頂部に設けられてもよい。本実施形態のセンサ4は、移動ロボット100の少なくとも前方を含む周辺に存在する障害物の位置、及び障害物が人である場合には当該人の位置と姿勢を認識可能な画像データを検出して、検出データ(画像データ)をコントローラ1(障害物検出部13)に送信する。 The detection means 4 detects an obstacle existing in the surrounding environment of the mobile robot 200. Obstacles here include not only objects but also people. Further, the detection means 4 is not only used as a means for detecting an obstacle, but may also be configured as a means for recognizing the surrounding environment of the mobile robot 100 so that the mobile robot 100 autonomously travels. The detection means 4 includes, for example, at least one of a sensor (image sensor, lidar, laser scanner, etc.), a stereo camera, and the like. The detection means 4 of the present embodiment is composed of an image sensor, and the detection means 4 is also referred to as a sensor 4 below. The image sensor is a sensor having an image sensor that converts light detected through an optical system into an electric signal, and the type thereof is not particularly limited. The sensor 4 is configured to be able to detect an obstacle existing in the surrounding environment including at least the front, and is provided, for example, in a portion corresponding to the human eye, which is configured in the face portion 31 shown in FIG. 3A. good. Further, a sensor 4 capable of detecting all directions, which is a so-called 360-degree camera, may be provided on the crown of the head 30, for example. The sensor 4 of the present embodiment detects the position of an obstacle existing in the vicinity including at least the front of the mobile robot 100, and image data capable of recognizing the position and posture of the person when the obstacle is a person. Then, the detection data (image data) is transmitted to the controller 1 (obstacle detection unit 13).
 障害物検出部13は、センサ4が取得した検出データ(画像データ)に基づいて、移動経路上の障害物を検出する。また、障害物検出部13は、障害物が人であるか否かを識別し、障害物が人である場合には当該人の位置と姿勢を検出する。さらに、障害物検出部13は、移動経路上の障害物だけでなく、移動ロボット100の周辺に存在する人の位置と姿勢を検出するように構成されてよい。ただし、障害物検出部21が検出する人の姿勢は、必ずしも体全体の構えではなく、少なくとも人の頭の位置が検出可能であればよい。なお、画像データに基づいて人の位置と姿勢とを検出する手法として、公知の種々の手法を採用し得るが、本実施形態では、画像データから人の骨格(ボーン)を検出して人の位置と姿勢とを検出する手法(ボーンモデル)が採用されてよい。これにより、障害物検出部13は、移動経路上、および移動ロボット100の周辺にいる人の頭の位置を容易に検出することができる。 The obstacle detection unit 13 detects an obstacle on the movement path based on the detection data (image data) acquired by the sensor 4. In addition, the obstacle detection unit 13 identifies whether or not the obstacle is a person, and if the obstacle is a person, detects the position and posture of the person. Further, the obstacle detection unit 13 may be configured to detect not only obstacles on the movement path but also the position and posture of a person existing in the vicinity of the mobile robot 100. However, the posture of the person detected by the obstacle detection unit 21 is not necessarily the posture of the whole body, and at least the position of the person's head may be detectable. Various known methods can be adopted as a method for detecting the position and posture of a person based on the image data, but in the present embodiment, the skeleton (bone) of the person is detected from the image data and the person's skeleton (bone) is detected. A method (bone model) for detecting the position and the posture may be adopted. As a result, the obstacle detection unit 13 can easily detect the position of the head of a person on the movement path and around the mobile robot 100.
 また、障害物検出部13が検出する「移動経路上にある障害物」とは、移動ロボット100が移動経路上を移動した際に、障害物に接触あるいは衝突するおそれのある障害物を含んでもよい。換言すれば、ここでの「移動経路」は、移動ロボット100が移動する予定の経路に対応して引かれる単なる線として観念されるだけでなく、移動ロボット100の水平方向幅に対応する幅を持つ帯として観念されてよい。「移動経路上にある障害物」の検出方法の具体例については図8を参照して後述する。 Further, the "obstacle on the moving path" detected by the obstacle detecting unit 13 includes an obstacle that may come into contact with or collide with the obstacle when the mobile robot 100 moves on the moving path. good. In other words, the "movement path" here is not only conceived as a simple line drawn corresponding to the path that the mobile robot 100 plans to move, but also has a width corresponding to the horizontal width of the mobile robot 100. It may be thought of as a belt to have. A specific example of a method for detecting an "obstacle on a moving path" will be described later with reference to FIG.
 障害物対応部14は、障害物検出部13が障害物を検出した場合に、当該障害物に対する所定の対応を実行する。ここでの所定の対応は、例えば、頭部30を駆動する、報知手段5から所定の報知を行う、等の対応を含む。所定の対応は、障害物の種類(例えば人か否か)や状況に応じて適宜選択されてよい。所定の対応の詳細については図9および図10を参照して後述する。 When the obstacle detection unit 13 detects an obstacle, the obstacle response unit 14 executes a predetermined response to the obstacle. The predetermined response here includes, for example, a response such as driving the head 30 or performing a predetermined notification from the notification means 5. The predetermined response may be appropriately selected depending on the type of obstacle (for example, whether or not it is a person) and the situation. Details of the predetermined correspondence will be described later with reference to FIGS. 9 and 10.
 頭部方向制御部14aは、移動ロボット100に備わる頭部30(図2参照)を駆動するために、頭部駆動手段3を制御する。具体的には、頭部方向制御部14aは、頭部駆動手段3を介して頭部30の正面の向く方向を制御する。頭部駆動手段3の詳細については図2を参照して後述する。なお、本実施形態の頭部方向制御部14aは、障害物対応部14の一機能部として説明される。 The head direction control unit 14a controls the head driving means 3 in order to drive the head 30 (see FIG. 2) provided in the mobile robot 100. Specifically, the head direction control unit 14a controls the front facing direction of the head 30 via the head driving means 3. The details of the head driving means 3 will be described later with reference to FIG. The head direction control unit 14a of the present embodiment is described as a functional unit of the obstacle handling unit 14.
 報知制御部14bは、報知手段5を用いて所定の報知を行う。本実施形態の報知手段5はスピーカとし、報知手段5を以下ではスピーカ5とも称する。すなわち、報知制御部14bは、スピーカ5を用いて所定の音を出力する。所定の音の詳細については後述する。なお、本実施形態の報知制御部14bは、障害物対応部14の一機能部として説明される。 The notification control unit 14b uses the notification means 5 to perform a predetermined notification. The notification means 5 of the present embodiment is a speaker, and the notification means 5 is also referred to as a speaker 5 below. That is, the notification control unit 14b outputs a predetermined sound using the speaker 5. The details of the predetermined sound will be described later. The notification control unit 14b of the present embodiment is described as a functional unit of the obstacle handling unit 14.
 図2は、移動ロボット100の構成を説明する図であって、主に移動ロボット100の外観例を示す。 FIG. 2 is a diagram for explaining the configuration of the mobile robot 100, and mainly shows an example of the appearance of the mobile robot 100.
 図示するように、本実施形態の移動ロボット100は、略人型の形状を有しており、ロボット本体部10と、ロボット本体部10を支持する移動台車20と、ロボット本体部10の上端に設けられた頭部30と、ロボット本体部10の前面から延びる腕部40とから構成されている。 As shown in the figure, the mobile robot 100 of the present embodiment has a substantially humanoid shape, and is located on the robot main body 10, the mobile trolley 20 that supports the robot main body 10, and the upper end of the robot main body 10. It is composed of a head portion 30 provided and an arm portion 40 extending from the front surface of the robot main body portion 10.
 ロボット本体部10を支持する移動台車20は、主に、移動ロボット100が置かれた面(床面)の上を移動する機能を有する。移動台車20の種類は特に制限されないが、本実施形態では全方位移動台車が採用される。全方位移動台車とは、例えば駆動輪として複数のオムニホイールを備える等して、全方向に移動することができるように構成された台車である。本実施形態の移動台車20は、図2で示されるように、外観に表されるスカート21の内側に、三つの駆動輪22と、駆動輪22をベルト等を介して駆動するための移動機構駆動手段2とから構成されてよい。本実施形態の移動機構駆動手段2は一または複数のモータから構成されるものとし、以下では、移動機構駆動手段2をモータ2とも称する。 The mobile trolley 20 that supports the robot body 10 mainly has a function of moving on the surface (floor surface) on which the mobile robot 100 is placed. The type of the mobile trolley 20 is not particularly limited, but in the present embodiment, the omnidirectional mobile trolley is adopted. The omnidirectional moving trolley is a trolley configured to be able to move in all directions, for example, by providing a plurality of omni wheels as driving wheels. As shown in FIG. 2, the moving carriage 20 of the present embodiment has three driving wheels 22 and a moving mechanism for driving the driving wheels 22 via a belt or the like inside the skirt 21 represented in the outer appearance. It may be composed of the driving means 2. The moving mechanism driving means 2 of the present embodiment is composed of one or a plurality of motors, and hereinafter, the moving mechanism driving means 2 is also referred to as a motor 2.
 頭部30は、略人型の移動ロボット100において、人の頭(首より上の部分)に相当する構成としてロボット本体部10の上端に設けられてよい。また、頭部30は、人がその正面を顔として認識できるように構成されるのが望ましい。頭部30は、正面が向く方向を制御可能に構成される。具体的には、本実施形態の移動ロボット100は、一又は複数の頭部駆動手段3(アクチュエータ3)を備え、頭部駆動手段3を介して頭部を動かすことによりその正面の向きを制御できるように構成される。なお、以下では、頭部30の正面を顔部31と称する。 The head 30 may be provided at the upper end of the robot main body 10 as a configuration corresponding to a human head (a portion above the neck) in the substantially humanoid mobile robot 100. Further, it is desirable that the head 30 is configured so that a person can recognize the front surface as a face. The head 30 is configured to be able to control the direction in which the front faces. Specifically, the mobile robot 100 of the present embodiment includes one or a plurality of head driving means 3 (actuators 3), and controls the direction of the front surface thereof by moving the head via the head driving means 3. It is configured so that it can be done. In the following, the front surface of the head 30 will be referred to as a face portion 31.
 本実施形態の移動ロボット100は、頭部30を床面に対して水平方向(左右方向)に回転させる鉛直方向の回転軸を有し頭部30の左右方向への回転駆動を可能とする頭部駆動手段3(サーボモータ3a)を有する。これにより、移動ロボット100は、頭部30のみを鉛直方向に対して左右へ回転駆動させることができる。ただし、サーボモータ3aの配置は、図示する配置に制限されず、顔部31の向きを左右方向に動かすことができる限り適宜変更されてよい。例えば、サーボモータ3aは、ロボット本体部10を移動台車20に対して左右へ回転駆動させるようにロボット本体部10の下端に設けられてもよい。この場合、サーボモータ3aは、頭部30をロボット本体部10とともに垂直方向に対して左右へ回転駆動させることにより顔部31の向きを変えることができる。 The mobile robot 100 of the present embodiment has a vertical rotation axis that rotates the head 30 in the horizontal direction (horizontal direction) with respect to the floor surface, and enables the head 30 to be rotationally driven in the left-right direction. It has a unit driving means 3 (servo motor 3a). As a result, the mobile robot 100 can rotate and drive only the head 30 to the left and right with respect to the vertical direction. However, the arrangement of the servomotor 3a is not limited to the arrangement shown in the drawing, and may be appropriately changed as long as the direction of the face portion 31 can be moved in the left-right direction. For example, the servomotor 3a may be provided at the lower end of the robot main body 10 so as to rotationally drive the robot main body 10 to the left and right with respect to the moving carriage 20. In this case, the servomotor 3a can change the direction of the face portion 31 by rotationally driving the head portion 30 together with the robot main body portion 10 to the left and right with respect to the vertical direction.
 また、移動ロボット100は、頭部30を床面に対して上下方向に駆動させる水平方向の回転軸を有し頭部30の上下を仰ぎ見る動作を可能とする頭部駆動手段3(サーボモータ3b)を有してもよい。これにより、移動ロボット100は、顔部31の向きを上下方向にも動かすことができる。 Further, the mobile robot 100 has a head driving means 3 (servomotor) which has a horizontal rotation axis for driving the head 30 in the vertical direction with respect to the floor surface and enables an operation of looking up and down of the head 30. 3b) may be provided. As a result, the mobile robot 100 can also move the direction of the face portion 31 in the vertical direction.
 図3は、本実施形態の頭部30の構成例をより具体的に説明する図である。頭部30の形状は、正面を顔部31として認識できる限り特に制限されない。例えば、顔部31は、正面を顔部31として容易に認識されるように、頭部30の正面部分にのみ視認される構造的な特徴を有していることが望ましい。例えば、図3(a)のように、人の顔を模した二つの目に相当する構成を頭部30の正面にのみ設けることにより、正面が顔部31として容易に認識され得る。また、図3(b)に示すように、例えば透過部材からなる特徴的な部分(太矢印で示す部分)が頭部30の正面にのみ設けられてもよい。或いは、図示しないが、例えば略長球形等の頭部30において、正面のみを平面にする等してもよい。 FIG. 3 is a diagram for more specifically explaining a configuration example of the head 30 of the present embodiment. The shape of the head 30 is not particularly limited as long as the front surface can be recognized as the face portion 31. For example, it is desirable that the face portion 31 has a structural feature that can be visually recognized only on the front portion of the head portion 30 so that the front surface can be easily recognized as the face portion 31. For example, as shown in FIG. 3A, by providing a configuration corresponding to two eyes imitating a human face only on the front surface of the head 30, the front surface can be easily recognized as the face portion 31. Further, as shown in FIG. 3B, for example, a characteristic portion (a portion indicated by a thick arrow) made of a transparent member may be provided only on the front surface of the head 30. Alternatively, although not shown, for example, in the head 30 having a substantially long spherical shape, only the front surface may be made flat.
 また、頭部30は、図示するように、人間の首に相当する構成(頸部)を介してロボット本体部10に設けられてもよいし、ロボット本体部10の上端に直接設けられてもよい。すなわち、サーボモータ3a、3bは、頸部に設けられてもよいし、頭部30に設けられてもよい。また、検出手段4は、頭部30において、例えば、図3(a)に示す目に相当する部分や、図3(b)に示す透過部材からなる部分から外部環境を検出するように設けられてもよい。また、スピーカ5は、頭部30、例えば顔部31から音を出力するように設けられてよい。以上が頭部30の構成例である。図2に戻って説明を続ける。 Further, as shown in the figure, the head 30 may be provided on the robot main body 10 via a configuration (neck) corresponding to a human neck, or may be provided directly on the upper end of the robot main body 10. good. That is, the servomotors 3a and 3b may be provided on the neck or the head 30. Further, the detection means 4 is provided on the head 30 so as to detect the external environment from, for example, a portion corresponding to the eyes shown in FIG. 3A and a portion made of a transparent member shown in FIG. 3B. You may. Further, the speaker 5 may be provided so as to output sound from the head portion 30, for example, the face portion 31. The above is a configuration example of the head 30. The explanation will be continued by returning to FIG.
 図2に示すとおり、本実施形態の移動ロボット100はロボット本体部10の前面に腕部40を備える。腕部40は、マニピュレータ機構を備え、運搬する部材等を把持するための把持機構41が自由端に相当する先端に設けられている。ただし、腕部40の形状、数、および配置は図示する態様に制限されず目的に応じて適宜変更されてよい。例えば、移動ロボット100は、ロボット本体部10の両側面に腕部40を備える双腕ロボットとして構成されてもよい。 As shown in FIG. 2, the mobile robot 100 of the present embodiment includes an arm portion 40 on the front surface of the robot main body portion 10. The arm portion 40 is provided with a manipulator mechanism, and a gripping mechanism 41 for gripping a member to be transported or the like is provided at a tip corresponding to a free end. However, the shape, number, and arrangement of the arm portions 40 are not limited to the illustrated modes and may be appropriately changed according to the purpose. For example, the mobile robot 100 may be configured as a dual-arm robot having arm portions 40 on both side surfaces of the robot main body portion 10.
 以上が、本実施形態の移動ロボット100の構成例である。なお、移動ロボット100は、図2では省略されているが、移動ロボット100の動作を制御するのに必要となる他の構成、例えば、腕部40等を駆動する他のアクチュエータや、電力源となるバッテリ等を備えてもよい。 The above is a configuration example of the mobile robot 100 of this embodiment. Although the mobile robot 100 is omitted in FIG. 2, the mobile robot 100 has other configurations required for controlling the operation of the mobile robot 100, for example, other actuators for driving the arm 40 and the like, and a power source. A battery or the like may be provided.
 次に、図4から図10を参照して、移動ロボット100の動作について説明する。 Next, the operation of the mobile robot 100 will be described with reference to FIGS. 4 to 10.
 図4は、本実施形態の移動ロボット100が行う移動制御を説明するフローチャートである。コントローラ1が備える記憶媒体には、図示のフローチャートを参照して以下に説明する処理を実行する制御プログラムが格納されている。 FIG. 4 is a flowchart illustrating the movement control performed by the mobile robot 100 of the present embodiment. The storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
 ステップS11では、コントローラ1はタスク取得処理を実行する。タスク取得処理は、移動ロボット100が後述の移動経路生成処理(S12)によって移動経路を生成するために必要な情報を取得するための処理である。タスク取得処理では、例えば、ピックする部品の識別情報(部品ID)や目的地の位置情報(例えば位置座標)等が取得される。取得方法に制限はなく、例えば公知の無線通信技術を利用して取得してもよいし、不図示の情報入力手段(例えばタッチパネル等)を介して取得するように構成されてもよい。 In step S11, the controller 1 executes the task acquisition process. The task acquisition process is a process for the mobile robot 100 to acquire information necessary for generating a movement route by a movement route generation process (S12) described later. In the task acquisition process, for example, identification information (part ID) of the part to be picked, position information of the destination (for example, position coordinates), and the like are acquired. The acquisition method is not limited, and may be acquired by using, for example, a known wireless communication technique, or may be configured to be acquired via an information input means (for example, a touch panel or the like) (not shown).
 ステップS12では、コントローラ1(移動経路生成部11)は、現在地から目的地までの移動経路を生成する。目的地は、タスク取得処理によって取得した情報に応じて設定されてよい。例えば、タスク取得処理によって部品IDを取得した場合には、移動経路生成部11は、当該部品IDにより特定される部品が置かれている場所を目的地とし、現在地から当該目的地までの経路を生成する。生成された経路は、移動ロボット100の移動経路として設定される。その際、候補となる経路が複数生成された場合には目的地まで最も早く到達できる経路または目的地までの距離が最も短い経路を優先する等の設定条件が予め定められていてもよい。目的地までの移動経路が設定されると、移動ロボット100を目的地まで移動させるための移動処理が実行される(S13)。 In step S12, the controller 1 (movement route generation unit 11) generates a movement route from the current location to the destination. The destination may be set according to the information acquired by the task acquisition process. For example, when the component ID is acquired by the task acquisition process, the movement route generation unit 11 sets the location where the component specified by the component ID is placed as the destination, and sets the route from the current location to the destination. Generate. The generated route is set as the movement route of the mobile robot 100. At that time, when a plurality of candidate routes are generated, setting conditions such as giving priority to the route that can reach the destination earliest or the route that has the shortest distance to the destination may be predetermined. When the movement route to the destination is set, the movement process for moving the mobile robot 100 to the destination is executed (S13).
 ステップS13では、コントローラ1(移動制御部12)は、移動ロボット100を目的地まで移動させる移動処理を実行する。移動処理では、移動制御部12がモータ2を制御して、移動ロボット100を設定された移動経路に沿って目的地まで移動させる。本ステップで実行される移動処理の具体的な手法は特に制限されず、公知の種々の手法が適宜採用されてよい。例えば、移動ロボット100は、パスプランニングにより計画された経路と、予め記憶された作業領域のマップデータとに基づいて、オドメトリ、スキャンマッチング等の公知の自己位置推定を行いながらモータ2を制御して、目的地までの自律的な移動(自律走行)を実現するように構成されてよい。 In step S13, the controller 1 (movement control unit 12) executes a movement process for moving the mobile robot 100 to the destination. In the movement process, the movement control unit 12 controls the motor 2 to move the mobile robot 100 to the destination along the set movement path. The specific method of the movement process executed in this step is not particularly limited, and various known methods may be appropriately adopted. For example, the mobile robot 100 controls the motor 2 while performing known self-position estimation such as odometry and scan matching based on the route planned by path planning and the map data of the work area stored in advance. , It may be configured to realize autonomous movement (autonomous driving) to the destination.
 ステップS13の移動処理の詳細について、図5を参照して説明する。 The details of the movement process in step S13 will be described with reference to FIG.
 図5は、本実施形態の移動ロボット100が行う移動処理を説明するフローチャートである。本フローチャートで説明する移動処理は、移動ロボット100が設定された移動経路に沿って目的地まで移動している間、常時実行される。なお、コントローラ1が備える記憶媒体には、図示のフローチャートを参照して以下に説明する処理を実行する制御プログラムが格納されている。 FIG. 5 is a flowchart illustrating a movement process performed by the mobile robot 100 of the present embodiment. The movement process described in this flowchart is always executed while the mobile robot 100 is moving to the destination along the set movement route. The storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
 ステップS131では、コントローラ1(障害物検出部13)は、所定距離内における移動経路上の障害物を検出する。障害物検出部13が移動経路上の障害物を検出する方法の具体例について、図6を用いて説明する。 In step S131, the controller 1 (obstacle detection unit 13) detects an obstacle on the movement path within a predetermined distance. A specific example of a method in which the obstacle detection unit 13 detects an obstacle on the movement path will be described with reference to FIG.
 図6は、本実施形態の障害物検出部13が移動経路上の障害物を検出する方法を説明する図である。図6では、移動ロボット100と検出手段4により検出された障害物300とが示されている。移動ロボット100から延びる点線矢印は、生成された移動経路を示す。また、移動ロボット100から延びる実線矢印は、移動ロボット100の直近の移動方向(現在地からの移動方向)、すなわち、現在地から、移動経路上において最初に方向転換する地点(経由点)に向かう方向である。 FIG. 6 is a diagram illustrating a method in which the obstacle detection unit 13 of the present embodiment detects an obstacle on the movement path. In FIG. 6, the mobile robot 100 and the obstacle 300 detected by the detecting means 4 are shown. The dotted arrow extending from the mobile robot 100 indicates the generated movement path. The solid arrow extending from the mobile robot 100 indicates the direction of movement closest to the mobile robot 100 (movement direction from the current location), that is, the direction from the current location toward the first turning point (waypoint) on the movement route. be.
 本実施形態の障害物検出部13による障害物の検出は、移動ロボット100および障害物300を、上方から床面に投射された単純な二次元図形とみなし、当該二次元図形同士が干渉する否かを判定することにより行われてもよい。図6で例示されるように、移動ロボット100は、例えば、移動台車20や腕部40の各リンクに対応して投射された円の連なりで表現される。また、障害物(人)300は、腕を水平方向に伸ばした場合に水平方向に広がる領域を考慮した円で表現される。なお、障害物が人でない場合には、障害物の外形に応じて適宜選択された二次元図形で表現されてよい。なお、投射された二次元図形の外縁例えば円の直径は、上方から見た移動ロボット100の水平方向幅の最大値、および障害物300の水平方向幅の最大値にそれぞれ一致する必要は必ずしもない。投射された二次元図形は、若干広くして余裕を持たせたり、現実的な可動範囲を考慮して狭めたりするなど適宜調整されてよい。 In the detection of obstacles by the obstacle detection unit 13 of the present embodiment, the mobile robot 100 and the obstacle 300 are regarded as simple two-dimensional figures projected from above on the floor surface, and whether or not the two-dimensional figures interfere with each other. It may be performed by determining whether or not. As illustrated in FIG. 6, the mobile robot 100 is represented by, for example, a series of circles projected corresponding to each link of the mobile carriage 20 and the arm 40. Further, the obstacle (person) 300 is represented by a circle considering a region that expands in the horizontal direction when the arm is extended in the horizontal direction. When the obstacle is not a person, it may be represented by a two-dimensional figure appropriately selected according to the outer shape of the obstacle. The outer edge of the projected two-dimensional figure, for example, the diameter of a circle, does not necessarily have to match the maximum horizontal width of the mobile robot 100 and the maximum horizontal width of the obstacle 300 when viewed from above. .. The projected two-dimensional figure may be appropriately adjusted by making it slightly wider to allow a margin or narrowing it in consideration of a realistic movable range.
 また、移動ロボット100に対応する円は、移動ロボット100が所定距離または所定時間移動する間の移動経路に沿って生成される。図6では、点線円で示す所定距離内を移動する移動ロボット100に対応する円が時系列に示されている。本図では、現在地の移動ロボット100に対応する円を(t)とし、移動経路に沿って時系列に生成された円を(t+1)、(t+2)として示している。そして、障害物検出部13は、所定距離内において、移動ロボット100に対応する円と障害物300に対応する円とが接触(干渉)するか否かを計算する。接触するか否かは、例えば円と円との中心間距離と、各円のそれぞれの半径とに基づいて容易に計算することができる。これにより、障害物検出部13は、例えば図示のように、移動ロボット100が(t+2)において障害物300と接触あるいは衝突し得ること、すなわち、(t+2)の位置において障害物300に移動経路を妨げられることを容易に検出することができる。 Further, the circle corresponding to the mobile robot 100 is generated along the movement path while the mobile robot 100 moves for a predetermined distance or a predetermined time. In FIG. 6, the circles corresponding to the mobile robot 100 that moves within the predetermined distance indicated by the dotted line circles are shown in chronological order. In this figure, the circle corresponding to the mobile robot 100 at the current location is shown as (t), and the circles generated in time series along the moving path are shown as (t + 1) and (t + 2). Then, the obstacle detection unit 13 calculates whether or not the circle corresponding to the mobile robot 100 and the circle corresponding to the obstacle 300 come into contact (interference) within a predetermined distance. Whether or not they touch each other can be easily calculated based on, for example, the distance between the centers of the circles and the radius of each circle. As a result, the obstacle detection unit 13 can contact or collide with the obstacle 300 at (t + 2), that is, the movement path to the obstacle 300 at the position (t + 2), for example, as shown in the drawing. It can be easily detected that it is hindered.
 なお、図中の点線円で示す所定距離、すなわち現在地を基準として移動経路上の障害物を検出する距離的範囲は、適宜設定されてよい。ここでの所定距離は、主に、移動ロボット100から遠すぎる位置にある障害物が検出されることを除く意図で設定される距離であって、例えば5m等、移動ロボット100の移動速度や作業領域の広さ等に応じて適宜設定されてよい。図では、経由点よりも近い範囲、すなわち、移動ロボット100が直近の移動方向に動いた場合に衝突し得る障害物を検出する例が示されているが、これに限られない。障害物を検出する所定距離内は適宜設定されてよく、原則として検出手段4が検出可能な範囲内において、経由点を超える範囲で設定されてもよい。 Note that the predetermined distance indicated by the dotted line circle in the figure, that is, the distance range for detecting an obstacle on the movement route with reference to the current location may be appropriately set. The predetermined distance here is a distance set mainly with the intention of excluding the detection of an obstacle located too far from the mobile robot 100, such as 5 m, the moving speed of the mobile robot 100, and the work. It may be set as appropriate according to the size of the area and the like. The figure shows an example of detecting an obstacle that may collide when the mobile robot 100 moves in a range closer to the waypoint, that is, in the latest moving direction, but the present invention is not limited to this. The predetermined distance for detecting an obstacle may be appropriately set, and in principle, it may be set within a range that can be detected by the detecting means 4 and a range that exceeds the waypoint.
 なお、衝突検証用図形として上述のような二次元図形を生成する衝突判定手法は一例であって、これに限られない。障害物検出部13は、衝突検証用図形として、例えば移動ロボット100の水平方向幅に対応する幅を持つ帯を移動経路に沿って生成してもよい。また、障害物検出部13は、三次元図形を生成する公知の他の衝突判定手法を採用してもよい。この場合には、移動ロボット200および障害物300は、衝突検証用図形として三次元の球体の連なりまたはポリゴンメッシュで表されてもよい。 The collision determination method for generating the above-mentioned two-dimensional figure as a collision verification figure is an example, and is not limited to this. The obstacle detection unit 13 may generate, for example, a band having a width corresponding to the horizontal width of the mobile robot 100 along the movement path as a collision verification figure. Further, the obstacle detection unit 13 may adopt another known collision determination method for generating a three-dimensional figure. In this case, the mobile robot 200 and the obstacle 300 may be represented by a series of three-dimensional spheres or a polygon mesh as a collision verification figure.
 図5に戻って説明を続ける。ステップS131の処理において移動経路上に障害物が検出されない場合には(NO判定)、目的地に到達するまで、ステップS131からステップS132の処理が繰り返し実行される。移動経路上に障害物が検出された場合には(YES判定)、ステップS133の処理が実行される。 Return to Fig. 5 and continue the explanation. If no obstacle is detected on the movement path in the process of step S131 (NO determination), the processes of steps S131 to S132 are repeatedly executed until the destination is reached. When an obstacle is detected on the movement path (YES determination), the process of step S133 is executed.
 ステップS133では、コントローラ1は、事前の設定に基づいて、障害物を回避する別の移動経路を生成するか否か判断する。障害物が検出された際は移動経路を再生成すると予め設定されていた場合には(YES判定)、ステップS134の処理に進む。障害物が検出された際は移動経路を再生成すると予め設定されていない場合には(NO判定)、ステップS134の障害物対応処理に進む。なお、移動経路の再生成に関する設定方法に制限はなく、例えば公知の無線通信技術を介して設定されてもよいし、不図示の情報入力手段(例えばタッチパネル等)を介して設定されるように構成されてもよい。また、移動経路の再生成に関する事前の設定は、されなくてもよい。この場合、ステップS133の処理は削除されてよく、コントローラ1は、移動経路上に障害物が検出された場合には(S131、YES判定)、ステップS134の処理を実行する。 In step S133, the controller 1 determines whether or not to generate another movement route for avoiding obstacles based on the preset setting. If it is preset to regenerate the movement route when an obstacle is detected (YES determination), the process proceeds to step S134. When an obstacle is detected, the movement route is regenerated. If it is not set in advance (NO determination), the process proceeds to the obstacle handling process in step S134. The setting method for regenerating the movement route is not limited, and may be set via, for example, a known wireless communication technique, or may be set via an information input means (for example, a touch panel) (not shown). It may be configured. In addition, it is not necessary to set in advance regarding the regeneration of the movement route. In this case, the process of step S133 may be deleted, and the controller 1 executes the process of step S134 when an obstacle is detected on the movement path (S131, YES determination).
 ステップS134では、コントローラ1(移動経路生成部11)は、ステップS131で検出された障害物を回避して現在地から目的地まで到達可能な移動経路(回避ルート)の生成を試みる。この際、回避ルートの生成条件として、目的地に到達するまでに要する距離又は時間の上限が予め設定されてもよい。例えば、100mの距離的上限が設定されている場合には、目的地までの通路上に障害物が検出され、当該通路の幅を鑑みて障害物を避けて越えられない場合であって、かつ唯一の迂回路では現在地から目的地までの距離が100mを超える場合には、回避ルートは生成されず、新たな移動経路は再設定されない。なお、ステップS12での移動経路生成処理と同様に、候補となる経路が複数生成された場合には目的地まで最も早く到達できる経路または目的地までの距離が最も短い経路を優先する等の条件が予め定められていてもよい。 In step S134, the controller 1 (movement route generation unit 11) tries to generate a movement route (avoidance route) that can reach the destination from the current location by avoiding the obstacle detected in step S131. At this time, as a condition for generating an avoidance route, an upper limit of the distance or time required to reach the destination may be set in advance. For example, when the upper limit of the distance of 100 m is set, an obstacle is detected on the passage to the destination, and the obstacle cannot be avoided in view of the width of the passage. In the only detour, if the distance from the current location to the destination exceeds 100 m, an avoidance route is not generated and a new travel route is not reset. Similar to the movement route generation process in step S12, when a plurality of candidate routes are generated, the route that can reach the destination earliest or the route that has the shortest distance to the destination is prioritized. May be predetermined.
 回避ルートが生成されると(S134、YES判定)、生成された回避ルートを移動経路として再設定する移動経路再設定処理が実行され(S135)、再設定された移動経路に沿って目的地に到達するまでステップS131からステップS132の処理が繰り返し実行される。その間、移動ロボット100は移動経路に沿って移動し、目的地に到達したと判定されると(S132、YES判定)、移動処理は終了する。 When the avoidance route is generated (S134, YES determination), the movement route resetting process for resetting the generated avoidance route as the movement route is executed (S135), and the destination is set along the reset movement route. The processes of steps S131 to S132 are repeatedly executed until they are reached. During that time, the mobile robot 100 moves along the movement path, and when it is determined that the destination has been reached (S132, YES determination), the movement process ends.
 他方、ステップS134において、現在地から障害物を回避して目的地まで到達可能な経路が存在しない、或いは、上述の上限又は設定条件を満たす経路が存在しないことによって回避ルートが生成されない場合は(S134、NO判定)、障害物を取り除くための所定の対応を実行するためにステップS136の障害物対応処理に進む。 On the other hand, in step S134, if there is no route that can reach the destination by avoiding obstacles from the current location, or if there is no route that satisfies the above upper limit or setting condition, the avoidance route is not generated (S134). , NO determination), the process proceeds to the obstacle handling process of step S136 in order to execute a predetermined response for removing the obstacle.
 なお、ステップS131で障害物が検出されてから、障害物対応処理(S136)又は移動経路再設定処理(S135)が実行されるまでの間、コントローラ1(移動制御部14)は、移動ロボット100の移動を減速または停止させる処理(減速処理)を実行してもよい。これにより、移動中に障害物を検出した場合に、障害物に衝突する前に、障害物対応処理(S136)又は移動経路再設定処理(S135)を安全に実行することができる。なお、減速の度合い及び停止する位置(障害物との距離)は、予め設定されていてもよいし、障害物を検出した時点における移動ロボット100の移動速度等に応じて適宜調整されてもよい。 From the time when an obstacle is detected in step S131 until the obstacle handling process (S136) or the movement route resetting process (S135) is executed, the controller 1 (movement control unit 14) is the mobile robot 100. A process of decelerating or stopping the movement of the robot (deceleration process) may be executed. As a result, when an obstacle is detected during movement, the obstacle handling process (S136) or the movement route resetting process (S135) can be safely executed before colliding with the obstacle. The degree of deceleration and the stop position (distance to the obstacle) may be set in advance, or may be appropriately adjusted according to the moving speed of the mobile robot 100 at the time when the obstacle is detected. ..
 以上が本実施形態の移動ロボット100が行う移動処理の詳細である。次に、図7を参照して、障害物対応処理の詳細について説明する。 The above is the details of the movement process performed by the mobile robot 100 of the present embodiment. Next, the details of the obstacle handling process will be described with reference to FIG. 7.
 図7は、本実施形態の移動ロボット100が行う障害物対応処理(S136)を説明するフローチャートである。コントローラ1が備える記憶媒体には、図示のフローチャートを参照して以下に説明する処理を実行する制御プログラムが格納されている。 FIG. 7 is a flowchart illustrating the obstacle handling process (S136) performed by the mobile robot 100 of the present embodiment. The storage medium included in the controller 1 stores a control program that executes the processes described below with reference to the illustrated flowchart.
 ステップS1361では、コントローラ1(障害物検出部13)が障害物が人か否かを判定する。障害物が人か否かは、検出手段4が取得した検出データ(画像データ)を解析することで容易に判別することができる。なお、検出手段4は、障害物の温度から人か否かを判別するために、障害物から放射される赤外線を検出可能に構成されてもよい。障害物が人である場合には(YES判定)、検出された障害物(人)が複数であるか否かを判定するためにステップS1362の処理が実行される。障害物が人でない場合には(NO判定)、人ではない障害物に対して物対応処理を実行するためにステップS1365に進む。物対応処理の詳細については図10を参照して後述する。 In step S1361, the controller 1 (obstacle detection unit 13) determines whether or not the obstacle is a person. Whether or not the obstacle is a person can be easily determined by analyzing the detection data (image data) acquired by the detection means 4. The detection means 4 may be configured to be able to detect infrared rays radiated from the obstacle in order to determine whether or not the person is a person from the temperature of the obstacle. When the obstacle is a person (YES determination), the process of step S1362 is executed to determine whether or not there are a plurality of detected obstacles (persons). If the obstacle is not a person (NO determination), the process proceeds to step S1365 to execute the object handling process for the non-human obstacle. The details of the object handling process will be described later with reference to FIG.
 ステップS1362では、コントローラ1は、検出された人が複数か否かを判定する。検出した人が複数ではなく一人であれば(NO判定)、検出された人に対して人対応処理を実行するためにステップS1364の処理にすすむ。人対応処理の詳細については図9を参照して後述する。他方、検出された人が複数であれば(YES判定)、人対応処理を実行する対象を特定するために人特定処理(S1363)を実行する。人特定処理の詳細については図8を参照して説明する。 In step S1362, the controller 1 determines whether or not there are a plurality of detected persons. If the number of detected persons is not a plurality but one (NO determination), the process of step S1364 is performed in order to execute the person-correspondence process for the detected persons. The details of the human handling process will be described later with reference to FIG. On the other hand, if there are a plurality of detected persons (YES determination), the person identification process (S1363) is executed in order to specify the target for executing the person correspondence process. The details of the person identification process will be described with reference to FIG.
 図8は、人特定処理を説明する図である。図8は、移動ロボット100と人との位置関係を上方から見た図であって、移動ロボット100と、移動ロボット100から延びる点線矢印で示す移動経路と、移動ロボット100から延びる実線矢印で示す直近の移動方向と、点線円で示す移動ロボット100からの所定距離と、人Aおよび人Bを含む複数の人と、が示されている。 FIG. 8 is a diagram illustrating a person identification process. FIG. 8 is a view of the positional relationship between the mobile robot 100 and a person from above, which is indicated by the mobile robot 100, the movement path indicated by the dotted arrow extending from the mobile robot 100, and the solid line arrow extending from the mobile robot 100. The latest movement direction, a predetermined distance from the mobile robot 100 indicated by a dotted circle, and a plurality of people including the person A and the person B are shown.
 図8に示す人Aおよび人Bは、両者とも所定距離内における移動経路上に検出された障害物としての人である。このように、所定距離内における移動経路上の障害物として人が複数検出された場合には、コントローラ1は、移動ロボット100が移動した際に最初に接触あるいは衝突すると予測される人を特定する。具体的には、移動ロボット100に一番近い人(人A)が、最初に衝突が予測される人と特定される。ただし、最初に衝突が予測される人を特定するにおいて人の移動速度や移動方向が考慮されてもよい。この場合、例えば、人Aが移動ロボット200からゆっくりと遠ざかる一方で、人Bが移動ロボット200に速やかに近づいている場合には、人Bが最初に衝突が予測される人として特定される場合があってもよい。人特定処理により人対応処理を実行する対象が特定されると、ステップS1364の人対応処理が実行される。 Person A and person B shown in FIG. 8 are both persons as obstacles detected on the movement path within a predetermined distance. In this way, when a plurality of people are detected as obstacles on the movement path within a predetermined distance, the controller 1 identifies the person who is predicted to first contact or collide when the mobile robot 100 moves. .. Specifically, the person (person A) closest to the mobile robot 100 is identified as the person who is predicted to collide first. However, the moving speed and the moving direction of the person may be taken into consideration in identifying the person who is expected to collide first. In this case, for example, when the person A slowly moves away from the mobile robot 200 while the person B quickly approaches the mobile robot 200, the person B is identified as the first person to be predicted to collide. There may be. When the target to be executed for the person correspondence process is specified by the person identification process, the person correspondence process in step S1364 is executed.
 ステップS1364では、障害物が人であると判定されたので(S1361、YES判定)、検出された障害物に対して、対象が人である場合に選択される一又は複数の所定の対応を実行する。具体的に、図9を参照しながら説明する。 In step S1364, since it is determined that the obstacle is a person (S1361, YES determination), one or a plurality of predetermined actions selected when the target is a person are executed for the detected obstacle. do. Specifically, it will be described with reference to FIG.
 図9は、人対応処理を説明する図である。本実施形態の人対応処理は、少なくとも次の三つの対応のうちのいずれか一つ以上を含む。具体的には、人対応処理は、移動ロボット100の移動を停止する(人対応1)、人の頭に顔部31を向ける(人対応2)、および所定の音を出力する(人対応3)のいずれか一つを、または、いずれか二つ以上を組み合わせて実行する。なお、図中の点線矢印は、移動ロボット100の移動経路を示す。 FIG. 9 is a diagram for explaining the human correspondence process. The human response process of the present embodiment includes at least one or more of the following three responses. Specifically, in the human correspondence process, the movement of the mobile robot 100 is stopped (human correspondence 1), the face 31 is directed to the human head (human correspondence 2), and a predetermined sound is output (human correspondence 3). ), Or a combination of two or more. The dotted arrow in the figure indicates the movement path of the mobile robot 100.
 図9(a)は、移動ロボット100が、人対応1および人対応3のいずれか一方又は両方を実行している様子を説明する図である。図示するように、移動ロボット100は、障害物である人の前で静止してよい(人対応1)。例えば、上述の減速処理によって障害物の手前で停止した場合には、その停止状態を維持する。このような対応は、人の行動を優先するものであり、原則として、人が移動ロボット100の存在に気づき移動経路から立ち退いてくれるのを静かに待つことになる。 FIG. 9A is a diagram illustrating how the mobile robot 100 is executing either one or both of the human correspondence 1 and the human correspondence 3. As shown in the figure, the mobile robot 100 may stand still in front of a person who is an obstacle (human correspondence 1). For example, when the vehicle is stopped in front of an obstacle by the deceleration process described above, the stopped state is maintained. Such a response gives priority to human behavior, and in principle, quietly waits for a person to notice the existence of the mobile robot 100 and evict from the movement route.
 また、移動ロボット100は、図9(a)の状態でスピーカ5を介して所定の音を出力してもよい(人対応3)。所定の音は、ブザー音等の警告音でもよいし、「すみません、どいてください」「とおります」といった音声でもよい。これら所定の音の種類は、予め記憶された複数パターンの中から事前の設定等に応じて適宜選択されてよい。音の種類だけでなく、音量、音の繰り返し回数(出力回数)、および、音の出力継続時間等も事前に設定可能に構成されてよい。これにより、移動ロボット100は、移動ロボット100が利用されるシーンや、移動ロボット100を管理するユーザの要望に応じた適切な音を出力することができる。 Further, the mobile robot 100 may output a predetermined sound via the speaker 5 in the state of FIG. 9A (human correspondence 3). The predetermined sound may be a warning sound such as a buzzer sound, or a voice such as "I'm sorry, please go" or "pass". These predetermined sound types may be appropriately selected from a plurality of patterns stored in advance according to prior settings and the like. Not only the type of sound, but also the volume, the number of times the sound is repeated (the number of times of output), the duration of sound output, and the like may be set in advance. As a result, the mobile robot 100 can output an appropriate sound according to the scene in which the mobile robot 100 is used and the request of the user who manages the mobile robot 100.
 また、静止する人対応1と組み合わせて所定の音を出力する場合には、出力するタイミングも適宜設定されてよい。例えば、静止状態が5秒継続しても人が退避しない場合に音を出力するように設定されてよい。 Further, when a predetermined sound is output in combination with the stationary person correspondence 1, the output timing may be appropriately set. For example, it may be set to output a sound when a person does not evacuate even if the stationary state continues for 5 seconds.
 また、出力する音の種類、音量、繰り返し回数等を人対応処理の継続時間に応じて変化させてもよい。例えば、「すみません」「どいてください」「どけ!」等の、言葉の強さ(意味合い)に応じて働きかけの強さが異なる複数種類の音を用意し、働きかけの度合いが徐々に強くなるように、出力する音の種類を所定時間毎により強い言葉に変更してもよい。なお、ここでの所定時間は適宜設定されてよい。また、段階的に変更する際の時間間隔が等間隔である必要はない。 Further, the type, volume, number of repetitions, etc. of the output sound may be changed according to the duration of the human processing. For example, prepare multiple types of sounds with different working strengths according to the strength (meaning) of words, such as "I'm sorry", "Please go", "Doke!", So that the degree of working gradually increases. In addition, the type of sound to be output may be changed to stronger words at predetermined time intervals. The predetermined time here may be set as appropriate. In addition, the time interval for stepwise change does not have to be equal.
 また、音の種類だけでなく、所定の音の音量を所定時間毎に徐々に大きくしてもよい。このように、人対応3では、音量を徐々に大きくしたり、音の種類(言葉の内容)を段々と強くしたりすることにより、働きかけの強度(働きかけの度合いの強さ)を段階的に高めることができる。これにより、移動ロボット100は、いきなり大音量で強い言葉を発するなどして人を不快にしたり驚かせたりすることなく、人の行動に応じて段階的に変化する適切な強度で退避を促すことができる。 Further, not only the type of sound but also the volume of a predetermined sound may be gradually increased at predetermined time intervals. In this way, in Human Response 3, the strength of the action (strength of the degree of action) is gradually increased by gradually increasing the volume and gradually increasing the type of sound (content of the word). Can be enhanced. As a result, the mobile robot 100 can urge evacuation with an appropriate strength that changes stepwise according to the human behavior without making the person uncomfortable or surprised by suddenly uttering a strong word at a loud volume. can.
 図9(b)は、移動ロボット100が人の頭に顔部31を向ける対応(人対応2)を実行している様子を説明する図である。図示するように、障害物対応部14(頭部駆動制御部14a)は、人対応処理として顔部31を人の頭に向けてもよい。これにより、移動経路上の人に対して、静止するだけの人対応1よりは強く、音を出力する人対応3よりかは柔らかく退避を促すことができる。なお、人対応1および人対応3の少なくとも一方と組み合わせて実行されてよく、その組み合わせタイミングは適宜設定されてよい。例えば、静止状態から所定時間経過後に顔部31を人の頭に向けて、さらに所定時間経過した場合に音を出力するように構成されてよい。これにより、状況に応じて、よりきめ細かい働きかけを行うことが可能になる。以上が本実施形態の人対応処理の詳細である。障害物対応部14は、障害物が人である場合には、上述の人対応1~3から選択した一又は複数の所定の対応を人に対して実行するように構成される。 FIG. 9B is a diagram illustrating a state in which the mobile robot 100 is executing a response (human response 2) in which the face portion 31 is directed to the human head. As shown in the figure, the obstacle handling unit 14 (head drive control unit 14a) may turn the face portion 31 toward the human head as a human handling process. As a result, it is possible to urge a person on the movement path to evacuate, which is stronger than the person correspondence 1 that only stands still and softer than the person correspondence 3 that outputs a sound. It may be executed in combination with at least one of the person correspondence 1 and the person correspondence 3, and the combination timing may be appropriately set. For example, the face 31 may be directed toward the head of a person after a lapse of a predetermined time from a stationary state, and a sound may be output when a predetermined time has passed. This makes it possible to work more finely depending on the situation. The above is the details of the human response processing of the present embodiment. When the obstacle is a person, the obstacle response unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned person responses 1 to 3 for the person.
 次に、ステップS1365に係る物対応処理の詳細について図10を参照して説明する。 Next, the details of the object handling process according to step S1365 will be described with reference to FIG.
 図10は、物対応処理を説明する図である。物対応処理は、障害物が人ではないと判定された場合(S1361、NO判定)に行われる処理であって、少なくとも次の二つの対応のうちのいずれか一つ又は両方を含む。具体的には、物対応処理は、移動ロボット100の移動を停止した後に、周辺にいる人の頭に顔部31を向ける(物対応1)、および周辺にいる人に音を出力する(物対応2)のいずれか一方を、または組み合わせて実行される。なお、上述の人対応1と同様に、障害物の手前で静止する物対応3を実行してもよい。図中の点線矢印は、移動ロボット100の移動経路を示す。 FIG. 10 is a diagram for explaining the object handling process. The object handling process is a process performed when it is determined that the obstacle is not a person (S1361, NO determination), and includes at least one or both of the following two responses. Specifically, in the object handling process, after stopping the movement of the mobile robot 100, the face 31 is directed to the heads of people in the vicinity (object handling 1), and a sound is output to the people in the vicinity (objects). Correspondence 2) is executed either one or in combination. In addition, similarly to the above-mentioned person correspondence 1, the object correspondence 3 which stands still in front of an obstacle may be executed. The dotted arrow in the figure indicates the movement path of the mobile robot 100.
 図10(a)は、移動ロボット100が、障害物(物)の手前で停止している状態を示す。この時、障害物検出部13が移動ロボット100の周囲にいる人の位置、特に周囲にいる人の頭の位置を検出する。 FIG. 10A shows a state in which the mobile robot 100 is stopped in front of an obstacle (object). At this time, the obstacle detection unit 13 detects the position of a person around the mobile robot 100, particularly the position of the head of a person around the mobile robot 100.
 そして、図10(b)に示すように、障害物対応部14(頭部駆動制御部14a)は、物対応処理として、周辺にいる人の頭に顔部31を向けてよい(物対応1)。これにより、移動ロボット100は、障害物に移動経路を妨げられていることを周囲にいる人に積極的に訴えかけることができる。また、この時、顔部31を向けた人に所定の音を出力することで(物対応2)、移動経路から障害物を除いてほしいことを周囲の人に直接訴えかけることができる。この時に出力する音の種類は、人対応処理時とは変えて、例えば次のような音声であってもよい。 Then, as shown in FIG. 10B, the obstacle handling unit 14 (head drive control unit 14a) may turn the face portion 31 toward the head of a person in the vicinity as an object handling process (object handling 1). ). As a result, the mobile robot 100 can positively appeal to the people around it that the movement route is obstructed by an obstacle. Further, at this time, by outputting a predetermined sound to the person who faces the face 31 (object correspondence 2), it is possible to directly appeal to the surrounding people that the obstacle should be removed from the movement route. The type of sound output at this time may be, for example, the following voice, different from that at the time of human processing.
 すなわち、障害物対応部14(報知制御部14b)は、物対応処理(物対応2)によって、「物をどかしてください」等の、移動経路から障害物を取り除くことを要求する音声を周囲の人に出力してもよい。これにより、移動ロボット100は、移動経路の確保のために、移動経路から障害物を取り除くことを周囲の人により的確に訴えることができる。また、音を出力するスピーカ5として、いわゆる指向性スピーカが採用されてもよい。これにより、例えばむやみに音を発せられない現場等であっても、周辺にいる特定の人にのみ助けを求めることが可能となる。また、物対応2は図10(a)に示す静止状態(物対応3)において実行されてもよい。その場合には、移動ロボット100は、音を出力する対象を限定せずに、周辺に向けて「誰かどかしてください」等の音を出力してもよい。なお、物対応処理において音を出力する際においても、人対応処理において上述したのと同様に、音の種類、音量、音の繰り返し回数、音の出力継続時間は適宜変更可能に構成されてよい。 That is, the obstacle handling unit 14 (notification control unit 14b) emits a voice requesting that the obstacle be removed from the movement route, such as "Please move the object" by the object handling process (object handling 2). It may be output to a person. As a result, the mobile robot 100 can more accurately appeal to the surrounding people to remove the obstacle from the moving path in order to secure the moving path. Further, a so-called directional speaker may be adopted as the speaker 5 that outputs sound. This makes it possible to ask for help only from a specific person in the vicinity, for example, even in a field where sound cannot be emitted unnecessarily. Further, the object correspondence 2 may be executed in the stationary state (object correspondence 3) shown in FIG. 10 (a). In that case, the mobile robot 100 may output a sound such as "Who is it?" Toward the periphery without limiting the target for outputting the sound. In addition, even when the sound is output in the object handling process, the sound type, the volume, the number of times the sound is repeated, and the sound output duration may be appropriately changed in the same manner as described above in the human handling process. ..
 また、図示しないが、障害物対応部14は、物対応処理として、腕部40を用いて取り除いてほしい障害物を指し示してもよい(物対応4)。これにより、移動ロボット100は、移動経路から取り除いてほしい障害物を周囲の人により的確に伝えることができる。なお、障害物を指し示す動作としては種々の動作が採用され得る。例えば、移動ロボット100は、人が対象物を指差すように腕部40の先端を障害物に向ける動作を行ってもよいし、さらに、腕部40の先端を障害物に対して前後させる等して取り除いてほしい障害物をより強調して指し示すような動作を行ってもよい。なお、物対応4についても、上述の物対応1から3のいずれか一つ以上と組み合わせて実行されてよい。 Although not shown, the obstacle handling unit 14 may point to an obstacle to be removed by using the arm 40 as an object handling process (object handling 4). As a result, the mobile robot 100 can more accurately convey the obstacles to be removed from the movement path to the surrounding people. Various movements can be adopted as the movements for pointing to obstacles. For example, the mobile robot 100 may perform an action of pointing the tip of the arm 40 toward an obstacle so that a person points to an object, and further moves the tip of the arm 40 back and forth with respect to the obstacle. You may perform an action that emphasizes and points to the obstacle that you want to remove. The object correspondence 4 may also be executed in combination with any one or more of the above-mentioned object correspondences 1 to 3.
 以上が、本実施形態の物対応処理の詳細である。障害物対応部14は、障害物が物である場合には、上述の物対応1~4から選択した一又は複数の所定の対応を周囲の人に対して実行するように構成される。 The above is the details of the product handling process of this embodiment. When the obstacle is an object, the obstacle handling unit 14 is configured to perform one or a plurality of predetermined responses selected from the above-mentioned object handling 1 to 4 to the surrounding people.
 以上が、一実施形態の移動ロボット100が行う移動制御の詳細である。なお、図4、図5および図7で示すフローチャートは本実施形態の移動制御を説明するための例示であって、必ずしも図示するとおりに実行されることを制限するものではなく、上述の技術的効果を奏する限り適宜変更されてもよい。 The above is the details of the movement control performed by the mobile robot 100 of one embodiment. It should be noted that the flowcharts shown in FIGS. 4, 5 and 7 are examples for explaining the movement control of the present embodiment, and do not necessarily limit the execution as shown in the drawings, and the above-mentioned technical techniques are not necessarily limited. It may be changed as appropriate as long as it is effective.
 例えば、上述したとおり図5で示すステップS133は削除されてもよい。この場合には、移動ロボット100は、障害物が検出された場合には回避ルートの生成を試み、適切な回避ルートが生成されなかった場合に、障害物対応処理を実行するように構成されてもよい。 For example, as described above, step S133 shown in FIG. 5 may be deleted. In this case, the mobile robot 100 is configured to try to generate an avoidance route when an obstacle is detected, and to execute an obstacle handling process when an appropriate avoidance route is not generated. May be good.
 また、移動ロボット100は、障害物が検出され(S131、YES判定)、障害物対応処理として上述の所定の対応を実行してから所定時間経過しても障害物が移動経路から除かれない場合に、現在地から当該障害物を回避して目的地に到達する移動経路(回避ルート)の生成を試みてもよい(S134)。このように、移動ロボット100は、障害物を検出した場合に、まずは障害物対応処理を実行し、その後所定時間経過しても障害物が移動経路から取り除かれない場合に、回避ルートを生成するように構成されてよい。なお、ここでの所定時間は適宜設定されてよく特に制限されない。 Further, when the mobile robot 100 detects an obstacle (S131, YES determination) and the obstacle is not removed from the movement route even after a predetermined time has elapsed after executing the above-mentioned predetermined response as the obstacle response process. In addition, an attempt may be made to generate a movement route (avoidance route) that avoids the obstacle from the current location and reaches the destination (S134). In this way, when the mobile robot 100 detects an obstacle, it first executes an obstacle handling process, and then generates an avoidance route when the obstacle is not removed from the moving route even after a lapse of a predetermined time. It may be configured as follows. The predetermined time here may be set as appropriate and is not particularly limited.
 以上が本実施形態の移動ロボット100の詳細である。以上説明したように、移動ロボット100は、移動経路上に障害物が検出された場合には、当該障害物を回避して目的地まで到達可能な移動経路を再生成し、又、そのような経路が再生成できなかった場合には、障害物を移動経路から除くための所定の対応を実行する。あるいは、移動ロボット100は、移動経路上に障害物が検出された場合には、障害物を移動経路から除くための所定の対応を実行し、又、所定の対応を実行しても当該障害物が取り除かれない場合には、障害物を回避して目的地まで到達可能な移動経路を再生成する。これにより、移動ロボット100は、移動経路を障害物に塞がれた場合でも、移動経路を積極的に確保して目的地まで移動することができるので、作業効率の低下を抑制することができる。 The above is the details of the mobile robot 100 of this embodiment. As described above, when an obstacle is detected on the movement path, the mobile robot 100 regenerates a movement path that can avoid the obstacle and reach the destination, and such If the route cannot be regenerated, a predetermined action is taken to remove the obstacle from the movement route. Alternatively, when an obstacle is detected on the movement path, the mobile robot 100 executes a predetermined response for removing the obstacle from the movement path, and even if the predetermined response is executed, the obstacle If is not removed, avoid obstacles and regenerate a travel route that can reach the destination. As a result, the mobile robot 100 can actively secure the movement route and move to the destination even when the movement route is blocked by an obstacle, so that the decrease in work efficiency can be suppressed. ..
 以上、本発明の実施形態について説明したが、上記実施形態は本発明の適用例の一部を示したに過ぎず、本発明の技術的範囲を上記実施形態の具体的構成に限定する趣旨ではない。 Although the embodiments of the present invention have been described above, the above embodiments are only a part of the application examples of the present invention, and the technical scope of the present invention is limited to the specific configurations of the above embodiments. No.
 例えば、報知手段5は、必ずしもスピーカである必要はなく、ランプ、レーザ、ディスプレイ等、光を用いて人の視覚に訴えるものであってもよい。また、障害物対応部14は、障害物対応処理において、このように構成された報知手段5を用いて所定の報知を行う場合でも、働きかけの強さを所定の時間毎に変更してもよい。例えば、報知手段5としてディスプレイを用いて所定の報知が行われる場合でも、障害物対応部14は、上述のスピーカを用いた所定の報知と同様に、当該ディスプレイに表示される内容に係る働きかけの強さが所定時間毎により強くなるように変更してもよい。 For example, the notification means 5 does not necessarily have to be a speaker, and may be a lamp, a laser, a display, or the like that appeals to human vision using light. Further, the obstacle handling unit 14 may change the strength of the action at predetermined time intervals even when performing a predetermined notification using the notification means 5 configured in this way in the obstacle handling process. .. For example, even when a predetermined notification is performed using a display as the notification means 5, the obstacle handling unit 14 works on the content displayed on the display in the same manner as the predetermined notification using the speaker described above. The strength may be changed to be stronger at predetermined time intervals.
 また、図2で示す移動ロボット100の構成は、少なくとも頭部30を備え、移動経路に沿って移動可能に構成される限り適宜変更されてよい。例えば、移動ロボット100は、ロボット本体部10を備えなくてもよい。この場合、頭部30は、移動台車20に直接設けられるように構成されてよい。 Further, the configuration of the mobile robot 100 shown in FIG. 2 may be appropriately changed as long as it includes at least the head 30 and is configured to be movable along the movement path. For example, the mobile robot 100 does not have to include the robot main body 10. In this case, the head 30 may be configured to be provided directly on the moving carriage 20.
 本発明は、少なくともロボット制御装置等を製造する産業において利用可能である。 The present invention can be used at least in an industry that manufactures robot control devices and the like.
 1 コントローラ
 3 頭部駆動手段(駆動手段)
 5 スピーカ
 11 移動経路設定部
 12 移動制御部
 13 障害物検出部
 14 障害物対応部
 21 人検出部(検出部)
 22 衝突予測部(衝突判定部)
 30 頭部
 40 腕部
 100 移動ロボット
 300 障害物
1 Controller 3 Head drive means (drive means)
5 Speaker 11 Movement route setting unit 12 Movement control unit 13 Obstacle detection unit 14 Obstacle response unit 21 Person detection unit (detection unit)
22 Collision prediction unit (collision detection unit)
30 Head 40 Arms 100 Mobile robot 300 Obstacles

Claims (19)

  1.  移動ロボットを制御するロボット制御装置であって、
     現在地から目的地までの移動経路を生成する移動経路生成部と、
     生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御部と、
     所定距離内における前記移動経路上の障害物を検出する障害物検出部と、
     前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応部と、を備え、
     前記障害物検出部は、前記障害物が人か否かを検出し、
     前記障害物対応部は、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する、ロボット制御装置。
    A robot control device that controls a mobile robot
    A movement route generator that generates a movement route from the current location to the destination,
    A movement control unit that moves the mobile robot along the generated movement path, and
    An obstacle detection unit that detects an obstacle on the movement path within a predetermined distance,
    It is provided with an obstacle handling unit that executes a predetermined response for removing the obstacle from the movement path.
    The obstacle detection unit detects whether or not the obstacle is a person, and determines whether or not the obstacle is a person.
    The obstacle response unit is a robot control device that executes one or a plurality of the predetermined responses selected depending on whether or not the obstacle is a person.
  2.  前記移動ロボットは、
      正面を顔部とする頭部と、
      前記頭部を駆動する駆動手段と、をさらに備え、
     検出した前記障害物が人である場合は、
      前記障害物検出部は、前記人の頭の位置を検出し、
      前記障害物対応部は、前記所定の対応として、前記駆動手段を制御して前記顔部を前記人の頭に向ける、請求項1に記載のロボット制御装置。
    The mobile robot
    The head with the front as the face and
    Further provided with a driving means for driving the head,
    If the detected obstacle is a person,
    The obstacle detection unit detects the position of the person's head and
    The robot control device according to claim 1, wherein the obstacle handling unit controls the driving means to direct the face portion toward the person's head as the predetermined response.
  3.  検出した前記障害物が人であって、かつ当該人が複数いる場合に前記移動ロボットに最も近い人を特定する特定部をさらに備え、
     前記障害物対応部は、前記所定の対応として前記顔部を前記人の頭に向ける場合は、前記顔部を前記特定部が特定した前記人の頭に向ける、請求項2に記載のロボット制御装置。
    Further provided with a specific unit for identifying the person closest to the mobile robot when the detected obstacle is a person and there are a plurality of the persons.
    The robot control according to claim 2, wherein the obstacle handling unit directs the face portion to the head of the person specified by the specific portion when the face portion is directed to the head of the person as the predetermined response. Device.
  4.  前記移動ロボットは、
      正面を顔部とする頭部と、
      前記頭部を駆動する駆動手段と、をさらに備え、
     検出した前記障害物が人ではない場合は、
      前記障害物検出部は、前記移動ロボットの周辺にいる人の頭の位置を検出し、
      前記障害物対応部は、前記所定の対応として、前記駆動手段を制御して前記顔部を周辺にいる前記人の頭に向ける、請求項1から3のいずれか一項に記載のロボット制御装置。
    The mobile robot
    The head with the front as the face and
    Further provided with a driving means for driving the head,
    If the detected obstacle is not a person,
    The obstacle detection unit detects the position of the head of a person in the vicinity of the mobile robot, and detects the position of the head of the person.
    The robot control device according to any one of claims 1 to 3, wherein the obstacle handling unit controls the driving means to direct the face portion to the head of the person in the vicinity as the predetermined response. ..
  5.  前記所定の対応として、所定の報知を行うための報知手段をさらに備える、請求項1から4のいずれか一項に記載のロボット制御装置。 The robot control device according to any one of claims 1 to 4, further comprising a notification means for performing a predetermined notification as the predetermined response.
  6.  前記移動ロボットは前記報知手段としてスピーカを備え、
     前記障害物対応部は、前記所定の対応として、前記障害物が人か否かに応じて選択される所定の音を前記スピーカを用いて出力する、請求項5に記載のロボット制御装置。
    The mobile robot includes a speaker as the notification means.
    The robot control device according to claim 5, wherein the obstacle handling unit outputs a predetermined sound selected according to whether or not the obstacle is a person by using the speaker as the predetermined response.
  7.  前記所定の音は、働きかけの強さが異なる複数種類の音声を含み、
     前記障害物対応部は、前記所定の音の種類を所定の時間毎に働きかけの度合いが強くなるように変更する、請求項6に記載のロボット制御装置。
    The predetermined sound includes a plurality of types of sounds having different working intensities.
    The robot control device according to claim 6, wherein the obstacle handling unit changes the predetermined sound type so that the degree of action is increased at predetermined time intervals.
  8.  前記障害物対応部は、前記所定の音の音量を所定の時間毎に大きくする、請求項6又は7に記載のロボット制御装置。 The robot control device according to claim 6 or 7, wherein the obstacle handling unit increases the volume of the predetermined sound at predetermined time intervals.
  9.  前記障害物対応部が出力する前記所定の音の種類、音量、出力回数、および出力継続時間の少なくとも一つ以上は、予め設定可能に構成される、請求項6から8のいずれか一項に記載のロボット制御装置。 The item of any one of claims 6 to 8, wherein at least one or more of the predetermined sound type, volume, output frequency, and output duration output by the obstacle handling unit is configured to be preset. The robot control device described.
  10.  前記スピーカは、前記所定の音が前記頭部から出力されるように配置される、請求項6から9のいずれか一項に記載のロボット制御装置。 The robot control device according to any one of claims 6 to 9, wherein the speaker is arranged so that the predetermined sound is output from the head.
  11.  前記障害物対応部は、前記報知手段による働きかけの強さを所定の時間毎に変更する、請求項5から10に記載のロボット制御装置。 The robot control device according to claim 5 to 10, wherein the obstacle handling unit changes the strength of the action by the notification means at predetermined time intervals.
  12.  前記移動経路生成部は、前記障害物が検出された場合に、現在地から当該障害物を回避して前記目的地に到達する移動経路を生成し、
     前記障害物対応部は、現在地から前記障害物を回避して前記目的地に到達する移動経路が生成できなかった場合に、前記所定の対応を実行する、請求項1から11のいずれか一項に記載のロボット制御装置。
    When the obstacle is detected, the movement route generation unit generates a movement route that avoids the obstacle from the current location and reaches the destination.
    One of claims 1 to 11, wherein the obstacle handling unit executes the predetermined response when a movement route that avoids the obstacle and reaches the destination cannot be generated from the current location. The robot control device described in.
  13.  前記移動経路生成部は、前記障害物が検出された場合であって、かつ前記障害物対応部が前記所定の対応を実行してから所定時間経過しても前記障害物が前記移動経路から除かれない場合に、現在地から当該障害物を回避して前記目的地に到達する移動経路を生成する、請求項1から11のいずれか一項に記載のロボット制御装置。 The movement route generation unit removes the obstacle from the movement route even when the obstacle is detected and a predetermined time elapses after the obstacle response unit executes the predetermined response. The robot control device according to any one of claims 1 to 11, which generates a movement route from the current location to avoid the obstacle and reach the destination.
  14.  前記移動制御部は、移動中に前記障害物が検出された場合は、前記移動ロボットを減速又は停止させる、請求項1から13のいずれか一項に記載のロボット制御装置。 The robot control device according to any one of claims 1 to 13, wherein the movement control unit decelerates or stops the mobile robot when the obstacle is detected during movement.
  15.  前記障害物対応部は、前記障害物が検出された場合に前記移動ロボットが停止した場合には、前記所定の対応として、前記移動ロボットの停止状態を維持する、請求項14に記載のロボット制御装置。 The robot control according to claim 14, wherein the obstacle handling unit maintains the stopped state of the mobile robot as the predetermined response when the mobile robot stops when the obstacle is detected. Device.
  16.  前記移動ロボットはマニピュレータ機構を有する一又は複数の腕部をさらに備える、請求項1から15のいずれか一項に記載のロボット制御装置。 The robot control device according to any one of claims 1 to 15, wherein the mobile robot further includes one or a plurality of arms having a manipulator mechanism.
  17.  前記障害物対応部は、検出した前記障害物が人ではない場合に、前記所定の対応として前記腕部を用いて前記障害物を指し示す、請求項16に記載のロボット制御装置。 The robot control device according to claim 16, wherein the obstacle handling unit points to the obstacle using the arm as the predetermined response when the detected obstacle is not a person.
  18.  移動ロボットを制御するロボット制御方法であって、
     現在地から目的地までの移動経路を生成する移動経路生成ステップと、
     生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御ステップと、
     前記移動経路上の障害物を検出する障害物検出ステップと、
     前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応ステップと、を含み、
     前記障害物検出ステップでは、前記障害物が人か否かを検出し、
     前記障害物対応ステップでは、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する、
     ロボット制御方法。
    It is a robot control method that controls a mobile robot.
    A movement route generation step that generates a movement route from the current location to the destination, and
    A movement control step for moving the mobile robot along the generated movement path, and
    An obstacle detection step for detecting an obstacle on the movement path and
    Includes an obstacle response step that performs a predetermined response to remove the obstacle from the travel path.
    In the obstacle detection step, it is detected whether or not the obstacle is a person, and the obstacle is detected.
    In the obstacle response step, one or a plurality of the predetermined responses selected depending on whether the obstacle is a person or not is executed.
    Robot control method.
  19.  移動ロボットを制御するロボット制御プログラムであって、
     現在地から目的地までの移動経路を生成する移動経路生成ステップと、
     生成された前記移動経路に沿って前記移動ロボットを移動させる移動制御ステップと、
     前記移動経路上の障害物を検出する障害物検出ステップと、
     前記障害物を前記移動経路から除くための所定の対応を実行する障害物対応ステップと、を含み、
     前記障害物検出ステップでは、前記障害物が人か否かを検出し、
     前記障害物対応ステップでは、前記障害物が人か否かに応じて選択した一又は複数の前記所定の対応を実行する、
    ロボット制御プログラム。
    A robot control program that controls a mobile robot
    A movement route generation step that generates a movement route from the current location to the destination, and
    A movement control step for moving the mobile robot along the generated movement path, and
    An obstacle detection step for detecting an obstacle on the movement path and
    Includes an obstacle response step that performs a predetermined response to remove the obstacle from the travel path.
    In the obstacle detection step, it is detected whether or not the obstacle is a person, and the obstacle is detected.
    In the obstacle response step, one or a plurality of the predetermined responses selected depending on whether the obstacle is a person or not is executed.
    Robot control program.
PCT/JP2021/012351 2020-04-30 2021-03-24 Robot control device, method, and program WO2021220679A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020080586A JP2021171905A (en) 2020-04-30 2020-04-30 Robot control device, method, and program
JP2020-080586 2020-04-30

Publications (1)

Publication Number Publication Date
WO2021220679A1 true WO2021220679A1 (en) 2021-11-04

Family

ID=78281162

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/012351 WO2021220679A1 (en) 2020-04-30 2021-03-24 Robot control device, method, and program

Country Status (2)

Country Link
JP (1) JP2021171905A (en)
WO (1) WO2021220679A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115958575A (en) * 2023-03-16 2023-04-14 中国科学院自动化研究所 Humanoid dexterous operation mobile robot

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023089817A1 (en) * 2021-11-22 2023-05-25 三菱電機株式会社 Information processing device, simulation system, simulation method, and simulation program
JP7434635B1 (en) 2023-03-28 2024-02-20 Kddi株式会社 Information processing device, information processing method and program
JP7434634B1 (en) 2023-03-28 2024-02-20 Kddi株式会社 Information processing device, information processing method and program
JP7397228B1 (en) 2023-03-31 2023-12-12 Kddi株式会社 Information processing device, information processing method, program and information processing system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001129787A (en) * 1999-11-02 2001-05-15 Atr Media Integration & Communications Res Lab Autonomous mobile robot
JP2011204145A (en) * 2010-03-26 2011-10-13 Sony Corp Moving device, moving method and program
JP2013537487A (en) * 2010-05-20 2013-10-03 アイロボット コーポレイション Mobile human interface robot
WO2015147149A1 (en) * 2014-03-28 2015-10-01 ヤンマー株式会社 Autonomously traveling work vehicle
JP2016081403A (en) * 2014-10-21 2016-05-16 株式会社Ihiエアロスペース Unmanned moving body and method of creating route for unmanned moving body

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001129787A (en) * 1999-11-02 2001-05-15 Atr Media Integration & Communications Res Lab Autonomous mobile robot
JP2011204145A (en) * 2010-03-26 2011-10-13 Sony Corp Moving device, moving method and program
JP2013537487A (en) * 2010-05-20 2013-10-03 アイロボット コーポレイション Mobile human interface robot
WO2015147149A1 (en) * 2014-03-28 2015-10-01 ヤンマー株式会社 Autonomously traveling work vehicle
JP2016081403A (en) * 2014-10-21 2016-05-16 株式会社Ihiエアロスペース Unmanned moving body and method of creating route for unmanned moving body

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115958575A (en) * 2023-03-16 2023-04-14 中国科学院自动化研究所 Humanoid dexterous operation mobile robot

Also Published As

Publication number Publication date
JP2021171905A (en) 2021-11-01

Similar Documents

Publication Publication Date Title
WO2021220679A1 (en) Robot control device, method, and program
US10293484B2 (en) Robot device and method of controlling the robot device
JP6936081B2 (en) robot
US9020682B2 (en) Autonomous mobile body
JP5987842B2 (en) Communication pull-in system, communication pull-in method and communication pull-in program
US9079307B2 (en) Autonomous locomotion apparatus, autonomous locomotion method, and program for autonomous locomotion apparatus
EP3169487A1 (en) Virtual safety cages for robotic devices
KR20170042547A (en) Humanoid robot with collision avoidance and trajectory recovery capabilities
JP2009113190A (en) Autonomous working robot and method of controlling operation of autonomous working robot
US11154991B2 (en) Interactive autonomous robot configured for programmatic interpretation of social cues
JP2017528779A (en) Vehicle vehicle tracking and control system using passive tracking elements
JP2016151897A (en) Mobile body control device and mobile body control method
US11241790B2 (en) Autonomous moving body and control program for autonomous moving body
KR20190105214A (en) Robot cleaner for escaping constrained situation through artificial intelligence and operating method thereof
JP2003140747A (en) Autonomous moving device and system for operating the same
WO2019244644A1 (en) Mobile body control device, mobile body control method, and program
JP2013188815A (en) Control device, control method and computer program
JP5552710B2 (en) Robot movement control system, robot movement control program, and robot movement control method
KR20210026595A (en) Method of moving in administrator mode and robot of implementing thereof
WO2021220678A1 (en) Mobile robot, method for controlling mobile robot, and control program
JP2020021351A (en) Robot, robot control program and robot control method
JP2009151382A (en) Traveling object
JP7258438B2 (en) ROBOT, ROBOT CONTROL PROGRAM AND ROBOT CONTROL METHOD
JP7360792B2 (en) Mobile object, learning device, and learning device manufacturing method
WO2021235141A1 (en) Moving body and control device, and method and program therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21795466

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21795466

Country of ref document: EP

Kind code of ref document: A1