WO2023171500A1 - Travel robot system - Google Patents

Travel robot system Download PDF

Info

Publication number
WO2023171500A1
WO2023171500A1 PCT/JP2023/007606 JP2023007606W WO2023171500A1 WO 2023171500 A1 WO2023171500 A1 WO 2023171500A1 JP 2023007606 W JP2023007606 W JP 2023007606W WO 2023171500 A1 WO2023171500 A1 WO 2023171500A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
guided vehicle
automatic guided
robot
travel
Prior art date
Application number
PCT/JP2023/007606
Other languages
French (fr)
Japanese (ja)
Inventor
秀樹 長末
昌昭 中川
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2023171500A1 publication Critical patent/WO2023171500A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Definitions

  • the present invention relates to a traveling robot system that includes an automatic guided vehicle and a robot mounted on the automatic guided vehicle.
  • This traveling robot system includes a storage unit that stores an environmental map, a traveling route determining unit that determines a scheduled traveling route from its own position on the environmental map to a travel destination, and a scheduled traveling route determined by the traveling route determining unit.
  • the vehicle is equipped with a travel control section that causes the automatic guided vehicle to travel along the road, and an obstacle detection section that detects obstacles.
  • the traveling control section causes the automatic guided vehicle to travel along the avoidance route when the obstacle is detected by the obstacle detection section. Then, the automatic guided vehicle travels to the destination while avoiding obstacles while repeating the detection of obstacles and the search for an avoidance route.
  • the travel control unit cannot find an avoidance route, the automatic guided vehicle will stop near the obstacle.
  • the automatic guided vehicle is configured to perform avoidance travel to avoid obstacles, but for example, when an obstacle is detected by the obstacle detection unit, the automatic guided vehicle performs avoidance travel to avoid obstacles.
  • a system that immediately stops an automated guided vehicle without searching for a route is also known.
  • the operator can remove the obstacle and restart the automatic guided vehicle.
  • the operator cannot easily determine which obstacle is causing the automated guided vehicle to stop, and the worker may be unable to remove the obstacle. There is a problem that the work cannot be done efficiently.
  • the present invention has been made in view of the above-mentioned circumstances, and is intended to automatically stop the automated guided vehicle when it is predicted that the automated guided vehicle will interfere with an obstacle, and to identify the cause of the stopping and remove it. It is an object of the present invention to provide a traveling robot system that allows surrounding workers to easily recognize obstacles that may occur.
  • a first aspect of the invention provides an automatic guided vehicle, a travel route determination unit that determines a scheduled travel route of the automatic guided vehicle, and a travel route determination unit that causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit.
  • a traveling robot system comprising a traveling control section, a robot mounted on the automatic guided vehicle, and a robot control section that controls the operation of the robot, an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified; an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle; a passage area calculation unit that calculates a planned passage area of the automatic guided vehicle along the planned travel route when the object is detected as an obstacle by the obstacle detection unit; Based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route
  • the travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the interference prediction unit predicts that the automatic guided vehicle will interfere with the obstacle
  • the robot control unit is configured to cause the robot to perform an obstacle designation operation to point to the obstacle when the interference prediction unit predicts that the automatic guided vehicle and the obstacle will interfere. This relates to a mobile robot system.
  • the obstacle detection unit places the object in the travelable area. If it is determined that the object is an obstacle, the passage area calculation unit calculates a planned passage area along the scheduled travel route of the automatic guided vehicle. Then, based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, it is determined whether the automatic guided vehicle will interfere with the obstacle. Predicted by the interference prediction unit. If the interference prediction unit predicts that the automatic guided vehicle will interfere with an obstacle, the automatic guided vehicle will stop traveling under the control of the travel control unit, and the robot will avoid the obstacle under the control of the robot control unit.
  • This obstacle designation action is an action in which the robot points to an obstacle, and surrounding workers should be aware that when the robot performs the obstacle designation action, the cause of the automatic guided vehicle's stoppage is the obstacle rather than a malfunction. In addition, it is possible to easily identify obstacles that should be removed in order to restart the automatic guided vehicle.
  • the obstacle designation operation is performed by the existing robot mounted on the automatic guided vehicle, there is no need to separately install a display monitor or speaker to notify obstacles that may cause interference, and the system The entire structure can be constructed at low cost.
  • a second invention provides an automatic guided vehicle, a travel route determination unit that determines a scheduled travel route of the automatic guided vehicle, and causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit.
  • a traveling robot system comprising a traveling control section, a robot mounted on the automatic guided vehicle, and a robot control section that controls the operation of the robot, an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified; an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle; a passing area calculating unit that calculates a scheduled passing area of the automatic guided vehicle along the scheduled travel route when the object is detected as an obstacle by the obstacle detecting unit; Based on the scheduled passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route.
  • the travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the travel route determining unit determines that there is no avoidance route that can avoid the interference. , If the travel route determination unit determines that there is no avoidance route that can avoid the interference, the robot control unit may cause the robot to have an obstacle that points to the obstacle that is predicted to cause the interference.
  • the present invention relates to a traveling robot system configured to perform an object specifying operation.
  • the travel route determination unit determines whether or not there is an avoidance route that can avoid the interference. If it is determined that an avoidance route exists, the current scheduled travel route is updated to the avoidance route, and the automatic guided vehicle travels along the updated avoidance route under the control of the travel control unit. On the other hand, if the travel route determination unit determines that there is no avoidance route that avoids interference between the automatic guided vehicle and the obstacle, the current scheduled travel route is not updated. In this case, under the control of the travel control unit, the automatic guided vehicle stops without performing avoidance travel, and the robot, under the control of the robot control unit, detects obstacles that are predicted to cause interference with the automatic guided vehicle. Executes an obstacle designation action that points to an object. Therefore, the same effects as the first invention can be obtained.
  • the robot is an articulated robot that performs a predetermined task
  • the obstacle designation operation is such that the tip of the robot on the work side is predicted to interfere with an automatic guided vehicle by the interference prediction unit. This is an operation in which the robot is positioned opposite to or above an obstacle.
  • the worker looks at the obstacle pointed to by the tip of the robot on the work side and selects the automatic guided vehicle.
  • the obstacle that caused the vehicle to stop can be easily recognized.
  • the robot is an articulated robot
  • a camera equipped with a lighting device is attached to the robot
  • the obstacle designation operation is such that the illumination light of the lighting device This is an action that causes the robot to take a posture toward an obstacle that is predicted to interfere with the automated guided vehicle.
  • the worker looks at the destination of the illumination light of the lighting device and identifies the cause of the automatic guided vehicle stopping. Obstacles can be easily recognized. Furthermore, the entire system can be constructed at low cost by using, for example, a camera illumination device used by a robot to recognize an object to be grasped as the illumination device.
  • the traveling robot system it is determined whether the object detected by the object detection sensor is an obstacle placed in the travelable area of the automatic guided vehicle, and the robot system is able to travel. If it is determined that the obstacle is placed in the area, the area where the automated guided vehicle is scheduled to pass along the planned travel route is calculated, and the calculated area and the obstacle output from the object detection sensor are calculated. Based on the position information, it is predicted whether or not the automatic guided vehicle will interfere with the obstacle if it continues to travel along the planned travel route, and if it is predicted that it will interfere, the automatic guided vehicle will be stopped. At the same time, by having the robot perform an obstacle designation operation that points to the obstacle, the automatic guided vehicle can automatically stop before it interferes with the obstacle, and the cause of the stop and the obstacle to be removed can be determined. Objects can be easily recognized by surrounding workers.
  • FIG. 2 is an explanatory diagram for explaining the schematic configuration and operation example of a traveling robot system according to an embodiment.
  • FIG. 1 is an external perspective view showing an automatic guided vehicle equipped with a robot.
  • FIG. 1 is a block diagram showing a schematic configuration of a control system.
  • FIG. 3 is a plan view showing an obstacle designation operation by the robot.
  • FIG. 3 is a side view of the vehicle seen from the left side, showing an obstacle designation operation by the robot. It is a flowchart which shows an example of obstacle designation control.
  • FIG. 4B is a diagram corresponding to FIG. 4B showing modification example 1;
  • FIG. 4A is a diagram corresponding to FIG. 4A showing modification example 2;
  • FIG. 4B is a diagram corresponding to FIG. 4B showing modification example 2;
  • It is an explanatory view for explaining an example of operation of a traveling robot system concerning other embodiments.
  • FIG. 1 is an explanatory diagram for explaining the schematic configuration of a traveling robot system 1 according to an embodiment.
  • This traveling robot system 1 includes an automatic guided vehicle 10 and a robot 20 mounted on the upper part of the automatic guided vehicle 10, and as described later, it is predicted that the automatic guided vehicle 10 will interfere with an obstacle 5.
  • the automatic guided vehicle 10 is stopped and the robot 20 is configured to perform an obstacle designation operation of pointing to the obstacle 5.
  • the traveling robot system 1 performs wireless communication between an on-vehicle control device 30 (see FIG. 3) housed in the guided vehicle body 11 of the automatic guided vehicle 10 and the on-vehicle control device 30 installed in a factory building. However, it further includes a host control device 40 that controls the automatic guided vehicle 10 and the robot 20. Under the control of the on-vehicle control device 30 that receives instructions from the host control device 40, the automatic guided vehicle 10 passes through work positions P1 and P2 adjacent to each work station 3a and 3b installed in the building. Run. Each of the work stations 3a, 3b is configured by, for example, a machine tool.
  • the robot 20 Under the control of the on-vehicle control device 30, the robot 20 performs a workpiece attachment/detachment operation (an example of a predetermined operation) when the automatic guided vehicle 10 stops at the work positions P1 and P2 of each of the work stations 3a and 3b.
  • a workpiece attachment/detachment operation an example of a predetermined operation
  • the work station 3 and obstacles 5 shown in FIG. 1 are merely examples, and their shapes, numbers, and placement positions are not limited to the example shown in FIG. 1.
  • FIG. 2 is an external perspective view showing the automatic guided vehicle 10 on which the robot 20 is mounted.
  • front side and rear side mean the front side and rear side in the longitudinal direction of the vehicle
  • left side and right side mean the left side and right side in the vehicle width direction.
  • the automatic guided vehicle 10 has a guided vehicle main body 11 in the shape of a rectangular parallelepiped that is long in the longitudinal direction of the vehicle.
  • the transport vehicle main body 11 accommodates the vehicle-mounted control device 30 therein, and has the robot 20 mounted on its upper surface.
  • four drive wheels 12, front, rear, left and right, are attached.
  • the four drive wheels 12 are independently rotationally driven by traveling motors 13a (see FIG.
  • each drive wheel 12 has a vertically extending steering shaft (not shown), and is configured to be able to be steered around the vertical axis by a steering motor 13b connected to the steering shaft.
  • the automatic guided vehicle 10 is capable of moving straight, traveling sideways, or turning by changing the steering angle of each drive wheel 12 using the steering motor 13b under the control of the on-vehicle control device 30.
  • the traveling motor 13a and the steering motor 13b will be referred to as the traveling actuator 13 unless they are particularly distinguished.
  • a distance sensor 14 (an example of an object detection sensor) is attached to the front side of the carrier body 11.
  • the distance sensor 14 is a sensor for measuring the distance to another object, and is configured by, for example, a LIDAR (Light Detection and Ranging) device.
  • LIDAR Light Detection and Ranging
  • a fixed installation object such as a machine tool installed in a building
  • an obstacle 5 placed in the travelable area of the automatic guided vehicle 10 an area excluding fixed installation objects
  • the distance sensor 14 has a light emitting section, and scans the laser beam emitted from the light emitting section in the vertical and horizontal directions to detect an object existing within a predetermined range in front of the automatic guided vehicle 10. Irradiate with laser light.
  • the distance sensor 14 measures the distance to the light irradiation position on the object surface by measuring the time it takes for the laser beam to reflect on the object and return, and calculates the measured distance and the scanning of the light at the time of the measurement. Correlate with corner information and output as distance data.
  • This distance data corresponds to position information for specifying the position of the object, and is transmitted in real time to the host control device 40 via the on-vehicle control device 30.
  • the robot 20 is mounted on the front end of the upper surface of the carrier body 11.
  • a pallet 4 that holds workpieces before and after processing is loaded on the rear side of the robot 20 on the upper surface of the carrier body 11.
  • the robot 20 is a six-axis articulated robot having a first arm 21, a second arm 22, a third arm 23, and six axes A1 to A6.
  • a hand 24 is attached.
  • the hand 24 has three gripping claws 24a that are slidable in the radial direction with respect to the center thereof, and grips the workpiece by pinching it from the outside in the radial direction with the three gripping claws 24a.
  • a hand-mounted camera 25 (corresponding to a camera equipped with an illumination device) is attached to the upper part of the hand 24 to take an image of the work to be gripped.
  • the hand-mounted camera 25 includes a camera body 25a and a ring illumination 25b (an example of a lighting device) attached to the camera body 25a coaxially with the optical axis thereof.
  • the ring illumination 25b is provided to ensure illuminance during imaging by the camera body 25a.
  • the camera body 25a and the ring illumination 25b are controlled by a robot control section 31b, which will be described later.
  • the hand-mounted camera 25 transmits image data captured by the camera body 25a to the robot control unit 31b.
  • This control system 100 includes the vehicle-mounted control device 30 and the host control device 40.
  • the on-vehicle control device 30 is a microcomputer including a CPU, ROM, RAM, etc., and includes a CPU 31 and a wireless communication section 32, as shown in FIG.
  • the wireless communication unit 32 includes a transmitting circuit, a receiving circuit, and a transmitting/receiving antenna, and transmits and receives various signals and data to and from the higher-level control device 40 by wireless communication in response to instructions from the CPU 31.
  • An example of this data is, for example, distance data of an object detected by the distance sensor 14.
  • the CPU 31 is connected to the travel actuator 13, the distance sensor 14, the robot 20, and the hand-mounted camera 25 so as to be able to send and receive signals.
  • the CPU 31 functions as a travel control section 31a and a robot control section 31b by executing a computer program stored in a ROM or the like.
  • the travel control unit 31a controls the travel actuator 13 so that the automatic guided vehicle 10 travels along a scheduled travel route V calculated by a travel route determining unit 41b of a higher-level control device 40, which will be described later.
  • the robot control unit 31b causes the robot 20 to perform a work operation or an obstacle designation operation.
  • the work operation is a predetermined operation that the robot 20 performs on each work station 3, and includes, for example, work for attaching and detaching a workpiece.
  • the robot control unit 31b corrects the position of the hand 24 relative to the workpiece by correcting the working posture based on the image captured by the hand-mounted camera 25 when causing the robot 20 to perform a work operation. It is composed of
  • the obstacle designation operation is an operation in which the robot 20 points to the obstacle 5 when the cause of the automatic guided vehicle 10 stopping is the obstacle 5 . Details of the obstacle designation operation will be described later.
  • the upper control device 40 is a microcomputer having a CPU, ROM, RAM, etc., and includes a CPU 41, a wireless communication section 42, and a map data storage section 43.
  • the wireless communication unit 42 includes a transmitting circuit, a receiving circuit, and a transmitting/receiving antenna, and transmits and receives various signals and data to and from the vehicle-mounted control device 30 by wireless communication in response to instructions from the CPU 41.
  • the map data storage unit 43 is a functional unit that stores map data, and is composed of a storage medium such as a magnetic disk, for example.
  • the map data includes information on the travelable area of the automatic guided vehicle 10 within the building.
  • the travelable area is an area excluding fixed installations such as machine tools, and is an area in which the automatic guided vehicle 10 can physically travel.
  • this map data is automatically generated based on the distance data output from the distance sensor 14.
  • the method for generating map data is not limited to this; for example, the map data may be generated manually by an operator operating an operation panel (not shown) such as a touch panel provided on the automatic guided vehicle 10. .
  • the CPU 41 functions as a job generation section 41a, a travel route determination section 41b, an obstacle detection section 41c, a passage area calculation section 41d, and an interference prediction section 41e by executing a computer program stored in the ROM or the like.
  • the job generation unit 41a acquires the loading status of conveyed objects in the automatic guided vehicle 10 and the retention status of conveyed objects at each work station 3, and determines the movement start point and movement of the automatic guided vehicle 10 based on the acquired conditions. Decide on your destination.
  • the travel route determination unit 41b determines (calculates) a scheduled travel route V from the travel start point determined by the job generation unit 41a to the travel destination.
  • This planned travel route V is determined so that the travel distance of the automatic guided vehicle 10 is the shortest within the travelable area defined in the map data. Note that the condition is not limited to the shortest moving distance; for example, the condition that the power consumption is the minimum may be adopted.
  • the obstacle detection unit 41c determines whether the object is an obstacle 5 placed in the travelable area of the automatic guided vehicle 10 based on the distance data of the object received from the on-vehicle control device 30 and the map data. Detect whether or not.
  • the travelable area is an area excluding fixed objects such as machine tools shown in the map data, and the obstacle detection unit 41c detects fixed objects such as machine tools that already exist on the map data. is not detected as the obstacle 5, and only objects placed in the travelable area are detected as the obstacle 5.
  • the passage area calculation unit 41d determines the planned passage area R of the automatic guided vehicle 10 along the scheduled travel route V determined by the travel route determination unit 41b. calculate.
  • the passing area calculation unit 41d calculates as the expected passing area R the area through which the planar shape of the automatic guided vehicle 10 is predicted to pass.
  • the planned passage area R may be a three-dimensional area in consideration of the three-dimensional shape of the automatic guided vehicle 10.
  • the interference prediction unit 41e determines whether the automatic guided vehicle 10 will pass through the area R calculated by the passing area calculation unit 41d and the distance data (position information) of the obstacle 5 output from the distance sensor 14. It is predicted whether the car 10 will interfere with an obstacle 5 if it continues to travel along the planned travel route V. Specifically, the interference prediction unit 41e identifies the coordinate position of the obstacle 5 on the map data based on the distance data of the obstacle 5 output from the distance sensor 14.
  • the interference prediction unit 41e determines whether at least a part of the obstacle 5 overlaps with the calculated expected passage area R in plan view, and if it is determined that it overlaps, the automatic guided vehicle 10 and the obstacle It is predicted that object 5 will interfere in the future, and if it is determined that they will not overlap, it is predicted that interference between the two will not occur.
  • the prediction result by the interference prediction unit 41e is transmitted to the travel control unit 31a and robot control unit 31b of the vehicle-mounted control device 30 via the wireless communication unit 42.
  • the traveling control unit 31a stops driving the traveling motor 13a of the automatic guided vehicle 10 and stops the automatic guided vehicle 10. .
  • the timing of stopping the automatic guided vehicle 10 is preferably, for example, when the distance between the automatic guided vehicle 10 and the obstacle 5 becomes a predetermined distance or less (for example, 1 m or less).
  • the robot control unit 31b causes the robot 20 to perform an obstacle designation operation.
  • the obstacle designation operation in this example is an operation in which illumination light is emitted toward the obstacle 5 from the ring illumination 25b attached to the third arm 23 of the robot 20.
  • 4A and 4B show the operating postures of the robot 20 when executing the obstacle designation motion, FIG. 4A is a plan view of the robot 20 seen from above, and FIG. 4B is a plan view of the robot 20 seen from the left side.
  • FIG. 4A is a plan view of the robot 20 seen from above
  • FIG. 4B is a plan view of the robot 20 seen from the left side.
  • the robot control unit 31b When the robot control unit 31b causes the robot 20 to perform the obstacle designation operation, it first causes the robot 20 to take a predetermined forward leaning posture (for example, the posture shown in FIG. 2). The tilting angles of each of the arms 21 to 23 in this forward leaning posture are stored in a storage section (not shown) by performing a teaching operation in advance. Then, the robot control unit 31b causes the robot 20 to rotate the entire robot 20 around the axis A1 and rotate the third arm 23 around the axis A6 based on this forward-leaning posture, thereby equipping the hand-mounted camera 25 with the robot 20. The light irradiation direction of the ring illumination 25b is directed toward the obstacle 5 (see FIGS. 4A and 4B).
  • the robot control unit 31b controls the rotation angle of the robot 20 around the axes A1 and A6 so that the optical axis of the ring illumination 25b (the central axis of the ring illumination 25b) passes through the center of gravity G of the obstacle 5.
  • the center of gravity position G of the obstacle 5 may be calculated based on the three-dimensional shape of the obstacle 5 estimated from the distance data detected by the distance sensor 14.
  • the center position C of the side surface of the obstacle 5 on the automatic guided vehicle 10 side is calculated, and the axis A1 of the robot 20 and the axis
  • the rotation angle around A6 may also be determined.
  • the robot control unit 31b causes the ring illumination 25b to emit illumination light toward the obstacle 5, with the optical axis of the ring illumination 25b directed toward the gravity center position G of the obstacle 5.
  • illumination light is continuously emitted toward the obstacle 5 from the ring illumination 25b.
  • the manner in which the illumination light is emitted is not limited to this; for example, it may be alternately turned on and turned off (that is, blinked).
  • FIG. 5 is a flowchart illustrating an example of obstacle designation control executed in cooperation between the host control device 40 and the vehicle-mounted control device 30.
  • step S1 the travel route determining unit 41b determines (calculates) a scheduled travel route V for the automatic guided vehicle 10 from the travel start point to the travel destination.
  • a large number of route candidates are generated using, for example, a genetic algorithm, and the shortest route is calculated as the planned driving route V from among the generated large number of route candidates.
  • step S2 the travel control unit 31a controls the travel actuator 13 to start the automatic guided vehicle 10 traveling along the scheduled travel route V determined in step S1.
  • step S3 it is determined whether or not the obstacle 5 has been detected by the obstacle detection unit 41c, and if this determination is NO, the process returns, while if it is YES, the process proceeds to step S4.
  • step S4 the passage area calculation unit 41d calculates the planned passage area R of the automatic guided vehicle 10 along the planned travel route V calculated in step S1.
  • step S5 the interference prediction unit 41e determines whether or not the automatic guided vehicle 10 will pass through the automatic guided vehicle 10 based on the expected passage area R of the automatic guided vehicle 10 calculated in step S4 and the distance data of the obstacle 5 output from the distance sensor 14. It is predicted whether or not the vehicle will interfere with the obstacle 5 if it continues traveling along the planned travel route V. If this determination is NO, the process returns, whereas if this determination is YES, the process advances to step S6.
  • step S6 the traveling control unit 31a determines whether the distance between the automatic guided vehicle 10 and the obstacle 5 is less than a predetermined distance (for example, 1 m or less) based on the distance data of the obstacle 5 output from the distance sensor 14. If the determination is NO, the process returns, whereas if the determination is YES, the process advances to step S7.
  • a predetermined distance for example, 1 m or less
  • step S7 the travel control unit 31a stops driving the travel motor 13a to stop the automatic guided vehicle 10.
  • step S8 the robot control unit 31b causes the robot 20 to perform the above-described obstacle designation operation, and then returns.
  • the robot 20 starts executing the obstacle designation operation after the automatic guided vehicle 10 has stopped; however, this is not limited to this, and before or at the same time as the automatic guided vehicle 10 has stopped.
  • the execution of the obstacle designation operation may be started.
  • the movement start point of the automatic guided vehicle 10 is a work position P1 set for the first work station 3a
  • the movement destination is a work position P1 set for the second work station 3b.
  • An example of position P2 is shown.
  • the travel route determining unit 41b calculates a planned travel route V from the work position P1, which is the movement start point, to the work position P2, which is the movement destination, and then, under the control of the travel control unit 31a, the automatic guided vehicle 10 starts traveling along the scheduled travel route V.
  • the passage area calculation unit 41d calculates the expected passage area R of the automatic guided vehicle 10
  • the interference prediction unit 41e calculates the expected passage area R of the automatic guided vehicle 10.
  • the automatic guided vehicle 10 continues to travel along the planned travel route V based on the planned passage area R calculated by the passage area calculation unit 41d and the distance data of the obstacle 5 output from the distance sensor 14. It is predicted whether or not there will be interference with the obstacle 5. In the example of FIG. 1, since it is predicted that interference will occur, the prediction result to that effect is transmitted from the interference prediction unit 41e to the vehicle-mounted control device 30.
  • the travel control section 31a stops the automatic guided vehicle 10, and the robot control section 31b causes the robot 20 to perform an obstacle designation operation.
  • the automatic guided vehicle 10 stops at the intermediate position P3 before interfering with the obstacle 5, and the optical axis of the ring illumination 25b attached to the third arm 23 of the robot 20 is directed toward the obstacle 5.
  • illumination light is emitted toward the obstacle 5 from the ring illumination 25b.
  • surrounding workers recognize that the obstacle 5 illuminated by the illumination light from the ring illumination 25b is the cause of the automatic guided vehicle 10 stopping, and take appropriate measures such as removing the obstacle 5. can be taken.
  • the automatic guided vehicle 10 when it is predicted that the automatic guided vehicle 10 will interfere with the obstacle 5, the automatic guided vehicle 10 is stopped and the robot 20 mounted on the automatic guided vehicle 10 is stopped.
  • the object designation operation By performing the object designation operation, interference between the automatic guided vehicle 10 and the obstacle 5 can be avoided, and the obstacle 5 that caused the automatic guided vehicle 10 to stop can be easily recognized by surrounding workers. can be done.
  • the existing robot 20 mounted on the automatic guided vehicle 10 can be used to notify surrounding workers of the obstacle 5 that caused the automatic guided vehicle 10 to stop, so the operator can use the display monitor, speaker, etc. There is no need to separately provide a notification means, and the entire system can be constructed at low cost.
  • the ring illumination 25b illuminates the obstacle 5 that caused the automatic guided vehicle 10 to stop, the automatic guided vehicle 10 The obstacle 5 that caused the vehicle to stop can be more clearly recognized by surrounding workers. Furthermore, since the ring illumination 25b is an illumination device attached to the hand-mounted camera 25, the entire system can be constructed at a lower cost than when a dedicated illumination device is separately provided.
  • FIG. 6 shows modification example 1.
  • the obstacle designation operation by the robot 20 is different from the above embodiment.
  • the obstacle designation operation by the robot 20 is to move the hand 24 (corresponding to the working side tip of the robot 20) attached to the tip of the third arm 23 of the robot 20 to face the obstacle 5. It is said to be an action that causes
  • the surface of the hand 24 on the gripping claw 24a side faces the obstacle 5. Further, in this obstacle designation operation, it is preferable to operate the robot 20 so that the axis of the third arm 23 passes through the center of gravity position G of the obstacle 5. Note that instead of the center of gravity position G, the center position C of the side surface of the obstacle 5 closer to the automatic guided vehicle 10 may be used.
  • FIGS. 7A and 7B show a second modification.
  • the obstacle designation operation by the robot 20 is different from the embodiment and the first modification.
  • the obstacle designation operation by the robot 20 is to move the hand 24 (corresponding to the working side tip of the robot 20) attached to the tip of the third arm 23 of the robot 20 to the upper side of the obstacle 5. It is said that the action is to position the This obstacle designation operation is preferably performed so that a gap is secured between the hand 24 and the upper surface of the obstacle 5. Moreover, it is preferable to perform this obstacle designation operation so that the hand 24 is positioned above the end of the obstacle 5 on the side closer to the automatic guided vehicle 10.
  • the robot 20 takes a posture such that the hand 24 is positioned above the obstacle 5 when the automatic guided vehicle 10 stops, the worker can change the posture of the robot 20. It can be recognized at a glance that the obstacle 5 is the cause of the automatic guided vehicle 10 stopping.
  • the hand 24 is positioned above the obstacle 5, which causes the automatic guided vehicle 10 to stop, compared to the embodiment and modification 1. It is possible to point out the obstacle 5 even more clearly. Further, unlike the embodiments described above, it is not necessary to operate the ring illumination 25b when executing the obstacle designation operation, so that energy saving can be improved.
  • the interference prediction unit 41e predicts that the automatic guided vehicle 10 will interfere with the obstacle 5
  • the automatic guided vehicle 10 is stopped without searching for an avoidance route.
  • the travel route determination unit 41b determines whether or not there is an avoidance route for avoiding interference between the automatic guided vehicle 10 and the obstacle 5, and performs the avoidance. If it is determined that a route exists, the current scheduled travel route V is updated to the corresponding avoidance route, while if it is determined that there is no avoidance route, the current scheduled travel route V is not updated. good.
  • the automatic guided vehicle 10 stops only after the travel route determination unit 41b determines that there is no avoidance route that avoids the obstacle 5. Accordingly, the robot 20 executes the obstacle designation operation under the control of the robot control unit 31b.
  • the automatic guided vehicle 10 can travel between the obstacles 5a and 5b.
  • the solid line in FIG. 8 if the automatic guided vehicle 10 stops between two obstacles 5a and 5b, in the conventional traveling robot system, one of the two obstacles 5a and 5b There was a problem in that surrounding workers could not recognize whether the automatic guided vehicle 10 had stopped due to this.
  • the robot 20 can move around the obstacle under the control of the robot control unit 31b.
  • the object specifying operation is executed to point to the obstacle 5 (in the example of FIG. 8, the obstacle 5a) that caused the automatic guided vehicle 10 to stop. Therefore, surrounding workers can recognize at a glance that the automatic guided vehicle 10 has stopped due to the obstacle 5a pointed to by the robot 20.
  • the automatic guided vehicle 10 is configured to autonomously travel without a track, but the present invention is not limited to this.
  • the automatic guided vehicle 10 may be configured to run on a track along a guide such as a magnetic tape or a light reflective tape, for example.
  • control system 100 includes the host control device 40 and the vehicle-mounted control device 30; however, the control system 100 is not limited to this, and each functional section of the host control device 40 is configured to be controlled by the vehicle-mounted control device 40.
  • the functions may be integrated into the device 30, or conversely, the functional units of the vehicle-mounted control device 30 may be integrated into the host control device 40 to remotely control the automatic guided vehicle 10.
  • the hand 24 is attached to the tip of the third arm 23 of the robot 20, but the hand 24 is not necessarily required.
  • the hand 24 is not necessarily required.
  • an air blowing device may be attached instead of the hand 24.
  • the air blow device corresponds to the tip of the robot 20 on the working side.
  • the distance sensor 14 is configured to scan the laser beam in the horizontal direction and the vertical direction, but may be configured to scan only in the horizontal direction, for example. Even in this case, the two-dimensional position of the obstacle 5 can be specified based on the distance data output from the distance sensor 14, so based on the two-dimensional position of the obstacle 5, the robot 20 can It is possible to perform an action of pointing to the object 5 (an obstacle designation action).
  • the distance sensor 14 detects an object existing within a predetermined range in front of the automatic guided vehicle 10; however, the present invention is not limited to this. An object existing within a predetermined range on the left side, right side, or rear side may be detected.
  • the object detection sensor is the distance sensor 14, but the object detection sensor is not limited to this, and may be an image sensor, for example.
  • the coordinate position of each pixel and the respective brightness value output from the image sensor function as position information for specifying the position of the object.
  • the present invention includes any combination of the above embodiment and each modification.

Abstract

This travel robot system: assesses whether or not an object detected by an object detection sensor is an obstacle disposed in a region in which an unmanned transport vehicle can travel (step S3); calculates an expected passage region of the unmanned transport vehicle along an expected transport route V (step S4) if it is assessed that the object is an obstacle disposed in the region in which travel is possible; predicts, on the basis of the calculated expected passage region and position information that pertains to the obstacle and is outputted by the object detection sensor, whether or not the unmanned transport vehicle will interfere with the obstacle (step S5); and stops the unmanned transport vehicle and causes a robot to execute an obstacle designation operation to designate the obstacle (steps S7 and S8) if it is predicted that interference will occur.

Description

走行ロボットシステムtraveling robot system
 本発明は、無人搬送車と、該無人搬送車に搭載されたロボットとを備えた走行ロボットシステムに関する。 The present invention relates to a traveling robot system that includes an automatic guided vehicle and a robot mounted on the automatic guided vehicle.
 従来、上述した走行ロボットシステムの一例として、特開2004-042148号公報(下記特許文献1)に開示された走行ロボットシステムが知られている。この走行ロボットシステムでは、環境地図が記憶された記憶部と、環境地図における自己位置から移動目的地までの走行予定経路を決定する走行経路決定部と、走行経路決定部により決定された走行予定経路に沿って無人搬送車を走行させる走行制御部と、障害物を検出する障害物検出部とを備えている。走行制御部は、障害物検出部により障害物が検出されると、無人搬送車を回避経路に沿って走行させる。そうして、障害物の検出と回避経路の探索とを繰り返しながら、無人搬送車が障害物を回避しつつ移動目的地まで走行する。一方、走行制御部により回避経路を見つけることができない場合には、無人搬送車は障害物付近で停車することとなる。 Conventionally, as an example of the above-mentioned traveling robot system, the traveling robot system disclosed in Japanese Patent Application Laid-open No. 2004-042148 (Patent Document 1 below) is known. This traveling robot system includes a storage unit that stores an environmental map, a traveling route determining unit that determines a scheduled traveling route from its own position on the environmental map to a travel destination, and a scheduled traveling route determined by the traveling route determining unit. The vehicle is equipped with a travel control section that causes the automatic guided vehicle to travel along the road, and an obstacle detection section that detects obstacles. The traveling control section causes the automatic guided vehicle to travel along the avoidance route when the obstacle is detected by the obstacle detection section. Then, the automatic guided vehicle travels to the destination while avoiding obstacles while repeating the detection of obstacles and the search for an avoidance route. On the other hand, if the travel control unit cannot find an avoidance route, the automatic guided vehicle will stop near the obstacle.
 特許文献1に示す走行ロボットシステムでは、無人搬送車は障害物を回避するための回避走行を行うように構成されているが、例えば、障害物検出部により障害物が検出された場合には回避経路を探索することなく無人搬送車を即時停車させるシステムも知られている。 In the traveling robot system disclosed in Patent Document 1, the automatic guided vehicle is configured to perform avoidance travel to avoid obstacles, but for example, when an obstacle is detected by the obstacle detection unit, the automatic guided vehicle performs avoidance travel to avoid obstacles. A system that immediately stops an automated guided vehicle without searching for a route is also known.
特開2004-042148号公報Japanese Patent Application Publication No. 2004-042148
 上述した従来の走行ロボットシステムにおいて、障害物が原因で無人搬送車が停車した場合には、作業者が当該障害物を撤去することで無人搬送車の走行を再開させることができる。しかし、例えば無人搬送車の周囲に障害物が複数存在する場合には、作業者はいずれの障害物が原因で無人搬送車が停車しているかを容易に判断することできず、障害物の撤去作業を効率的に行うことができないという問題がある。 In the above-described conventional traveling robot system, if the automatic guided vehicle stops due to an obstacle, the operator can remove the obstacle and restart the automatic guided vehicle. However, for example, if there are multiple obstacles around the automated guided vehicle, the operator cannot easily determine which obstacle is causing the automated guided vehicle to stop, and the worker may be unable to remove the obstacle. There is a problem that the work cannot be done efficiently.
 また、従来の走行ロボットシステムでは、無人搬送車が停車している場合に、作業者は、その原因が無人搬送車の故障によるものか又は障害物の存在によるものかを判断することができず復旧作業に手間取るという問題がある。 In addition, with conventional mobile robot systems, when an automated guided vehicle is stopped, the operator cannot determine whether the cause is a malfunction of the automated guided vehicle or the presence of an obstacle. There is a problem that recovery work takes time.
 本発明は、以上の実情に鑑みてなされたものであって、無人搬送車が障害物と干渉すると予測される場合には無人搬送車を自動的に停車させつつ、その停車の原因及び除去すべき障害物を周囲の作業者に容易に認識させることが可能な走行ロボットシステムを提供することを、その目的とする。 The present invention has been made in view of the above-mentioned circumstances, and is intended to automatically stop the automated guided vehicle when it is predicted that the automated guided vehicle will interfere with an obstacle, and to identify the cause of the stopping and remove it. It is an object of the present invention to provide a traveling robot system that allows surrounding workers to easily recognize obstacles that may occur.
 第1の発明は、無人搬送車と、該無人搬送車の走行予定経路を決定する走行経路決定部と、該走行経路決定部により決定された走行予定経路に沿って前記無人搬送車を走行させる走行制御部と、前記無人搬送車に搭載されたロボットと、前記ロボットの動作を制御するロボット制御部とを備えた走行ロボットシステムであって、
 前記無人搬送車に搭載され、該無人搬送車から所定範囲内に存在する物体を検出するとともに当該物体の位置を特定可能な位置情報を出力する物体検出センサと、
 前記物体検出センサにより検出された物体が、前記無人搬送車の走行可能領域に配置された障害物であるか否かを検出する障害物検出部と、
 前記障害物検出部により前記物体が障害物であると検出された場合に、前記走行予定経路に沿った前記無人搬送車の通過予定領域を算出する通過領域算出部と、
 前記通過領域算出部により算出された前記無人搬送車の通過予定領域と、前記物体検出センサより出力された前記障害物の位置情報とを基に、前記無人搬送車が前記走行予定経路に沿って走行し続けた場合に前記障害物と干渉するか否か予測する干渉予測部とを備え、
 前記走行制御部は、前記干渉予測部によって前記無人搬送車と前記障害物とが干渉すると予測される場合には、前記無人搬送車を該障害物と干渉する前に停車させるように構成され、
 前記ロボット制御部は、前記干渉予測部によって前記無人搬送車と前記障害物とが干渉すると予測される場合には、前記ロボットに当該障害物を指し示す障害物指定動作を実行させるように構成されている走行ロボットシステムに係る。
A first aspect of the invention provides an automatic guided vehicle, a travel route determination unit that determines a scheduled travel route of the automatic guided vehicle, and a travel route determination unit that causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit. A traveling robot system comprising a traveling control section, a robot mounted on the automatic guided vehicle, and a robot control section that controls the operation of the robot,
an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified;
an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle;
a passage area calculation unit that calculates a planned passage area of the automatic guided vehicle along the planned travel route when the object is detected as an obstacle by the obstacle detection unit;
Based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route. an interference prediction unit that predicts whether or not the vehicle will interfere with the obstacle if the vehicle continues to travel;
The travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the interference prediction unit predicts that the automatic guided vehicle will interfere with the obstacle,
The robot control unit is configured to cause the robot to perform an obstacle designation operation to point to the obstacle when the interference prediction unit predicts that the automatic guided vehicle and the obstacle will interfere. This relates to a mobile robot system.
 第1の発明によれば、無人搬送車に搭載された物体検出センサにより無人搬送車から所定範囲内にある物体が検出されると、障害物検出部にて当該物体が走行可能領域に配置された障害物であるか否かが判定され、障害物であると判定された場合には、通過領域算出部により無人搬送車の走行予定経路に沿った通過予定領域が算出される。そして、通過領域算出部が算出した無人搬送車の通過予定領域と、前記物体検出センサより出力された障害物の位置情報とを基に、無人搬送車が当該障害物と干渉するか否かが干渉予測部にて予測される。干渉予測部にて無人搬送車が障害物に干渉すると予測された場合には、走行制御部による制御の下、無人搬送車が走行を停止するとともに、ロボット制御部による制御の下、ロボットが障害物指定動作を実行する。この障害物指定動作はロボットが障害物を指し示す動作であり、周囲の作業者は、ロボットが障害物指定動作を実行した場合には、無人搬送車の停車原因が故障等ではなく障害物であると認識できるとともに、無人搬送車の走行を再開させるために除去すべき障害物を容易に特定することができる。 According to the first invention, when an object within a predetermined range from the automatic guided vehicle is detected by the object detection sensor mounted on the automatic guided vehicle, the obstacle detection unit places the object in the travelable area. If it is determined that the object is an obstacle, the passage area calculation unit calculates a planned passage area along the scheduled travel route of the automatic guided vehicle. Then, based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, it is determined whether the automatic guided vehicle will interfere with the obstacle. Predicted by the interference prediction unit. If the interference prediction unit predicts that the automatic guided vehicle will interfere with an obstacle, the automatic guided vehicle will stop traveling under the control of the travel control unit, and the robot will avoid the obstacle under the control of the robot control unit. Execute object specification action. This obstacle designation action is an action in which the robot points to an obstacle, and surrounding workers should be aware that when the robot performs the obstacle designation action, the cause of the automatic guided vehicle's stoppage is the obstacle rather than a malfunction. In addition, it is possible to easily identify obstacles that should be removed in order to restart the automatic guided vehicle.
 しかも、障害物指定動作は、無人搬送車に搭載されている既存のロボットにより実行されるので、干渉を生じさせる障害物を報知するための表示モニタやスピーカ等を別途設ける必要がなく、したがってシステム全体を安価に構成することができる。 Moreover, since the obstacle designation operation is performed by the existing robot mounted on the automatic guided vehicle, there is no need to separately install a display monitor or speaker to notify obstacles that may cause interference, and the system The entire structure can be constructed at low cost.
 第2の発明は、無人搬送車と、該無人搬送車の走行予定経路を決定する走行経路決定部と、該走行経路決定部により決定された走行予定経路に沿って前記無人搬送車を走行させる走行制御部と、前記無人搬送車に搭載されたロボットと、前記ロボットの動作を制御するロボット制御部とを備えた走行ロボットシステムであって、
 前記無人搬送車に搭載され、該無人搬送車から所定範囲内に存在する物体を検出するとともに当該物体の位置を特定可能な位置情報を出力する物体検出センサと、
 前記物体検出センサにより検出された物体が、前記無人搬送車の走行可能領域に配置された障害物であるか否かを検出する障害物検出部と、
 前記障害物検出部により前記物体が障害物であると検出された場合に、前記走行予定経路に沿った前記無人搬送車の通過予定領域を算出する通過領域算出部と、
 前記通過領域算出部により算出された前記無人搬送車の通過予定領域と、前記物体検出センサより出力された前記障害物の位置情報とを基に、前記無人搬送車が前記走行予定経路に沿って走行し続けた場合に前記障害物と干渉するか否かを予測する干渉予測部とを備え、
 前記走行経路決定部は、前記干渉予測部により前記無人搬送車と前記障害物とが干渉すると予測された場合には、該干渉を回避し得る回避経路が存在するか否かを判定して、存在すると判定した場合には、現時点の走行予定経路を当該回避経路に更新するように構成され、
 前記走行制御部は、前記走行経路決定部により前記干渉を回避し得る回避経路が存在しないと判定された場合には、前記無人搬送車を前記障害物と干渉する前に停車させるように構成され、
 前記ロボット制御部は、前記走行経路決定部により前記干渉を回避し得る回避経路が存在しないと判定された場合には、前記ロボットに、該干渉を生じさせると予測される前記障害物を指し示す障害物指定動作を実行させるように構成されている走行ロボットシステムに係る。
A second invention provides an automatic guided vehicle, a travel route determination unit that determines a scheduled travel route of the automatic guided vehicle, and causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit. A traveling robot system comprising a traveling control section, a robot mounted on the automatic guided vehicle, and a robot control section that controls the operation of the robot,
an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified;
an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle;
a passing area calculating unit that calculates a scheduled passing area of the automatic guided vehicle along the scheduled travel route when the object is detected as an obstacle by the obstacle detecting unit;
Based on the scheduled passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route. an interference prediction unit that predicts whether or not the vehicle will interfere with the obstacle if the vehicle continues to travel;
When the interference prediction unit predicts that the automatic guided vehicle and the obstacle will interfere, the travel route determination unit determines whether there is an avoidance route that can avoid the interference, If it is determined that the avoidance route exists, the current scheduled travel route is updated to the avoidance route,
The travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the travel route determining unit determines that there is no avoidance route that can avoid the interference. ,
If the travel route determination unit determines that there is no avoidance route that can avoid the interference, the robot control unit may cause the robot to have an obstacle that points to the obstacle that is predicted to cause the interference. The present invention relates to a traveling robot system configured to perform an object specifying operation.
 第2の発明によれば、干渉予測部により無人搬送車が障害物と干渉すると予測された場合には、走行経路決定部において該干渉を回避し得る回避経路が存在するか否かが判定され、回避経路が存在すると判定された場合には、現時点の走行予定経路が回避経路に更新され、走行制御部による制御の下、無人搬送車は更新された回避経路に沿って走行する。一方、走行経路決定部において無人搬送車と障害物との干渉を回避する回避経路が存在しないと判定された場合には、現時点の走行予定経路は更新されない。この場合、無人搬送車は、走行制御部による制御の下、回避走行を行うことなく停車し、ロボットは、ロボット制御部による制御の下、無人搬送車との干渉を生じさせると予測される障害物を指し示す障害物指定動作を実行する。したがって、第1の発明と同様の作用効果を得ることができる。 According to the second invention, when the interference prediction unit predicts that the automatic guided vehicle will interfere with an obstacle, the travel route determination unit determines whether or not there is an avoidance route that can avoid the interference. If it is determined that an avoidance route exists, the current scheduled travel route is updated to the avoidance route, and the automatic guided vehicle travels along the updated avoidance route under the control of the travel control unit. On the other hand, if the travel route determination unit determines that there is no avoidance route that avoids interference between the automatic guided vehicle and the obstacle, the current scheduled travel route is not updated. In this case, under the control of the travel control unit, the automatic guided vehicle stops without performing avoidance travel, and the robot, under the control of the robot control unit, detects obstacles that are predicted to cause interference with the automatic guided vehicle. Executes an obstacle designation action that points to an object. Therefore, the same effects as the first invention can be obtained.
 第3の発明では、前記ロボットは、所定作業を行う多関節ロボットであり、前記障害物指定動作は、前記ロボットの作業側の先端部を、前記干渉予測部により無人搬送車と干渉すると予測される障害物に対向させるか又は当該障害物の上側に位置させる動作とされている。 In a third aspect of the invention, the robot is an articulated robot that performs a predetermined task, and the obstacle designation operation is such that the tip of the robot on the work side is predicted to interfere with an automatic guided vehicle by the interference prediction unit. This is an operation in which the robot is positioned opposite to or above an obstacle.
 第3の発明によれば、無人搬送車が停車してロボットにより障害物指定動作が実行された場合に、作業者は、ロボットの作業側の先端部が指し示す障害物を見て、無人搬送車の停車原因となった障害物を容易に認識することができる。 According to the third invention, when the automatic guided vehicle is stopped and the robot executes the obstacle designation operation, the worker looks at the obstacle pointed to by the tip of the robot on the work side and selects the automatic guided vehicle. The obstacle that caused the vehicle to stop can be easily recognized.
 第4の発明では、前記ロボットは多関節ロボットであり、前記ロボットには、照明装置を装備したカメラが取付けられており、前記障害物指定動作は、前記照明装置の照明光が、前記干渉予測部により無人搬送車と干渉すると予測される障害物に向くような姿勢をロボットに取らせる動作とされている。 In a fourth invention, the robot is an articulated robot, a camera equipped with a lighting device is attached to the robot, and the obstacle designation operation is such that the illumination light of the lighting device This is an action that causes the robot to take a posture toward an obstacle that is predicted to interfere with the automated guided vehicle.
 第4の発明によれば、無人搬送車が停車してロボットにより障害物指定動作が実行された場合に、作業者は、照明装置の照明光の照らす先を見て、無人搬送車の停車原因となった障害物を容易に認識することができる。また、照明装置として、例えばロボットが把持対象物を認識するために使用するカメラの照明装置を利用することでシステム全体を安価に構成することができる。 According to the fourth invention, when the automatic guided vehicle stops and the robot performs an obstacle designation operation, the worker looks at the destination of the illumination light of the lighting device and identifies the cause of the automatic guided vehicle stopping. Obstacles can be easily recognized. Furthermore, the entire system can be constructed at low cost by using, for example, a camera illumination device used by a robot to recognize an object to be grasped as the illumination device.
 以上のように、本発明に係る走行ロボットシステムによれば、物体検出センサにより検出された物体が、無人搬送車の走行可能領域に配置された障害物であるか否かを判定し、走行可能領域に配置された障害物であると判定した場合には、走行予定経路に沿った無人搬送車の通過予定領域を算出し、算出した通過予定領域と、物体検出センサより出力された障害物の位置情報とを基に、無人搬送車が前記走行予定経路に沿って走行し続けた場合に前記障害物と干渉するか否かを予測し、干渉すると予測した場合には、無人搬送車を停車させるとともに、ロボットに当該障害物を指し示す障害物指定動作を実行させるようにしたことで、無人搬送車を障害物と干渉する前に自動的に停車させつつ、その停車の原因及び除去すべき障害物を周囲の作業者に容易に認識させることができる。 As described above, according to the traveling robot system according to the present invention, it is determined whether the object detected by the object detection sensor is an obstacle placed in the travelable area of the automatic guided vehicle, and the robot system is able to travel. If it is determined that the obstacle is placed in the area, the area where the automated guided vehicle is scheduled to pass along the planned travel route is calculated, and the calculated area and the obstacle output from the object detection sensor are calculated. Based on the position information, it is predicted whether or not the automatic guided vehicle will interfere with the obstacle if it continues to travel along the planned travel route, and if it is predicted that it will interfere, the automatic guided vehicle will be stopped. At the same time, by having the robot perform an obstacle designation operation that points to the obstacle, the automatic guided vehicle can automatically stop before it interferes with the obstacle, and the cause of the stop and the obstacle to be removed can be determined. Objects can be easily recognized by surrounding workers.
実施形態に係る走行ロボットシステムの概略構成及び動作例を説明するための説明図である。FIG. 2 is an explanatory diagram for explaining the schematic configuration and operation example of a traveling robot system according to an embodiment. ロボットを搭載した無人搬送車を示す外観斜視図である。FIG. 1 is an external perspective view showing an automatic guided vehicle equipped with a robot. 制御システムの概略構成を示すブロック図である。FIG. 1 is a block diagram showing a schematic configuration of a control system. ロボットによる障害物指定動作を示す平面図である。FIG. 3 is a plan view showing an obstacle designation operation by the robot. ロボットによる障害物指定動作を示す車両左側から見た側面図である。FIG. 3 is a side view of the vehicle seen from the left side, showing an obstacle designation operation by the robot. 障害物指定制御の一例を示すフローチャートである。It is a flowchart which shows an example of obstacle designation control. 変形例1を示す図4B相当図である。FIG. 4B is a diagram corresponding to FIG. 4B showing modification example 1; 変形例2を示す図4A相当図である。FIG. 4A is a diagram corresponding to FIG. 4A showing modification example 2; 変形例2を示す図4B相当図である。FIG. 4B is a diagram corresponding to FIG. 4B showing modification example 2; 他の実施形態に係る走行ロボットシステムの動作例を説明するための説明図である。It is an explanatory view for explaining an example of operation of a traveling robot system concerning other embodiments.
 《実施形態》
 図1は、実施形態に係る走行ロボットシステム1の概略構成を説明するための説明図である。この走行ロボットシステム1は、無人搬送車10と、無人搬送車10の上部に搭載されたロボット20とを備えていて、後述するように、無人搬送車10が障害物5と干渉すると予測される場合には、無人搬送車10を停車させるとともにロボット20に当該障害物5を指し示す障害物指定動作を実行させるように構成されている。
《Embodiment》
FIG. 1 is an explanatory diagram for explaining the schematic configuration of a traveling robot system 1 according to an embodiment. This traveling robot system 1 includes an automatic guided vehicle 10 and a robot 20 mounted on the upper part of the automatic guided vehicle 10, and as described later, it is predicted that the automatic guided vehicle 10 will interfere with an obstacle 5. In this case, the automatic guided vehicle 10 is stopped and the robot 20 is configured to perform an obstacle designation operation of pointing to the obstacle 5.
 走行ロボットシステム1は、無人搬送車10の搬送車本体11に収容された車載制御装置30(図3参照)と、工場の建屋内に設置され、車載制御装置30との間で無線通信を行いながら無人搬送車10及びロボット20を制御する上位制御装置40とをさらに備えている。無人搬送車10は、上位制御装置40からの指令を受けた車載制御装置30による制御の下、建屋内に設置された各作業ステーション3a,3bに隣接する作業位置P1,P2を経由するように走行する。各作業ステーション3a,3bは、例えば工作機械等により構成される。ロボット20は、車載制御装置30による制御の下、無人搬送車10が各作業ステーション3a,3bの作業位置P1,P2に停車すると、ワークの着脱作業(所定作業の一例)を実行する。尚、図1に示す作業ステーション3や障害物5は一例であり、その形状や数及び配置位置は図1の例に限定されない。 The traveling robot system 1 performs wireless communication between an on-vehicle control device 30 (see FIG. 3) housed in the guided vehicle body 11 of the automatic guided vehicle 10 and the on-vehicle control device 30 installed in a factory building. However, it further includes a host control device 40 that controls the automatic guided vehicle 10 and the robot 20. Under the control of the on-vehicle control device 30 that receives instructions from the host control device 40, the automatic guided vehicle 10 passes through work positions P1 and P2 adjacent to each work station 3a and 3b installed in the building. Run. Each of the work stations 3a, 3b is configured by, for example, a machine tool. Under the control of the on-vehicle control device 30, the robot 20 performs a workpiece attachment/detachment operation (an example of a predetermined operation) when the automatic guided vehicle 10 stops at the work positions P1 and P2 of each of the work stations 3a and 3b. Note that the work station 3 and obstacles 5 shown in FIG. 1 are merely examples, and their shapes, numbers, and placement positions are not limited to the example shown in FIG. 1.
 図2は、ロボット20を搭載した無人搬送車10を示す外観斜視図である。以下の説明において、特に断らない限り、前側、後側は、車両前後方向の前側、後側を意味し、左側、右側は、車幅方向の左側、右側を意味するものとする。同図に示すように、無人搬送車10は、車両前後方向に長い直方体状の搬送車本体11を有している。搬送車本体11は、その内部に前記車載制御装置30を収容するとともに、上面に前記ロボット20を搭載している。搬送車本体11の下端部の四隅には、前後左右の4つの駆動輪12が取付けられている。4つの駆動輪12は、それぞれの車軸に連結された走行モータ13a(図3参照)によって独立に回転駆動される。また、各駆動輪12は、鉛直に延びる操舵軸(図示省略)を有していて、該操舵軸に連結された操舵モータ13bによって鉛直軸回りに操舵可能に構成されている。無人搬送車10は、車載制御装置30による制御の下、操舵モータ13bにより各駆動輪12の操舵角を変更することで直進、横行又は旋回が可能になっている。以下の説明では、走行モータ13a及び操舵モータ13bを特に区別しない場合には走行用アクチュエータ13と称する。 FIG. 2 is an external perspective view showing the automatic guided vehicle 10 on which the robot 20 is mounted. In the following description, unless otherwise specified, front side and rear side mean the front side and rear side in the longitudinal direction of the vehicle, and left side and right side mean the left side and right side in the vehicle width direction. As shown in the figure, the automatic guided vehicle 10 has a guided vehicle main body 11 in the shape of a rectangular parallelepiped that is long in the longitudinal direction of the vehicle. The transport vehicle main body 11 accommodates the vehicle-mounted control device 30 therein, and has the robot 20 mounted on its upper surface. At the four corners of the lower end of the transport vehicle main body 11, four drive wheels 12, front, rear, left and right, are attached. The four drive wheels 12 are independently rotationally driven by traveling motors 13a (see FIG. 3) connected to respective axles. Further, each drive wheel 12 has a vertically extending steering shaft (not shown), and is configured to be able to be steered around the vertical axis by a steering motor 13b connected to the steering shaft. The automatic guided vehicle 10 is capable of moving straight, traveling sideways, or turning by changing the steering angle of each drive wheel 12 using the steering motor 13b under the control of the on-vehicle control device 30. In the following description, the traveling motor 13a and the steering motor 13b will be referred to as the traveling actuator 13 unless they are particularly distinguished.
 搬送車本体11の前側面には距離センサ14(物体検出センサの一例)が取付けられている。距離センサ14は、他の物体との距離を測定するためのセンサであって、例えばLIDAR(Light Detection and Ranging)装置により構成される。ここで、他の物体の一例として、例えば建屋内に設置された工作機械等の固定設置物や、無人搬送車10の走行可能領域(固定設置物を除く領域)に置かれた障害物5などが挙げられる。距離センサ14は、光出射部を有していて、該光出射部から出射されるレーザ光を上下方向及び水平方向に走査させることで無人搬送車10の前方の所定範囲内に存在する物体にレーザ光を照射する。そして、距離センサ14は、レーザ光が物体に反射して返って来るまでの時間を計測することで物体表面の光照射位置までの距離を計測し、計測した距離と該計測時の光の走査角の情報とを対応付けて距離データとして出力する。この距離データは、物体の位置を特定するための位置情報に相当し、車載制御装置30を介して上位制御装置40にリアルタイムで送信される。 A distance sensor 14 (an example of an object detection sensor) is attached to the front side of the carrier body 11. The distance sensor 14 is a sensor for measuring the distance to another object, and is configured by, for example, a LIDAR (Light Detection and Ranging) device. Here, as an example of other objects, for example, a fixed installation object such as a machine tool installed in a building, an obstacle 5 placed in the travelable area of the automatic guided vehicle 10 (an area excluding fixed installation objects), etc. can be mentioned. The distance sensor 14 has a light emitting section, and scans the laser beam emitted from the light emitting section in the vertical and horizontal directions to detect an object existing within a predetermined range in front of the automatic guided vehicle 10. Irradiate with laser light. The distance sensor 14 measures the distance to the light irradiation position on the object surface by measuring the time it takes for the laser beam to reflect on the object and return, and calculates the measured distance and the scanning of the light at the time of the measurement. Correlate with corner information and output as distance data. This distance data corresponds to position information for specifying the position of the object, and is transmitted in real time to the host control device 40 via the on-vehicle control device 30.
 前記ロボット20は、搬送車本体11の上面における前側端部に搭載されている。搬送車本体11の上面におけるロボット20の後側には、加工前及び加工後のワークを保持するパレット4が積載されている。ロボット20は、第1アーム21、第2アーム22及び第3アーム23と、6つの軸A1~A6とを有する6軸の多関節ロボットであり、第3アーム23の先端部には作業用のハンド24が装着されている。ハンド24は、その中心部に対して径方向にスライド可能な3つの把持爪24aを有していて、該3つの把持爪24aによりワークを径方向の外側から挟み込んで把持する。 The robot 20 is mounted on the front end of the upper surface of the carrier body 11. A pallet 4 that holds workpieces before and after processing is loaded on the rear side of the robot 20 on the upper surface of the carrier body 11. The robot 20 is a six-axis articulated robot having a first arm 21, a second arm 22, a third arm 23, and six axes A1 to A6. A hand 24 is attached. The hand 24 has three gripping claws 24a that are slidable in the radial direction with respect to the center thereof, and grips the workpiece by pinching it from the outside in the radial direction with the three gripping claws 24a.
 ハンド24の上部には、把持対象となるワークを撮像するためのハンド装着カメラ25(照明装置を備えたカメラに相当)が取付けられている。ハンド装着カメラ25は、カメラ本体25aと、カメラ本体25aに対してその光軸と同軸に取付けられたリング照明25b(照明装置の一例)とを有している。リング照明25bは、カメラ本体25aによる撮像時の照度を確保するために設けられている。カメラ本体25a及びリング照明25bは後述するロボット制御部31bにより制御される。ハンド装着カメラ25は、カメラ本体25aにより撮像した画像データをロボット制御部31bに送信する。 A hand-mounted camera 25 (corresponding to a camera equipped with an illumination device) is attached to the upper part of the hand 24 to take an image of the work to be gripped. The hand-mounted camera 25 includes a camera body 25a and a ring illumination 25b (an example of a lighting device) attached to the camera body 25a coaxially with the optical axis thereof. The ring illumination 25b is provided to ensure illuminance during imaging by the camera body 25a. The camera body 25a and the ring illumination 25b are controlled by a robot control section 31b, which will be described later. The hand-mounted camera 25 transmits image data captured by the camera body 25a to the robot control unit 31b.
 次に、図3を参照して、走行ロボットシステム1を制御する制御システム100の説明を行う。この制御システム100は、前記車載制御装置30及び上位制御装置40を含んでいる。 Next, with reference to FIG. 3, the control system 100 that controls the traveling robot system 1 will be explained. This control system 100 includes the vehicle-mounted control device 30 and the host control device 40.
 車載制御装置30は、CPU、ROM及びRAM等を有するマイクロコンピュータであって、図3に示すように、CPU31と、無線通信部32とを有している。 The on-vehicle control device 30 is a microcomputer including a CPU, ROM, RAM, etc., and includes a CPU 31 and a wireless communication section 32, as shown in FIG.
 無線通信部32は、送信回路と受信回路と送受信アンテナとを有していて、CPU31からの指令を受けて上位制御装置40との間で無線通信により各種信号及びデータの送受信を行う。このデータの一例として、例えば距離センサ14により検出した物体の距離データが挙げられる。 The wireless communication unit 32 includes a transmitting circuit, a receiving circuit, and a transmitting/receiving antenna, and transmits and receives various signals and data to and from the higher-level control device 40 by wireless communication in response to instructions from the CPU 31. An example of this data is, for example, distance data of an object detected by the distance sensor 14.
 CPU31は、走行用アクチュエータ13、距離センサ14、ロボット20、及びハンド装着カメラ25に信号の授受可能に接続されている。CPU31は、ROM等に記憶されたコンピュータプログラムを実行することで走行制御部31a及びロボット制御部31bとして機能する。 The CPU 31 is connected to the travel actuator 13, the distance sensor 14, the robot 20, and the hand-mounted camera 25 so as to be able to send and receive signals. The CPU 31 functions as a travel control section 31a and a robot control section 31b by executing a computer program stored in a ROM or the like.
 走行制御部31aは、後述する上位制御装置40の走行経路決定部41bにより算出される走行予定経路Vに沿って無人搬送車10が走行するように走行用アクチュエータ13を制御する。 The travel control unit 31a controls the travel actuator 13 so that the automatic guided vehicle 10 travels along a scheduled travel route V calculated by a travel route determining unit 41b of a higher-level control device 40, which will be described later.
 ロボット制御部31bは、ロボット20に作業用動作又は障害物指定動作を実行させる。作業用動作は、ロボット20が各作業ステーション3に対して行う所定の動作であって、例えばワークの着脱作業等が含まれる。本例では、ロボット制御部31bは、ロボット20に作業用動作を実行させる際に、ハンド装着カメラ25による撮像画像を基に、作業姿勢を補正することでワークに対するハンド24の位置補正を行うように構成されている。障害物指定動作は、無人搬送車10の停車原因が障害物5である場合にロボット20によって当該障害物5を指し示す動作である。障害物指定動作の詳細は後述する。 The robot control unit 31b causes the robot 20 to perform a work operation or an obstacle designation operation. The work operation is a predetermined operation that the robot 20 performs on each work station 3, and includes, for example, work for attaching and detaching a workpiece. In this example, the robot control unit 31b corrects the position of the hand 24 relative to the workpiece by correcting the working posture based on the image captured by the hand-mounted camera 25 when causing the robot 20 to perform a work operation. It is composed of The obstacle designation operation is an operation in which the robot 20 points to the obstacle 5 when the cause of the automatic guided vehicle 10 stopping is the obstacle 5 . Details of the obstacle designation operation will be described later.
 上位制御装置40は、CPU、ROM及びRAM等を有するマイクロコンピュータであって、CPU41と、無線通信部42と、マップデータ記憶部43とを有している。 The upper control device 40 is a microcomputer having a CPU, ROM, RAM, etc., and includes a CPU 41, a wireless communication section 42, and a map data storage section 43.
 無線通信部42は、送信回路と受信回路と送受信アンテナとを有していて、CPU41からの指令を受けて車載制御装置30との間で無線通信により各種信号及びデータの送受信を行う。 The wireless communication unit 42 includes a transmitting circuit, a receiving circuit, and a transmitting/receiving antenna, and transmits and receives various signals and data to and from the vehicle-mounted control device 30 by wireless communication in response to instructions from the CPU 41.
 マップデータ記憶部43は、マップデータを記憶する機能部であって、例えば磁気ディスク等の記憶媒体により構成されている。マップデータは、建屋内における無人搬送車10の走行可能領域の情報を含んでいる。走行可能領域とは、工作機械等の固定設置物を除く領域であって無人搬送車10が物理的に走行可能な領域である。本例では、このマップデータは、距離センサ14から出力される前記距離データを基に自動生成される。マップデータの生成方法はこれに限ったものではなく、例えば、作業者が無人搬送車10に設けられたタッチパネル等の操作盤(図示省略)を操作することで手動で生成するようにしてもよい。 The map data storage unit 43 is a functional unit that stores map data, and is composed of a storage medium such as a magnetic disk, for example. The map data includes information on the travelable area of the automatic guided vehicle 10 within the building. The travelable area is an area excluding fixed installations such as machine tools, and is an area in which the automatic guided vehicle 10 can physically travel. In this example, this map data is automatically generated based on the distance data output from the distance sensor 14. The method for generating map data is not limited to this; for example, the map data may be generated manually by an operator operating an operation panel (not shown) such as a touch panel provided on the automatic guided vehicle 10. .
 CPU41は、前記ROM等に記憶されたコンピュータプログラムを実行することでジョブ生成部41a、走行経路決定部41b、障害物検出部41c、通過領域算出部41d、及び干渉予測部41eとして機能する。 The CPU 41 functions as a job generation section 41a, a travel route determination section 41b, an obstacle detection section 41c, a passage area calculation section 41d, and an interference prediction section 41e by executing a computer program stored in the ROM or the like.
 ジョブ生成部41aは、無人搬送車10における搬送物の積載状況、及び、各作業ステーション3おける搬送物の滞留状況を取得して、取得した状況を基に無人搬送車10の移動開始地及び移動目的地を決定する。 The job generation unit 41a acquires the loading status of conveyed objects in the automatic guided vehicle 10 and the retention status of conveyed objects at each work station 3, and determines the movement start point and movement of the automatic guided vehicle 10 based on the acquired conditions. Decide on your destination.
 走行経路決定部41bは、ジョブ生成部41aにより決定された移動開始地から移動目的地までの走行予定経路Vを決定(算出)する。この走行予定経路Vは、マップデータに規定された走行可能領域内において無人搬送車10の移動距離が最短になるように決定される。尚、移動距離が最短になるとの条件に限らず、例えば、消費電力が最小になるとの条件を採用するなどしてもよい。 The travel route determination unit 41b determines (calculates) a scheduled travel route V from the travel start point determined by the job generation unit 41a to the travel destination. This planned travel route V is determined so that the travel distance of the automatic guided vehicle 10 is the shortest within the travelable area defined in the map data. Note that the condition is not limited to the shortest moving distance; for example, the condition that the power consumption is the minimum may be adopted.
 障害物検出部41cは、車載制御装置30より受信した物体の距離データと、前記マップデータとを基に、当該物体が、無人搬送車10の走行可能領域に配置された障害物5であるか否かを検出する。走行可能領域とは、上述したように、マップデータに示された工作機械等の固定設置物を除く領域であり、障害物検出部41cは、既にマップデータ上に存在する工作機械等の固定物は障害物5として検出せず、走行可能領域に配置された物体のみを障害物5として検出する。 The obstacle detection unit 41c determines whether the object is an obstacle 5 placed in the travelable area of the automatic guided vehicle 10 based on the distance data of the object received from the on-vehicle control device 30 and the map data. Detect whether or not. As described above, the travelable area is an area excluding fixed objects such as machine tools shown in the map data, and the obstacle detection unit 41c detects fixed objects such as machine tools that already exist on the map data. is not detected as the obstacle 5, and only objects placed in the travelable area are detected as the obstacle 5.
 通過領域算出部41dは、障害物検出部41cにより障害物5が検出された場合には、走行経路決定部41bにより決定された走行予定経路Vに沿った無人搬送車10の通過予定領域Rを算出する。本例では、通過領域算出部41dは、無人搬送車10の平面形状が通過すると予測される領域を通過予定領域Rとして算出する。尚、通過予定領域Rは、無人搬送車10の立体形状を考慮した三次元領域であってもよい。 When the obstacle 5 is detected by the obstacle detection unit 41c, the passage area calculation unit 41d determines the planned passage area R of the automatic guided vehicle 10 along the scheduled travel route V determined by the travel route determination unit 41b. calculate. In this example, the passing area calculation unit 41d calculates as the expected passing area R the area through which the planar shape of the automatic guided vehicle 10 is predicted to pass. Note that the planned passage area R may be a three-dimensional area in consideration of the three-dimensional shape of the automatic guided vehicle 10.
 干渉予測部41eは、通過領域算出部41dにより算出された無人搬送車10の通過予定領域Rと、距離センサ14より出力された障害物5の距離データ(位置情報)とを基に、無人搬送車10が走行予定経路Vに沿って走行し続けた場合に障害物5と干渉するか否かを予測する。具体的には、干渉予測部41eは、距離センサ14より出力された障害物5の距離データを基に、当該障害物5のマップデータ上における座標位置を特定する。そして、干渉予測部41eは、障害物5の少なくとも一部が、平面視で前記算出した通過予定領域Rと重なるか否かを判定し、重なると判定した場合には、無人搬送車10と障害物5とが今後干渉すると予測し、重ならないと判定した場合には両者の干渉は発生しないと予測する。干渉予測部41eによる予測結果は、無線通信部42を介して車載制御装置30の走行制御部31a及びロボット制御部31bに送信される。 The interference prediction unit 41e determines whether the automatic guided vehicle 10 will pass through the area R calculated by the passing area calculation unit 41d and the distance data (position information) of the obstacle 5 output from the distance sensor 14. It is predicted whether the car 10 will interfere with an obstacle 5 if it continues to travel along the planned travel route V. Specifically, the interference prediction unit 41e identifies the coordinate position of the obstacle 5 on the map data based on the distance data of the obstacle 5 output from the distance sensor 14. Then, the interference prediction unit 41e determines whether at least a part of the obstacle 5 overlaps with the calculated expected passage area R in plan view, and if it is determined that it overlaps, the automatic guided vehicle 10 and the obstacle It is predicted that object 5 will interfere in the future, and if it is determined that they will not overlap, it is predicted that interference between the two will not occur. The prediction result by the interference prediction unit 41e is transmitted to the travel control unit 31a and robot control unit 31b of the vehicle-mounted control device 30 via the wireless communication unit 42.
 走行制御部31aは、干渉予測部41eより無人搬送車10が障害物5と干渉すると予測された場合には、無人搬送車10の走行モータ13aの駆動を停止して無人搬送車10を停車させる。無人搬送車10を停車させるタイミングは、例えば、無人搬送車10と障害物5との離間距離が予め定めた距離以下(例えば1m以下)になったときとすることが好ましい。 When the interference prediction unit 41e predicts that the automatic guided vehicle 10 will interfere with the obstacle 5, the traveling control unit 31a stops driving the traveling motor 13a of the automatic guided vehicle 10 and stops the automatic guided vehicle 10. . The timing of stopping the automatic guided vehicle 10 is preferably, for example, when the distance between the automatic guided vehicle 10 and the obstacle 5 becomes a predetermined distance or less (for example, 1 m or less).
 ロボット制御部31bは、干渉予測部41eより無人搬送車10が障害物5と干渉すると予測された場合には、ロボット20に障害物指定動作を実行させる。 If the interference prediction unit 41e predicts that the automatic guided vehicle 10 will interfere with the obstacle 5, the robot control unit 31b causes the robot 20 to perform an obstacle designation operation.
 本例の障害物指定動作は、ロボット20の第3アーム23に取付けられたリング照明25bから障害物5に向けて照明光を出射する動作である。図4A及び図4Bは、障害物指定動作を実行する際のロボット20の動作姿勢を示しており、図4Aがロボット20を上側から見た平面図であり、図4Bがロボット20を左側から見た側面図である。 The obstacle designation operation in this example is an operation in which illumination light is emitted toward the obstacle 5 from the ring illumination 25b attached to the third arm 23 of the robot 20. 4A and 4B show the operating postures of the robot 20 when executing the obstacle designation motion, FIG. 4A is a plan view of the robot 20 seen from above, and FIG. 4B is a plan view of the robot 20 seen from the left side. FIG.
 ロボット制御部31bは、ロボット20に障害物指定動作を実行させる際には、ロボット20に先ず予め定めた前傾姿勢(例えば図2に示す姿勢)を取らせる。この前傾姿勢時における各アーム21~23の傾動角度は予めティーチング操作を行うことにより不図示の記憶部に記憶されている。そして、ロボット制御部31bは、ロボット20にこの前傾姿勢を基準として、ロボット20全体を軸A1回りに回転させるとともに第3アーム23を軸A6回りに回転させることで、ハンド装着カメラ25に装備されたリング照明25bの光照射方向を障害物5に向ける(図4A及び図4B参照)。本例では、ロボット制御部31bは、リング照明25bの光軸(リング照明25bの中心軸)が、障害物5の重心位置Gを通るようにロボット20の軸A1及び軸A6回りの回転角度を決定する。ここで、障害物5の重心位置Gは、距離センサ14により検出された距離データから推定される障害物5の立体形状を基に算出すればよい。また、重心位置Gに替えて、障害物5の無人搬送車10側の側面の中心位置Cを算出し、リング照明25bの光軸が該中心位置Cを通るようにロボット20の軸A1及び軸A6回りの回転角度を決定するようにしてもよい。 When the robot control unit 31b causes the robot 20 to perform the obstacle designation operation, it first causes the robot 20 to take a predetermined forward leaning posture (for example, the posture shown in FIG. 2). The tilting angles of each of the arms 21 to 23 in this forward leaning posture are stored in a storage section (not shown) by performing a teaching operation in advance. Then, the robot control unit 31b causes the robot 20 to rotate the entire robot 20 around the axis A1 and rotate the third arm 23 around the axis A6 based on this forward-leaning posture, thereby equipping the hand-mounted camera 25 with the robot 20. The light irradiation direction of the ring illumination 25b is directed toward the obstacle 5 (see FIGS. 4A and 4B). In this example, the robot control unit 31b controls the rotation angle of the robot 20 around the axes A1 and A6 so that the optical axis of the ring illumination 25b (the central axis of the ring illumination 25b) passes through the center of gravity G of the obstacle 5. decide. Here, the center of gravity position G of the obstacle 5 may be calculated based on the three-dimensional shape of the obstacle 5 estimated from the distance data detected by the distance sensor 14. In addition, instead of the center of gravity position G, the center position C of the side surface of the obstacle 5 on the automatic guided vehicle 10 side is calculated, and the axis A1 of the robot 20 and the axis The rotation angle around A6 may also be determined.
 そして、ロボット制御部31bは、リング照明25bの光軸を障害物5の重心位置Gに向けた状態で、リング照明25bから該障害物5に向けて照明光を出射させる。本例では、リング照明25bから障害物5に向けて照明光を継続的に出射する。尚、照明光の出射態様はこれに限ったものではなく、例えば、点灯と消灯とを交互に繰り返す(つまり点滅させる)ようにしてもよい。 Then, the robot control unit 31b causes the ring illumination 25b to emit illumination light toward the obstacle 5, with the optical axis of the ring illumination 25b directed toward the gravity center position G of the obstacle 5. In this example, illumination light is continuously emitted toward the obstacle 5 from the ring illumination 25b. Note that the manner in which the illumination light is emitted is not limited to this; for example, it may be alternately turned on and turned off (that is, blinked).
 図5は、上位制御装置40と車載制御装置30との協働により実行される障害物指定制御の一例を示すフローチャートである。 FIG. 5 is a flowchart illustrating an example of obstacle designation control executed in cooperation between the host control device 40 and the vehicle-mounted control device 30.
 ステップS1では、走行経路決定部41bが無人搬送車10の移動開始地から移動目的地までの走行予定経路Vを決定(算出)する。走行予定経路Vの決定に際しては、例えば遺伝的アルゴリズムを用いて多数の経路候補を生成し、生成した多数の経路候補の中から最短となる経路を走行予定経路Vとして算出する。 In step S1, the travel route determining unit 41b determines (calculates) a scheduled travel route V for the automatic guided vehicle 10 from the travel start point to the travel destination. When determining the planned driving route V, a large number of route candidates are generated using, for example, a genetic algorithm, and the shortest route is calculated as the planned driving route V from among the generated large number of route candidates.
 ステップS2では、走行制御部31aが、ステップS1で決定された走行予定経路Vに沿って無人搬送車10の走行を開始させるべく走行用アクチュエータ13を制御する。 In step S2, the travel control unit 31a controls the travel actuator 13 to start the automatic guided vehicle 10 traveling along the scheduled travel route V determined in step S1.
 ステップS3では、障害物検出部41cにて障害物5が検出されたか否かを判定し、この判定がNOである場合にはリターンする一方、YESである場合にはステップS4に進む。 In step S3, it is determined whether or not the obstacle 5 has been detected by the obstacle detection unit 41c, and if this determination is NO, the process returns, while if it is YES, the process proceeds to step S4.
 ステップS4では、通過領域算出部41dが、ステップS1で算出された走行予定経路Vに沿った無人搬送車10の通過予定領域Rを算出する。 In step S4, the passage area calculation unit 41d calculates the planned passage area R of the automatic guided vehicle 10 along the planned travel route V calculated in step S1.
 ステップS5では、干渉予測部41eが、ステップS4で算出された無人搬送車10の通過予定領域Rと、距離センサ14から出力された前記障害物5の距離データとを基に、無人搬送車10が走行予定経路Vに沿って走行し続けた場合に当該障害物5と干渉するか否かを予測する。この判定がNOである場合にはリターンする一方、YESである場合にはステップS6に進む。 In step S5, the interference prediction unit 41e determines whether or not the automatic guided vehicle 10 will pass through the automatic guided vehicle 10 based on the expected passage area R of the automatic guided vehicle 10 calculated in step S4 and the distance data of the obstacle 5 output from the distance sensor 14. It is predicted whether or not the vehicle will interfere with the obstacle 5 if it continues traveling along the planned travel route V. If this determination is NO, the process returns, whereas if this determination is YES, the process advances to step S6.
 ステップS6では、走行制御部31aが、距離センサ14から出力される障害物5の距離データを基に、無人搬送車10と障害物5との距離が所定距離以下(例えば1m以下)であるか否かを判定し、この判定がNOである場合にはリターンする一方、YESである場合にはステップS7に進む。 In step S6, the traveling control unit 31a determines whether the distance between the automatic guided vehicle 10 and the obstacle 5 is less than a predetermined distance (for example, 1 m or less) based on the distance data of the obstacle 5 output from the distance sensor 14. If the determination is NO, the process returns, whereas if the determination is YES, the process advances to step S7.
 ステップS7では、走行制御部31aが無人搬送車10を停車させるべく走行モータ13aの駆動を停止する。 In step S7, the travel control unit 31a stops driving the travel motor 13a to stop the automatic guided vehicle 10.
 ステップS8では、ロボット制御部31bがロボット20に上述した障害物指定動作を実行させ、しかる後にリターンする。尚、本例では、無人搬送車10の停車後にロボット20による障害物指定動作の実行を開始するようにしているが、これに限ったものではなく、無人搬送車10の停車前又は停車と同時に障害物指定動作の実行を開始してもよい。 In step S8, the robot control unit 31b causes the robot 20 to perform the above-described obstacle designation operation, and then returns. Note that in this example, the robot 20 starts executing the obstacle designation operation after the automatic guided vehicle 10 has stopped; however, this is not limited to this, and before or at the same time as the automatic guided vehicle 10 has stopped. The execution of the obstacle designation operation may be started.
 以上のように構成された走行ロボットシステム1の動作例を、図1を参照しながら説明する。図1は、無人搬送車10の移動開始地が、第1の作業ステーション3aに対して設定された作業位置P1であり、移動目的地が、第2の作業ステーション3bに対して設定された作業位置P2である例を示している。先ず、走行経路決定部41bにより、移動開始地である作業位置P1から移動目的地である作業位置P2までの走行予定経路Vが算出され、次いで、走行制御部31aによる制御の下、無人搬送車10が走行予定経路Vに沿って走行を開始する。走行開始後、中間位置P3において、障害物検出部41cにより障害物5が検出されると、通過領域算出部41dにより無人搬送車10の通過予定領域Rが算出され、次いで、干渉予測部41eにおいて、通過領域算出部41dが算出した通過予定領域Rと距離センサ14から出力される障害物5の距離データとを基に、無人搬送車10が走行予定経路Vに沿って走行し続けた場合に障害物5と干渉するか否かが予測される。図1の例では、干渉が生じると予測されるのでその旨の予測結果が干渉予測部41eより車載制御装置30に送信される。車載制御装置30では、この予測結果を受けて、走行制御部31aが無人搬送車10を停車させるとともに、ロボット制御部31bがロボット20に障害物指定動作を実行させる。この結果、無人搬送車10が障害物5と干渉する前に中間位置P3にて停車するとともに、ロボット20の第3アーム23に取付けられたリング照明25bの光軸が障害物5に向けられて、リング照明25bより該障害物5に向けて照明光が出射される。これにより、周囲の作業者は、リング照明25bからの照明光が照射された障害物5を無人搬送車10の停車原因であると認識して、当該障害物5を除去するなど適切な対策を講じることができる。 An example of the operation of the traveling robot system 1 configured as described above will be explained with reference to FIG. 1. In FIG. 1, the movement start point of the automatic guided vehicle 10 is a work position P1 set for the first work station 3a, and the movement destination is a work position P1 set for the second work station 3b. An example of position P2 is shown. First, the travel route determining unit 41b calculates a planned travel route V from the work position P1, which is the movement start point, to the work position P2, which is the movement destination, and then, under the control of the travel control unit 31a, the automatic guided vehicle 10 starts traveling along the scheduled travel route V. After the start of travel, when the obstacle detection unit 41c detects the obstacle 5 at the intermediate position P3, the passage area calculation unit 41d calculates the expected passage area R of the automatic guided vehicle 10, and then the interference prediction unit 41e calculates the expected passage area R of the automatic guided vehicle 10. , when the automatic guided vehicle 10 continues to travel along the planned travel route V based on the planned passage area R calculated by the passage area calculation unit 41d and the distance data of the obstacle 5 output from the distance sensor 14. It is predicted whether or not there will be interference with the obstacle 5. In the example of FIG. 1, since it is predicted that interference will occur, the prediction result to that effect is transmitted from the interference prediction unit 41e to the vehicle-mounted control device 30. In the vehicle-mounted control device 30, in response to this prediction result, the travel control section 31a stops the automatic guided vehicle 10, and the robot control section 31b causes the robot 20 to perform an obstacle designation operation. As a result, the automatic guided vehicle 10 stops at the intermediate position P3 before interfering with the obstacle 5, and the optical axis of the ring illumination 25b attached to the third arm 23 of the robot 20 is directed toward the obstacle 5. , illumination light is emitted toward the obstacle 5 from the ring illumination 25b. As a result, surrounding workers recognize that the obstacle 5 illuminated by the illumination light from the ring illumination 25b is the cause of the automatic guided vehicle 10 stopping, and take appropriate measures such as removing the obstacle 5. can be taken.
 以上説明したように、本実施形態では、無人搬送車10が障害物5と干渉すると予測される場合には、無人搬送車10を停車させるとともに、無人搬送車10に搭載されたロボット20に障害物指定動作を実行させるようにしたことで、無人搬送車10と障害物5との干渉を回避しつつ、無人搬送車10の停車原因となった障害物5を周囲の作業者に容易に認識させることができる。また、無人搬送車10に搭載された既存のロボット20を利用して無人搬送車10の停車の原因となった障害物5を周囲の作業者に報知することができるので、表示モニタやスピーカなどの報知手段を別途設ける必要がなく、システム全体を安価に構成することができる。 As explained above, in this embodiment, when it is predicted that the automatic guided vehicle 10 will interfere with the obstacle 5, the automatic guided vehicle 10 is stopped and the robot 20 mounted on the automatic guided vehicle 10 is stopped. By performing the object designation operation, interference between the automatic guided vehicle 10 and the obstacle 5 can be avoided, and the obstacle 5 that caused the automatic guided vehicle 10 to stop can be easily recognized by surrounding workers. can be done. In addition, the existing robot 20 mounted on the automatic guided vehicle 10 can be used to notify surrounding workers of the obstacle 5 that caused the automatic guided vehicle 10 to stop, so the operator can use the display monitor, speaker, etc. There is no need to separately provide a notification means, and the entire system can be constructed at low cost.
 また、本実施形態では、リング照明25bにより無人搬送車10の停車原因となった障害物5が照らし出されるため、単にアームの先端等で障害物5を指し示す場合に比べて、無人搬送車10の停車原因となった障害物5をより一層明確に周囲の作業者に認識させることができる。また、リング照明25bは、ハンド装着カメラ25に付属する照明装置であるため、専用の照明装置を別途設ける場合に比べてシステム全体を安価に構成することができる。 Furthermore, in this embodiment, since the ring illumination 25b illuminates the obstacle 5 that caused the automatic guided vehicle 10 to stop, the automatic guided vehicle 10 The obstacle 5 that caused the vehicle to stop can be more clearly recognized by surrounding workers. Furthermore, since the ring illumination 25b is an illumination device attached to the hand-mounted camera 25, the entire system can be constructed at a lower cost than when a dedicated illumination device is separately provided.
 《変形例1》
 図6は変形例1を示している。この変形例では、ロボット20による障害物指定動作が前記実施形態とは異なっている。尚、この点を除く他のハードウェア構成や制御処理は前記実施形態と同様であるため詳細な説明を省略する。
Modification 1》
FIG. 6 shows modification example 1. In this modified example, the obstacle designation operation by the robot 20 is different from the above embodiment. Note that other hardware configurations and control processing other than this point are the same as those in the embodiment described above, so detailed explanations will be omitted.
 すなわち、本変形例では、ロボット20による障害物指定動作は、ロボット20の第3アーム23の先端部に装着されたハンド24(ロボット20の作業側の先端部に相当)を障害物5に対向させる動作とされている。 That is, in this modified example, the obstacle designation operation by the robot 20 is to move the hand 24 (corresponding to the working side tip of the robot 20) attached to the tip of the third arm 23 of the robot 20 to face the obstacle 5. It is said to be an action that causes
 この障害物指定動作では、ハンド24における把持爪24a側の面が障害物5に対向することが好ましい。また、この障害物指定動作では、第3アーム23の軸線が障害物5の重心位置Gを通るようにロボット20を動作させることが好ましい。尚、重心位置Gに替えて、障害物5の無人搬送車10に近い側の側面の中心位置Cを採用してもよい。 In this obstacle designation operation, it is preferable that the surface of the hand 24 on the gripping claw 24a side faces the obstacle 5. Further, in this obstacle designation operation, it is preferable to operate the robot 20 so that the axis of the third arm 23 passes through the center of gravity position G of the obstacle 5. Note that instead of the center of gravity position G, the center position C of the side surface of the obstacle 5 closer to the automatic guided vehicle 10 may be used.
 本変形例によれば、無人搬送車10が停車した場合に、ロボット20が、第3アーム23に装着されたハンド24を障害物5に対向させる姿勢を取っていれば、作業者は、このロボット20の姿勢を見て、無人搬送車10の停車原因が当該障害物5であると一目で認識することができる。また、前記実施形態のように障害物指定動作の実行に際してリング照明25bを作動させる必要がないので省エネ性を向上させることができる。 According to this modification, when the automatic guided vehicle 10 is stopped and the robot 20 takes a posture in which the hand 24 attached to the third arm 23 faces the obstacle 5, the operator can By looking at the posture of the robot 20, it can be recognized at a glance that the obstacle 5 is the cause of the automatic guided vehicle 10 stopping. Further, unlike the embodiments described above, it is not necessary to operate the ring illumination 25b when executing the obstacle designation operation, so that energy saving can be improved.
 《変形例2》
 図7A及び図7Bは変形例2を示している。この変形例では、ロボット20による障害物指定動作が前記実施形態及び変形例1とは異なっている。尚、この点を除く他のハードウェア構成や制御処理は前記実施形態と同様であるため詳細な説明を省略する。
《Modification 2》
FIGS. 7A and 7B show a second modification. In this modification, the obstacle designation operation by the robot 20 is different from the embodiment and the first modification. Note that other hardware configurations and control processing other than this point are the same as those in the embodiment described above, so detailed explanations will be omitted.
 すなわち、本変形例では、ロボット20による障害物指定動作は、ロボット20の第3アーム23の先端部に装着されたハンド24(ロボット20の作業側の先端部に相当)を障害物5の上側に位置させる動作とされている。この障害物指定動作は、ハンド24と障害物5の上面との間に隙間が確保されるように実行することが好ましい。また、この障害物指定動作は、障害物5における無人搬送車10に近い側の端部の上側にハンド24が位置するように実行することが好ましい。 That is, in this modification, the obstacle designation operation by the robot 20 is to move the hand 24 (corresponding to the working side tip of the robot 20) attached to the tip of the third arm 23 of the robot 20 to the upper side of the obstacle 5. It is said that the action is to position the This obstacle designation operation is preferably performed so that a gap is secured between the hand 24 and the upper surface of the obstacle 5. Moreover, it is preferable to perform this obstacle designation operation so that the hand 24 is positioned above the end of the obstacle 5 on the side closer to the automatic guided vehicle 10.
 本変形例によれば、無人搬送車10が停車した場合に、ハンド24が障害物5の上側に位置するような姿勢をロボット20が取っていれば、作業者は、このロボット20の姿勢を見て、無人搬送車10の停車原因が当該障害物5であることを一目で認識することができる。特に、本変形例の障害物指定動作では、ハンド24を障害物5の上側に位置させるようにしているので、前記実施形態及び変形例1に比べて、無人搬送車10の停車原因となっている障害物5をより一層明確に指し示すことができる。また、前記実施形態のように障害物指定動作の実行に際してリング照明25bを作動させる必要がないので省エネ性を向上させることができる。 According to this modification, if the robot 20 takes a posture such that the hand 24 is positioned above the obstacle 5 when the automatic guided vehicle 10 stops, the worker can change the posture of the robot 20. It can be recognized at a glance that the obstacle 5 is the cause of the automatic guided vehicle 10 stopping. In particular, in the obstacle designation operation of this modification, the hand 24 is positioned above the obstacle 5, which causes the automatic guided vehicle 10 to stop, compared to the embodiment and modification 1. It is possible to point out the obstacle 5 even more clearly. Further, unlike the embodiments described above, it is not necessary to operate the ring illumination 25b when executing the obstacle designation operation, so that energy saving can be improved.
 《他の実施形態》
 前記実施形態及び各変形例では、干渉予測部41eにより無人搬送車10が障害物5と干渉すると予測される場合には、回避経路を探索することなく無人搬送車10を停車させるようにしているが、これに限ったものではなく、例えば、走行経路決定部41bにて、無人搬送車10と障害物5との干渉を回避するための回避経路が存在するか否かを判定して、回避経路が存在すると判定した場合には、現時点の走行予定経路Vを当該回避経路に更新する一方、回避経路が存在しないと判定した場合には、現時点の走行予定経路Vを更新しないようにしてもよい。これによれば、障害物検出部41cにより障害物5が検出された後、走行経路決定部41bにて障害物5を回避する回避経路が存在しないと判定されて初めて無人搬送車10が停車し、それに伴い、ロボット制御部31bによる制御の下、ロボット20が障害物指定動作を実行することとなる。
《Other embodiments》
In the embodiment and each modification example, when the interference prediction unit 41e predicts that the automatic guided vehicle 10 will interfere with the obstacle 5, the automatic guided vehicle 10 is stopped without searching for an avoidance route. However, the invention is not limited to this, and for example, the travel route determination unit 41b determines whether or not there is an avoidance route for avoiding interference between the automatic guided vehicle 10 and the obstacle 5, and performs the avoidance. If it is determined that a route exists, the current scheduled travel route V is updated to the corresponding avoidance route, while if it is determined that there is no avoidance route, the current scheduled travel route V is not updated. good. According to this, after the obstacle 5 is detected by the obstacle detection unit 41c, the automatic guided vehicle 10 stops only after the travel route determination unit 41b determines that there is no avoidance route that avoids the obstacle 5. Accordingly, the robot 20 executes the obstacle designation operation under the control of the robot control unit 31b.
 このような走行ロボットシステム1によれば、例えば、図8に示すように2つの障害物5a,5bが無人搬送車10の走行路を形成する左右の側壁から交互に突出して配置されている場合に、無人搬送車10は、両障害物5a,5bの間を縫って走行することが可能になる。このとき、図8の実線で示すように、無人搬送車10が2つの障害物5a,5bの間で停車してしまうと、従来の走行ロボットシステムでは、2つの障害物5a,5bのうちいずれが原因で無人搬送車10が停車したかを周囲の作業者が認識することができないという問題があった。これに対して、本発明の走行ロボットシステム1では、無人搬送車10が2つの障害物5a,5bの間で停車した場合であっても、ロボット制御部31bによる制御の下、ロボット20が障害物指定動作を実行して無人搬送車10の停車原因となった障害物5(図8の例では障害物5a)を指し示す。したがって、周囲の作業者は、ロボット20が指し示す障害物5aが原因で無人搬送車10が停車したことを一目で認識することができる。 According to such a traveling robot system 1, for example, as shown in FIG. 8, when two obstacles 5a and 5b are arranged to protrude alternately from the left and right side walls forming the traveling path of the automatic guided vehicle 10. In addition, the automatic guided vehicle 10 can travel between the obstacles 5a and 5b. At this time, as shown by the solid line in FIG. 8, if the automatic guided vehicle 10 stops between two obstacles 5a and 5b, in the conventional traveling robot system, one of the two obstacles 5a and 5b There was a problem in that surrounding workers could not recognize whether the automatic guided vehicle 10 had stopped due to this. On the other hand, in the traveling robot system 1 of the present invention, even if the automatic guided vehicle 10 stops between two obstacles 5a and 5b, the robot 20 can move around the obstacle under the control of the robot control unit 31b. The object specifying operation is executed to point to the obstacle 5 (in the example of FIG. 8, the obstacle 5a) that caused the automatic guided vehicle 10 to stop. Therefore, surrounding workers can recognize at a glance that the automatic guided vehicle 10 has stopped due to the obstacle 5a pointed to by the robot 20.
 前記実施形態及び各変形例では、無人搬送車10は、無軌道で自律走行するように構成されているが、これに限ったものではない。無人搬送車10は、例えば、磁気テープや光反射テープなどの誘導体に沿って有軌道で走行するように構成されていてもよい。 In the embodiment and each modification example, the automatic guided vehicle 10 is configured to autonomously travel without a track, but the present invention is not limited to this. The automatic guided vehicle 10 may be configured to run on a track along a guide such as a magnetic tape or a light reflective tape, for example.
 前記実施形態及び各変形例では、制御システム100は、上位制御装置40と車載制御装置30とで構成されているが、これに限ったものではなく、上位制御装置40の各機能部を車載制御装置30に集約してもよいし、これとは逆に、車載制御装置30の各機能部を上位制御装置40に集約して無人搬送車10をリモート制御するようにしてもよい。 In the embodiment and each modification example, the control system 100 includes the host control device 40 and the vehicle-mounted control device 30; however, the control system 100 is not limited to this, and each functional section of the host control device 40 is configured to be controlled by the vehicle-mounted control device 40. The functions may be integrated into the device 30, or conversely, the functional units of the vehicle-mounted control device 30 may be integrated into the host control device 40 to remotely control the automatic guided vehicle 10.
 前記実施形態及び各変形例では、ロボット20の第3アーム23の先端にはハンド24が装着されているが、必ずしもハンド24は必要ではない。例えば、ロボット20が行う作業がエアブロー動作である場合には、ハンド24に替えてエアブロー装置を装着してもよい。この場合、エアブロー装置がロボット20の作業側の先端部に相当する。 In the embodiment and each modification example, the hand 24 is attached to the tip of the third arm 23 of the robot 20, but the hand 24 is not necessarily required. For example, when the work performed by the robot 20 is an air blowing operation, an air blowing device may be attached instead of the hand 24. In this case, the air blow device corresponds to the tip of the robot 20 on the working side.
 前記実施形態及び各変形例では、距離センサ14は、レーザ光を水平方向及び上下方向に走査させるように構成されているが、例えば水平方向にのみ走査させるように構成されていてもよい。この場合であっても、距離センサ14より出力される距離データを基に障害物5の平面的な位置は特定できるので、この障害物5の平面的な位置を基に、ロボット20に当該障害物5を指し示す動作(障害物指定動作)を実行させることは可能である。 In the embodiment and each modification, the distance sensor 14 is configured to scan the laser beam in the horizontal direction and the vertical direction, but may be configured to scan only in the horizontal direction, for example. Even in this case, the two-dimensional position of the obstacle 5 can be specified based on the distance data output from the distance sensor 14, so based on the two-dimensional position of the obstacle 5, the robot 20 can It is possible to perform an action of pointing to the object 5 (an obstacle designation action).
 前記実施形態及び各変形例では、距離センサ14により無人搬送車10の前側の所定範囲内に存在する物体を検出するようにしているが、これに限ったものではなく、例えば無人搬送車10の左側、右側又は後側の所定範囲内に存在する物体を検出するようにしてもよい。 In the embodiment and each modification, the distance sensor 14 detects an object existing within a predetermined range in front of the automatic guided vehicle 10; however, the present invention is not limited to this. An object existing within a predetermined range on the left side, right side, or rear side may be detected.
 前記実施形態及び各変形例では、物体検出センサは距離センサ14とされているが、これに限ったものではなく、例えば画像センサであってもよい。この場合、画像センサから出力される各画素の座標位置及びそれぞれの輝度値が、物体の位置を特定するための位置情報として機能する。 In the embodiment and each modification example, the object detection sensor is the distance sensor 14, but the object detection sensor is not limited to this, and may be an image sensor, for example. In this case, the coordinate position of each pixel and the respective brightness value output from the image sensor function as position information for specifying the position of the object.
 また、本発明には、前記実施形態及び各変形例の任意の組み合わせが含まれる。 Furthermore, the present invention includes any combination of the above embodiment and each modification.
 尚、上述した実施形態の説明は、すべての点で例示であって、制限的なものではない。当業者にとって変形および変更が適宜可能である。本発明の範囲は、上述の実施形態ではなく、特許請求の範囲によって示される。さらに、本発明の範囲には、特許請求の範囲内と均等の範囲内での実施形態からの変更が含まれる。 Note that the description of the embodiments described above is illustrative in all respects and is not restrictive. Modifications and changes can be made as appropriate by those skilled in the art. The scope of the invention is indicated by the claims rather than the embodiments described above. Furthermore, the scope of the present invention includes changes from the embodiments within the scope of the claims and equivalents.
R   通過予定領域
V   走行予定経路
1   走行ロボットシステム
5   障害物
5a  障害物
5b  障害物
10  無人搬送車
14  距離センサ(物体検出センサ)
20  ロボット
24  ハンド(ロボットの作業側の先端部)
25  ハンド装着カメラ(カメラ)
25b リング照明(照明装置)
31a 走行制御部
31b ロボット制御部
41a ジョブ生成部
41b 走行経路決定部
41c 障害物検出部
41d 通過領域算出部
41e 干渉予測部
R Planned passage area V Planned travel route 1 Traveling robot system 5 Obstacle 5a Obstacle 5b Obstacle 10 Automatic guided vehicle 14 Distance sensor (object detection sensor)
20 Robot 24 Hand (the tip of the robot on the working side)
25 Hand-worn camera (camera)
25b Ring lighting (lighting device)
31a Travel control section 31b Robot control section 41a Job generation section 41b Travel route determination section 41c Obstacle detection section 41d Passage area calculation section 41e Interference prediction section

Claims (4)

  1.  無人搬送車と、該無人搬送車の走行予定経路を決定する走行経路決定部と、該走行経路決定部により決定された走行予定経路に沿って前記無人搬送車を走行させる走行制御部と、前記無人搬送車に搭載されたロボットと、前記ロボットの動作を制御するロボット制御部とを備えた走行ロボットシステムであって、
     前記無人搬送車に搭載され、該無人搬送車から所定範囲内に存在する物体を検出するとともに当該物体の位置を特定可能な位置情報を出力する物体検出センサと、
     前記物体検出センサにより検出された物体が、前記無人搬送車の走行可能領域に配置された障害物であるか否かを検出する障害物検出部と、
     前記障害物検出部により前記物体が障害物であると検出された場合に、前記走行予定経路に沿った前記無人搬送車の通過予定領域を算出する通過領域算出部と、
     前記通過領域算出部により算出された前記無人搬送車の通過予定領域と、前記物体検出センサより出力された前記障害物の位置情報とを基に、前記無人搬送車が前記走行予定経路に沿って走行し続けた場合に前記障害物と干渉するか否かを予測する干渉予測部とを備え、
     前記走行制御部は、前記干渉予測部によって前記無人搬送車と前記障害物とが干渉すると予測される場合には、前記無人搬送車を該障害物と干渉する前に停車させるように構成され、
     前記ロボット制御部は、前記干渉予測部によって前記無人搬送車と前記障害物とが干渉すると予測される場合には、前記ロボットに当該障害物を指し示す障害物指定動作を実行させるように構成されていることを特徴とする走行ロボットシステム。
    an automatic guided vehicle; a travel route determination unit that determines a scheduled travel route for the automatic guided vehicle; a travel control unit that causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit; A traveling robot system comprising a robot mounted on an unmanned guided vehicle and a robot control unit that controls the operation of the robot,
    an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified;
    an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle;
    a passage area calculation unit that calculates a planned passage area of the automatic guided vehicle along the planned travel route when the object is detected as an obstacle by the obstacle detection unit;
    Based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route. an interference prediction unit that predicts whether or not the vehicle will interfere with the obstacle if the vehicle continues to travel;
    The travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the interference prediction unit predicts that the automatic guided vehicle will interfere with the obstacle,
    The robot control unit is configured to cause the robot to perform an obstacle designation operation to point to the obstacle when the interference prediction unit predicts that the automatic guided vehicle and the obstacle will interfere. A traveling robot system that is characterized by:
  2.  無人搬送車と、該無人搬送車の走行予定経路を決定する走行経路決定部と、該走行経路決定部により決定された走行予定経路に沿って前記無人搬送車を走行させる走行制御部と、前記無人搬送車に搭載されたロボットと、前記ロボットの動作を制御するロボット制御部とを備えた走行ロボットシステムであって、
     前記無人搬送車に搭載され、該無人搬送車から所定範囲内に存在する物体を検出するとともに当該物体の位置を特定可能な位置情報を出力する物体検出センサと、
     前記物体検出センサにより検出された物体が、前記無人搬送車の走行可能領域に配置された障害物であるか否かを検出する障害物検出部と、
     前記障害物検出部により前記物体が障害物であると検出された場合に、前記走行予定経路に沿った前記無人搬送車の通過予定領域を算出する通過領域算出部と、
     前記通過領域算出部により算出された前記無人搬送車の通過予定領域と、前記物体検出センサより出力された前記障害物の位置情報とを基に、前記無人搬送車が前記走行予定経路に沿って走行し続けた場合に前記障害物と干渉するか否かを予測する干渉予測部とを備え、
     前記走行経路決定部は、前記干渉予測部により前記無人搬送車と前記障害物とが干渉すると予測された場合には、該干渉を回避し得る回避経路が存在するか否かを判定して、存在すると判定した場合には、現時点の走行予定経路を当該回避経路に更新するように構成され、
     前記走行制御部は、前記走行経路決定部により前記干渉を回避し得る回避経路が存在しないと判定された場合には、前記無人搬送車を前記障害物と干渉する前に停車させるように構成され、
     前記ロボット制御部は、前記走行経路決定部により前記干渉を回避し得る回避経路が存在しないと判定された場合には、前記ロボットに、該干渉を生じさせると予測される前記障害物を指し示す障害物指定動作を実行させるように構成されていることを特徴とする走行ロボットシステム。
    an automatic guided vehicle; a travel route determination unit that determines a scheduled travel route for the automatic guided vehicle; a travel control unit that causes the automatic guided vehicle to travel along the scheduled travel route determined by the travel route determination unit; A traveling robot system comprising a robot mounted on an unmanned guided vehicle and a robot control unit that controls the operation of the robot,
    an object detection sensor mounted on the automatic guided vehicle that detects an object existing within a predetermined range from the automatic guided vehicle and outputs position information that allows the position of the object to be specified;
    an obstacle detection unit that detects whether the object detected by the object detection sensor is an obstacle placed in a travelable area of the automatic guided vehicle;
    a passage area calculation unit that calculates a planned passage area of the automatic guided vehicle along the planned travel route when the object is detected as an obstacle by the obstacle detection unit;
    Based on the expected passage area of the automatic guided vehicle calculated by the passage area calculation unit and the position information of the obstacle outputted from the object detection sensor, the automatic guided vehicle moves along the scheduled travel route. an interference prediction unit that predicts whether or not the vehicle will interfere with the obstacle if the vehicle continues to travel;
    When the interference prediction unit predicts that the automatic guided vehicle and the obstacle will interfere, the travel route determination unit determines whether there is an avoidance route that can avoid the interference, If it is determined that the avoidance route exists, the current scheduled travel route is updated to the avoidance route,
    The travel control unit is configured to stop the automatic guided vehicle before interfering with the obstacle when the travel route determination unit determines that there is no avoidance route that can avoid the interference. ,
    If the travel route determination unit determines that there is no avoidance route that can avoid the interference, the robot control unit may cause the robot to have an obstacle that points to the obstacle that is predicted to cause the interference. A traveling robot system characterized by being configured to execute an object-specifying motion.
  3.  前記ロボットは、所定作業を行う多関節ロボットであり、
     前記障害物指定動作は、前記ロボットの作業側の先端部を、前記干渉予測部により無人搬送車と干渉すると予測される障害物に対向させるか又は当該障害物の上側に位置させる動作であることを特徴とする請求項1又は2記載の走行ロボットシステム。
    The robot is an articulated robot that performs a predetermined task,
    The obstacle designation operation is an operation of causing the tip of the robot on the work side to face an obstacle that is predicted to interfere with the automatic guided vehicle by the interference prediction unit, or to position it above the obstacle. The traveling robot system according to claim 1 or 2, characterized in that:
  4.  前記ロボットは多関節ロボットであり、
     前記ロボットには、照明装置を装備したカメラが取付けられており、
     前記障害物指定動作は、前記照明装置の照明光が、前記干渉予測部により無人搬送車と干渉すると予測される障害物に向くような姿勢を前記ロボットに取らせる動作であることを特徴とする請求項1から3のいずれか1つに記載の走行ロボットシステム。
     
    The robot is an articulated robot,
    A camera equipped with a lighting device is attached to the robot,
    The obstacle designation operation is characterized in that the robot takes a posture such that the illumination light of the illumination device faces an obstacle predicted by the interference prediction unit to interfere with the automatic guided vehicle. A traveling robot system according to any one of claims 1 to 3.
PCT/JP2023/007606 2022-03-11 2023-03-01 Travel robot system WO2023171500A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-038441 2022-03-11
JP2022038441A JP7285354B1 (en) 2022-03-11 2022-03-11 traveling robot system

Publications (1)

Publication Number Publication Date
WO2023171500A1 true WO2023171500A1 (en) 2023-09-14

Family

ID=86538381

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/007606 WO2023171500A1 (en) 2022-03-11 2023-03-01 Travel robot system

Country Status (2)

Country Link
JP (1) JP7285354B1 (en)
WO (1) WO2023171500A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0926826A (en) * 1995-07-07 1997-01-28 Tokyu Car Corp Obstacle detection method and device for automated guided vehicle
JP2009113190A (en) * 2007-11-09 2009-05-28 Toyota Motor Corp Autonomous working robot and method of controlling operation of autonomous working robot

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0926826A (en) * 1995-07-07 1997-01-28 Tokyu Car Corp Obstacle detection method and device for automated guided vehicle
JP2009113190A (en) * 2007-11-09 2009-05-28 Toyota Motor Corp Autonomous working robot and method of controlling operation of autonomous working robot

Also Published As

Publication number Publication date
JP2023132875A (en) 2023-09-22
JP7285354B1 (en) 2023-06-01

Similar Documents

Publication Publication Date Title
US11518029B2 (en) Control processing for mobile robotic devices
US11241796B2 (en) Robot system and method for controlling robot system
JP6811258B2 (en) Position measurement of robot vehicle
CN110763225B (en) Trolley path navigation method and system and transport vehicle system
US9541922B2 (en) Vehicle control system and vehicle control method
JP4142021B2 (en) Coordinate correction method for robot cleaner and robot cleaner system using the same
JP6853832B2 (en) Position measurement using negative mapping
US20180079079A1 (en) Mobile robot with collision anticipation
US20160170412A1 (en) Autonomous mobile device and method for controlling same
JP2016151897A (en) Mobile body control device and mobile body control method
JP4735476B2 (en) Autonomous mobile device
KR20200057321A (en) Mobile robot platform system for process and production management
US20210162961A1 (en) Automated guided robot system
JP2017142659A (en) Autonomous moving body system
JP5553220B2 (en) Moving body
EP3892426A1 (en) Autonomously traveling mobile robot and traveling control method therefor
JP2009237851A (en) Mobile object control system
JP2000033592A (en) Production system
WO2023171500A1 (en) Travel robot system
KR101257566B1 (en) Auto guided vehicle and method for controlling the same
KR20220050483A (en) Coordinates recognition apparatus of automatic guided vehicle and method thereof
JP2021109252A (en) Production system
JP7234446B1 (en) unmanned carrier system
JP7254973B1 (en) traveling robot system
KR102171934B1 (en) Bidirectional following cart

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23766676

Country of ref document: EP

Kind code of ref document: A1