CN111065981A - Moving body and moving body system - Google Patents

Moving body and moving body system Download PDF

Info

Publication number
CN111065981A
CN111065981A CN201880057317.5A CN201880057317A CN111065981A CN 111065981 A CN111065981 A CN 111065981A CN 201880057317 A CN201880057317 A CN 201880057317A CN 111065981 A CN111065981 A CN 111065981A
Authority
CN
China
Prior art keywords
obstacle
control circuit
signal indicating
path
agv10
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880057317.5A
Other languages
Chinese (zh)
Inventor
市川明男
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Shimpo Corp
Original Assignee
Nidec Shimpo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Shimpo Corp filed Critical Nidec Shimpo Corp
Publication of CN111065981A publication Critical patent/CN111065981A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The management device has: a 1 st communication circuit which communicates with each of a plurality of mobile bodies; and a 1 st control circuit that determines a travel path of each mobile body and transmits a signal indicating the travel path to each of the plurality of mobile bodies via the 1 st communication circuit. Each of the moving bodies has: a 2 nd communication circuit that communicates with the 1 st communication circuit; a sensor that detects an obstacle; and a 2 nd control circuit for moving the movable body according to the travel path determined by the 1 st control circuit. When the sensor detects an obstacle, the 2 nd control circuit causes the moving body to avoid the obstacle, and transmits a signal indicating the presence of the obstacle via the 2 nd communication circuit. When a signal indicating the presence of an obstacle is transmitted from an arbitrary mobile object, the 1 st control circuit changes the route of another mobile object that is expected to pass through the route in which the obstacle is present.

Description

Moving body and moving body system
Technical Field
The present disclosure relates to a mobile body and a mobile body system.
Background
Research and development of moving bodies such as automated guided vehicles and mobile robots are being advanced. For example, japanese patent laid-open nos. 2009-.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2009-223634
Patent document 2: japanese patent laid-open publication No. 2009-205652
Patent document 3: japanese patent laid-open publication No. 2005-242489
Disclosure of Invention
Problems to be solved by the invention
Embodiments of the present disclosure provide a technique of making operations of a plurality of moving bodies capable of autonomous movement smoother.
Means for solving the problems
The management device of the exemplary embodiment of the present disclosure manages the operation of a plurality of mobile bodies that can autonomously move. The management device includes: a 1 st communication circuit that communicates with each of the plurality of moving bodies; and a 1 st control circuit that determines a travel path of each of the plurality of mobile bodies and transmits a signal indicating the travel path to each of the plurality of mobile bodies via the 1 st communication circuit. Each of the plurality of moving bodies has: a 2 nd communication circuit that communicates with the 1 st communication circuit; at least one sensor that detects an obstacle; and a 2 nd control circuit that moves the moving body according to the travel path determined by the 1 st control circuit, and when the sensor detects an obstacle, the 2 nd control circuit causes the moving body to avoid the obstacle and transmits a signal indicating the presence of the obstacle via the 2 nd communication circuit. When the signal indicating the presence of the obstacle is transmitted from any of the plurality of moving bodies, the 1 st control circuit changes the path of the moving body, which is expected to pass through the path in which the obstacle is present, among the plurality of moving bodies.
The above-described general manner can also be realized by a system, a method, an integrated circuit, a computer program, or a recording medium. Alternatively, the present invention may be implemented by any combination of systems, apparatuses, methods, integrated circuits, computer programs, and recording media.
Effects of the invention
According to the embodiment of the present disclosure, when a certain moving body performs an operation of avoiding an obstacle, the path of another moving body is changed to a path that does not collide with the obstacle. Therefore, the operation of the moving body system can be smoother.
Drawings
Fig. 1 is a diagram schematically showing the structure of a mobile body system 100 of an exemplary embodiment of the present disclosure.
Fig. 2A shows an example of a case where there is no obstacle on the traveling path of the mobile body 10A.
FIG. 2B shows a mark M on the traveling path of the mobile body 10A1And a marker M2An example of the avoidance operation in the case where the obstacle 70 exists therebetween.
Fig. 2C is a diagram showing an example of the route after the change.
Fig. 2D is a diagram showing another example of the route after the change.
Fig. 3 is a diagram showing an example of data of the travel path of each mobile body 10 managed by the management device 50.
Fig. 4 is a flowchart showing an example of the operation of the 1 st control circuit 51 of the management device 50.
Fig. 5 is a flowchart showing an example of the operation of the 2 nd control circuit 14a of the mobile body 10.
Fig. 6 is a diagram showing an outline of the control system for controlling the travel of each AGV according to the present disclosure.
Fig. 7 is a diagram showing an example of the travel space S in which the AGV is located.
Fig. 8A is a diagram showing an AGV and a traction trolley before connection.
FIG. 8B is a diagram showing the AGV and the traction trolley after being connected.
Fig. 9 is an external view of an exemplary AGV according to the present embodiment.
Fig. 10A is a diagram showing an example of the 1 st hardware configuration of an AGV.
Fig. 10B is a diagram showing an example of the 2 nd hardware configuration of the AGV.
Fig. 11A is a diagram showing an AGV that generates a map while moving.
Fig. 11B is a diagram showing an AGV that generates a map while moving.
Fig. 11C is a diagram showing an AGV that generates a map while moving.
Fig. 11D is a diagram showing an AGV that generates a map while moving.
Fig. 11E is a diagram showing an AGV that generates a map while moving.
Fig. 11F is a diagram schematically illustrating a part of the completed map.
Fig. 12 is a diagram showing an example of a map in which one floor is configured by a plurality of partial maps.
Fig. 13 is a diagram showing an example of the hardware configuration of the operation management device.
Fig. 14 is a diagram schematically showing an example of the travel path of the AGV determined by the operation management device.
Detailed Description
< word >
Before describing the embodiments of the present disclosure, definitions of words used in the present specification will be described.
An "automated guided vehicle" (AGV) is a trackless vehicle that manually or automatically loads a load onto a subject, automatically travels to a designated location, and then manually or automatically unloads the load. "automated guided vehicles" include unmanned tractors and unmanned forklifts.
The term "unmanned" means that no human is required to maneuver the vehicle, and does not exclude the case where an unmanned vehicle carries a "human (e.g., a person handling goods)".
An "unmanned tractor" is a trackless vehicle that travels automatically to the indicated location, towing a trolley that loads and unloads goods manually or automatically.
An "unmanned forklift" is a trackless vehicle that has a mast for raising and lowering a fork or the like for transferring a load, automatically transfers the load to the fork or the like, automatically travels to a designated place, and performs an automatic load handling operation.
A "trackless vehicle" is a mobile body (vehicle) having wheels and an electric motor or engine that rotates the wheels.
The "mobile body" is a device that moves while carrying a person or a load, and includes a wheel that generates a driving force (traction) for movement, a bipedal or multi-legged running device, or a driving device such as a propeller. The term "moving body" in the present disclosure includes not only an unmanned carrier in a narrow sense but also a mobile robot, a service robot, and an unmanned aerial vehicle.
The "automatic travel" includes travel of the automated guided vehicle based on an instruction from an operation management system of a computer connected by communication and autonomous travel based on a control device included in the automated guided vehicle. The autonomous traveling includes not only traveling of the automated guided vehicle toward the destination along a predetermined route but also traveling following the tracking target. Further, the automated guided vehicle may temporarily perform manual travel based on an instruction from the operator. The "automatic travel" generally includes both "guided" travel and "unguided" travel, but in the present disclosure, it refers to "unguided" travel.
The "guide type" refers to a method of continuously or intermittently providing a guide body and guiding an automated guided vehicle by the guide body.
The "unguided type" refers to a type of guidance without providing a guide body. The automated guided vehicle according to the embodiment of the present disclosure has its own position estimating device, and can travel without guidance.
The "self-position estimation device" is a device that estimates a self-position on an environment map from sensor data acquired by an external sensor such as a laser range finder.
The "external sensor" is a sensor that senses a state of the outside of the moving body. Examples of ambient sensors are laser range finders (also called range sensors), cameras (or image sensors), LIDAR (Light Detection and ranging), millimeter-wave radar and magnetic sensors.
The "internal sensor" is a sensor that senses the state of the inside of the moving body. Examples of the internal sensors include a rotary encoder (hereinafter, may be simply referred to as "encoder"), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
"SLAM (スラム)" is an abbreviation for Simultaneous Localization and Mapping, and means that self-position estimation and environment Mapping are performed simultaneously.
< exemplary embodiment >
Hereinafter, an example of the moving body and the moving body system of the present disclosure will be described with reference to the drawings. In some cases, an excessive detailed description is omitted. For example, detailed descriptions of well-known matters and repeated descriptions of substantially the same structures may be omitted. This is to avoid unnecessary redundancy in the following description, which will be readily understood by those skilled in the art. The figures and the following description are provided by the present inventors to provide a sufficient understanding of the present invention to those skilled in the art. And are not intended to limit the subject matter recited in the claims. In the following description, the same or similar components are denoted by the same reference numerals.
Fig. 1 is a diagram schematically showing the structure of a mobile body system 100 of an exemplary embodiment of the present disclosure. The mobile body system 100 includes a plurality of mobile bodies 10 that can autonomously move, and an operation management device (hereinafter, may be simply referred to as "management device") 50 that manages operations of the plurality of mobile bodies 10. Fig. 1 shows two mobile bodies 10 as an example. The mobile body system 100 may include three or more mobile bodies 10. In the present embodiment, the moving object 10 is an Automated Guided Vehicle (AGV). In the following description, the moving object 10 may be referred to as an "AGV 10". The mobile body 10 may be another type of mobile body such as a bipedal or multi-footed walking robot, a hovercraft, or an unmanned aerial vehicle.
The management device 50 has a 1 st communication circuit 54 that communicates with each of the plurality of mobile bodies 10 via a network, and a 1 st control circuit 51 that controls the 1 st communication circuit 54. The 1 st control circuit 51 determines a travel path of each of the plurality of mobile bodies 10, and transmits a signal indicating the travel path of each to the plurality of mobile bodies 10 via the 1 st communication circuit 54. The travel path may be determined individually for each mobile body 10, or all mobile bodies 10 may move along the same travel path. The traveling paths of at least two moving bodies 10 among the plurality of moving bodies 10 at least partially overlap.
The "signal indicating the travel route" transmitted from the management apparatus 50 to each mobile body 10 may include, for example, information indicating positions of a plurality of points on a route from the initial position to the destination position. In this specification, such a place is sometimes referred to as a "mark". The marks may be set along the travel path of each mobile body 10, for example, every distance of several tens of centimeters (cm) to several meters (m).
Each of the plurality of mobile bodies 10 moves along the travel path in accordance with an instruction from the management apparatus 50. In a typical example, each mobile body 10 has a storage device that stores data of an environment map (sometimes simply referred to as "environment map") and an external sensor that periodically scans the environment and outputs sensor data at each scan. In this case, each mobile body 10 moves along the travel path while estimating its position and orientation (position) by matching the sensor data with the environment map data.
Each of the mobile bodies 10 has a function of detecting an obstacle on the travel path and a function of avoiding the obstacle. Each mobile body 10 has a 2 nd communication circuit 14e capable of communicating with the 1 st communication circuit 54 via a network, at least one obstacle sensor 19 that detects an obstacle, and a 2 nd control circuit 14a that controls movement and communication of the mobile body 10. The 2 nd control circuit 14a controls a driving device, not shown, to move the mobile body 10 according to the travel path determined by the 1 st control circuit 54. When the sensor 19 detects an obstacle on the travel path, the 2 nd control circuit 14a causes the mobile body 10 to avoid the obstacle. At this time, the 2 nd control circuit 14a transmits a signal indicating the presence of the obstacle to the 1 st communication circuit 54 via the 2 nd communication circuit 14 e.
The "signal indicating the presence of an obstacle" may include, for example, position information of an obstacle, information of a trajectory of a moving object after avoiding the obstacle, or information indicating the presence or absence of the obstacle. The signal indicating the presence of an obstacle may also contain information relating to the size of the obstacle or the area occupied by the obstacle.
When a signal indicating the presence of an obstacle is transmitted from any of the plurality of mobile bodies 10, the 1 st control circuit 51 of the management device 50 changes the route of the mobile body 10, which is expected to pass through the route in which the obstacle is present, among the plurality of mobile bodies 10.
As an example, an operation in a case where a signal indicating a travel route includes information indicating positions of a plurality of points (marks) on the route will be described. When a signal indicating the presence of an obstacle is transmitted from any of the plurality of mobile bodies 10, the 1 st control circuit 54 determines two adjacent spots between which the obstacle is located. The 1 st control circuit 54 changes the route of the moving body 10, which is expected to pass through the route including the two points, among the plurality of moving bodies 10, to the route not including the two points.
By such an operation, the following mobile body 10 can smoothly move along the new route without being affected by the obstacle. After one mobile body 10 finds an obstacle, the other mobile bodies 10 do not need to perform an operation to avoid the obstacle. Therefore, the operation of the moving body system can be made smoother.
An example of an operation when a path is changed will be described below with reference to fig. 2A to 2D. Here, as an example, a case where the "signal indicating the travel path" includes information indicating positions of a plurality of points (marks) on the path from the initial position to the destination position, and the "signal indicating the presence of an obstacle" includes information indicating a position of an obstacle will be described. The "information indicating the position of the obstacle" is not limited to the information of the position (coordinates) of the obstacle itself, and may be information of the position (coordinates) or trajectory of the mobile body 10 after the avoidance operation.
Fig. 2A shows an example of a case where there is no obstacle on the traveling path of the mobile body 10A. In this case, the mobile body 10A moves along a predetermined travel path (broken line arrow in the figure). More specifically, the mobile body 10A sequentially tracks a plurality of marks (only the mark M is illustrated in fig. 2A) instructed from the 1 st control circuit 51 of the management apparatus 501And M2) And moving from the initial position to the destination position. The movement between the markers is a linear movement. The mobile object 10A may acquire position information of all marks on the travel path in advance, or may request position information of the next mark from the management apparatus 50 each time each mark is reached.
FIG. 2B shows a mark M on the traveling path of the mobile body 10A1And a marker M2An example of the avoidance operation in the case where the obstacle 70 exists therebetween. The obstacle 70 is an object that does not exist on the environmental map, and may be, for example, cargo, a person, or other moving objects. The travel path of the mobile body 10A is determined in advance on the assumption that such an obstacle does not exist.
When the sensor 19 is used to find the obstacle 70 on the route, the mobile body 10A performs an operation of avoiding the obstacle 70. For example, the mobile body 10A appropriately combines motions of turning right, turning left, turning, and the like to avoid the obstacle 70. In the example of fig. 2B, when the mobile body 10A finds the obstacle 70, the following operations are performed.
(1) The traveling direction is rotated to the right by about 90 degrees in the near front of the obstacle 70 (for example, several tens of cm before), advancing by a distance of the same degree as the width of the obstacle 70. The width of the obstacle 70 can be measured by the sensor 19, a laser range finder, or the like, for example.
(2) The direction of travel is rotated approximately 90 degrees to the left, advancing a distance slightly longer than the width of the obstacle 70.
(3) The direction of travel is rotated approximately 90 degrees to the left, advancing the barrier 70 a distance of about the width.
(4) Rotate the advancing direction by about 90 degrees to the right, advance to the mark M2
The avoidance operation of the obstacle 70 by the mobile body 10A is not limited to this example, and any algorithm may be used.
When finding the obstacle 70, the mobile unit 10A transmits a signal indicating the presence of the obstacle 70 to the management device 50. The moving body 10A is shown at mark M1And a marker M2Signal of the presence of an obstacle 70 therebetween, or at the mark M1And a marker M2The trajectory (set of a plurality of coordinates) of the avoidance operation of the mobile body 10A performed in the meantime is signaled to the management device 50. When the mobile body 10A can measure the coordinates and size of the obstacle 70 using the laser range finder, information on the coordinates and size of the obstacle 70 may be included in the signal.
The 1 st control circuit 51 of the management device 50, upon receiving a signal indicating the presence of the obstacle 70 from the mobile body 10A, determines whether or not there is a passing signal expected to include the two marks M1And M2The moving body 10 following the route. In the case where such a moving body 10 exists, the 1 st control circuit 51 changes the path of the moving body 10 to a path not including the two marks M1And M2The path of (2).
Fig. 2C is a diagram showing an example of the route after the change. In this example, the path of the following other mobile object 10B is changed to a path slightly shifted so as not to collide with the obstacle 70. The 1 st control circuit 51 of the management device 50 controls the management device by marking M1And M2Change to marker M1' and M2' the path change is realized.
Fig. 2D is a diagram showing another example of the route after the change. In this example, the route of the other following mobile object 10B is changed greatly. Altered mark M1' and M2' position and original mark M1And M2Is much changed compared to the position of (a).
By performing the route change as described above, the following mobile object 10B can smoothly move to the destination without performing an operation of avoiding the obstacle 70.
Fig. 3 is a diagram showing an example of data of the travel path of each mobile body 10 managed by the management device 50. Such data may be recorded in a storage device (not shown in fig. 1) provided in the management device 50. As shown in fig. 3, the data indicating the travel path of each mobile body 10 may include information of a plurality of points (marks) on the path. The information of each mark may include information of a position (for example, x-coordinate and y-coordinate) of the mark and an orientation (for example, an angle θ with respect to the x-axis) of the moving body 10 at the position. In FIG. 3, the information of each mark is represented by M11(x11,y11,θ11) Etc., but these are all recorded in the form of specific numerical values.
Fig. 4 is a flowchart showing an example of the operation of the 1 st control circuit 51 of the management device 50. In this example, the 1 st control circuit 51 performs the following operation
Step S101: the movement path of each mobile body 10 is determined. The determination of the movement path is performed in accordance with an instruction from a user or a manager or a predetermined program.
Step S102: the instruction of the movement to each mobile body 10 is started. The timing of starting the instruction of the movement to each mobile body 10 is also performed in accordance with an instruction from the user or the manager or a predetermined program.
Step S103: it is determined whether or not a notification of the presence of an obstacle is received from any of the mobile bodies 10. If the determination is yes, the process proceeds to step S104. If the determination is no, step S103 is executed again.
Step S104: it is determined whether or not there is a subsequent mobile body 10 that is expected to pass through the path where the obstacle exists. This determination can be made, for example, by comparing the position of the obstacle with the path of each mobile body 10. If the determination is yes, the process proceeds to step S105. If the determination is no, the process returns to step S103.
Step S105: the route of the following mobile object 10 is changed, and the route change is instructed to the mobile object 10. Then, the process shifts to step S103.
Fig. 5 is a flowchart showing an example of the operation of the 2 nd control circuit 14a of the mobile body 10. In this example, the 2 nd control circuit 14a performs the following operation after starting the movement.
Step S201: it is determined whether the obstacle sensor 19 detects an obstacle. If the determination is yes, the process proceeds to step S202. If the determination is no, the process proceeds to step S203.
Step S202: a signal indicating the presence of an obstacle is generated to the management device 50, and an operation of avoiding the obstacle is performed.
Step S203: it is determined whether or not an instruction to change the route has been received from the management apparatus 50. If the determination is yes, the process proceeds to step S204. If the determination is no, the process returns to step S201.
Step S204: move along the indicated altered path.
The above operation is an example, and the above operation can be appropriately changed. Several modifications of the present embodiment will be described below.
After the route is changed, when a signal indicating that the obstacle has been removed is input, the 1 st control circuit 51 may return the changed route to the original route. The signal indicating that the obstacle has been removed may be transmitted by another mobile body 10 moving in the vicinity of the location, or may be manually input by a manager or a user.
When a signal indicating the presence of an obstacle is transmitted first, the 1 st control circuit 51 may request avoidance of the obstacle to the following mobile object 10 without changing the route of the mobile object 10 that is expected to pass through the route in which the obstacle is present. When a signal indicating the presence of an obstacle is transmitted from n (n is any of integers equal to or greater than 2) moving bodies or when a signal indicating the presence of an obstacle is transmitted n times, the 1 st control circuit 51 may start changing the path of the moving body 10 that is expected to pass through the path in which the obstacle is present. According to such an operation, since the route is changed only when the obstacle is present for a long time, it is possible to avoid frequent route changes when the obstacle is present for a short time.
Each mobile body 10 may further include: a laser range finder; a storage device that holds an environment map; and a position estimation device that determines and outputs an estimated value of the position and orientation of the mobile body 10 on the environment map by referring to the data output from the laser range finder and the environment map. In this case, the 2 nd control circuit 14a moves the mobile body 10 based on the estimated values of the position and the direction output from the position estimating device and the signal indicating the travel path transmitted from the 1 st control circuit 54.
The 1 st control circuit 54 may transmit the environment map to each mobile object 10 or instruct updating of the environment map according to the situation. For example, when a signal indicating that an obstacle is present is not input for a certain period of time (for example, within several hours to several days) after a signal indicating that an obstacle is present is transmitted from any of the plurality of mobile bodies 10, the 1 st control circuit 54 may instruct each mobile body 10 to update the environment map including information of the obstacle.
A more specific example of the case where the moving object is an automated guided vehicle will be described below. In the following description, the automated Guided vehicle is described as "agv (automated Guided vehicle)" using an abbreviation. In the following description, unless contradictory, the present invention can be similarly applied to moving objects other than AGVs, for example, bipedal or multi-legged walking robots, unmanned planes, hovercraft, manned vehicles, and the like.
(1) Basic structure of system
Fig. 6 shows a basic configuration example of an exemplary moving body management system 100 of the present disclosure. The moving object management system 100 includes at least one AGV10 and an operation management device 50 that manages the operation of the AGV 10. Fig. 6 also shows a terminal device 20 operated by the user 1.
The AGV10 is an unmanned transport vehicle capable of "unguided" travel without a guide such as a magnetic tape during travel. The AGV10 can estimate its own position and transmit the estimated result to the terminal device 20 and the operation management device 50. The AGV10 can automatically travel in the travel space S in accordance with an instruction from the operation management device 50. The AGV10 can also operate in a "tracking mode" in which it moves following a person or other moving object.
The operation management device 50 is a computer system that manages the travel of each AGV10 by tracking the position of each AGV 10. The operation management device 50 may be a desktop PC, a notebook PC, and/or a server computer. The operation management device 50 communicates with each AGV10 via a plurality of access points 2. For example, the operation management device 50 transmits data of coordinates of a position to which each AGV10 should be next directed to each AGV 10. Each AGV10 periodically transmits data indicating its position and orientation (orientation) to the operation management device 50, for example, every 100 milliseconds. When the AGV10 reaches the instructed position, the operation management device 50 transmits data of the coordinates of the position to be directed next. The AGV10 can also travel within the travel space S in accordance with the operation of the user 1 input to the terminal device 20. An example of the terminal device 20 is a tablet computer. Typically, the travel of the AGV10 using the terminal device 20 is performed when the map is created, and the travel of the AGV10 using the operation management device 50 is performed after the map is created.
Fig. 7 shows an example of the travel space S in which three AGVs 10a, 10b, and 10c are located. Assume that any AGV is traveling in the depth direction in the figure. The AGVs 10a and 10b are handling loads that are placed on the roof. The AGV10 c follows the AGV10 b ahead. For convenience of explanation, reference numerals 10a, 10b, and 10c are given to fig. 7, but hereinafter, the AGV10 is described.
The AGV10 can transport a load by a traction carriage connected to itself, in addition to a method of transporting a load placed on a roof. Fig. 8A shows the AGV10 and the traction trolley 5 prior to connection. Casters are provided on the legs of the traction carriage 5. The AGV10 is mechanically connected to the traction trolley 5. FIG. 8B shows the AGV10 and the traction trolley 5 connected. When the AGV10 travels, the traction trolley 5 is pulled by the AGV 10. By towing the traction trolley 5, the AGV10 can carry the load placed on the traction trolley 5.
The method of coupling the AGV10 to the traction trolley 5 is arbitrary. Here, an example will be explained. A plate 6 is secured to the roof of the AGV 10. A guide 7 having a slit is provided on the traction carriage 5. The AGV10 approaches the traction trolley 5 to insert the plate 6 into the slot of the guide 7. When the insertion is completed, the AGV10 passes an electromagnetic lock pin, not shown, through the plate 6 and the guide 7, and applies electromagnetic locking. Thus, the AGV10 is physically connected to the traction trolley 5.
Reference is again made to fig. 6. Each AGV10 and the terminal device 20 can be connected one-to-one, for example, and perform communication according to the Bluetooth (registered trademark) standard. Each AGV10 and the terminal device 20 can also communicate with each other by Wi-Fi (registered trademark) using one or more access points 2. The plurality of access points 2 are connected to each other via, for example, a switching hub 3. Fig. 6 shows two access points 2a, 2 b. The AGV10 is wirelessly connected to the access point 2 a. The terminal device 20 is wirelessly connected to the access point 2 b. The data transmitted from the AGV10 is received by the access point 2a, transferred to the access point 2b via the switching hub 3, and transmitted from the access point 2b to the terminal device 20. The data transmitted from the terminal device 20 is received by the access point 2b, transferred to the access point 2a via the switching hub 3, and transmitted from the access point 2a to the AGV 10. This realizes bidirectional communication between the AGV10 and the terminal device 20. The plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. This also enables two-way communication between the operation management device 50 and each AGV 10.
(2) Making of environment map
In order to allow the AGV10 to travel while estimating its own position, a map in the travel space S is created. A position estimation device and a laser range finder are mounted on AGV10, and a map can be created using the output of the laser range finder.
The AGV10 transitions to the data retrieval mode through operation by the user. In the data acquisition mode, AGV10 begins to acquire sensor data using the laser rangefinder. The laser rangefinder periodically emits a laser beam such as infrared rays or visible light to the surroundings to scan the surrounding space S. The laser beam is reflected by the surface of a structure such as a wall or a pillar, an object placed on the ground, or the like. The laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs data indicating the measurement result of the position of each reflection point. The position of each reflection point reflects the direction of arrival and the distance of the reflected light. The data of the assay result is sometimes referred to as "measurement data" or "sensor data".
The position estimation device accumulates sensor data in the storage device. When the acquisition of the sensor data in the moving space S is completed, the sensor data stored in the storage device is transmitted to the external device. The external device is, for example, a computer having a processor for signal processing and installed with a mapping program.
The processor for signal processing of the external device superimposes the sensor data acquired for each scan on each other. The map of the space S can be created by repeating the superimposition processing by the signal processing processor. The external device transmits the created map data to the AGV 10. The AGV10 stores the created map data in an internal storage device. The external device may be the operation management device 50 or may be another device.
Instead of an external device, the AGV10 may map the map. The processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of the AGV 10. When a map is created within the AGV10, it is no longer necessary to transmit the accumulated sensor data to an external device. The data capacity of the sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of a communication line can be avoided.
The AGV10 can travel in the travel space S for acquiring the sensor data according to the operation of the user. For example, the AGV10 wirelessly receives a travel command instructing to move in each of the front, rear, left, and right directions from the user via the terminal device 20. The AGV10 travels forward, backward, leftward and rightward within the travel space S according to the travel command to create a map. When the AGV10 is connected to a manipulator such as a joystick by a wire, the AGV can travel forward, backward, leftward and rightward in the travel space S in accordance with a control signal from the manipulator to create a map. The sensor data may be acquired by a person walking on a measurement carriage on which the laser range finder is mounted.
In addition, although a plurality of AGVs 10 are shown in fig. 6 and 7, one AGV may be used. When there are a plurality of AGVs 10, the user 1 can select one AGV10 from the plurality of registered AGVs using the terminal device 20 to create the map of the travel space S.
After the map is created, each AGV10 can automatically travel while estimating its own position using the map. The process of estimating the self position will be described later.
(3) AGV structure
Fig. 9 is an external view of an exemplary AGV10 according to the present embodiment. The AGV10 has two drive wheels 11a and 11b, four casters 11c, 11d, 11e, and 11f, a frame 12, a conveyance table 13, a travel control device 14, and a laser range finder 15. Two drive wheels 11a and 11b are provided on the right and left sides of the AGV10, respectively. The four casters 11c, 11d, 11e, and 11f are disposed at the four corners of the AGV 10. In addition, although the AGV10 also has a plurality of motors connected to the two drive wheels 11a and 11b, the plurality of motors are not shown in FIG. 9. Fig. 9 shows one drive wheel 11a and two caster wheels 11c and 11e on the right side of the AGV10 and a caster wheel 11f on the left rear portion, but the left drive wheel 11b and the left front caster wheel 11d are not shown because they are hidden by the frame 12. The four casters 11c, 11d, 11e, and 11f can freely turn. In the following description, the drive wheels 11a and 11b are also referred to as wheels 11a and 11b, respectively.
The AGV10 also has at least one obstacle sensor 19 for detecting obstacles. In the example of fig. 9, four obstacle sensors 19 are provided at four corners of the frame 12. The number and arrangement of the obstacle sensors 19 may also be different from the example of fig. 9. The obstacle sensor 19 may be a device capable of measuring a distance, such as an infrared sensor, an ultrasonic sensor, or a stereo camera. When the obstacle sensor 19 is an infrared sensor, for example, infrared light is emitted at regular intervals, and the time until the reflected infrared light returns is measured, whereby an obstacle existing within a certain distance can be detected. When the AGV10 detects an obstacle on the route based on a signal output from at least one obstacle sensor 19, it performs an operation of avoiding the obstacle.
The travel control device 14 is a device that controls the operation of the AGV10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a substrate on which the integrated circuit and the electronic components are mounted. The travel control device 14 performs the above-described transmission and reception of data with the terminal device 20 and preprocessing calculation.
The laser range finder 15 is, for example, an optical device as follows: the distance from the reflection point is measured by emitting a laser beam 15a of infrared rays or visible light and detecting the reflected light of the laser beam 15 a. In the present embodiment, the laser distance meter 15 of the AGV10 emits the pulsed laser beam 15a and detects the reflected light of each laser beam 15a while changing the direction every 0.25 degrees in a space of a range of 135 degrees (270 degrees in total) on the left and right with respect to the front of the AGV10, for example. This makes it possible to acquire data of the distance from the reflection point in the direction determined by the angle of 1081 steps in total at every 0.25 degrees. In the present embodiment, the scanning of the surrounding space by the laser range finder 15 is substantially parallel to the ground surface and planar (two-dimensional). However, the laser range finder 15 may perform scanning in the height direction.
The AGV10 can create a map of the space S based on the position and orientation (direction) of the AGV10 and the scanning result of the laser range finder 15. The map can reflect the arrangement of structures such as walls and pillars around the AGV and objects placed on the floor. The data of the map is stored in a storage device provided in the AGV 10.
Generally, the position and posture of a mobile body is referred to as a posture (position). The position and posture of the moving body in the two-dimensional plane are expressed by position coordinates (X, y) in an XY rectangular coordinate system and an angle θ with respect to the X axis. Hereinafter, the position and posture of the AGV10, i.e., the posture (x, y, θ), may be simply referred to as "position".
The position of the reflection point viewed from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance. In the present embodiment, the laser range finder 15 outputs sensor data expressed in polar coordinates. However, the laser range finder 15 may convert the position expressed by polar coordinates into rectangular coordinates and output the rectangular coordinates.
The construction and operating principle of the laser rangefinder are well known, and therefore further detailed explanation is omitted in this specification. Examples of objects that can be detected by the laser range finder 15 are people, goods, shelves, walls.
The laser range finder 15 is an example of an external sensor for sensing a surrounding space and acquiring sensor data. As another example of such an external sensor, an image sensor and an ultrasonic sensor are considered.
The travel control device 14 compares the measurement result of the laser range finder 15 with map data held by the travel control device itself to estimate the current position of the travel control device itself. The map data to be held may be map data created by another AGV 10.
Fig. 10A shows an example of the 1 st hardware configuration of the AGV 10. Fig. 10A also shows a specific configuration of the travel control device 14.
The AGV10 has a travel control device 14, a laser range finder 15, two motors 16a and 16b, a drive device 17, wheels 11a and 11b, and two rotary encoders 18a and 18 b.
The travel control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a position estimation device 14 e. The microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected via a communication bus 14f, and can transmit and receive data to and from each other. The laser rangefinder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e, and/or the memory 14 b. The microcomputer 14a also functions as the 2 nd control circuit 14a shown in fig. 1.
The microcomputer 14a is a processor or a control circuit (computer) that executes an arithmetic operation for controlling the entire AGV10 including the travel control device 14. Typically, the microcomputer 14a is a semiconductor integrated circuit. The microcomputer 14a transmits a PWM (Pulse Width Modulation) signal as a control signal to the driving device 17 to control the driving device 17 so as to adjust the voltage applied to the motor. Thereby, the motors 16a and 16b are rotated at desired rotation speeds, respectively.
One or more control circuits (for example, a microcomputer) for controlling the driving of the left and right motors 16a and 16b may be provided independently of the microcomputer 14 a. For example, the motor drive device 17 may have two microcomputers that control the driving of the motors 16a and 16b, respectively. The two microcomputers can use the encoder information output from the encoders 18a and 18b, respectively, to perform coordinate calculations to infer the distance the AGV10 has traveled from a given initial position. In addition, the two microcomputers may also control the motor drive circuits 17a and 17b using the encoder information.
The memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14 a. The memory 14b may be used as a work memory for the microcomputer 14a and the position estimation device 14e to perform arithmetic operations.
The storage device 14c is a nonvolatile semiconductor storage device. However, the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk. The storage device 14c may include a head device for writing and/or reading data to or from any recording medium, and a control device for the head device.
The storage device 14c stores map data M of the space S to be traveled, and data (travel path data) R of one or more travel paths. The map data M can be created by the AGV10 operating in the map creation mode and stored in the storage device 14 c. The travel path data R is transmitted from the outside after the map data M is made. In the present embodiment, the map data M and the travel route data R are stored in the same storage device 14c, but may be stored in different storage devices.
An example of the travel route data R will be described.
When the terminal device 20 is a tablet computer, the AGV10 receives the travel path data R indicating the travel path from the tablet computer. The travel route data R at this time includes mark data indicating positions of a plurality of marks. The "flag" indicates the passing position (via point) of the AGV10 to be traveled. The travel route data R includes at least position information of a start mark indicating a travel start position and an end mark indicating a travel end position. The travel route data R may further include position information of one or more markers of intermediate transit points. When the travel route includes one or more intermediate transit points, a route that reaches the end mark sequentially from the start mark via the travel transit points is defined as the travel route. The data for each marker may include, in addition to the coordinate data for that marker, the orientation (angle) and travel speed data of the AGV10 before moving to the next marker. When the AGV10 stops temporarily at the position of each marker and estimates its own position and notifies the terminal device 20, the data of each marker may include data of an acceleration time required to accelerate to the travel speed and/or a deceleration time required to decelerate from the travel speed to stop at the position of the next marker.
The movement of the AGV10 may be controlled not by the terminal device 20 but by the operation management apparatus 50 (e.g., a PC and/or a server computer). In this case, it may be that the operation management device 50 instructs the AGV10 to move to the next marker each time the AGV10 reaches a mark. For example, the AGV10 receives, as the travel path data R indicating the travel path, coordinate data of a destination position to be followed, or data of a distance from the destination position and an angle of travel to be performed, from the operation management device 50.
The AGV10 can travel along the stored travel route while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
The communication circuit 14d is a wireless communication circuit that performs wireless communication according to Bluetooth (registered trademark) and/or Wi-Fi (registered trademark) standards, for example. Any standard includes wireless communication standards that use frequencies in the 2.4GHz band. For example, in a mode in which the AGV10 is caused to travel to create a map, the communication circuit 14d performs wireless communication in accordance with the Bluetooth (registered trademark) standard, and performs one-to-one communication with the terminal device 20.
The position estimation device 14e performs a process of creating a map and a process of estimating the position of the device itself during travel. The position estimation device 14e creates a map of the moving space S based on the position and posture of the AGV10 and the scanning result of the laser range finder. While traveling, the position estimation device 14e receives sensor data from the laser range finder 15, and reads out the map data M stored in the storage device 14 c. The self position (x, y, θ) on the map data M is identified by matching the local map data (sensor data) created from the scanning result of the laser range finder 15 with the map data M of a larger range. The position estimation device 14e generates data indicating "reliability" indicating how well the local map data matches the map data M. Each data of the self position (x, y, θ) and the reliability can be transmitted from the AGV10 to the terminal device 20 or the operation management device 50. The terminal device 20 or the operation management device 50 can receive the respective data of the own position (x, y, θ) and the reliability and display them on a built-in or connected display device.
In the present embodiment, the microcomputer 14a and the position estimation device 14e are different components, but this is merely an example. The microcomputer 14a and the position estimation device 14e may be a single chip circuit or a semiconductor integrated circuit that can independently perform the operations of the microcomputer and the position estimation device. A chip circuit 14g including the microcomputer 14a and the position inferring means 14e is shown in fig. 10A. Hereinafter, an example in which the microcomputer 14a and the position estimation device 14e are independently provided will be described.
Two motors 16a and 16b are respectively mounted to the two wheels 11a and 11b to rotate the respective wheels. That is, the two wheels 11a and 11b are driving wheels, respectively. In this description, a case will be described where the motor 16a and the motor 16b are motors that drive the right and left wheels, respectively, of the AGV 10.
The moving body 10 also has an encoder unit 18 that measures the rotational position or the rotational speed of the wheels 11a and 11 b. The encoder unit 18 includes a 1 st rotary encoder 18a and a 2 nd rotary encoder 18 b. The 1 st rotary encoder 18a measures rotation of an arbitrary position of the power transmission mechanism from the motor 16a to the wheel 11 a. The 2 nd rotary encoder 18b measures rotation of an arbitrary position of the power transmission mechanism from the motor 16b to the wheel 11 b. The encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14 a. The microcomputer 14a may control the movement of the mobile body 10 not only by using the signal received from the position estimation device 14e but also by using the signal received from the encoder unit 18.
The driving device 17 has motor driving circuits 17a and 17b, and the motor driving circuits 17a and 17b are used to adjust voltages applied to the two motors 16a and 16b, respectively. The motor drive circuits 17a and 17b each include a so-called inverter circuit. The motor drive circuits 17a and 17b turn on or off the current flowing in each motor according to a PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motors.
Fig. 10B shows an example of the 2 nd hardware configuration of the AGV 10. The 2 nd hardware configuration example is different from the 1 st hardware configuration example (fig. 10A) in that a laser positioning system 14h is provided and a microcomputer 14a is connected to each component in a one-to-one correspondence.
The laser positioning system 14h has a position inference device 14e and a laser range finder 15. The position estimation device 14e and the laser range finder 15 are connected by, for example, an ethernet (registered trademark) cable. The operations of the position estimation device 14e and the laser range finder 15 are as described above. The laser positioning system 14h outputs information indicating the attitude (x, y, θ) of the AGV10 to the microcomputer 14 a.
The microcomputer 14a has various general-purpose I/O interfaces or general-purpose input/output ports (not shown). The microcomputer 14a is directly connected to other components in the travel control device 14, such as the communication circuit 14d and the laser positioning system 14h, via the general-purpose input/output port.
Fig. 10B is the same as fig. 10A except for the above-described structure. Therefore, the description of the same structure is omitted.
The AGV10 according to the embodiment of the present disclosure may have a safety sensor such as a bumper switch, not shown. The AGV10 may also have inertial measurement devices such as gyroscopic sensors. By using the measurement data of the internal sensors such as the rotary encoders 18a and 18b and the inertial measurement unit, the moving distance and the amount of change (angle) in the posture of the AGV10 can be estimated. The estimated values of these distances and angles are referred to as odometer data, and can function to assist the position and posture information acquired by the position estimation device 14 e.
(4) Map data
Fig. 11A to 11F schematically show an AGV10 that travels while acquiring sensor data. The user 1 may manually move the AGV10 while operating the terminal device 20. Alternatively, the unit having the travel control device 14 shown in fig. 10A and 6B or the AGV10 itself may be mounted on the dolly, and the user 1 may push or pull the dolly to acquire the sensor data.
An AGV10 that uses a laser rangefinder 15 to scan the surrounding space is shown in FIG. 11A. The laser beam is radiated at every predetermined step angle to perform scanning. The illustrated scanning range is a schematically illustrated example, and is different from the scanning range of 270 degrees in total.
In fig. 11A to 11F, the positions of the reflection points of the laser beam are schematically shown by a plurality of black dots 4 indicated by a symbol "·". The scanning of the laser beam is performed in a short cycle during the change of the position and posture of the laser range finder 15. Therefore, the number of actual reflection points is much larger than the number of reflection points 4 shown in the figure. The position estimation device 14e accumulates the position of the black dot 4 acquired along with the travel, for example, in the memory 14 b. The map data is gradually completed by the AGV10 continuing to scan while traveling. In fig. 11B to 11E, only the scanning range is shown for the sake of simplicity. This scanning range is illustrative and is different from the above-described example of 270 degrees in total.
The map may be created using the microcomputer 14a in the AGV10 or an external computer based on sensor data acquired after the sensor data is acquired in an amount necessary for creating the map. Alternatively, the map may be created in real time based on sensor data acquired by the moving AGV 10.
Fig. 11F schematically shows a part of the completed map 80. In the map shown in fig. 11F, a free space is defined by a Point Cloud (Point Cloud) corresponding to a set of reflection points of the laser beam. Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in units of a grid. The position estimation device 14e stores map data (map data M) in the memory 14b or the storage device 14 c. The number and density of black dots shown in the drawings are examples.
The map data thus obtained may be shared by multiple AGVs 10.
A typical example of an algorithm for the AGV10 to infer its location from map data is ICP (Iterative closest point) matching. As described above, the self position (x, y, θ) on the map data M can be estimated by matching the local map data (sensor data) created from the scanning result of the laser range finder 15 with the map data M in a wider range.
When the area where the AGV10 travels is large, the data amount of the map data M increases. Therefore, there is a possibility that a time required for creating a map increases, or a lot of time is required for estimating the self position. When such a failure occurs, the map data M may be divided into a plurality of data of local maps and created and recorded.
Fig. 12 shows an example of covering the entire area of one floor of one factory by a combination of four partial map data M1, M2, M3, M4. In this example, one local map data covers an area of 50m × 50 m. The boundary portion of two maps adjacent in each of the X direction and the Y direction is provided with a rectangular repetition area having a width of 5 m. This overlapping area is referred to as a "map switching area". When the AGV10 traveling while referring to one local map reaches the map switching area, the travel is switched to travel with reference to another local map adjacent thereto. The number of local maps is not limited to four, and may be set as appropriate according to the area of the floor on which the AGV10 travels, and the performance of the computer that performs the map creation and the self position estimation. The size of the local map data and the width of the overlap area are not limited to the above-described examples, and may be set arbitrarily.
(5) Configuration example of operation management device
Fig. 13 shows an example of the hardware configuration of the operation management device 50. The operation management device 50 has a CPU 51, a memory 52, a position database (position DB)53, a communication circuit 54, a map database (map DB)55, and an image processing circuit 56.
The CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57, and can transmit and receive data to and from each other.
The CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50. Typically, the CPU 51 is a semiconductor integrated circuit. The CPU 51 functions as the 1 st control circuit 51 shown in fig. 1.
The memory 52 is a volatile storage device that stores a computer program executed by the CPU 51. The memory 52 may be used as a work memory for the CPU 51 to perform operations.
The position DB 53 stores position data indicating positions that can be destinations of the AGVs 10. The position data may be expressed by coordinates virtually set by a manager in a factory, for example. The location data is determined by the manager.
The communication circuit 54 performs wired communication in accordance with, for example, the ethernet (registered trademark) standard. The communication circuit 54 is connected to the access point 2 (fig. 1) by a wire, and can communicate with the AGV10 via the access point 2. The communication circuit 54 receives data from the CPU 51 via the bus 57 that should be sent to the AGV 10. In addition, the communication circuit 54 sends data (notification) received from the AGV10 to the CPU 51 and/or the memory 52 via the bus 57.
The map DB 55 stores data of maps inside a factory or the like where the AGV10 travels. This map may be the same as map 80 (fig. 11F) or may be different. The data format is not limited as long as it is a map having a one-to-one correspondence relationship with the position of each AGV 10. For example, the map stored in the map DB 55 may be a map created by CAD.
The position DB 53 and the map DB 55 may be constructed on a nonvolatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
The image processing circuit 56 is a circuit that generates video data to be displayed on the monitor 58. The image processing circuit 56 operates exclusively when the manager operates the operation management device 50. In the present embodiment, further detailed description is particularly omitted. The monitor 59 may be integrated with the operation management device 50. The CPU 51 may perform the processing of the image processing circuit 56.
(6) Actions of the operation management device
The operation of the operation management device 50 will be described in brief with reference to fig. 14. Fig. 14 is a diagram schematically showing an example of the travel path of the AGV10 determined by the operation management device 50.
The operation of the AGV10 and the operation management device 50 is summarized as follows. Next, for a AGV10 currently located at a location (marker) M1A mark M to be driven to a final destination through several positionsn+1An example of (n: a positive integer of 1 or more) will be described. In addition, a mark M is recorded in the position DB 531Mark M to be passed later2Marker M2Mark M to be passed later3And coordinate data of each position.
The CPU 51 of the operation management device 50 reads the mark M with reference to the position DB 532Generating the orientation mark M2The travel command of (1). The communication circuit 54 sends a travel instruction to the AGV10 via the access point 2.
The CPU 51 periodically receives data representing the current position and posture from the AGV10 via the access point 2. In this way, the operations management device 50 is able to track the position of each AGV 10. The CPU 51 judges that the current position of the AGV10 and the marker M are present2When they match, the mark M is read3Generating the orientation mark M3And sends the travel command to the AGV 10. That is, when determining that the AGV10 has reached a certain position, the operation management device 50 transmits a travel command to the next position to be passed. Thus, the AGV10 can reach the marker M as the final destinationn+1
Industrial applicability
The moving body and the moving body management system of the present disclosure can be suitably used for moving and carrying goods, parts, finished products, and the like in factories, warehouses, construction sites, logistics, hospitals, and the like.
Description of the reference symbols
1: a user; 2a, 2 b: an access point; 10: AGVs (mobiles); 14: a travel control device; 14 a: a microcomputer (2 nd control circuit); 14 b: a memory; 14 c: a storage device; 14 d: a communication circuit (2 nd communication circuit); 14 e: a position inferring device; 16a, 16 b: a motor; 15: a laser range finder; 17: a drive device; 17a, 17 b: a motor drive circuit; 18: an encoder unit; 18a, 18 b: a rotary encoder; 19: an obstacle sensor; 20: a terminal device (a mobile computer such as a tablet computer); 50: an operation management device; 51: a CPU (1 st control circuit); 52: a memory; 53: a location database (location DB); 54: a communication circuit (1 st communication circuit); 55: a map database (map DB); 56: an image processing circuit; 100: a mobile body management system; 101: a moving body; 103: an external sensor; 105: a position inferring device; 107: a storage device; 109: a controller; 111: a drive device.

Claims (6)

1. A management apparatus that manages operations of a plurality of mobile bodies capable of autonomous movement, wherein,
the management device includes:
a 1 st communication circuit that communicates with each of the plurality of moving bodies; and
a 1 st control circuit that determines a travel path of each of the plurality of mobile bodies and transmits a signal indicating the travel path to each of the plurality of mobile bodies via the 1 st communication circuit,
each of the plurality of moving bodies has:
a 2 nd communication circuit that communicates with the 1 st communication circuit;
at least one sensor that detects an obstacle; and
a 2 nd control circuit that moves the mobile body according to the travel path determined by the 1 st control circuit, and when the sensor detects an obstacle, causes the mobile body to avoid the obstacle, and transmits a signal indicating the presence of the obstacle via the 2 nd communication circuit,
when the signal indicating the presence of the obstacle is transmitted from any of the plurality of moving bodies, the 1 st control circuit changes the path of the moving body, which is expected to pass through the path in which the obstacle is present, among the plurality of moving bodies.
2. The management device according to claim 1,
the signal representing the travel path contains information representing positions of a plurality of points on the path from an initial position to a destination position,
the signal indicating the presence of the obstacle contains information indicating the position of the obstacle,
when the signal indicating that the obstacle exists is transmitted from any of the plurality of moving bodies and the position of the obstacle is located between two adjacent points of the plurality of points, the 1 st control circuit changes the path of the moving body, which is expected to pass through the path including the two points, of the plurality of moving bodies to a path not including the two points.
3. The management apparatus according to claim 1 or 2, wherein,
the 1 st control circuit restores the changed route to the original route when a signal indicating that the obstacle has been removed is input.
4. The management apparatus according to any one of claims 1 to 3,
when the signal indicating the presence of the obstacle is transmitted first, the 1 st control circuit does not change the path of the mobile object expected to pass through the path in which the obstacle is present,
the 1 st control circuit starts changing the route of the moving object which is expected to pass through the route in which the obstacle exists, when the signal indicating the existence of the obstacle is transmitted from n moving objects, where n is any integer of 2 or more.
5. The management apparatus according to any one of claims 1 to 4,
each mobile body further includes:
a laser range finder;
a storage device that holds an environment map; and
a position inference device that determines and outputs an inferred value of a position and an orientation of the mobile body on the environment map by referring to data output from the laser range finder and the environment map,
the 2 nd control circuit moves the movable body based on the estimated value output from the position estimating device and the signal indicating the travel path transmitted from the 1 st control circuit,
the 1 st control circuit instructs each of the plurality of mobile bodies to update to an environment map including information of the obstacle when a signal indicating that the obstacle has been removed has not been input for a certain period after the signal indicating that the obstacle is present has been transmitted from any of the plurality of mobile bodies.
6. A mobile body system includes:
the management device of any one of claims 1 to 5; and
a plurality of moving bodies.
CN201880057317.5A 2017-09-25 2018-09-20 Moving body and moving body system Pending CN111065981A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017183531 2017-09-25
JP2017-183531 2017-09-25
PCT/JP2018/034905 WO2019059307A1 (en) 2017-09-25 2018-09-20 Moving body and moving body system

Publications (1)

Publication Number Publication Date
CN111065981A true CN111065981A (en) 2020-04-24

Family

ID=65809839

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880057317.5A Pending CN111065981A (en) 2017-09-25 2018-09-20 Moving body and moving body system

Country Status (3)

Country Link
JP (1) JP7136426B2 (en)
CN (1) CN111065981A (en)
WO (1) WO2019059307A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111806460A (en) * 2020-07-17 2020-10-23 青岛蚂蚁机器人有限责任公司 Automatic guide transport vechicle control system
TWI784786B (en) * 2020-11-16 2022-11-21 日商豐田自動織機股份有限公司 Automated guided vehicle control device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230333568A1 (en) * 2019-05-17 2023-10-19 Murata Machinery, Ltd. Transport vehicle system, transport vehicle, and control method
CN110567471B (en) * 2019-08-09 2020-10-09 易普森智慧健康科技(深圳)有限公司 Indoor traffic control method based on position
JP7338611B2 (en) 2020-11-16 2023-09-05 株式会社豊田自動織機 Controller for automatic guided vehicle

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11259130A (en) * 1998-03-06 1999-09-24 Nissan Motor Co Ltd Method for setting route of automated guided vehicle and method for controlling automated guided vehicle
WO2002023297A1 (en) * 2000-09-11 2002-03-21 Kunikatsu Takase Mobile body movement control system
JP2010231698A (en) * 2009-03-30 2010-10-14 Advanced Telecommunication Research Institute International Network robot system, and robot control device, method and program
CN103268111A (en) * 2013-05-28 2013-08-28 重庆大学 Networked distribution type multiple-mobile-robot system
CN105974925A (en) * 2016-07-19 2016-09-28 合肥学院 AGV trolley driving control method and system
CN106325280A (en) * 2016-10-20 2017-01-11 上海物景智能科技有限公司 Multirobot collision preventing method and system
CN106548231A (en) * 2016-11-24 2017-03-29 北京地平线机器人技术研发有限公司 Mobile controller, mobile robot and the method for moving to optimal interaction point
CN106774345A (en) * 2017-02-07 2017-05-31 上海仙知机器人科技有限公司 A kind of method and apparatus for carrying out multi-robot Cooperation
JP2017130121A (en) * 2016-01-22 2017-07-27 株式会社ダイヘン Mobile body, and server
JP2017134794A (en) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 Mobile robot control system, and server device for controlling mobile robots
CN107015566A (en) * 2017-06-05 2017-08-04 河池学院 A kind of multirobot detecting system based on LabVIEW

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11259130A (en) * 1998-03-06 1999-09-24 Nissan Motor Co Ltd Method for setting route of automated guided vehicle and method for controlling automated guided vehicle
WO2002023297A1 (en) * 2000-09-11 2002-03-21 Kunikatsu Takase Mobile body movement control system
JP2010231698A (en) * 2009-03-30 2010-10-14 Advanced Telecommunication Research Institute International Network robot system, and robot control device, method and program
CN103268111A (en) * 2013-05-28 2013-08-28 重庆大学 Networked distribution type multiple-mobile-robot system
JP2017130121A (en) * 2016-01-22 2017-07-27 株式会社ダイヘン Mobile body, and server
JP2017134794A (en) * 2016-01-29 2017-08-03 パナソニックIpマネジメント株式会社 Mobile robot control system, and server device for controlling mobile robots
CN105974925A (en) * 2016-07-19 2016-09-28 合肥学院 AGV trolley driving control method and system
CN106325280A (en) * 2016-10-20 2017-01-11 上海物景智能科技有限公司 Multirobot collision preventing method and system
CN106548231A (en) * 2016-11-24 2017-03-29 北京地平线机器人技术研发有限公司 Mobile controller, mobile robot and the method for moving to optimal interaction point
CN106774345A (en) * 2017-02-07 2017-05-31 上海仙知机器人科技有限公司 A kind of method and apparatus for carrying out multi-robot Cooperation
CN107015566A (en) * 2017-06-05 2017-08-04 河池学院 A kind of multirobot detecting system based on LabVIEW

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111806460A (en) * 2020-07-17 2020-10-23 青岛蚂蚁机器人有限责任公司 Automatic guide transport vechicle control system
TWI784786B (en) * 2020-11-16 2022-11-21 日商豐田自動織機股份有限公司 Automated guided vehicle control device

Also Published As

Publication number Publication date
JPWO2019059307A1 (en) 2020-10-15
JP7136426B2 (en) 2022-09-13
WO2019059307A1 (en) 2019-03-28

Similar Documents

Publication Publication Date Title
JP7168211B2 (en) Mobile object that avoids obstacles and its computer program
US20190294181A1 (en) Vehicle, management device, and vehicle management system
JP6825712B2 (en) Mobiles, position estimators, and computer programs
JP7081881B2 (en) Mobiles and mobile systems
CN110998472A (en) Mobile object and computer program
JP7136426B2 (en) Management device and mobile system
CN110998473A (en) Position estimation system and mobile body having the same
JPWO2019187816A1 (en) Mobiles and mobile systems
JP2019053391A (en) Mobile body
JPWO2019054209A1 (en) Map making system and map making device
JP7111424B2 (en) Mobile object, position estimation device, and computer program
JP2019175137A (en) Mobile body and mobile body system
CN111971633B (en) Position estimation system, mobile body having the position estimation system, and recording medium
JP2019175136A (en) Mobile body
JP2019179497A (en) Moving body and moving body system
JP2019079171A (en) Movable body
JP2020166702A (en) Mobile body system, map creation system, route creation program and map creation program
JP2019067001A (en) Moving body
CN112578789A (en) Moving body
JPWO2019069921A1 (en) Mobile
JP2019148871A (en) Movable body and movable body system
JPWO2019059299A1 (en) Operation management device
JP2019175138A (en) Mobile body and management device
JP2020166701A (en) Mobile object and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200424