CN111052026A - Moving body and moving body system - Google Patents

Moving body and moving body system Download PDF

Info

Publication number
CN111052026A
CN111052026A CN201880057308.6A CN201880057308A CN111052026A CN 111052026 A CN111052026 A CN 111052026A CN 201880057308 A CN201880057308 A CN 201880057308A CN 111052026 A CN111052026 A CN 111052026A
Authority
CN
China
Prior art keywords
mobile body
agv10
map
location
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880057308.6A
Other languages
Chinese (zh)
Inventor
安达信也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Shimpo Corp
Original Assignee
Nidec Shimpo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Shimpo Corp filed Critical Nidec Shimpo Corp
Publication of CN111052026A publication Critical patent/CN111052026A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control

Abstract

A mobile body according to an embodiment is a mobile body capable of autonomous travel, including: a drive device that moves the movable body; an external sensor; a position estimation device that sequentially outputs position information indicating a position and a posture of the moving object based on sensor data output from the external sensor; a storage device that stores the position information output from the position estimation device; and a controller that controls the drive device to move the movable body. The controller returns the mobile body to the 1 st place against a path from the 1 st place to the 2 nd place according to the position information stored in the storage device after the mobile body moves from the 1 st place to the 2 nd place.

Description

Moving body and moving body system
Technical Field
The present disclosure relates to a mobile body and a mobile body system.
Background
Research and development of moving bodies such as automated guided vehicles and mobile robots are being advanced. For example, japanese patent application laid-open No. 2008-084135 discloses a mobile robot that registers a map and a movement path while moving following a person. Further, japanese patent application laid-open No. 2006-285635 discloses a mobile robot including: when the person calls, the system moves to the place where the person is located, and returns to the vicinity of the original route position again when the event is finished.
Documents of the prior art
Patent document
Patent document 1: japanese laid-open patent publication No. 2008-084135
Patent document 2: japanese patent laid-open publication No. 2006-285635
Disclosure of Invention
Problems to be solved by the invention
The present disclosure provides a technique for further improving the convenience of work using a mobile body.
Means for solving the problems
A mobile body of an exemplary embodiment of the present disclosure is a mobile body capable of autonomous movement, wherein the mobile body has: a drive device that moves the movable body; an external sensor; a position estimation device that sequentially outputs position information indicating a position and a posture of the moving object based on sensor data output from the external sensor; a storage device that stores the position information output from the position estimation device; and a controller that controls the drive device to move the movable body. The controller returns the mobile body to the 1 st place against a path from the 1 st place to the 2 nd place according to the position information stored in the storage device after the mobile body moves from the 1 st place to the 2 nd place.
The general or specific aspects described above can also be implemented by a system, a method, an integrated circuit, a computer program, or a recording medium. Alternatively, the present invention may be implemented by any combination of systems, apparatuses, methods, integrated circuits, computer programs, and recording media.
Effects of the invention
According to the embodiments of the present disclosure, after a mobile body moves from a 1 st location to a 2 nd location, the mobile body can be returned to the 1 st location again without providing information on a return trip. Therefore, the convenience of the work using the moving body can be improved.
Drawings
Fig. 1 is a block diagram showing a schematic configuration of a mobile body according to an exemplary embodiment of the present disclosure.
Fig. 2 is a diagram showing an outline of the control system for controlling travel of each AGV according to the present disclosure.
Fig. 3 is a diagram showing an example of a travel space in which an AGV is located.
Fig. 4A is a diagram showing an AGV and a traction trolley prior to connection.
FIG. 4B is a diagram illustrating the AGV and the traction trolley after connection.
Fig. 5 is an external view of an exemplary AGV according to the present embodiment.
Fig. 6A is a diagram showing an example of the 1 st hardware configuration of an AGV.
Fig. 6B is a diagram showing an example of the 2 nd hardware configuration of the AGV.
Fig. 7A is a diagram showing an AGV that generates a map while moving.
Fig. 7B is a diagram showing an AGV that generates a map while moving.
Fig. 7C is a diagram showing an AGV that generates a map while moving.
Fig. 7D is a diagram showing an AGV that generates a map while moving.
Fig. 7E is a diagram showing an AGV that generates a map while moving.
Fig. 7F is a diagram schematically showing a part of the completed map.
Fig. 8 is a diagram showing an example of a map in which one floor is configured by a plurality of partial maps.
Fig. 9 is a diagram showing an example of the hardware configuration of the operation management device.
Fig. 10 is a diagram schematically showing an example of the travel path of the AGV determined by the operation management device.
Fig. 11A is a diagram schematically showing an example of an AGV that operates in the tracking mode.
Fig. 11B is a diagram schematically showing another example of an AGV operating in the tracking mode.
Fig. 12 is a flowchart showing the actions of the AGV of the illustrated embodiment of the present disclosure.
Fig. 13A is a diagram 1 showing an example of the operation of an AGV according to an exemplary embodiment of the present disclosure.
Fig. 13B is a 2 nd diagram showing an example of the operation of the AGV according to the illustrated embodiment of the present disclosure.
Fig. 13C is a 3 rd diagram showing an example of the operation of the AGV according to the exemplary embodiment of the present disclosure.
Fig. 13D is a 4 th view showing an example of the operation of the AGV according to the exemplary embodiment of the present disclosure.
Fig. 13E is a 5 th view showing an example of the operation of the AGV according to the exemplary embodiment of the present disclosure.
Fig. 13F is a 6 th view showing an example of the operation of the AGV according to the exemplary embodiment of the present disclosure.
Fig. 14 is a diagram showing another example of the operation of the AGV during the return trip.
Fig. 15A is a diagram showing an example of position information recorded at the time of an outbound trip.
Fig. 15B is a diagram showing an example of position information at the time of return.
Fig. 15C is a diagram showing another example of the position information at the time of the return trip.
Detailed Description
< word >
Before describing the embodiments of the present disclosure, definitions of words used in the present specification will be described.
An "automated guided vehicle" (AGV) is a trackless vehicle that manually or automatically loads a load onto a subject, automatically travels to a designated location, and then manually or automatically unloads the load. "automated guided vehicles" include unmanned tractors and unmanned forklifts.
The term "unmanned" means that no human is required to maneuver the vehicle, and does not exclude the case where an unmanned vehicle carries a "human (e.g., a person handling goods)".
An "unmanned tractor" is a trackless vehicle that travels automatically to the indicated location, towing a trolley that loads and unloads goods manually or automatically.
An "unmanned forklift" is a trackless vehicle that has a mast for raising and lowering a fork or the like for transferring a load, automatically transfers the load to the fork or the like, automatically travels to a designated place, and performs an automatic load handling operation.
A "trackless vehicle" is a mobile body (vehicle) having wheels and an electric motor or engine that rotates the wheels.
The "mobile body" is a device that moves while carrying a person or a load, and includes a driving device such as wheels, bipedal or multi-legged running devices, and propellers, which generate a driving force (traction) for movement. The term "moving body" in the present disclosure includes not only an unmanned carrier in a narrow sense but also a mobile robot, a service robot, and an unmanned aerial vehicle.
The "automatic travel" includes travel of the automated guided vehicle based on an instruction from an operation management system of a computer connected by communication and autonomous travel based on a control device included in the automated guided vehicle. The autonomous traveling includes not only traveling of the automated guided vehicle toward the destination along a predetermined route but also traveling following the tracking target. Further, the automated guided vehicle may temporarily perform manual travel based on an instruction from the operator. The "automatic travel" generally includes both "guided" travel and "unguided" travel, but in the present disclosure, it refers to "unguided" travel.
The "guide type" refers to a method of continuously or intermittently providing a guide body and guiding an automated guided vehicle by the guide body.
The "unguided type" refers to a type of guidance without providing a guide body. The automated guided vehicle according to the embodiment of the present disclosure has its own position estimating device, and can travel without guidance.
The "self-position estimation device" is a device that estimates a self-position on an environment map from sensor data acquired by an external sensor such as a laser range finder.
The "external sensor" is a sensor that senses a state of the outside of the moving body. Examples of ambient sensors are laser range finders (also called range sensors), cameras (or image sensors), LIDAR (Light Detection and ranging), millimeter-wave radar and magnetic sensors.
The "internal sensor" is a sensor that senses the state of the inside of the moving body. Examples of the internal sensors include a rotary encoder (hereinafter, may be simply referred to as "encoder"), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
"SLAM (スラム)" is an abbreviation for Simultaneous Localization and Mapping, and means that self-position estimation and environment Mapping are performed simultaneously.
< exemplary embodiment >
Hereinafter, an example of the moving body and the moving body system of the present disclosure will be described with reference to the drawings. In some cases, an excessive detailed description is omitted. For example, detailed descriptions of well-known matters and repeated descriptions of substantially the same structures may be omitted. This is to avoid unnecessary redundancy in the following description, which will be readily understood by those skilled in the art. The figures and the following description are provided by the present inventors to provide a sufficient understanding of the present invention to those skilled in the art. And are not intended to limit the subject matter recited in the claims.
Fig. 1 is a block diagram showing a schematic configuration of a mobile body according to an exemplary embodiment of the present disclosure. The movable body 101 includes an external sensor 103, a position estimation device 105, a storage device 107, a controller 109, and a drive device 111.
The driving device 111 has a mechanism for moving the movable body 101. The driving device 111 may include, for example, at least one driving electric motor (hereinafter, simply referred to as "motor"), not shown, and a motor control circuit for controlling the motor. The environment sensor 103 is a sensor for sensing an external environment, such as a laser range finder, a camera, a radar, or a LIDAR. The position estimation device 105 estimates the position and orientation of the moving body from sensor data output from an external sensor. The position estimation device 105 sequentially outputs information (referred to as "position information" in the present specification) indicating the estimated position and orientation (orientation) of the moving object. The storage device 107 stores position information sequentially output from the position estimation device 105 during the movement of the moving body 101. The controller 109 controls the driving device 111 to move the movable body 101.
After moving the mobile body 101 from the 1 st location to the 2 nd location, the controller 109 returns the mobile body 101 to the 1 st location based on the position information stored in the storage device 107. At this time, the controller 109 returns the mobile body 101 to the 1 st point along a path reverse to the path from the 1 st point to the 2 nd point. The route from the 1 st point to the 2 nd point is referred to as "outgoing route", and the route from the 2 nd point to the 1 st point is referred to as "returning route".
According to the present embodiment, after moving the mobile body 101 from the 1 st point to the 2 nd point, the mobile body 101 can be returned to the 1 st point without providing a specific instruction regarding the return trip. This can improve the efficiency of the work using the mobile body 101. For example, the efficiency of the carrying operation such as loading at one of the 1 st and 2 nd locations and unloading at the other of the 1 st and 2 nd locations can be improved.
In one embodiment, the moving object 101 is an automated guided vehicle, and data of an environment map (hereinafter, also simply referred to as "environment map") is recorded in the storage device 107. The position estimation device 105 can specify estimated values of the position and orientation of the mobile object on the environment map by matching the sensor data with the environment map recorded in the storage device 107, and output the estimated values as position information.
The controller 109 can operate the mobile body 101 in at least one of a mode in which the mobile body 101 is moved based on the route information transmitted from the external device and a mode in which the mobile body 101 is caused to track the mobile body (referred to as a "tracking mode"). In the tracking mode, the controller 109 controls the driving device 111 so that the moving object 101 tracks the moving object based on the sensor data output from the external sensor 103. The moving object may be, for example, a person or other moving object. In the tracking mode, the controller 109 causes the mobile body 101 to move from the 1 st spot to the 2 nd spot while tracking a person or another mobile body, and then causes the mobile body 101 to return to the 1 st spot.
The controller 109 may also return the mobile body 101 to the 1 st location in response to a return instruction provided by, for example, a user's operation or other means after the mobile body 101 moves from the 1 st location to the 2 nd location. The return instruction may not contain an indication specifying a return pass. The controller 109 may start the operation of returning the mobile body 101 to the 1 st point when the position or state of the mobile body 101 satisfies a set condition. For example, when the mobile body 101 is used to transport a load, the controller 109 may start the operation of returning the mobile body 101 to the 1 st location when the completion of unloading or loading is detected after the mobile body 101 has reached the 2 nd location. The unloading or loading can be detected by a sensor provided in the vicinity of the mobile body 101 or the 2 nd site, for example. In one example, a tag such as an RFID tag may be attached to each article mounted on the mobile body 101. The sensor can detect that the unloading or loading of all the items is complete by reading the information recorded on each label.
When returning the moving body 101 from the 2 nd position to the 1 st position, the controller 109 may start moving after reversing the direction of the moving body 101 from the direction of the forward travel. Alternatively, when returning the mobile body 101 from the 2 nd point to the 1 st point, the controller 109 may move the mobile body 101 while keeping the same direction of the mobile body 101 as that in the forward travel. In the latter example, the controller 109 causes the moving body 101 to travel in the reverse direction (or travel backward) by rotating each motor for driving the moving body 101 in the direction opposite to the forward travel.
A more specific example of the case where the moving object is an automated guided vehicle will be described below. In the following description, the automated Guided vehicle may be referred to as "agv (automated Guided vehicle)" by using an abbreviation. In the following description, unless otherwise contradictory, the present invention can be similarly applied to moving objects other than AGVs, for example, a mobile robot, an unmanned plane, a manned vehicle, and the like.
(1) Basic structure of system
Fig. 2 shows a basic configuration example of an exemplary mobile management system 100 of the present disclosure. The moving object management system 100 includes at least one AGV10 and an operation management device 50 that manages the operation of the AGV 10. Fig. 2 also shows a terminal device 20 operated by the user 1.
The AGV10 is an unmanned transport vehicle capable of "unguided" travel without a guide such as a magnetic tape during travel. The AGV10 can estimate its own position and transmit the estimated result to the terminal device 20 and the operation management device 50. The AGV10 can automatically travel in the travel space S in accordance with an instruction from the operation management device 50. The AGV10 can also operate in a "tracking mode" in which it moves following a person or other moving object.
The operation management device 50 is a computer system that manages the travel of each AGV10 by tracking the position of each AGV 10. The operation management device 50 may be a desktop PC, a notebook PC, and/or a server computer. The operation management device 50 communicates with each AGV10 via a plurality of access points 2. For example, the operation management device 50 transmits data of coordinates of a position to which each AGV10 should be next directed to each AGV 10. Each AGV10 periodically transmits data indicating its position and orientation (orientation) to the operation management device 50, for example, every 100 milliseconds. When the AGV10 reaches the instructed position, the operation management device 50 transmits data of the coordinates of the position to be directed next. The AGV10 can also travel within the travel space S in accordance with the operation of the user 1 input to the terminal device 20. An example of the terminal device 20 is a tablet computer. Typically, the travel of the AGV10 using the terminal device 20 is performed when the map is created, and the travel of the AGV10 using the operation management device 50 is performed after the map is created.
Fig. 3 shows an example of the travel space S in which three AGVs 10a, 10b, and 10c are located. Assume that any AGV is traveling in the depth direction in the figure. The AGVs 10a and 10b are handling loads that are placed on the roof. The AGV10 c follows the AGV10b ahead. For convenience of explanation, reference numerals 10a, 10b, and 10c are given to fig. 3, but hereinafter, the AGV10 is described.
The AGV10 can transport a load by a traction carriage connected to itself, in addition to a method of transporting a load placed on a roof. Fig. 4A shows the AGV10 and the traction trolley 5 prior to connection. Casters are provided on the legs of the traction carriage 5. The AGV10 is mechanically connected to the traction trolley 5. FIG. 4B shows the AGV10 and the traction trolley 5 connected. When the AGV10 travels, the traction trolley 5 is pulled by the AGV 10. By towing the traction trolley 5, the AGV10 can carry the load placed on the traction trolley 5.
The method of coupling the AGV10 to the traction trolley 5 is arbitrary. Here, an example will be explained. A plate 6 is secured to the roof of the AGV 10. A guide 7 having a slit is provided on the traction carriage 5. The AGV10 approaches the traction trolley 5 to insert the plate 6 into the slot of the guide 7. When the insertion is completed, the AGV10 passes an electromagnetic lock pin, not shown, through the plate 6 and the guide 7, and applies electromagnetic locking. Thus, the AGV10 is physically connected to the traction trolley 5.
Reference is again made to fig. 2. Each AGV10 and the terminal device 20 can be connected one-to-one, for example, and perform communication according to the Bluetooth (registered trademark) standard. Each AGV10 and the terminal device 20 can also communicate with each other by Wi-Fi (registered trademark) using one or more access points 2. The plurality of access points 2 are connected to each other via, for example, a switching hub 3. Fig. 2 shows two access points 2a, 2 b. The AGV10 is wirelessly connected to the access point 2 a. The terminal device 20 is wirelessly connected to the access point 2 b. The data transmitted from the AGV10 is received by the access point 2a, transferred to the access point 2b via the switching hub 3, and transmitted from the access point 2b to the terminal device 20. The data transmitted from the terminal device 20 is received by the access point 2b, transferred to the access point 2a via the switching hub 3, and transmitted from the access point 2a to the AGV 10. This realizes bidirectional communication between the AGV10 and the terminal device 20. The plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. This also enables two-way communication between the operation management device 50 and each AGV 10.
(2) Making of environment map
In order to allow the AGV10 to travel while estimating its own position, a map in the travel space S is created. A position estimation device and a laser range finder are mounted on AGV10, and a map can be created using the output of the laser range finder.
The AGV10 transitions to the data retrieval mode through operation by the user. In the data acquisition mode, the AGV10 begins to acquire sensor data using the laser rangefinder. The laser rangefinder periodically emits a laser beam such as infrared rays or visible light to the surroundings to scan the surrounding space S. The laser beam is reflected by the surface of a structure such as a wall or a pillar, an object placed on the ground, or the like. The laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs data indicating the measurement result of the position of each reflection point. The position of each reflection point reflects the direction of arrival and the distance of the reflected light. The data of the assay result is sometimes referred to as "measurement data" or "sensor data".
The position estimation device accumulates sensor data in the storage device. When the acquisition of the sensor data in the moving space S is completed, the sensor data stored in the storage device is transmitted to the external device. The external device is, for example, a computer having a processor for signal processing and installed with a mapping program.
The processor for signal processing of the external device superimposes the sensor data acquired for each scan on each other. The map of the space S can be created by repeating the superimposition processing by the signal processing processor. The external device transmits the created map data to the AGV 10. The AGV10 stores the created map data in an internal storage device. The external device may be the operation management device 50 or may be another device.
Instead of an external device, the AGV10 may map the map. The processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of the AGV 10. When a map is created within the AGV10, it is no longer necessary to transmit the accumulated sensor data to an external device. The data capacity of the sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of a communication line can be avoided.
The AGV10 can travel in the travel space S for acquiring the sensor data according to the operation of the user. For example, the AGV10 wirelessly receives a travel command instructing to move in each of the front, rear, left, and right directions from the user via the terminal device 20. The AGV10 travels forward, backward, leftward and rightward within the travel space S according to the travel command to create a map. When the AGV10 is connected to a manipulator such as a joystick by a wire, the AGV can travel forward, backward, leftward and rightward in the travel space S in accordance with a control signal from the manipulator to create a map. The sensor data may be acquired by a person walking on a measurement carriage on which the laser range finder is mounted.
In addition, although a plurality of AGVs 10 are shown in fig. 2 and 3, one AGV may be used. When there are a plurality of AGVs 10, the user 1 can select one AGV10 from the plurality of registered AGVs using the terminal device 20 to create the map of the travel space S.
After the map is created, each AGV10 can automatically travel while estimating its own position using the map. The process of estimating the self position will be described later.
(3) AGV structure
Fig. 5 is an external view of an exemplary AGV10 according to the present embodiment. The AGV10 has two drive wheels 11a and 11b, four casters 11c, 11d, 11e, and 11f, a frame 12, a conveyance table 13, a travel control device 14, and a laser range finder 15. Two drive wheels 11a and 11b are provided on the right and left sides of the AGV10, respectively. The four casters 11c, 11d, 11e, and 11f are disposed at the four corners of the AGV 10. In addition, although the AGV10 also has a plurality of motors connected to the two drive wheels 11a and 11b, the plurality of motors are not shown in FIG. 5. In fig. 5, one driving wheel 11a and two casters 11c and 11e on the right side of the AGV10 and a left rear caster 11f are shown, but the left driving wheel 11b and the left front caster 11d are not shown because they are hidden by the frame 12. The four casters 11c, 11d, 11e, and 11f can freely turn. In the following description, the drive wheels 11a and 11b are also referred to as wheels 11a and 11b, respectively.
The AGV10 also has at least one obstacle sensor 19 for detecting obstacles. In the example of fig. 5, four obstacle sensors 19 are provided at four corners of the frame 12. The number and arrangement of the obstacle sensors 19 may also be different from the example of fig. 5. The obstacle sensor 19 may be a device capable of measuring a distance, such as an infrared sensor, an ultrasonic sensor, or a stereo camera. When the obstacle sensor 19 is an infrared sensor, for example, infrared light is emitted at regular intervals, and the time until the reflected infrared light returns is measured, whereby an obstacle existing within a certain distance can be detected. When the AGV10 detects an obstacle on the route based on a signal output from at least one obstacle sensor 19, it performs an operation of avoiding the obstacle.
The travel control device 14 is a device that controls the operation of the AGV10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a substrate on which the integrated circuit and the electronic components are mounted. The travel control device 14 performs the above-described transmission and reception of data with the terminal device 20 and preprocessing calculation.
The laser range finder 15 is, for example, an optical device as follows: the distance from the reflection point is measured by emitting a laser beam 15a of infrared rays or visible light and detecting the reflected light of the laser beam 15 a. In the present embodiment, the laser distance meter 15 of the AGV10 emits the pulsed laser beam 15a and detects the reflected light of each laser beam 15a while changing the direction every 0.25 degrees in a space of a range of 135 degrees (270 degrees in total) on the left and right with respect to the front of the AGV10, for example. This makes it possible to acquire data of the distance from the reflection point in the direction determined by the angle of 1081 steps in total at every 0.25 degrees. In the present embodiment, the scanning of the surrounding space by the laser range finder 15 is substantially parallel to the ground surface and planar (two-dimensional). However, the laser range finder 15 may perform scanning in the height direction.
The AGV10 can create a map of the space S based on the position and orientation (direction) of the AGV10 and the scanning result of the laser range finder 15. The map can reflect the arrangement of structures such as walls and pillars around the AGV and objects placed on the floor. The data of the map is stored in a storage device provided in the AGV 10.
Generally, the position and posture of a mobile body is referred to as a posture (position). The position and posture of the moving body in the two-dimensional plane are expressed by position coordinates (X, y) in an XY rectangular coordinate system and an angle θ with respect to the X axis. Hereinafter, the position and posture of the AGV10, i.e., the posture (x, y, θ), may be simply referred to as "position".
The position of the reflection point viewed from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance. In the present embodiment, the laser range finder 15 outputs sensor data expressed in polar coordinates. However, the laser range finder 15 may convert the position expressed by polar coordinates into rectangular coordinates and output the rectangular coordinates.
The construction and operating principle of the laser rangefinder are well known, and therefore further detailed explanation is omitted in this specification. Examples of objects that can be detected by the laser range finder 15 are people, goods, shelves, walls.
The laser range finder 15 is an example of an external sensor for sensing a surrounding space and acquiring sensor data. As another example of such an external sensor, an image sensor and an ultrasonic sensor are considered.
The travel control device 14 compares the measurement result of the laser range finder 15 with map data held by the travel control device itself to estimate the current position of the travel control device itself. The map data to be held may be map data created by another AGV 10.
Fig. 6A shows an example of the 1 st hardware configuration of the AGV 10. Fig. 6A also shows a specific configuration of the travel control device 14.
The AGV10 has a travel control device 14, a laser range finder 15, two motors 16a and 16b, a drive device 17, wheels 11a and 11b, and two rotary encoders 18a and 18 b.
The travel control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a position estimation device 14 e. The microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected via a communication bus 14f, and can transmit and receive data to and from each other. The laser rangefinder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e, and/or the memory 14 b.
The microcomputer 14a is a processor or a control circuit (computer) that executes an arithmetic operation for controlling the entire AGV10 including the travel control device 14. Typically, the microcomputer 14a is a semiconductor integrated circuit. The microcomputer 14a transmits a PWM (Pulse Width Modulation) signal as a control signal to the driving device 17 to control the driving device 17 so as to adjust the voltage applied to the motor. Thereby, the motors 16a and 16b are rotated at desired rotation speeds, respectively.
One or more control circuits (for example, a microcomputer) for controlling the driving of the left and right motors 16a and 16b may be provided independently of the microcomputer 14 a. For example, the motor drive device 17 may have two microcomputers that control the driving of the motors 16a and 16b, respectively. The two microcomputers can use the encoder information output from the encoders 18a and 18b, respectively, to perform coordinate calculations to infer the distance the AGV10 has traveled from a given initial position. In addition, the two microcomputers may also control the motor drive circuits 17a and 17b using the encoder information.
The memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14 a. The memory 14b may be used as a work memory for the microcomputer 14a and the position estimation device 14e to perform arithmetic operations.
The storage device 14c is a nonvolatile semiconductor storage device. However, the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk. The storage device 14c may include a head device for writing and/or reading data to or from any recording medium, and a control device for the head device.
The storage device 14c stores map data M of the space S to be traveled, and data (travel path data) R of one or more travel paths. The map data M can be created by the AGV10 operating in the map creation mode and stored in the storage device 14 c. The travel path data R is transmitted from the outside after the map data M is made. In the present embodiment, the map data M and the travel route data R are stored in the same storage device 14c, but may be stored in different storage devices.
An example of the travel route data R will be described.
When the terminal device 20 is a tablet computer, the AGV10 receives the travel path data R indicating the travel path from the tablet computer. The travel route data R at this time includes mark data indicating positions of a plurality of marks. The "flag" indicates the passing position (via point) of the AGV10 to be traveled. The travel route data R includes at least position information of a start mark indicating a travel start position and an end mark indicating a travel end position. The travel route data R may further include position information of one or more markers of intermediate transit points. When the travel route includes one or more intermediate transit points, a route that reaches the end mark sequentially from the start mark via the travel transit points is defined as the travel route. The data for each marker may include, in addition to the coordinate data for that marker, the orientation (angle) and travel speed data of the AGV10 before moving to the next marker. When the AGV10 stops temporarily at the position of each marker and estimates its own position and notifies the terminal device 20, the data of each marker may include data of an acceleration time required to accelerate to the travel speed and/or a deceleration time required to decelerate from the travel speed to stop at the position of the next marker.
The movement of the AGV10 may be controlled not by the terminal device 20 but by the operation management apparatus 50 (e.g., a PC and/or a server computer). In this case, it may be that the operation management device 50 instructs the AGV10 to move to the next marker each time the AGV10 reaches a mark. For example, the AGV10 receives, as the travel path data R indicating the travel path, coordinate data of a target position to be next directed, or data of a distance from the target position and an angle of travel to be performed, from the operation management device 50.
The AGV10 can travel along the stored travel route while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
The communication circuit 14d is a wireless communication circuit that performs wireless communication according to Bluetooth (registered trademark) and/or Wi-Fi (registered trademark) standards, for example. Any standard includes wireless communication standards that use frequencies in the 2.4GHz band. For example, in a mode in which the AGV10 is caused to travel to create a map, the communication circuit 14d performs wireless communication in accordance with the Bluetooth (registered trademark) standard, and performs one-to-one communication with the terminal device 20.
The position estimation device 14e performs a process of creating a map and a process of estimating the position of the device itself during travel. The position estimation device 14e creates a map of the moving space S based on the position and posture of the AGV10 and the scanning result of the laser range finder. While traveling, the position estimation device 14e receives sensor data from the laser range finder 15, and reads out the map data M stored in the storage device 14 c. The self position (x, y, θ) on the map data M is identified by matching the local map data (sensor data) created from the scanning result of the laser range finder 15 with the map data M of a larger range. The position estimation device 14e generates data indicating "reliability" indicating how well the local map data matches the map data M. Each data of the self position (x, y, θ) and the reliability can be transmitted from the AGV10 to the terminal device 20 or the operation management device 50. The terminal device 20 or the operation management device 50 can receive the respective data of the own position (x, y, θ) and the reliability and display them on a built-in or connected display device.
In the present embodiment, the microcomputer 14a and the position estimation device 14e are different components, but this is merely an example. The microcomputer 14a and the position estimation device 14e may be a single chip circuit or a semiconductor integrated circuit that can independently perform the operations of the microcomputer and the position estimation device. A chip circuit 14g including a microcomputer 14a and a position inferring device 14e is shown in fig. 6A. Hereinafter, an example in which the microcomputer 14a and the position estimation device 14e are independently provided will be described.
Two motors 16a and 16b are respectively mounted to the two wheels 11a and 11b to rotate the respective wheels. That is, the two wheels 11a and 11b are driving wheels, respectively. In this description, a case will be described where the motor 16a and the motor 16b are motors that drive the right and left wheels, respectively, of the AGV 10.
The moving body 10 also has an encoder unit 18 that measures the rotational position or the rotational speed of the wheels 11a and 11 b. The encoder unit 18 includes a 1 st rotary encoder 18a and a 2 nd rotary encoder 18 b. The 1 st rotary encoder 18a measures rotation of an arbitrary position of the power transmission mechanism from the motor 16a to the wheel 11 a. The 2 nd rotary encoder 18b measures rotation of an arbitrary position of the power transmission mechanism from the motor 16b to the wheel 11 b. The encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14 a. The microcomputer 14a may control the movement of the mobile body 10 not only by using the signal received from the position estimation device 14e but also by using the signal received from the encoder unit 18.
The driving device 17 has motor driving circuits 17a and 17b, and the motor driving circuits 17a and 17b are used to adjust voltages applied to the two motors 16a and 16b, respectively. The motor drive circuits 17a and 17b each include a so-called inverter circuit. The motor drive circuits 17a and 17b turn on or off the current flowing in each motor according to a PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motors.
Fig. 6B shows an example of the 2 nd hardware configuration of the AGV 10. The 2 nd hardware configuration example is different from the 1 st hardware configuration example (fig. 6A) in that a laser positioning system 14h is provided and a microcomputer 14a is connected to each component in a one-to-one correspondence.
The laser positioning system 14h has a position inference device 14e and a laser range finder 15. The position estimation device 14e and the laser range finder 15 are connected by, for example, an ethernet (registered trademark) cable. The operations of the position estimation device 14e and the laser range finder 15 are as described above. The laser positioning system 14h outputs information indicating the attitude (x, y, θ) of the AGV10 to the microcomputer 14 a.
The microcomputer 14a has various general-purpose I/O interfaces or general-purpose input/output ports (not shown). The microcomputer 14a is directly connected to other components in the travel control device 14, such as the communication circuit 14d and the laser positioning system 14h, via the general-purpose input/output port.
Fig. 6B is the same as fig. 6A except for the above-described structure. Therefore, the description of the same structure is omitted.
The AGV10 according to the embodiment of the present disclosure may have a safety sensor such as a bumper switch, not shown. The AGV10 may also have inertial measurement devices such as gyroscopic sensors. By using the measurement data of the internal sensors such as the rotary encoders 18a and 18b and the inertial measurement unit, the moving distance and the amount of change (angle) in the posture of the AGV10 can be estimated. The estimated values of these distances and angles are referred to as odometer data, and can function to assist the position and posture information acquired by the position estimation device 14 e.
(4) Map data
Fig. 7A to 7F schematically show the AGV10 that travels while acquiring sensor data. The user 1 may manually move the AGV10 while operating the terminal device 20. Alternatively, the unit having the travel control device 14 shown in fig. 6A and 6B or the AGV10 itself may be mounted on the dolly, and the user 1 may push or pull the dolly to acquire the sensor data.
An AGV10 that uses a laser rangefinder 15 to scan the surrounding space is shown in FIG. 7A. The laser beam is radiated at every predetermined step angle to perform scanning. The illustrated scanning range is a schematically illustrated example, and is different from the scanning range of 270 degrees in total.
In fig. 7A to 7F, the positions of the reflection points of the laser beam are schematically shown by a plurality of black dots 4 indicated by the symbol "·". The scanning of the laser beam is performed in a short cycle during the change of the position and posture of the laser range finder 15. Therefore, the number of actual reflection points is much larger than the number of reflection points 4 shown in the figure. The position estimation device 14e accumulates the position of the black dot 4 acquired along with the travel, for example, in the memory 14 b. The map data is gradually completed by the AGV10 continuing to scan while traveling. In fig. 7B to 7E, only the scanning range is shown for the sake of simplicity. This scanning range is illustrative and is different from the above-described example of 270 degrees in total.
The map may be created using the microcomputer 14a in the AGV10 or an external computer based on sensor data acquired after the sensor data is acquired in an amount necessary for creating the map. Alternatively, the map may be created in real time based on sensor data acquired by the moving AGV 10.
Fig. 7F schematically shows a portion of the completed map 40. In the map shown in fig. 7F, a free space is defined by a Point Cloud (Point Cloud) corresponding to a set of reflection points of the laser beam. Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in units of a grid. The position estimation device 14e stores map data (map data M) in the memory 14b or the storage device 14 c. The number and density of black dots shown in the drawings are examples.
The map data thus obtained may be shared by multiple AGVs 10.
A typical example of an algorithm for the AGV10 to infer its location from map data is ICP (Iterative closest point) matching. As described above, the self position (x, y, θ) on the map data M can be estimated by matching the local map data (sensor data) created from the scanning result of the laser range finder 15 with the map data M in a wider range.
When the area where the AGV10 travels is large, the data amount of the map data M increases. Therefore, there is a possibility that a time required for creating a map increases, or a lot of time is required for estimating the self position. When such a failure occurs, the map data M may be divided into a plurality of data of local maps and created and recorded.
Fig. 8 shows an example of covering the entire area of one floor of one factory by a combination of four partial map data M1, M2, M3, M4. In this example, one local map data covers an area of 50m × 50 m. The boundary portion of two maps adjacent in each of the X direction and the Y direction is provided with a rectangular repetition area having a width of 5 m. This overlapping area is referred to as a "map switching area". When the AGV10 traveling while referring to one local map reaches the map switching area, the travel is switched to travel with reference to another local map adjacent thereto. The number of local maps is not limited to four, and may be set as appropriate according to the area of the floor on which the AGV10 travels, and the performance of the computer that performs the map creation and the self position estimation. The size of the local map data and the width of the overlap area are not limited to the above-described examples, and may be set arbitrarily.
(5) Configuration example of operation management device
Fig. 9 shows an example of the hardware configuration of the operation management device 50. The operation management device 50 has a CPU 51, a memory 52, a position database (position DB)53, a communication circuit 54, a map database (map DB)55, and an image processing circuit 56.
The CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57, and can transmit and receive data to and from each other.
The CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50. Typically, the CPU 51 is a semiconductor integrated circuit.
The memory 52 is a volatile storage device that stores a computer program executed by the CPU 51. The memory 52 may be used as a work memory for the CPU 51 to perform operations.
The position DB 53 stores position data indicating positions that can be destinations of the AGVs 10. The position data may be expressed by coordinates virtually set by a manager in a factory, for example. The location data is determined by the manager.
The communication circuit 54 performs wired communication in accordance with, for example, the ethernet (registered trademark) standard. The communication circuit 54 is connected to the access point 2 (fig. 1) by a wire, and can communicate with the AGV10 via the access point 2. The communication circuit 54 receives data from the CPU 51 via the bus 57 that should be sent to the AGV 10. In addition, the communication circuit 54 sends data (notification) received from the AGV10 to the CPU 51 and/or the memory 52 via the bus 57.
The map DB 55 stores data of maps inside a factory or the like where the AGV10 travels. This map may be the same as map 40 (fig. 7F) or may be different. The data format is not limited as long as it is a map having a one-to-one correspondence relationship with the position of each AGV 10. For example, the map stored in the map DB 55 may be a map created by CAD.
The position DB 53 and the map DB 55 may be constructed on a nonvolatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
The image processing circuit 56 is a circuit that generates video data to be displayed on the monitor 58. The image processing circuit 56 operates exclusively when the manager operates the operation management device 50. In the present embodiment, further detailed description is particularly omitted. The monitor 59 may be integrated with the operation management device 50. The CPU 51 may perform the processing of the image processing circuit 56.
(6) Actions of the operation management device
The operation of the operation management device 50 will be described in brief with reference to fig. 10. Fig. 10 is a diagram schematically showing an example of the travel path of the AGV10 determined by the operation management device 50.
The operation of the AGV10 and the operation management device 50 is summarized as follows. Next, the AGV10 is currently located at the position M1Travel to a position M as a final destination through several positionsn+1An example of (n: a positive integer of 1 or more) will be described. In addition, theIn the position DB 53, a representative position M is recorded1Position M to be passed after2Position M2Position M to be passed after3And coordinate data of each position.
The CPU 51 of the operation management device 50 reads the position M with reference to the position DB 532Generating the orientation position M2The travel command of (1). The communication circuit 54 sends a travel instruction to the AGV10 via the access point 2.
The CPU 51 periodically receives data representing the current position and posture from the AGV10 via the access point 2. In this way, the operations management device 50 is able to track the position of each AGV 10. The CPU 51 determines the current position and position M of the AGV102When they match, the position M is read3Generating the orientation position M3And sends the travel command to the AGV 10. That is, when determining that the AGV10 has reached a certain position, the operation management device 50 transmits a travel command to the next position to be passed. Thus, the AGV10 can reach the final target position Mn+1. The pass position and destination position of the AGV10 described above are sometimes referred to as "markers".
(7) Example of AGV operation
Next, a more specific example of the operation of the AGV10 will be described.
The AGV10 of the present embodiment can operate in a "tracking mode" for tracking a moving object, which is different from a mode in which the AGV travels in accordance with an instruction from the operation management device 50.
Fig. 11A and 11B are diagrams schematically showing an example of the AGV10 operating in the tracking mode. An example of the path of the AGV10 is schematically shown by the dashed lines in the figure. In the example of fig. 11A, the AGV10 moves following the movement of the user 1. In the example of fig. 11B, the AGV10 moves following the movement of another AGV10A for guidance.
The microcomputer 14a of the AGV10 detects a moving object from the sensor data acquired by the laser range finder 15. For example, the microcomputer 14a detects an object as a marker, such as the legs of the user 1 located in front or the rear bumper of another AGV 10A. The microcomputer 14a controls the movement of the AGV10 so that the distance between the object and the AGV10 is substantially constant (for example, several tens of cm). Thus, the AGV10 can track the moving object in front. The recognition of the moving object may also be performed by an external sensor other than the laser range finder 15, such as an image sensor or an ultrasonic sensor. In the case of using an image sensor, the microcomputer 14a may recognize a moving object through image processing.
The AGV10 starts moving from the 1 st point P1 as an initial point and moves to the 2 nd point P2 as a destination point while following the moving object. In this example, the AGV10 is loaded at point 1, P1, and unloaded from the AGV10 at point 2, P2.
While the AGV10 is moving from the 1 st location P1 to the 2 nd location P2, the microcomputer 14a stores the position information (x, y, θ) sequentially output from the position estimation device 14e in the storage device 14 c. In this case, the microcomputer 14a may directly store the position information sequentially output from the position estimation device 14e in the storage device 14c, or may divide the position information as necessary. For example, the microcomputer 14a may store the position information periodically output from the position estimating device 14e in the storage device 14c at a rate of 1 time of storage, that is, 1 time of storage N times (N is an arbitrary integer of 2 or more) of storage. Alternatively, the microcomputer 14a may store the position information in the storage device 14c every time the AGV10 travels a certain distance. The distance that the AGV10 travels can be calculated from the position information output from the position inferring device 14e or the odometer information output from the encoder unit 18. By dividing and recording the position information output from the position estimation device 14e, the data amount can be reduced and the processing speed can be increased.
After the mobile 10 arrives at the location P2 No. 2 and the unloading is completed, the microcomputer 14a returns the AGV10 from the location P2 No. 2 to the location P1 No. 1. The completion of the unloading can be notified to the microcomputer 14a by the user 1 pressing a switch or button, not shown, provided on the AGV10 or operating the terminal device 20, for example. The notification is input to the microcomputer 14a as a return instruction. Triggered by this return command, the microcomputer 14a starts the AGV10 to return to the point 1P 1. The return command may be transmitted from another device such as the operation management device 50 to the microcomputer 14 a. The operation management device 50 can detect that the unloading is completed at the point 2P 2 by, for example, the user 1 operating the terminal device 20. Alternatively, the operation management device 50 may detect that the unloading is completed from a sensor that reads information on an IC tag such as an RFID attached to each article placed on the AGV 10. Such sensors may be located, for example, on a rack or the like near AGV10 or location 2P 2.
The microcomputer 14a may return the AGV10 to the 1 st point P1 when a set condition is satisfied without explicitly giving a return command. As the set conditions, for example, the following conditions can be considered.
When the sensor that reads the information of the tag attached to each article detects the completion of unloading all the articles;
when a predetermined time (for example, 5 minutes to 1 hour or so) has elapsed after the AGV10 reaches the 2 nd point P2; and
when the AGV10 is manually moved to a specific area near the 2 nd location P2.
In this way, the controller can start the operation of returning the moving object 10 to the 1 st point when the position or state of the AGV10 satisfies the set condition (hereinafter, may be referred to as "return condition").
The microcomputer 14a returns the AGV10 to the 1 st location P1 against the path from the 1 st location P1 to the 2 nd location P2 based on the position information stored in the storage device 14c at the time of the return trip. Here, the travel paths of the AGV10 on the outbound and inbound travels need not be exactly the same. For example, if there is an obstacle that does not exist when the moving object 10 returns to the point P1 halfway along the route, the microcomputer 14 may control the operation of the AGV10 so as to avoid the obstacle. The obstacle may be detected by at least one obstacle sensor 19 (FIG. 5) located on the AGV 10.
Fig. 12 is a flowchart showing the operation of the AGV10 according to the present embodiment. When the AGV10 starts moving from the 1 st position P1, the microcomputer 14a stores the position information in the storage device 14c, for example, at every fixed time or every fixed distance (step S101). The stored set of position information is recorded in the storage device 14c as the route information of the outward route. When the AGV10 reaches the 2 nd place (yes in step S102), the microcomputer 14a determines whether a return instruction is issued or whether a return condition is satisfied (step S103). If the determination is yes, the microcomputer 14a returns the mobile body 10 to the 1 st spot P1 as the initial position against the route based on the position information recorded in the storage device 14c (step S104).
Next, an example of the operation of the AGV10 will be described in more detail with reference to fig. 13A to 13F. Here, an example in which three AGVs 10A, 10B, and 10C are queued and moved will be described as an example. The leading AGV10A moves from the initial point to the destination point in accordance with the path instruction from the operation management device 50. The 2 nd AGV10B follows the leading AGV 10A. The 3 rd AGV 10C follows the 2 nd AGV10B and moves.
The leading AGV10A maintains the environment map in the storage 14 c. The position estimation device 10e of the AGV10A moves while performing matching between the point cloud data output from the laser range finder 15 and the environment map to estimate its own position. When an obstacle that does not exist on the environment map exists on the route, the AGV10A can perform an operation to avoid the obstacle. The AGV10A can update the environment map while moving.
AGVs 10B and 10C following the leading AGV10A may not hold an environment map at the departure time. The AGVs 10B and 10C can create an environment map while moving from an initial location to a destination location. In this case, the other AGVs traveling ahead are not reflected in the environment map. The environment map may be created by the position estimation device 14e from the position information of the point cloud periodically output from the laser range finder 15.
Fig. 13A to 13C show an example of an operation in the case of going from the start of movement to the arrival at the destination point. Fig. 13D to 13F show examples of the action of the return trip from the destination point to the initial point. The initial location (location 1) and the destination location (location 2) differ depending on the AGV. For example, as shown in FIG. 13A, the initial location of the leading AGV10A is located forward of the initial locations of the AGVs 10B and 10C. The same applies to the destination point.
As shown in fig. 13B, during the outbound trip, the three AGVs 10A, 10B, 10C move forward toward the destination point while maintaining a substantially constant inter-vehicle distance. At this time, the AGVs 10A, 10B, and 10C each move while recording their own position information. The recording of the position information may be performed, for example, every fixed time (for example, several milliseconds to several seconds) or every fixed distance (for example, several tens of millimeters to several meters). The frequency of the recording may be lower than the output frequency of the position information output from the position estimation device 14 e.
The AGVs 10B and 10C can create or update an environment map while tracking the leading AGV 10A. In this case, the microcomputer 14a creates or updates the environment map based on the position information sequentially output from the position estimation device 14e while the AGVs 10B and 10C are caused to track the leading AGV 10A. The AGVs 10B, 10C may also obtain the environment map from the leading AGV10A or the operation management device by communication.
When the AGVs 10B and 10C following the leading AGV10A lose their positions, the position information can be acquired from the leading AGV10A or the operation management device 50. The operation management device 50 communicates with the leading AGV10A to track its position and attitude.
In this example, only the leading AGV10A is moving in accordance with the instruction from the operation management device, but the other AGVs 10B and 10C may be operated in accordance with the instruction from the operation management device. In this case, the AGVs 10B and 10C may not have the tracking function.
As shown in fig. 13C, when all AGVs 10A, 10B, 10C reach the destination, the unloading operation is performed. When the job is completed, the user presses a button provided to each AGV or operates the terminal device 20, thereby giving a return instruction to each AGV. Then, the AGVs are returned to the original locations in order from the AGV to which the return instruction is given. In the example of fig. 13D to 13F, return instructions are provided in the order of AGVs 10C, 10B, 10A. The microcomputer 14a of each AGV estimates its own position while moving, and controls the movement while confirming the correspondence between its position and orientation (x, y, θ) and the position information (x, y, θ) recorded in the storage device 14 c. The frequency of the corresponding confirmation of the position information on the return trip may be lower than the frequency of the recording of the position information on the outgoing trip. Finally, as shown in fig. 13F, all AGVs 10A, 10B, 10C return to their respective initial positions.
The position of each AGV shown in fig. 13F does not need to exactly coincide with the position in the initial state shown in fig. 13A, and may be slightly shifted. Further, each AGV may return to the middle of the travel path without returning to the initial position. In this case, a point in the middle of the movement path is interpreted as "point 1 st". Neither location 1 nor location 2 means a strict "point" and may be an area having a certain extent.
In the example shown in fig. 13D to 13F, each AGV moves toward the initial position against the path while maintaining the same orientation as that in the forward travel. I.e. each AGV runs in reverse. According to such a moving manner, the sensor data from the laser range finder 15 is similar in the outbound and inbound travels. Therefore, particularly when moving while creating or updating an environment map during a forward trip, there is an advantage in that position estimation during a backward trip is easy.
Fig. 14 is a diagram showing another example of the operation of the AGV during the return trip. In the example of FIG. 14, each AGV returns to the initial position in an opposite orientation to the orientation during the outbound. In this case, when the microcomputer 14a of each AGV starts the return operation, it first performs an operation of turning the direction of the AGV. According to this example, since the traveling direction of each AGV coincides with the front direction of the laser range finder 15, it is not necessary to change the control method of the microcomputer 14a for each motor during the forward and backward travel.
Fig. 15A is a diagram showing an example of position information recorded at the time of an outbound trip. Fig. 15B is a diagram showing an example of position information at the time of return. Fig. 15C is a diagram showing another example of the position information at the time of the return trip. Fig. 15B shows an example of the case where the AGV travels in the reverse direction while keeping the same direction as the direction during the forward travel. FIG. 15C shows an example of the case where the AGV has returned to the original position in an opposite direction to the forward direction. In the example of fig. 15C, the angle θ at the time of the return stroke differs from the angle θ at the time of the forward stroke by 180 degrees. The microcomputer 14a sets position information as shown in fig. 15B or 15C at the time of the return trip, and returns the AGV to the initial position along the same path as the forward trip.
Industrial applicability
The moving body and the moving body management system of the present disclosure can be suitably used for moving and carrying goods, parts, finished products, and the like in factories, warehouses, construction sites, logistics, hospitals, and the like.
Description of the reference symbols
1: a user; 2a, 2 b: an access point; 10: AGVs (mobiles); 14: a travel control device; 14 a: a microcomputer (arithmetic circuit); 14 b: a memory; 14 c: a storage device; 14 d: a communication circuit; 14 e: a position inferring device; 16a, 16 b: a motor; 15: a laser range finder; 17: a drive device; 17a, 17 b: a motor drive circuit; 18: an encoder unit; 18a, 18 b: a rotary encoder; 19: an obstacle sensor; 20: a terminal device (a mobile computer such as a tablet computer); 50: an operation management device; 51: a CPU; 52: a memory; 53: a location database (location DB); 54: a communication circuit; 55: a map database (map DB); 56: an image processing circuit; 100: a mobile body management system; 101: a moving body; 103: an external sensor; 105: a position inferring device; 107: a storage device; 109: a controller; 111: a drive device.

Claims (11)

1. A mobile body capable of autonomous movement, wherein,
the moving body includes:
a drive device that moves the movable body;
an external sensor;
a position estimation device that sequentially outputs position information indicating a position and a posture of the moving object based on sensor data output from the external sensor;
a storage device that stores the position information output from the position estimation device; and
a controller that controls the drive device to move the movable body,
the controller returns the mobile body to the 1 st place against a path from the 1 st place to the 2 nd place according to the position information stored in the storage device after the mobile body moves from the 1 st place to the 2 nd place.
2. The movable body according to claim 1, wherein,
the storage means stores a map of the environment,
the position estimation means determines an estimated value of the position and the posture of the mobile body on the environment map by performing matching of the sensor data output from the outside world sensor with the environment map, and outputs the estimated value as the position information.
3. The movable body according to claim 1 or 2, wherein,
the controller may cause the moving object to operate in a tracking mode in which the driving device is controlled based on the sensor data output from the external sensor, and the moving object is moved from the 1 st location to the 2 nd location while tracking the moving object.
4. The movable body according to claim 3, wherein,
the controller creates or updates an environment map based on the position information sequentially output from the position estimation device while the moving object is caused to track the moving object in the tracking mode.
5. The movable body according to any one of claims 1 to 4, wherein,
the controller starts an operation of returning the mobile body to the 1 st location in response to a return instruction from a user or another device after the mobile body moves from the 1 st location to the 2 nd location.
6. The movable body according to any one of claims 1 to 5, wherein,
the controller starts the moving body to return to the 1 st point when the position or state of the moving body satisfies a set condition.
7. The movable body according to claim 6, wherein,
the moving body is an automated guided vehicle,
the controller starts the operation of returning the mobile body to the 1 st location when the controller detects that the mobile body has completed unloading or loading after the mobile body has reached the 2 nd location.
8. The movable body according to any one of claims 1 to 7, wherein,
the controller may divide the position information sequentially output from the position estimation device and store the divided position information in the storage device.
9. The movable body according to any one of claims 1 to 8, wherein,
the mobile body further has at least one obstacle sensor that detects an obstacle,
the controller may cause the moving body to avoid the obstacle when the obstacle sensor detects the obstacle in the middle of a path where the moving body returns to the 1 st point.
10. The movable body according to any one of claims 1 to 9, wherein,
the controller moves the mobile body from the 1 st location to the 2 nd location in accordance with an instruction from an operation management device connected to the mobile body via a network.
11. A mobile body system includes:
at least one moving body; and
an operation management device that manages operation of the at least one mobile body,
the at least one mobile body is the mobile body according to any one of claims 1 to 10.
CN201880057308.6A 2017-09-13 2018-08-31 Moving body and moving body system Pending CN111052026A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2017175645 2017-09-13
JP2017-175645 2017-09-13
PCT/JP2018/032447 WO2019054208A1 (en) 2017-09-13 2018-08-31 Mobile body and mobile body system

Publications (1)

Publication Number Publication Date
CN111052026A true CN111052026A (en) 2020-04-21

Family

ID=65723688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880057308.6A Pending CN111052026A (en) 2017-09-13 2018-08-31 Moving body and moving body system

Country Status (3)

Country Link
JP (1) JP7081881B2 (en)
CN (1) CN111052026A (en)
WO (1) WO2019054208A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111538299A (en) * 2020-04-23 2020-08-14 新石器慧通(北京)科技有限公司 Unmanned vehicle for carrying port containers and control method
CN112904857A (en) * 2021-01-20 2021-06-04 广东顺德工业设计研究院(广东顺德创新设计研究院) Automatic guided vehicle control method and device and automatic guided vehicle

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230333568A1 (en) * 2019-05-17 2023-10-19 Murata Machinery, Ltd. Transport vehicle system, transport vehicle, and control method
JP7003097B2 (en) * 2019-10-30 2022-01-20 株式会社東芝 Automated guided vehicle
DE102020105334A1 (en) * 2020-02-28 2021-09-02 Viatcheslav Tretyakov Method for controlling a driverless transport vehicle and control system adapted to carry out the method
CN113703460B (en) * 2021-08-31 2024-02-09 上海木蚁机器人科技有限公司 Method, device and system for identifying vacant position of navigation vehicle
KR102491670B1 (en) * 2022-09-01 2023-01-27 엘케이시스(주) Apparatus for Generating Map for Autonomous Driving

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002321180A (en) * 2001-04-24 2002-11-05 Matsushita Electric Ind Co Ltd Robot control system
JP2004341864A (en) * 2003-05-16 2004-12-02 Yaskawa Electric Corp Moving carriage
JP2006285635A (en) * 2005-03-31 2006-10-19 Toshiba Corp Display guidance apparatus, robot system, and method of display guidance in robot system
JP2007004434A (en) * 2005-06-23 2007-01-11 Yoichiro Sawa Mobile robot system
JP2008084135A (en) * 2006-09-28 2008-04-10 Toshiba Corp Movement control method, mobile robot and movement control program
JP2013218541A (en) * 2012-04-10 2013-10-24 Panasonic Corp Control method for mobile robot
CN104149781A (en) * 2014-07-07 2014-11-19 广东技术师范学院天河学院 Intelligent carrier control system and control method
CN105159322A (en) * 2015-08-24 2015-12-16 铜陵学院 Servo control system based on automatic two-wheel middle-low-speed type fire-fighting robot
CN105197076A (en) * 2015-09-28 2015-12-30 三峡大学 Automatic moving vehicle
CN105759821A (en) * 2016-04-12 2016-07-13 深圳市正阳精密装备有限公司 Wheel type transport robot control system and method for automatic production workshop
CN106595648A (en) * 2016-11-04 2017-04-26 华为机器有限公司 Navigation method and terminal

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002321180A (en) * 2001-04-24 2002-11-05 Matsushita Electric Ind Co Ltd Robot control system
JP2004341864A (en) * 2003-05-16 2004-12-02 Yaskawa Electric Corp Moving carriage
JP2006285635A (en) * 2005-03-31 2006-10-19 Toshiba Corp Display guidance apparatus, robot system, and method of display guidance in robot system
JP2007004434A (en) * 2005-06-23 2007-01-11 Yoichiro Sawa Mobile robot system
JP2008084135A (en) * 2006-09-28 2008-04-10 Toshiba Corp Movement control method, mobile robot and movement control program
JP2013218541A (en) * 2012-04-10 2013-10-24 Panasonic Corp Control method for mobile robot
CN104149781A (en) * 2014-07-07 2014-11-19 广东技术师范学院天河学院 Intelligent carrier control system and control method
CN105159322A (en) * 2015-08-24 2015-12-16 铜陵学院 Servo control system based on automatic two-wheel middle-low-speed type fire-fighting robot
CN105197076A (en) * 2015-09-28 2015-12-30 三峡大学 Automatic moving vehicle
CN105759821A (en) * 2016-04-12 2016-07-13 深圳市正阳精密装备有限公司 Wheel type transport robot control system and method for automatic production workshop
CN106595648A (en) * 2016-11-04 2017-04-26 华为机器有限公司 Navigation method and terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111538299A (en) * 2020-04-23 2020-08-14 新石器慧通(北京)科技有限公司 Unmanned vehicle for carrying port containers and control method
CN111538299B (en) * 2020-04-23 2021-08-17 新石器慧通(北京)科技有限公司 Unmanned vehicle for carrying port containers and control method
CN112904857A (en) * 2021-01-20 2021-06-04 广东顺德工业设计研究院(广东顺德创新设计研究院) Automatic guided vehicle control method and device and automatic guided vehicle

Also Published As

Publication number Publication date
JP7081881B2 (en) 2022-06-07
WO2019054208A1 (en) 2019-03-21
JPWO2019054208A1 (en) 2020-10-15

Similar Documents

Publication Publication Date Title
JP7168211B2 (en) Mobile object that avoids obstacles and its computer program
JP2019168942A (en) Moving body, management device, and moving body system
JP7081881B2 (en) Mobiles and mobile systems
US20200264616A1 (en) Location estimation system and mobile body comprising location estimation system
JP7136426B2 (en) Management device and mobile system
CN110998472A (en) Mobile object and computer program
US20200363212A1 (en) Mobile body, location estimation device, and computer program
JP2020057307A (en) System and method for processing map data for use in self-position estimation, and moving entity and control system for the same
US11537140B2 (en) Mobile body, location estimation device, and computer program
JP2019053391A (en) Mobile body
JP7164085B2 (en) Work transport method using moving body, computer program, and moving body
JPWO2019054209A1 (en) Map making system and map making device
WO2019194079A1 (en) Position estimation system, moving body comprising said position estimation system, and computer program
JP7243014B2 (en) moving body
JP2019175137A (en) Mobile body and mobile body system
JP2019079171A (en) Movable body
JP2019179497A (en) Moving body and moving body system
JP2019067001A (en) Moving body
CN112578789A (en) Moving body
JP2020166702A (en) Mobile body system, map creation system, route creation program and map creation program
JP2019148871A (en) Movable body and movable body system
JPWO2019069921A1 (en) Mobile
WO2019059299A1 (en) Operation management device
JP2019175138A (en) Mobile body and management device
JP2020166701A (en) Mobile object and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200421

WD01 Invention patent application deemed withdrawn after publication