WO2018110568A1 - Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé - Google Patents

Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé Download PDF

Info

Publication number
WO2018110568A1
WO2018110568A1 PCT/JP2017/044621 JP2017044621W WO2018110568A1 WO 2018110568 A1 WO2018110568 A1 WO 2018110568A1 JP 2017044621 W JP2017044621 W JP 2017044621W WO 2018110568 A1 WO2018110568 A1 WO 2018110568A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
route
obstacle
path
detour
Prior art date
Application number
PCT/JP2017/044621
Other languages
English (en)
Japanese (ja)
Inventor
赤松 政弘
俊太 佐藤
健 阪井
大野 良治
Original Assignee
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産シンポ株式会社 filed Critical 日本電産シンポ株式会社
Priority to JP2018556702A priority Critical patent/JP7168211B2/ja
Publication of WO2018110568A1 publication Critical patent/WO2018110568A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/60Other road transportation technologies with climate change mitigation effect
    • Y02T10/72Electric energy management in electromobility

Definitions

  • This disclosure relates to a moving object that performs an obstacle avoidance operation.
  • JP 2009-223634 A, JP 2009-205652 A, and JP 2005-242489 A disclose systems that control the movement of each moving body so that a plurality of autonomous moving bodies do not collide with each other. ing.
  • Certain non-limiting exemplary embodiments of the present disclosure provide a technique that allows a moving object to smoothly avoid an obstacle when an obstacle exists on the movement path.
  • a mobile body is a mobile body that can move autonomously, and a plurality of drive wheels, and a plurality of motors respectively connected to the plurality of drive wheels,
  • An obstacle sensor that detects an obstacle
  • an external sensor that repeatedly scans the environment and outputs sensor data for each scan, and positional information that indicates estimated values of the position and orientation of the moving body based on the sensor data
  • a position estimation device that generates and outputs, a control circuit that controls the movement of the moving body by rotating the plurality of motors while referring to the position information output from the position estimation device, and a first direction related to the first direction.
  • a storage device that stores detour path data that defines a reference detour path that is a combination of a second path in a second direction different from the first direction, While the body is moving along a preset movement path, when the obstacle sensor detects that an obstacle is present in the traveling direction at the first position, the control circuit Using the detour route data, a detour route from the first position to the second position on the preset travel route is set, and the mobile body is moved along the detour route.
  • the control circuit uses the detour route data to set a detour route from the first position to the second position on the preset travel route, and moves the mobile body along the detour route.
  • the detour path is defined by a reference detour path that is a combination of the first path regarding the first direction and the second path regarding the second direction different from the first direction. Thereby, setting of a detour route is facilitated.
  • FIG. 1A is a diagram illustrating a moving route during normal traveling of the AGV.
  • FIG. 1B is a diagram illustrating a stop operation when the AGV detects an obstacle.
  • FIG. 1C is a diagram illustrating an obstacle avoidance operation of AGV.
  • FIG. 1D is a diagram for explaining a reference detour route.
  • FIG. 2 is a diagram illustrating an example of the moving space S where the AGV exists.
  • FIG. 3 is a diagram showing the AGV and the towing cart before being connected.
  • FIG. 4A is a diagram showing an AGV and a towing cart before being connected.
  • FIG. 4B is a diagram showing the connected AGV and towing cart.
  • FIG. 5 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 5 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 6A is a diagram illustrating a first hardware configuration example of AGV.
  • FIG. 6B is a diagram illustrating a second hardware configuration example of AGV.
  • FIG. 7A is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7B is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7C is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7D is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7E is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7F is a diagram schematically showing a part of the completed map.
  • FIG. 7A is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7B is a diagram illustrating an AGV that generates a map while moving.
  • FIG. 7C is a diagram illustrating an AGV that generates a map while moving
  • FIG. 8 is a diagram illustrating an example in which a map of one floor is configured by a plurality of partial maps.
  • FIG. 9 is a diagram illustrating a hardware configuration example of the operation management apparatus.
  • FIG. 10 is a diagram schematically illustrating an example of the movement route of the AGV determined by the operation management device.
  • FIG. 11A is a diagram illustrating an example of the detour route B when the obstacle 70 exists on the travel route R.
  • FIG. 11B is a diagram for describing the notation on the drawing related to the obstacle detection operation and the detour operation performed by the moving body 10.
  • FIG. 12 is a diagram illustrating an example of a detour route when there are a plurality of obstacles 70a.
  • FIG. 13 is a diagram illustrating an example of a detour route when there are a plurality of obstacles 70a.
  • FIG. 14 is a diagram illustrating an example of a detour route when there are a plurality of obstacles 70a.
  • FIG. 15 is a diagram illustrating three examples in which it is not possible to return to the original movement route R.
  • FIG. 16 is a diagram illustrating three examples in which it is not possible to return to the original travel route R.
  • FIG. 17 is a diagram illustrating three examples in which it is not possible to return to the original travel route R.
  • FIG. 18A is a flowchart illustrating a processing procedure of the microcomputer 14a according to the exemplary embodiment.
  • FIG. 18B is a flowchart illustrating a processing procedure of the microcomputer 14a according to the exemplary embodiment.
  • FIG. 19A is a diagram for explaining an operation example of the moving body 10 according to another example.
  • FIG. 19B is a diagram for describing an operation example of the moving body 10 according to another example.
  • FIG. 19C is a diagram for describing an operation example of the moving body 10 according to another example.
  • FIG. 19D is a diagram for describing an operation example of the moving body 10 according to another example.
  • FIG. 19E is a diagram for describing an operation example of the moving body 10 according to another example.
  • FIG. 20 is a flowchart showing a processing procedure of the microcomputer 14a according to another exemplary embodiment.
  • AAV automated guided vehicle
  • Automated guided vehicle includes automatic guided vehicles and automatic forklifts.
  • the term“ unmanned ” means that no person is required to steer the vehicle, and it does not exclude that the automated guided vehicle transports“ person (for example, a person who loads and unloads luggage) ”.
  • Unmanned tow truck refers to a trackless vehicle that automatically pulls a cart that loads and unloads packages manually or automatically to a designated location.
  • Unmanned forklift is a trackless vehicle equipped with a mast that moves up and down a load transfer fork, etc., automatically transfers the load to the fork, etc., automatically travels to the designated location, and performs automatic load handling work.
  • a trackless vehicle is a vehicle that includes wheels and an electric motor or engine that rotates the wheels.
  • a “moving body” is a device that carries a person or a load and moves, and includes a driving device such as a wheel, a biped or multi-legged walking device, and a propeller that generate driving force (traction) for movement.
  • the term “mobile body” in the present disclosure includes not only a narrow automatic guided vehicle but also a mobile robot, a service robot, and a drone.
  • Autonomous travel includes travel based on a command of an operation management system of a computer to which the automatic guided vehicle is connected by communication, and autonomous travel by a control device included in the automatic guided vehicle. Autonomous traveling includes not only traveling where the automated guided vehicle travels to a destination along a predetermined route, but also traveling following a tracking target. Moreover, the automatic guided vehicle may temporarily perform manual travel based on an instruction from the worker. “Automatic travel” generally includes both “guided” travel and “guideless” travel, but in the present disclosure, it means “guideless” travel.
  • “Guide type” is a method in which a derivative is installed continuously or intermittently and a guided vehicle is guided using the derivative.
  • “Guideless type” is a method of guiding without installing a derivative.
  • the automatic guided vehicle in the embodiment of the present disclosure includes a self-position estimation device and can travel in a guideless manner.
  • Self-position estimation device is a device that estimates a self-position on an environmental map based on sensor data acquired by an external sensor such as a laser range finder.
  • External sensor is a sensor that senses the external state of the moving body.
  • the external sensor include a laser range finder (also referred to as a range sensor), a camera (or an image sensor), a LIDAR (Light Detection and Ranging), a millimeter wave radar, an ultrasonic sensor, and a magnetic sensor.
  • a laser range finder also referred to as a range sensor
  • a camera or an image sensor
  • LIDAR Light Detection and Ranging
  • millimeter wave radar an ultrasonic sensor
  • magnetic sensor magnetic sensor
  • Internal sensor is a sensor that senses the internal state of a moving object.
  • Examples of the internal sensor include a rotary encoder (hereinafter sometimes simply referred to as “encoder”), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
  • SAM is an abbreviation for “Simultaneous Localization” and “Mapping”, which means that self-location estimation and environmental map creation are performed simultaneously.
  • 1A to 1C illustrate a basic operation example of the moving object 10 according to an exemplary embodiment of the present disclosure.
  • 1A to 1C show a travel route R connecting the start point S and the end point G.
  • an “automated guided vehicle” AGV
  • the travel route may be referred to as a “travel route”.
  • the travel route R is a straight line.
  • the moving body 10 includes at least an obstacle sensor that detects an obstacle in the traveling direction of the moving body 10.
  • FIG. 1A shows a moving body 10 that travels along a travel route R.
  • FIG. 1B shows the moving body 10 that stops traveling when an obstacle 70 exists on the travel route R by the obstacle sensor. The moving body 10 resumes traveling when the obstacle 70 is removed.
  • FIG. 1C shows the mobile 10 that travels by setting a detour route that avoids the obstacle 70 when the obstacle 70 exists on the travel route R by the obstacle sensor.
  • Figure 1C is shown detour route B left a bypass route B.
  • Right and left is a right travel route R avoiding direction of the obstacle 70 is based on the traveling direction. Whether the avoidance direction is the right side or the left side can be set in advance in the moving body 10.
  • bypass route B is set using the “reference bypass route”.
  • FIG. 1D shows an example of a typical reference detour route.
  • the “reference bypass route” is a route defined by a combination of routes in at least two directions. That is, the reference detour route is a route defined by a combination of a first route relating to the first direction and a second route relating to a second direction different from the first direction.
  • the first direction is a direction perpendicular to the route R (up and down direction on the paper surface), and the second direction is a direction parallel to the travel route R.
  • the moving body 10 moves in the first direction, for example, using the first route as one unit, and moves in the second direction, using the second route as one unit.
  • the first direction and the second direction of the reference detour path are orthogonal to each other, but may not be orthogonal. If not orthogonal, it may have a path for one or more additional directions that are different from the first and second directions. Accordingly, in the example shown in FIG. 1D, the detour has a U-shape, but it may be a shape having, for example, a hexagon as a route that is divided into two by a straight line passing through the center thereof.
  • the distance of the first route and the distance of the second route are both fixed values and equal. However, both need not be the same value. In the embodiment described later, an example is given in which the distance of the second route is not a fixed value for the second route.
  • the moving body is an automatic guided vehicle
  • the automatic guided vehicle may be described as “AGV” using an abbreviation.
  • AGV automatic guided vehicle
  • FIG. 2 shows a basic configuration example of an exemplary mobile management system 100 according to the present disclosure.
  • the mobile management system 100 includes at least one AGV 10 and an operation management device 50 that manages the operation of the AGV 10.
  • FIG. 2 also shows a terminal device 20 operated by the user 1.
  • AGV10 is an automatic guided vehicle that can perform “guideless” traveling that does not require a magnetic tape or other derivative for traveling.
  • the AGV 10 can perform self-position estimation and transmit the estimation result to the terminal device 20 and the operation management device 50.
  • the AGV 10 can automatically travel in the moving space S in accordance with a command from the operation management device 50.
  • the AGV 10 can also operate in a “tracking mode” in which it moves following a person or other moving body.
  • the operation management device 50 is a computer system that tracks the position of each AGV 10 and manages the running of each AGV 10.
  • the operation management device 50 may be a desktop PC, a notebook PC, and / or a server computer.
  • the operation management device 50 communicates with each AGV 10 via the plurality of access points 2. For example, the operation management apparatus 50 transmits the data of the coordinates of the position where each AGV 10 should go next to each AGV 10.
  • Each AGV 10 transmits data indicating its own position and orientation to the operation management device 50 periodically, for example, every 100 milliseconds.
  • the operation management device 50 transmits data on the coordinates of the position to be further headed.
  • the AGV 10 can travel in the moving space S according to the operation of the user 1 input to the terminal device 20.
  • An example of the terminal device 20 is a tablet computer.
  • the travel of the AGV 10 using the terminal device 20 is performed at the time of map creation, and the travel of the AGV 10 using the operation management device 50 is performed after the map creation.
  • FIG. 3 shows an example of a moving space S in which three AGVs 10a, 10b, and 10c exist. Assume that all AGVs are traveling in the depth direction in the figure. The AGVs 10a and 10b are transporting loads placed on the top board. The AGV 10c travels following the front AGV 10b.
  • reference numerals 10a, 10b, and 10c are assigned in FIG. 3, but hereinafter, they are described as “AGV10”.
  • the AGV 10 can also transport packages using a tow truck connected to itself, in addition to a method of transporting packages placed on the top board.
  • FIG. 4A shows the AGV 10 and the towing cart 5 before being connected. A caster is provided on each foot of the traction cart 5. The AGV 10 is mechanically connected to the traction cart 5.
  • FIG. 4B shows the AGV 10 and the traction cart 5 connected. When the AGV 10 travels, the tow cart 5 is pulled by the AGV 10. By pulling the tow cart 5, the AGV 10 can transport the load placed on the tow cart 5.
  • connection method between the AGV 10 and the towing cart 5 is arbitrary.
  • a plate 6 is fixed to the top plate of the AGV 10.
  • the pulling cart 5 is provided with a guide 7 having a slit.
  • the AGV 10 approaches the tow truck 5 and inserts the plate 6 into the slit of the guide 7.
  • the AGV 10 passes an electromagnetic lock pin (not shown) through the plate 6 and the guide 7 and applies an electromagnetic lock. Thereby, AGV10 and tow cart 5 are physically connected.
  • Each AGV 10 and the terminal device 20 can be connected, for example, on a one-to-one basis, and can perform communication based on the Bluetooth (registered trademark) standard.
  • Each AGV 10 and the terminal device 20 can perform communication based on Wi-Fi (registered trademark) using one or a plurality of access points 2.
  • the plurality of access points 2 are connected to each other via, for example, the switching hub 3.
  • FIG. 2 shows two access points 2a and 2b.
  • the AGV 10 is wirelessly connected to the access point 2a.
  • the terminal device 20 is wirelessly connected to the access point 2b.
  • the data transmitted by the AGV 10 is received by the access point 2a, transferred to the access point 2b via the switching hub 3, and transmitted from the access point 2b to the terminal device 20.
  • the data transmitted by the terminal device 20 is received by the access point 2b, transferred to the access point 2a via the switching hub 3, and transmitted from the access point 2a to the AGV 10. Thereby, bidirectional communication between the AGV 10 and the terminal device 20 is realized.
  • the plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. Thereby, bidirectional communication is also realized between the operation management device 50 and each AGV 10.
  • a map in the moving space S is created so that the AGV 10 can travel while estimating its own position.
  • the AGV 10 is equipped with a position estimation device and a laser range finder, and a map can be created using the output of the laser range finder.
  • the AGV10 shifts to a data acquisition mode by a user operation.
  • the AGV 10 starts acquiring sensor data using the laser range finder.
  • the laser range finder periodically scans the surrounding space S by periodically emitting, for example, an infrared or visible laser beam.
  • the laser beam is reflected by the surface of a structure such as a wall or a pillar or an object placed on the floor.
  • the laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs measurement result data indicating the position of each reflection point.
  • the direction and distance of the reflected light are reflected at the position of each reflection point.
  • Data of measurement results obtained by one scan may be referred to as “measurement data” or “sensor data”.
  • the position estimation device accumulates sensor data in a storage device.
  • the sensor data accumulated in the storage device is transmitted to the external device.
  • the external device is, for example, a computer having a signal processor and having a mapping program installed therein.
  • the signal processor of the external device superimposes the sensor data obtained for each scan.
  • a map of the space S can be created by repeatedly performing the process of overlapping by the signal processor.
  • the external device transmits the created map data to the AGV 10.
  • the AGV 10 stores the created map data in an internal storage device.
  • the external device may be the operation management device 50 or another device.
  • the AGV 10 may create the map instead of the external device.
  • a circuit such as a microcontroller unit (microcomputer) of the AGV 10 may perform the processing performed by the signal processor of the external device described above.
  • microcontroller unit microcomputer
  • the data capacity of sensor data is generally considered large. Since it is not necessary to transmit sensor data to an external device, occupation of the communication line can be avoided.
  • the movement in the movement space S for acquiring sensor data can be realized by the AGV 10 traveling according to the user's operation.
  • the AGV 10 receives a travel command instructing movement in the front, rear, left, and right directions from the user via the terminal device 20 wirelessly.
  • the AGV 10 travels forward, backward, left and right in the moving space S according to the travel command, and creates a map.
  • a map may be created by traveling forward and backward and left and right in the moving space S according to a control signal from the steering device.
  • Sensor data may be acquired by a person walking around a measurement carriage equipped with a laser range finder.
  • the number of AGVs may be one.
  • the user 1 can use the terminal device 20 to select one AGV 10 from the plurality of registered AGVs and create a map of the moving space S.
  • each AGV 10 can automatically travel while estimating its own position using the map. A description of the process of estimating the self position will be given later.
  • FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 includes two drive wheels 11a and 11b, four casters 11c, 11d, 11e, and 11f, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
  • the two drive wheels 11a and 11b are provided on the right side and the left side of the AGV 10, respectively.
  • the four casters 11c, 11d, 11e, and 11f are arranged at the four corners of the AGV 10.
  • the AGV 10 also has a plurality of motors connected to the two drive wheels 11a and 11b, but the plurality of motors are not shown in FIG. Further, FIG.
  • FIG. 5 shows one drive wheel 11a and two casters 11c and 11e located on the right side of the AGV 10, and a caster 11f located on the left rear part, but the left drive wheel 11b and the left front part.
  • the caster 11d is not clearly shown because it is hidden behind the frame 12.
  • the four casters 11c, 11d, 11e, and 11f can freely turn.
  • the drive wheels 11a and the drive wheels 11b are also referred to as wheels 11a and wheels 11b, respectively.
  • the traveling control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), electronic components, and a board on which they are mounted.
  • the traveling control device 14 performs the above-described data transmission / reception with the terminal device 20 and the preprocessing calculation.
  • the laser range finder 15 is an optical instrument that measures the distance to the reflection point by, for example, emitting an infrared or visible laser beam 15a and detecting the reflected light of the laser beam 15a.
  • the laser range finder 15 of the AGV 10 has a pulsed laser beam, for example, in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10 while changing the direction every 0.25 degrees.
  • the reflected light of each laser beam 15a is detected. Thereby, data of the distance to the reflection point in the direction determined by the angle corresponding to the total of 1081 steps every 0.25 degrees can be obtained.
  • the scan of the surrounding space performed by the laser range finder 15 is substantially parallel to the floor surface and is planar (two-dimensional). However, the laser range finder 15 may perform scanning in the height direction.
  • the AGV 10 can create a map of the space S based on the position and orientation (orientation) of the AGV 10 and the scan result of the laser range finder 15.
  • the map may reflect the arrangement of walls, pillars and other structures around the AGV, and objects placed on the floor.
  • the map data is stored in a storage device provided in the AGV 10.
  • the position and posture of the moving body are called poses.
  • the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” hereinafter.
  • the position of the reflection point viewed from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 15 outputs sensor data expressed in polar coordinates.
  • the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
  • Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
  • the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
  • Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
  • the traveling control device 14 can estimate its current position by comparing the measurement result of the laser range finder 15 with the map data held by itself.
  • the stored map data may be map data created by another AGV 10.
  • FIG. 6A shows a first hardware configuration example of the AGV 10.
  • FIG. 6A also shows a specific configuration of the travel control device 14.
  • the AGV 10 includes a travel control device 14, a laser range finder 15, two motors 16a and 16b, a drive device 17, wheels 11a and 11b, and two rotary encoders 18a and 18b.
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, a position estimation device 14e, and an obstacle sensor 14j.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e, and / or the memory 14b.
  • the microcomputer 14a is a processor or a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive device 17 to control the drive device 17 and adjust the voltage applied to the motor.
  • PWM Pulse Width Modulation
  • One or more control circuits for example, a microcomputer for controlling the driving of the left and right motors 16a and 16b may be provided independently of the microcomputer 14a.
  • the motor driving device 17 may include two microcomputers that control the driving of the motors 16a and 16b, respectively. These two microcomputers may perform coordinate calculations using the encoder information output from the encoders 18a and 18b, respectively, and estimate the moving distance of the AGV 10 from a given initial position.
  • the two microcomputers may control the motor drive circuits 17a and 17b using encoder information.
  • the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the position estimation device 14e perform calculations.
  • the storage device 14c is a non-volatile semiconductor memory device.
  • the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk.
  • the storage device 14c may include a head device for writing and / or reading data on any recording medium and a control device for the head device.
  • the storage device 14c stores map data M of the traveling space S, data of one or a plurality of travel routes (travel route data) R, and detour route data Bd.
  • the map data M is created by the AGV 10 operating in the map creation mode and stored in the storage device 14c.
  • the travel route data R is transmitted from the outside after the map data M is created.
  • the detour route data Bd is prepared in advance and stored in the storage device 14c in order to define the reference detour route.
  • the reference detour route may be defined as a combination of a first route relating to the first direction and a second route relating to a second direction different from the first direction.
  • the map data M, the travel route data R, and the detour route data Bd are stored in the same storage device 14c, but may be stored in different storage devices.
  • the AGV 10 receives travel route data R indicating a travel route from the tablet computer.
  • the travel route data R at this time includes marker data indicating the positions of a plurality of markers. “Marker” indicates the passing position (route point) of the traveling AGV 10.
  • the travel route data R includes at least position information of a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include position information of one or more intermediate waypoint markers. When the travel route includes one or more intermediate waypoints, a route from the start marker to the end marker via the travel route point in order is defined as the travel route.
  • the data of each marker may include data on the direction (angle) and traveling speed of the AGV 10 until moving to the next marker, in addition to the coordinate data of the marker.
  • the data of each marker includes acceleration time required for acceleration to reach the travel speed, and / or Further, it may include data of deceleration time required for deceleration from the traveling speed until the vehicle stops at the position of the next marker.
  • the operation management device 50 may control the movement of the AGV 10 instead of the terminal device 20. In that case, the operation management device 50 may instruct the AGV 10 to move to the next marker every time the AGV 10 reaches the marker. For example, the AGV 10 receives, from the operation management device 50, the coordinate data of the target position to be next, or the data of the distance to the target position and the angle to travel as the travel route data R indicating the travel route.
  • the AGV 10 can travel along the stored travel route while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
  • the communication circuit 14d is a wireless communication circuit that performs wireless communication based on, for example, Bluetooth (registered trademark) and / or Wi-Fi (registered trademark) standards. Each standard includes a wireless communication standard using a frequency of 2.4 GHz band. For example, in a mode in which the AGV 10 is run to create a map, the communication circuit 14d performs wireless communication based on the Bluetooth (registered trademark) standard and communicates with the terminal device 20 one-on-one.
  • Bluetooth registered trademark
  • Wi-Fi registered trademark
  • the position estimation device 14e performs map creation processing and self-position estimation processing when traveling.
  • the position estimation device 14e creates a map of the moving space S based on the position and orientation of the AGV 10 and the scan result of the laser range finder.
  • the position estimation device 14e receives sensor data from the laser range finder 15 and reads map data M stored in the storage device 14c.
  • the self-position x, y, ⁇
  • the position estimation device 14e generates “reliability” data representing the degree to which the local map data matches the map data M.
  • Each data of self-position (x, y, ⁇ ) and reliability can be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50.
  • the terminal device 20 or the operation management device 50 can receive each data of its own position (x, y, ⁇ ) and reliability and display it on a built-in or connected display device.
  • the microcomputer 14a and the position estimation device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the position estimation device 14e.
  • FIG. 6A shows a chip circuit 14g including the microcomputer 14a and the position estimation device 14e. Below, the example in which the microcomputer 14a and the position estimation apparatus 14e are provided independently is demonstrated.
  • the obstacle sensor 14j is an infrared sensor that detects an obstacle, for example, by emitting infrared rays and detecting reflected light.
  • the two motors 16a and 16b are attached to the two wheels 11a and 11b, respectively, and rotate each wheel. That is, the two wheels 11a and 11b are drive wheels, respectively.
  • the motor 16a and the motor 16b are described as being motors that drive the right wheel and the left wheel of the AGV 10, respectively.
  • the moving body 10 further includes an encoder unit 18 that measures the rotational positions or rotational speeds of the wheels 11a and 11b.
  • the encoder unit 18 includes a first rotary encoder 18a and a second rotary encoder 18b.
  • the first rotary encoder 18a measures the rotation at any position of the power transmission mechanism from the motor 16a to the wheel 11a.
  • the second rotary encoder 18b measures the rotation at any position of the power transmission mechanism from the motor 16b to the wheel 11b.
  • the encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14a.
  • the microcomputer 14a may control the movement of the moving body 10 using the signal received from the encoder unit 18 as well as the signal received from the position estimation device 14e.
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of motor drive circuits 17a and 17b includes a so-called inverter circuit.
  • the motor drive circuits 17a and 17b turn on or off the current flowing through each motor by a PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motor.
  • FIG. 6B shows a second hardware configuration example of the AGV 10.
  • the second hardware configuration example is different from the first hardware configuration example (FIG. 6A) in that it has a laser positioning system 14h and the microcomputer 14a is connected to each component in a one-to-one relationship. To do.
  • the laser positioning system 14 h includes a position estimation device 14 e and a laser range finder 15.
  • the position estimation device 14e and the laser range finder 15 are connected by, for example, an Ethernet (registered trademark) cable. Each operation of the position estimation device 14e and the laser range finder 15 is as described above.
  • the laser positioning system 14h outputs information indicating the pause (x, y, ⁇ ) of the AGV 10 to the microcomputer 14a.
  • the microcomputer 14a has various general purpose I / O interfaces or general purpose input / output ports (not shown).
  • the microcomputer 14a is directly connected to other components in the travel control device 14 such as the communication circuit 14d and the laser positioning system 14h via the general-purpose input / output port.
  • the AGV 10 in the embodiment of the present disclosure may include a safety sensor such as a bumper switch (not shown).
  • the AGV 10 may include an inertial measurement device such as a gyro sensor.
  • the movement distance and the change amount (angle) of the AGV 10 can be estimated.
  • These estimated values of distance and angle are called odometry data, and can exhibit a function of assisting position and orientation information obtained by the position estimation device 14e.
  • FIGS. 7A to 7F schematically show the AGV 10 that moves while acquiring sensor data.
  • the user 1 may move the AGV 10 manually while operating the terminal device 20.
  • the sensor data may be acquired by placing the unit including the travel control device 14 shown in FIGS. 6A and 6B, or the AGV 10 itself on a cart, and the user 1 manually pushing or driving the cart.
  • FIG. 7A shows an AGV 10 that scans the surrounding space using the laser range finder 15.
  • a laser beam is emitted at every predetermined step angle, and scanning is performed.
  • the illustrated scan range is an example schematically shown, and is different from the above-described scan range of 270 degrees in total.
  • the position of the reflection point of the laser beam is schematically shown using a plurality of black spots 4 represented by the symbol “•”.
  • the laser beam scan is executed in a short cycle while the position and posture of the laser range finder 15 change. For this reason, the actual number of reflection points is much larger than the number of reflection points 4 shown in the figure.
  • the position estimation device 14e accumulates the position of the black spot 4 obtained as the vehicle travels, for example, in the memory 14b. By continuously performing scanning while the AGV 10 is traveling, the map data is gradually completed.
  • FIG. 7B to FIG. 7E only the scan range is shown for simplicity.
  • the scan range is an example, and is different from the above-described example of 270 degrees in total.
  • the map may be created using the microcomputer 14a in the AGV 10 or an external computer on the basis of the sensor data after acquiring the amount of sensor data necessary for creating the map. Or you may create a map in real time based on the sensor data which AGV10 which is moving moves.
  • FIG. 7F schematically shows a part of the completed map 40.
  • the free space is partitioned by a point cloud (Point Cloud) corresponding to a collection of laser beam reflection points.
  • Point Cloud Point Cloud
  • Another example of the map is an occupied grid map that distinguishes between a space occupied by an object and a free space in units of grids.
  • the position estimation device 14e accumulates map data (map data M) in the memory 14b or the storage device 14c.
  • map data M map data M
  • the number or density of black spots shown in the figure is an example.
  • the map data obtained in this way can be shared by a plurality of AGVs 10.
  • a typical example of an algorithm in which the AGV 10 estimates its own position based on map data is ICP (Iterative Closest Point) matching.
  • ICP Intelligent Closest Point
  • the map data M may be created and recorded separately for a plurality of partial map data.
  • FIG. 8 shows an example in which the entire area of one floor of one factory is covered by a combination of four partial map data M1, M2, M3, and M4.
  • one partial map data covers an area of 50 m ⁇ 50 m.
  • a rectangular overlapping region having a width of 5 m is provided at the boundary between two adjacent maps in each of the X direction and the Y direction. This overlapping area is called a “map switching area”.
  • the traveling is switched to refer to another adjacent partial map.
  • the number of partial maps is not limited to four, and may be appropriately set according to the area of the floor on which the AGV 10 travels, the performance of the computer that executes map creation and self-location estimation.
  • the size of the partial map data and the width of the overlapping area are not limited to the above example, and may be arbitrarily set.
  • FIG. 9 shows a hardware configuration example of the operation management device 50.
  • the operation management device 50 includes a CPU 51, a memory 52, a position database (position DB) 53, a communication circuit 54, a map database (map DB) 55, and an image processing circuit 56.
  • the CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57, and can exchange data with each other.
  • the CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50.
  • the CPU 51 is a semiconductor integrated circuit.
  • the memory 52 is a volatile storage device that stores a computer program executed by the CPU 51.
  • the memory 52 can also be used as a work memory when the CPU 51 performs calculations.
  • the position DB 53 stores position data indicating each position that can be a destination of each AGV 10.
  • the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
  • the location data is determined by the administrator.
  • the communication circuit 54 performs wired communication conforming to, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 54 is connected to the access point 2 (FIG. 1) by wire and can communicate with the AGV 10 via the access point 2.
  • the communication circuit 54 receives data to be transmitted to the AGV 10 from the CPU 51 via the bus 57.
  • the communication circuit 54 transmits the data (notification) received from the AGV 10 to the CPU 51 and / or the memory 52 via the bus 57.
  • the map DB 55 stores map data inside the factory where the AGV 10 is traveling.
  • the map may be the same as or different from the map 40 (FIG. 7F).
  • the format of the data is not limited.
  • the map stored in the map DB 55 may be a map created by CAD.
  • the position DB 53 and the map DB 55 may be constructed on a nonvolatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disk.
  • the image processing circuit 56 is a circuit that generates video data to be displayed on the monitor 58.
  • the image processing circuit 56 operates exclusively when the administrator operates the operation management device 50. In the present embodiment, further detailed explanation is omitted.
  • the monitor 59 may be integrated with the operation management device 50. Further, the CPU 51 may perform the processing of the image processing circuit 56.
  • FIG. 10 is a diagram schematically illustrating an example of the movement route of the AGV 10 determined by the operation management device 50.
  • An outline of operations of the AGV 10 and the operation management device 50 is as follows.
  • an example where an AGV 10 is currently at a position M 1 passes through several positions, and travels to a final destination position M n + 1 (n: a positive integer greater than or equal to 1).
  • n a positive integer greater than or equal to 1
  • coordinate data indicating positions such as a position M 2 to be passed next to the position M 1 and a position M 3 to be passed next to the position M 2 are recorded.
  • the CPU 51 of the operation management device 50 reads the coordinate data of the position M 2 with reference to the position DB 53 and generates a travel command for making it move to the position M 2 .
  • the communication circuit 54 transmits a travel command to the AGV 10 via the access point 2.
  • the CPU 51 periodically receives data indicating the current position and orientation from the AGV 10 via the access point 2.
  • the operation management device 50 can track the position of each AGV 10.
  • CPU51 determines that the current position of the AGV10 matches the position M 2, reads the coordinate data of the position M 3, and transmits the AGV10 generates a travel command to direct the position M 3.
  • the operation management apparatus 50 transmits a travel command for directing to the position to be passed next.
  • the AGV 10 can reach the final target position M n + 1 .
  • the above-described passing position and target position of the AGV 10 may be referred to as “markers”.
  • FIG. 11A shows an example of the detour route B when the obstacle 70 exists on the travel route R.
  • the microcomputer 14a uses the detour route data Bd to set the detour route B from the position P1 to the position P4 on the travel route, and moves the moving body 10 along the detour route B.
  • the detour path B shown in FIG. 11A is composed of the most basic one reference detour path.
  • the “reference bypass route” is defined by a combination of the first travel route and the second travel route.
  • the reference detour path is defined as follows in the present embodiment. First, the first direction and the second direction are orthogonal to each other.
  • the reference detour route is defined by a first route from position P1 to position P2, a second route from position P2 to position P3, and a first route from position P3 to position P4.
  • the first path from the position P1 to the position P2 is a path that moves the moving body 10 by a distance D1 in a direction parallel to the first direction and away from the moving path R.
  • the second path from the position P2 to the position P3 is a path that moves the moving body 10 by a distance D2 parallel to the second direction.
  • the moving body 10 rotates counterclockwise and changes its posture from the second direction to the first direction.
  • the first path from the position P3 to the position P4 is a path that is parallel to the first direction and moves the moving body 10 by a distance D1 in a direction approaching the moving path R.
  • the moving body 10 rotates clockwise and changes its posture from the first direction to the second direction.
  • the obstacle 70 can be detoured and returned to the original movement route R, and thereafter, the vehicle can move along the movement route R.
  • FIG. 11A shows a first maximum allowable distance Dmax1 related to the first direction and a second maximum allowable distance Dmax2 related to the second direction.
  • the detour route B is set such that the distance in the first direction from the travel route R is within the first maximum allowable distance Dmax1, and the travel distance in the second direction is within the second maximum allowable distance Dmax2.
  • Each data of the 1st maximum permissible distance Dmax1 and the 2nd maximum permissible distance Dmax2 is beforehand memorized by storage manager 14c, for example by the manager of moving object 10.
  • the reference detour path, the first maximum allowable distance Dmax1 and the second maximum allowable distance Dmax2 may be set independently for each moving body 10.
  • FIG. 11B shows the meaning of each operation of symbols a to c as a legend. That is, as a legend, an obstacle determination operation a, a movement operation b, and a direction changing operation c between the first route and the second route are described.
  • the distance moved by the movement operation b may be referred to as “minimum avoidance distance”.
  • “Minimum avoidance distance” means the minimum distance away from the original route.
  • FIG. 12 and subsequent figures the same reference numerals are given to the respective operations, and description of the respective operations a to c will be omitted. For convenience of explanation, the operations a to c that are particularly focused on may be described as “operation a1” or the like.
  • FIGS. 12, 13, and 14 show examples of detour routes when there are a plurality of obstacles 70a. Since the obstacle 70a is detected by the first obstacle determination operation a, the microcomputer 14a sets a reference detour path.
  • the microcomputer 14a performs the first movement operation b for detouring. After the movement of the distance D1, the microcomputer 14a detects a further obstacle 70b on the detour path in the second direction that is the traveling direction by the obstacle determination operation a1. Thereby, the microcomputer 14a further continues the movement operation b1 along the first direction.
  • the operation after further movement of the distance D1 is the same as the example after the position P2 in FIG. 11A. However, the movement distance by the last movement operation b2 is D2 * 2.
  • the microcomputer 14a performs the first moving operation b for detouring, and performs the direction changing operation c after moving the distance D1.
  • the microcomputer 14a performs the obstacle 70b on the detour path in the second direction that is the traveling direction by the obstacle determination operation a1. Is detected.
  • the microcomputer 14a further performs the movement operation b2 in the direction away from the original movement route R along the first direction. After the movement of the further distance D1, it is the same as the example of FIG.
  • FIG. 14 shows the operation of the moving body 10 when an obstacle is detected during the plurality of moving operations b.
  • the operation when each obstacle is detected is in accordance with the example of FIGS.
  • the obstacle determination operation a1 will be particularly described.
  • the position where the obstacle determination operation a1 is performed corresponds to the position moved by the distance D2 from the position where the first determination operation a0 is performed when viewed in the second direction.
  • the microcomputer 14a determines whether or not it is possible to return (return) in the direction of the movement route R in units of the distance D2 from the position where the first determination operation a0 is performed.
  • the movement operation b1 in the second direction is continued.
  • the direction changing operation c1 is an operation performed when there is no obstacle as a result of the determination operation. Thereby, the microcomputer 14a moves the moving body 10 in the direction approaching the moving route R along the first direction.
  • the microcomputer 14a When the detour route cannot be set within the first maximum allowable distance Dmax1 in the first direction from the first determination operation a0 (FIG. 15), the microcomputer 14a returns to the movement route R. It is determined that the mobile object 10 cannot be stopped. Further, in the second direction, the microcomputer 14a also stops the moving body 10 when the detour route cannot be set within the second maximum allowable distance Dmax2 (FIG. 16). Further, as shown in FIG. 17, the microcomputer 14 a also stops the moving body 10 when the presence of an obstacle is detected in all the return directions to the moving route R.
  • the administrator may carry the moving body 10 to a different position and restart the moving body 10.
  • 18A and 18B are flowcharts showing the processing procedure of the microcomputer 14a according to the present embodiment.
  • step S10 the microcomputer 14a starts normal running.
  • the state of traveling on the movement route R is described as “normal operation” in FIG. 18BA.
  • step S12 the microcomputer 14a determines whether an obstacle is detected from the output of the obstacle sensor 14j. If no obstacle is detected, the normal operation in step S10 is continued. If an obstacle is detected, the process proceeds to step S14.
  • step S14 the microcomputer 14a determines whether or not the first maximum allowable distance Dmax1 has been reached.
  • the position at the time of deviating from the movement route R is stored using the position information output from the position estimation device 14e, and the microcomputer 14a determines the position and the current position with respect to the first direction. This is achieved by calculating the difference between If it is determined that it has not been reached, the process proceeds to step S16. If it is determined that it has been reached, the process proceeds to step S26.
  • step S16 the microcomputer 14a controls the moving direction so that the moving body 10 travels in the direction away from the moving route R along the first direction.
  • step S18 the process proceeds to step S18.
  • step S18 The process of step S18 is started immediately before the moving body 10 starts moving.
  • the microcomputer 14a again determines whether or not an obstacle has been detected by the same processing as in step S12. If no obstacle is detected, the process proceeds to step S20. If an obstacle is detected, the process proceeds to step S22.
  • step S20 the microcomputer 14a determines whether or not the movement along the first direction by the minimum avoidance distance is completed.
  • the information on the movement distance is the same as the method described in step S14.
  • step S22 the microcomputer 14a determines whether or not the second maximum allowable distance Dmax2 has been reached. This process is also realized by the same process as step S14. That is, the position at the time of deviating from the movement route R is stored using the position information output from the position estimation device 14e, and the microcomputer 14a calculates the difference between the position and the current position in the second direction. Is realized. If it is determined that it has not been reached, the process proceeds to step S24. If it is determined that it has been reached, the process proceeds to step S26.
  • step S24 the microcomputer 14a controls the moving direction so that the moving body 10 travels in the direction along the second direction.
  • step S20 FIG. 18B
  • step S18 is started immediately before the moving body 10 starts moving.
  • step S26 the microcomputer 14a stops the moving operation of the moving body 10.
  • the reason is that failure to avoid the obstacle 70 has failed.
  • the obstacle detection is stopped until the moving body 10 is restarted by the administrator, thereby suppressing power consumption.
  • step S28 of FIG. 18B the microcomputer 14a again determines whether or not an obstacle has been detected by the same processing as in step S12. If no obstacle is detected, the process proceeds to step S30. If an obstacle is detected, the process returns to step S14.
  • step S30 the microcomputer 14a determines whether or not the movement along the second direction by the minimum avoidance distance is completed.
  • the information on the movement distance is acquired by the same method as that described in steps S14 and S20.
  • step S32 the microcomputer 14a controls the moving direction so that the moving body 10 travels in the direction approaching the moving route R along the first direction.
  • step S34 the process proceeds to step S34.
  • step S34 the microcomputer 14a again determines whether an obstacle has been detected by the same processing as in step S12. If no obstacle is detected, the process proceeds to step S36. If an obstacle is detected, the process returns to step S22.
  • step S36 the microcomputer 14a determines whether or not the movement along the first direction is completed by the minimum avoidance distance.
  • the information on the movement distance is acquired by the same method as that described in steps S14, S20, and S30. If the movement is not completed, the process returns to step S32 to continue the movement. If the movement is completed, the process proceeds to step S38.
  • step S38 the microcomputer 14a determines whether or not the moving body 10 has reached the moving path R. Also at this time, the microcomputer 14a may compare whether or not the current position output from the position estimation device 14e is the same as the position at the time of deviating from the movement route R with respect to the first direction. If the mobile object 10 has reached the movement route R, the process proceeds to step S40. If the mobile object 10 has not reached, the process after step S32 is executed while continuing to move.
  • step S40 the microcomputer 14a returns to the movement route R and determines that the avoidance operation is completed. Thereby, the movement (normal operation) on the movement route R can be resumed.
  • FIGS. 12 to 14 are examples related to setting of a detour route.
  • the mobile body 10 selects to move away from the travel route R is described.
  • the movement in the second direction may not only proceed in the direction from the start point S toward the end point G, but also proceed in the direction from the end point G back to the start point S.
  • Obstacle avoidance operation by the moving body 10 The present inventor observed the situation after the obstacle was placed on the moving path R of the moving body 10. As a result, it was found that the placement of the obstacle is temporary, and the obstacle is often removed after a certain period of time. If the detour route is set several times, the movement distance becomes long and the movement takes time. For this reason, the present inventors have found a problem that the moving body 10 continues to travel on the detour route even though the obstacle is actually removed.
  • FIGS. 19A to 19E are diagrams for explaining different operation examples of the moving body 10. The operations described below are different operation examples of the moving body 10 in the situation shown in FIG.
  • the microcomputer 14a detects the obstacle 70p by the operation a1 at the position P1. As a result, the detour route is set and the movement operation b is performed. Next, the microcomputer 14a detects the obstacle 70q in the traveling direction by the determination operation a2 at the position P2. In the example of FIG. 12, the microcomputer 14a has moved the moving body 10 along the further detour path b1.
  • the mobile unit 10 can select the bypass route b2, and can also select another bypass route b3. It may also be possible to select detour paths in other directions as well as in two directions (first direction and second direction).
  • the microcomputer 14a again selects a route for moving the moving body 10 from the position P2 to the position P1 as a bypass route among the plurality of bypass routes.
  • FIG. 19C shows the moving body 10 returning to the position P1 along the path b2.
  • the position P1 is a position where the obstacle 70p is detected on the moving route R.
  • the microcomputer 14a of the moving body 10 that has returned to the position P1 determines again whether or not the obstacle 70p exists on the original movement route R at the position P1. According to the above knowledge, there is a high possibility that the obstacle 70p is removed. Even if the obstacle 70p still exists, it may be considered that it may be removed if it waits for a certain period of time.
  • the microcomputer 14a of the moving body 10 moves the moving body 10 along the original movement path R.
  • the microcomputer 14a of the moving body 10 causes the moving body 10 to wait at a position P1 for a predetermined time and waits for the obstacle 70p to be removed. Thereafter, when the obstacle 70p is removed, the microcomputer 14a may move the moving body 10 along the original moving route R.
  • the mobile unit 10 travels once on the detour route, and when there is no obstacle on the detour route, it can return to the travel route R relatively quickly using the detour route. it can.
  • the process returns to the original position to determine whether the obstacle has been removed. If it has been removed, traveling of the moving route R can be started quickly. On the other hand, even if it waits for a certain period until it is removed, there are cases where it is possible to travel on the moving route R more quickly than when the detour route is set repeatedly.
  • FIG. 20 is a flowchart showing a processing procedure up to FIGS. 19A to 19E.
  • the flowchart shown in FIG. 20 is an alternative process of FIG. 18B.
  • the processes shown in FIG. 18A are performed in common.
  • the process indicated by “B” circled in FIG. 18A is not included. This is because the process of returning from step S28 to step S14 included in FIG. 18B is not included in FIG.
  • step S28 differs from the process of FIG. 18B in that steps S50, S52, and S54 are provided in FIG. In FIG. 20, the same processes (steps) as in FIG. In this example, it is assumed that the obstacle detection in step S28 is performed immediately before moving along the second direction. That is, it is assumed that the obstacle 70 exists in the traveling direction with respect to the second direction, and the moving body 10 has not yet moved in the second direction.
  • step S50 the microcomputer 14a selects a return route from a plurality of available detour routes.
  • the return route is a route that reversely follows the route that has been traveled so far. Thereafter, the process proceeds to step S32.
  • step S52 the microcomputer 14a determines whether an obstacle has been detected.
  • the obstacle at this time means the obstacle 70 that causes the necessity of setting a detour route while traveling on the movement route R. If the obstacle is continuously detected, for example, the microcomputer 14a waits until no obstacle is detected. If no obstacle is detected, the process proceeds to step S54.
  • the time until no obstacle is detected is short in step S52. .
  • step S54 the microcomputer 14a resumes traveling along the movement route R.
  • the comprehensive or specific aspect described above may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium.
  • the present invention may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a recording medium.
  • the processes of FIGS. 18A, 18B, and 20 described above can be realized as a computer program executed by the microcomputer 14a that is a computer.
  • the computer program can be recorded and distributed on a recording medium such as an optical disk such as a CD-ROM, a magnetic disk such as a hard disk, or a semiconductor memory such as a flash memory.
  • the computer program can be sent and received using a telecommunication line and can be a target of a commercial transaction.
  • the mobile body and mobile body management system of the present disclosure can be suitably used for moving and transporting items such as luggage, parts, and finished products in factories, warehouses, construction sites, logistics, hospitals, and the like.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Optics & Photonics (AREA)
  • Electromagnetism (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

Un corps mobile (10) comprend : une pluralité de roues motrices entraînées par une pluralité de moteurs ; un capteur d'obstacle (14j) ; un capteur externe (15) qui balaie répétitivement l'environnement et sort des données de capteur relatives à chaque balayage ; un dispositif d'estimation de position (14e) qui génère et sort successivement des informations de position sur le corps mobile sur la base des données de capteur ; un circuit de commande (14a) qui commande le déplacement du corps mobile en se référant aux informations de position ; et un dispositif de stockage (14c) qui stocke des données d'itinéraire de détour définissant un itinéraire de détour de référence, autrement dit une combinaison d'un premier itinéraire se rapportant à une première direction et d'un second itinéraire se rapportant à une seconde direction différente de la première. En cas de détection d'un obstacle dans la direction de déplacement au niveau de la première position à partir de la sortie du capteur d'obstacle alors que le corps mobile se déplace le long d'un itinéraire de déplacement prédéfini, le circuit de commande paramètre un itinéraire de détour de la première position à la seconde position sur l'itinéraire de déplacement prédéfini en utilisant les données de l'itinéraire de détour, puis amène le corps mobile à se déplacer sur l'itinéraire de détour.
PCT/JP2017/044621 2016-12-12 2017-12-12 Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé WO2018110568A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018556702A JP7168211B2 (ja) 2016-12-12 2017-12-12 障害物の回避動作を行う移動体およびそのコンピュータプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662432750P 2016-12-12 2016-12-12
US62/432,750 2016-12-12

Publications (1)

Publication Number Publication Date
WO2018110568A1 true WO2018110568A1 (fr) 2018-06-21

Family

ID=62559626

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/044621 WO2018110568A1 (fr) 2016-12-12 2017-12-12 Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé

Country Status (3)

Country Link
JP (1) JP7168211B2 (fr)
TW (1) TWI665538B (fr)
WO (1) WO2018110568A1 (fr)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020053031A (ja) * 2018-09-21 2020-04-02 日本電産株式会社 移動体の制御方法及び移動体の制御システム
JP2020111160A (ja) * 2019-01-10 2020-07-27 シャープ株式会社 台車及び搬送システム
JP2020166633A (ja) * 2019-03-29 2020-10-08 本田技研工業株式会社 管理装置、管理方法、およびプログラム
JP2020166734A (ja) * 2019-03-29 2020-10-08 日本電産シンポ株式会社 搬送車
CN111800732A (zh) * 2019-04-03 2020-10-20 佐臻股份有限公司 虚实信息整合空间定位系统
CN112105890A (zh) * 2019-01-30 2020-12-18 百度时代网络技术(北京)有限公司 用于自动驾驶车辆的基于rgb点云的地图生成系统
CN113252035A (zh) * 2020-02-11 2021-08-13 胜薪科技股份有限公司 光学导航装置
JP2023069171A (ja) * 2021-11-05 2023-05-18 三菱ロジスネクスト株式会社 遠隔操作システム
WO2023109281A1 (fr) * 2021-12-14 2023-06-22 灵动科技(北京)有限公司 Procédé et dispositif de commande de la conduite d'un robot mobile autonome

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020066486A1 (fr) * 2018-09-27 2020-04-02 村田機械株式会社 Système de déplacement et procédé de commande d'un système de déplacement
JP7205310B2 (ja) * 2019-03-04 2023-01-17 トヨタ自動車株式会社 移動体の使用方法
TWI729369B (zh) * 2019-03-25 2021-06-01 佐臻股份有限公司 虛實訊息整合空間定位系統
TWI790035B (zh) * 2021-12-10 2023-01-11 神達電腦股份有限公司 自動搬運車的控制方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07281748A (ja) * 1994-04-15 1995-10-27 Nippondenso Co Ltd 自走体の運行方法、及び自走体の運行システム
JP2000276232A (ja) * 1999-03-25 2000-10-06 Fuji Heavy Ind Ltd 自律走行作業車の障害物回避制御装置
JP2012022467A (ja) * 2010-07-13 2012-02-02 Murata Mach Ltd 自律移動体
JP2013257743A (ja) * 2012-06-13 2013-12-26 Sumitomo Heavy Ind Ltd 移動体

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090164123A1 (en) * 2006-05-17 2009-06-25 Murata Kikai Kabushiki Kaisha Travel device for self-propelled device
TWM468391U (zh) * 2013-08-05 2013-12-21 台灣新光保全股份有限公司 導護機器人
CN105683666B (zh) * 2013-10-29 2018-10-26 三菱电机株式会社 空气净化器

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07281748A (ja) * 1994-04-15 1995-10-27 Nippondenso Co Ltd 自走体の運行方法、及び自走体の運行システム
JP2000276232A (ja) * 1999-03-25 2000-10-06 Fuji Heavy Ind Ltd 自律走行作業車の障害物回避制御装置
JP2012022467A (ja) * 2010-07-13 2012-02-02 Murata Mach Ltd 自律移動体
JP2013257743A (ja) * 2012-06-13 2013-12-26 Sumitomo Heavy Ind Ltd 移動体

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020053031A (ja) * 2018-09-21 2020-04-02 日本電産株式会社 移動体の制御方法及び移動体の制御システム
JP2020111160A (ja) * 2019-01-10 2020-07-27 シャープ株式会社 台車及び搬送システム
JP7181097B2 (ja) 2019-01-10 2022-11-30 シャープ株式会社 台車及び搬送システム
CN112105890B (zh) * 2019-01-30 2023-11-17 百度时代网络技术(北京)有限公司 用于自动驾驶车辆的基于rgb点云的地图生成系统
CN112105890A (zh) * 2019-01-30 2020-12-18 百度时代网络技术(北京)有限公司 用于自动驾驶车辆的基于rgb点云的地图生成系统
JP2021515254A (ja) * 2019-01-30 2021-06-17 バイドゥ ドットコム タイムス テクノロジー (ベイジン) カンパニー リミテッド 自動運転車のためのリアルタイム地図生成システム
US11227398B2 (en) 2019-01-30 2022-01-18 Baidu Usa Llc RGB point clouds based map generation system for autonomous vehicles
JP7019731B2 (ja) 2019-01-30 2022-02-15 バイドゥ ドットコム タイムス テクノロジー (ベイジン) カンパニー リミテッド 自動運転車のためのリアルタイム地図生成システム
JP2020166633A (ja) * 2019-03-29 2020-10-08 本田技研工業株式会社 管理装置、管理方法、およびプログラム
JP2020166734A (ja) * 2019-03-29 2020-10-08 日本電産シンポ株式会社 搬送車
CN111830960A (zh) * 2019-03-29 2020-10-27 日本电产新宝株式会社 搬运车
CN111800732A (zh) * 2019-04-03 2020-10-20 佐臻股份有限公司 虚实信息整合空间定位系统
CN113252035A (zh) * 2020-02-11 2021-08-13 胜薪科技股份有限公司 光学导航装置
JP7381181B2 (ja) 2021-11-05 2023-11-15 三菱ロジスネクスト株式会社 遠隔操作システム
JP2023069171A (ja) * 2021-11-05 2023-05-18 三菱ロジスネクスト株式会社 遠隔操作システム
WO2023109281A1 (fr) * 2021-12-14 2023-06-22 灵动科技(北京)有限公司 Procédé et dispositif de commande de la conduite d'un robot mobile autonome

Also Published As

Publication number Publication date
JPWO2018110568A1 (ja) 2019-10-24
TW201833702A (zh) 2018-09-16
TWI665538B (zh) 2019-07-11
JP7168211B2 (ja) 2022-11-09

Similar Documents

Publication Publication Date Title
WO2018110568A1 (fr) Corps mobile effectuant une opération d'évitement d'obstacle et programme informatique associé
JP2019168942A (ja) 移動体、管理装置および移動体システム
JP6816830B2 (ja) 位置推定システム、および当該位置推定システムを備える移動体
JP7081881B2 (ja) 移動体および移動体システム
JP6825712B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
WO2019026761A1 (fr) Corps mobile et programme informatique
JP7136426B2 (ja) 管理装置および移動体システム
JP2020057307A (ja) 自己位置推定のための地図データを加工する装置および方法、ならびに移動体およびその制御システム
WO2019187816A1 (fr) Corps mobile et système de corps mobile
JP2019053391A (ja) 移動体
WO2019054209A1 (fr) Système et dispositif de création de carte
JP7111424B2 (ja) 移動体、位置推定装置、およびコンピュータプログラム
JP2019175137A (ja) 移動体および移動体システム
WO2019194079A1 (fr) Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique
JP7243014B2 (ja) 移動体
JP2019179497A (ja) 移動体および移動体システム
JP2019079171A (ja) 移動体
JP2019067001A (ja) 移動体
JP2019165374A (ja) 移動体および移動体システム
WO2020213645A1 (fr) Système de création de carte, circuit de traitement de signal, corps mobile et procédé de création de carte
CN112578789A (zh) 移动体
JP2020166702A (ja) 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム
JP2019148871A (ja) 移動体および移動体システム
WO2019069921A1 (fr) Corps mobile
WO2019059299A1 (fr) Dispositif de gestion opérationnelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17881051

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2018556702

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17881051

Country of ref document: EP

Kind code of ref document: A1