WO2019054209A1 - Système et dispositif de création de carte - Google Patents

Système et dispositif de création de carte Download PDF

Info

Publication number
WO2019054209A1
WO2019054209A1 PCT/JP2018/032448 JP2018032448W WO2019054209A1 WO 2019054209 A1 WO2019054209 A1 WO 2019054209A1 JP 2018032448 W JP2018032448 W JP 2018032448W WO 2019054209 A1 WO2019054209 A1 WO 2019054209A1
Authority
WO
WIPO (PCT)
Prior art keywords
map
space
data
sensor data
time
Prior art date
Application number
PCT/JP2018/032448
Other languages
English (en)
Japanese (ja)
Inventor
信也 安達
Original Assignee
日本電産シンポ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電産シンポ株式会社 filed Critical 日本電産シンポ株式会社
Priority to JP2019541997A priority Critical patent/JPWO2019054209A1/ja
Publication of WO2019054209A1 publication Critical patent/WO2019054209A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3807Creation or updating of map data characterised by the type of data
    • G01C21/383Indoor data
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/10Map spot or coordinate position indicators; Map reading aids

Definitions

  • the present disclosure relates to a cartographic system and a cartographic apparatus.
  • An autonomous mobile robot has been developed which moves space autonomously along a predetermined route.
  • the autonomous mobile robot senses the surrounding space using an external sensor such as a laser distance sensor, matches the sensing result with a map prepared in advance, and estimates (identifies) its current position and posture. .
  • the autonomous mobile robot can move along the path while controlling its current position and attitude.
  • JP 2005-326944 A and JP 2014-509417 A disclose a technique of creating a map used for matching. Both scan the surroundings while moving the laser distance sensor or the laser scanner, and create a map of the scanned area. Objects existing in the area at the time of scanning are reflected on the map.
  • JP 2005-326944 A Japanese Patent Application Publication No. 2014-509417
  • the created map may reflect movable objects such as luggage.
  • the robot performs matching using such a map, if the movable object is removed, the sensing result of the robot and the map may be different, and the estimation accuracy of the position and orientation may be reduced.
  • One non-limiting exemplary embodiment of the present application provides a technique for creating a map to which a moving object is referred to in order to estimate its own position in consideration of the presence of a moving object.
  • An exemplary cartographic system is a cartographic system for producing a map to which a mobile refers to estimate its own position, which is fixedly installed in a space in which the mobile travels, A fixed sensor that senses part of the space at different times and outputs sensor data of each time, and a map creation device that receives sensor data of each time and creates a map of at least a part of the space
  • the sensor data at each time indicates the position of an object present in a part of the space at each time
  • the map creation device stores the sensor data at each time
  • a signal processing circuit for creating local map data which is a map of a part of the space based on the position of the fixed object commonly included in all the sensor data at each time.
  • the above general or specific aspects may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium.
  • the present invention may be realized by any combination of a system, an apparatus, a method, an integrated circuit, a computer program, and a storage medium.
  • the mapping system enables creating a map to which a moving object is referred to in order to estimate its own position, in consideration of the presence of a movable object.
  • FIG. 1 is a block diagram showing a schematic configuration of a mobile unit in an exemplary embodiment of the present disclosure.
  • FIG. 2 is a diagram showing an outline of a control system that controls traveling of each AGV according to the present disclosure.
  • FIG. 3 is a diagram showing an example of a moving space in which an AGV is present.
  • FIG. 4A shows the AGV and tow truck before being connected.
  • FIG. 4B shows the connected AGV and tow truck.
  • FIG. 5 is an external view of an exemplary AGV according to the present embodiment.
  • FIG. 6A is a diagram illustrating an example of a first hardware configuration of an AGV.
  • FIG. 6B is a diagram showing an example of a second hardware configuration of an AGV.
  • FIG. 7A shows an AGV that generates a map while moving.
  • FIG. 7B is a diagram showing an AGV that generates a map while moving.
  • FIG. 7C is a diagram showing an AGV that generates a map while moving.
  • FIG. 7D is a diagram showing an AGV that generates a map while moving.
  • FIG. 7E is a diagram showing an AGV that generates a map while moving.
  • FIG. 7F is a view schematically showing a part of the completed map.
  • FIG. 8 is a diagram showing an example in which a map of one floor is configured by a plurality of partial maps.
  • FIG. 9 is a diagram showing an example of a hardware configuration of the operation management device.
  • FIG. 10 is a diagram schematically showing an example of the AGV movement route determined by the operation management device.
  • FIG. 10 is a diagram schematically showing an example of the AGV movement route determined by the operation management device.
  • FIG. 11A is an overhead view of a moving space in which a plurality of fixed sensors are installed.
  • FIG. 11B is a view showing an example of the configuration of a cartographic system including fixed sensors and a cartographic apparatus.
  • FIG. 12 is a perspective view showing the appearance of a laser range finder which is an example of a fixed sensor.
  • FIG. 13 is a hardware block diagram showing an internal configuration of a laser range finder which is an example of a fixed sensor.
  • FIG. 14 is a diagram showing an object present in the movement space and a fixed sensor including the object in the field of view.
  • FIG. 15 is a diagram showing a local map including sensor data output from a fixed sensor in a certain period.
  • FIG. 16 is a diagram showing a local map after updating.
  • FIG. 17 is a flowchart showing the procedure of processing of the map creating device.
  • FIG. 18 is a diagram showing transition of sensor data output from the fixed sensor when a plurality of objects appear at different times and different positions.
  • FIG. 19 is a view showing an example of a local map in which a fixed object and a movable object are shown in a distinguishable manner.
  • FIG. 20 is a diagram showing a point cloud representing a fixed object whose existing frequency is equal to or higher than a threshold.
  • FIG. 21 is a diagram showing a grid map displayed in darker color as the value of the presence frequency is larger.
  • FIG. 22 is a flowchart showing a procedure of processing of the map creating device for creating a map displaying the fixed object and the movable object.
  • unmanned transport vehicle means a trackless vehicle that manually or automatically loads a load on a main body, travels automatically to a designated location, and unloads manually or automatically.
  • unmanned aerial vehicle includes unmanned tow vehicles and unmanned forklifts.
  • unmanned means that the steering of the vehicle does not require a person, and does not exclude that the unmanned carrier conveys a "person (e.g., a person who unloads a package)".
  • the "unmanned tow truck” is a trackless vehicle that is to automatically travel to a designated location by towing a cart for manual or automatic loading and unloading of luggage.
  • the "unmanned forklift” is a trackless vehicle equipped with a mast for raising and lowering a load transfer fork and the like, automatically transferring the load to the fork and the like and automatically traveling to a designated location and performing an automatic load handling operation.
  • a “trackless vehicle” is a vehicle that includes a wheel and an electric motor or engine that rotates the wheel.
  • a “mobile” is a device that moves while carrying a person or a load, and includes a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • a driving device such as a wheel, a biped or multi-legged walking device, or a propeller that generates a traction for movement.
  • the term "mobile” in the present disclosure includes mobile robots, service robots, and drone as well as unmanned guided vehicles in a narrow sense.
  • the “automatic traveling” includes traveling based on an instruction of an operation management system of a computer to which the automated guided vehicle is connected by communication, and autonomous traveling by a control device provided in the automated guided vehicle.
  • the autonomous traveling includes not only traveling by the automated guided vehicle toward a destination along a predetermined route, but also traveling by following a tracking target.
  • the automatic guided vehicle may perform manual traveling temporarily based on the instruction of the worker.
  • “automatic travel” generally includes both “guided” travel and “guideless” travel, in the present disclosure, “guideless” travel is meant.
  • the “guided type” is a system in which a derivative is installed continuously or intermittently and a guided vehicle is guided using the derivative.
  • the “guideless type” is a method of guiding without installing a derivative.
  • the unmanned transfer vehicle in the embodiment of the present disclosure includes a self position estimation device, and can travel in a guideless manner.
  • the “self-position estimation device” is a device that estimates the self-location on the environment map based on sensor data acquired by an external sensor such as a laser range finder.
  • the “external sensor” is a sensor that senses the external state of the mobile object.
  • the external sensor includes, for example, a laser range finder (also referred to as a range sensor), a camera (or an image sensor), LIDAR (Light Detection and Ranging), a millimeter wave radar, and a magnetic sensor.
  • the “internal sensor” is a sensor that senses the internal state of the mobile object.
  • the internal sensors include, for example, a rotary encoder (hereinafter, may be simply referred to as an "encoder"), an acceleration sensor, and an angular acceleration sensor (for example, a gyro sensor).
  • SAM Simultaneous Localization and Mapping
  • FIG. 1 is a block diagram showing a schematic configuration of a cartographic system in an exemplary embodiment of the present disclosure.
  • the cartographic system 101 is used to create a map that the mobile refers to in order to estimate its position.
  • the map creating system 101 includes one or more fixed sensors 103 a, 103 b, 103 c and a map creating device 105.
  • the fixed sensors 103a, 103b, and 103c are fixedly installed at different positions in a space in which the moving body moves (hereinafter referred to as "moving space").
  • Each of the fixed sensors 103a, 103b, and 103c senses (scans) the surrounding space at periodic or random timing, and outputs scan data (sensor data) at each time.
  • the sensor data at each time point indicates the position of an object present in a part of the movement space at each time point.
  • the fixed sensors 103a, 103b, and 103c are carried in and installed when newly creating or recreating a map of the space in which the moving object moves, and removed after collection of data for mapping is completed. obtain.
  • the fixed sensors 103a, 103b, and 103c may be fixed at the time of acquiring scan data.
  • the map creating device 105 receives sensor data of each time from each of the fixed sensors 103a, 103b, and 103c, and creates a map of at least a part of the moving space. It will be described more specifically.
  • the map creating device 105 includes a storage device 107, a signal processing circuit 109, and an interface device 111 that receives sensor data of each time output from each fixed sensor.
  • the interface device 111 may be a wired or wireless communication terminal and / or a communication circuit.
  • the map creating device 105 receives the sensor data each time sensor data at each time is output from each fixed sensor. However, after each fixed sensor 103 has acquired sensor data at each time, it may be stored in an internal storage device, and may be transferred to the map generation device 105 collectively later. The transfer may be performed via a wired or wireless communication line, or may be performed via a storage medium such as a memory card.
  • the storage device 107 stores sensor data of each time received from each fixed sensor via the interface device 111.
  • the storage device 107 may store map data M of the entire moving space in advance. Furthermore, the storage device 107 may acquire and store scan data obtained by each scan also from an AGV moving around while scanning the entire moving space in order to create map data. A map of the entire moving space can be created while matching the scan data obtained from the AGV etc. with the sensor data of each time obtained from each fixed sensor.
  • the signal processing circuit 109 performs the following processing for each fixed sensor using the sensor data stored in the storage device 107.
  • the fixed sensor 103a is mentioned as an example and demonstrated for the facilities of description, the other fixed sensors 103b and 103c are also the same.
  • the signal processing circuit 109 acquires, from the storage device 107, a set SDa of sensor data obtained from the fixed sensor 103a and acquired at a plurality of different times.
  • the signal processing circuit 109 creates local map data based on the position of the object commonly included in all the sensor data of each time.
  • the “object commonly included in all the sensor data at each time” as used herein means an object that did not move during the sensing period by the fixed sensor, and is hereinafter referred to as “fixed object”.
  • the fixed object is typically a wall or a column existing in the moving space. If it is an object that has not moved during the sensing period by the fixed sensor 103a, even a loadable package etc. is included in the "fixed object”.
  • the signal processing circuit 109 may update the map data M of the storage device 107 using the created local map data.
  • the updated map data M is transmitted, for example, to the mobiles 10a to 10d, etc., and each mobile is referred to in order to estimate its own position.
  • the set SDa of sensor data acquired at a plurality of different times may include an object which is not commonly included in all the sensor data of each time but is partially included .
  • the said object is called a "movable object" below.
  • the movable object is typically a load placed on the floor of the movement space and then moved or removed, or a load newly placed on the floor of the movement space.
  • Movable objects may also include moving AGVs and people.
  • the signal processing circuit 109 may create a local map indicating only the position of the fixed object, or may create a local map indicating the position of the fixed object and the position of the movable object in a distinguishable manner.
  • An example of the latter is to indicate the position of the fixed object in a distinguishable manner when the presence frequency (ratio) of each movable object present in sensor data acquired at a plurality of different times is equal to or greater than a predetermined value.
  • Examples of distinguishable display modes include displaying the fixed object and the movable object with different characters or icons, displaying the fixed object in the darkest color, while the smaller the frequency (percentage) of each movable object is, the more It may be displayed in light color.
  • an unmanned carrier may be described as "AGV" using abbreviations.
  • AGV unmanned carrier
  • the following description can be similarly applied to mobile bodies other than AGVs, for example, mobile robots, drone, or manned vehicles, as long as no contradiction arises.
  • FIG. 2 shows an example of the basic configuration of an exemplary mobile management system 100 according to the present disclosure.
  • the mobile management system 100 includes at least one AGV 10 and an operation management apparatus 50 that manages the operation of the AGV 10.
  • the terminal device 20 operated by the user 1 is also shown in FIG.
  • the AGV 10 is an unmanned transport carriage capable of "guideless" traveling, which does not require a derivative such as a magnetic tape for traveling.
  • the AGV 10 can perform self-position estimation, and can transmit the result of estimation to the terminal device 20 and the operation management device 50.
  • the AGV 10 can automatically travel in the moving space S in accordance with a command from the operation management device 50.
  • the AGV 10 can also operate in a "tracking mode" that moves following a person or other moving object.
  • the operation management device 50 is a computer system that tracks the position of each AGV 10 and manages traveling of each AGV 10.
  • the operation management device 50 may be a desktop PC, a laptop PC, and / or a server computer.
  • the operation management apparatus 50 communicates with each AGV 10 via the plurality of access points 2. For example, the operation management device 50 transmits, to each AGV 10, data of coordinates of a position to which each AGV 10 should go next.
  • Each AGV 10 periodically transmits data indicating its position and orientation to the operation management device 50, for example, every 100 milliseconds.
  • the operation management device 50 transmits data of coordinates of a position to be further advanced.
  • the AGV 10 can also travel in the moving space S in accordance with the operation of the user 1 input to the terminal device 20.
  • An example of the terminal device 20 is a tablet computer.
  • travel of the AGV 10 using the terminal device 20 is performed at the time of map creation, and travel of the AGV 10 using the operation management device 50 is performed after the map creation.
  • FIG. 3 shows an example of a moving space S in which three AGVs 10a, 10b and 10c exist. All AGVs are assumed to travel in the depth direction in the figure. The AGVs 10a and 10b are carrying the load placed on the top plate. The AGV 10 c runs following the front AGV 10 b.
  • the referential mark 10a, 10b and 10c were attached
  • the AGV 10 can also transfer a load using a tow truck connected to itself, in addition to the method of transferring the load placed on the top plate.
  • FIG. 4A shows the AGV 10 and the tow truck 5 before being connected. Each leg of the tow truck 5 is provided with a caster. The AGV 10 is mechanically connected to the tow truck 5.
  • FIG. 4B shows the connected AGV 10 and tow truck 5. When the AGV 10 travels, the tow truck 5 is pulled by the AGV 10. By pulling the tow truck 5, the AGV 10 can transport the load placed on the tow truck 5.
  • connection method of AGV10 and the pulling truck 5 is arbitrary.
  • a plate 6 is fixed to the top plate of the AGV 10.
  • the tow truck 5 is provided with a guide 7 having a slit.
  • the AGV 10 approaches the tow truck 5 and inserts the plate 6 into the slit of the guide 7.
  • the AGV 10 penetrates the plate 6 and the guide 7 with an electromagnetic lock type pin (not shown) to lock the electromagnetic lock.
  • AGV10 and the pulling truck 5 are physically connected.
  • Each AGV 10 and the terminal device 20 can be connected, for example, on a one-to-one basis to perform communication conforming to the Bluetooth (registered trademark) standard.
  • Each AGV 10 and the terminal device 20 can also perform communication conforming to Wi-Fi (registered trademark) using one or more access points 2.
  • the plurality of access points 2 are connected to one another via, for example, a switching hub 3. Two access points 2a, 2b are shown in FIG.
  • the AGV 10 is wirelessly connected to the access point 2a.
  • the terminal device 20 is wirelessly connected to the access point 2b.
  • the data transmitted by the AGV 10 is received by the access point 2 a, transferred to the access point 2 b via the switching hub 3, and transmitted from the access point 2 b to the terminal device 20.
  • the data transmitted by the terminal device 20 is received by the access point 2 b, transferred to the access point 2 a via the switching hub 3, and transmitted from the access point 2 a to the AGV 10. Thereby, bi-directional communication between the AGV 10 and the terminal device 20 is realized.
  • the plurality of access points 2 are also connected to the operation management device 50 via the switching hub 3. Thereby, bidirectional communication is realized also between the operation management device 50 and each of the AGVs 10.
  • the AGV 10 transitions to the data acquisition mode by the operation of the user.
  • the AGV 10 starts acquiring sensor data using a laser range finder.
  • the laser range finder periodically scans the surrounding space S by emitting a laser beam of, for example, infrared or visible light around.
  • the laser beam is reflected by, for example, a surface such as a wall, a structure such as a pillar, or an object placed on the floor.
  • the laser range finder receives the reflected light of the laser beam, calculates the distance to each reflection point, and outputs measurement data indicating the position of each reflection point.
  • the direction of arrival of reflected light and the distance are reflected in the position of each reflection point.
  • Data of measurement results obtained by one scan may be referred to as "measurement data" or "sensor data”.
  • the position estimation device stores sensor data in a storage device.
  • the sensor data accumulated in the storage device is transmitted to the external device.
  • the external device is, for example, a computer that has a signal processor and has a mapping program installed.
  • the signal processor of the external device superimposes sensor data obtained for each scan.
  • a map of the space S can be created by repeatedly performing the process of overlaying the signal processor.
  • the external device transmits the created map data to the AGV 10.
  • the AGV 10 stores the created map data in an internal storage device.
  • the external device may be the operation management device 50 or another device.
  • the AGV 10 may create the map instead of the external device.
  • the processing performed by the signal processing processor of the external device described above may be performed by a circuit such as a microcontroller unit (microcomputer) of the AGV 10.
  • a microcontroller unit microcomputer
  • the data capacity of sensor data is generally considered to be large. Since it is not necessary to transmit sensor data to an external device, occupation of the communication line can be avoided.
  • the movement in the movement space S for acquiring sensor data can be implement
  • the AGV 10 wirelessly receives a traveling instruction instructing movement in each of the front, rear, left, and right directions from the user via the terminal device 20.
  • the AGV 10 travels back and forth and left and right in the moving space S in accordance with a travel command to create a map.
  • the map may be created by traveling in the moving space S in the front, rear, left, and right according to a control signal from the steering apparatus.
  • the sensor data may be acquired by a person pushing on the measurement cart on which the laser range finder is mounted.
  • FIGS. 2 and 3 Although a plurality of AGVs 10 are shown in FIGS. 2 and 3, one AGV may be provided. When there are a plurality of AGVs 10, the user 1 can use the terminal device 20 to select one AGV 10 out of the plurality of registered AGVs and create a map of the moving space S.
  • each AGV 10 can automatically travel while estimating its own position using the map.
  • the description of the process of estimating the self position will be described later.
  • FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
  • the AGV 10 has two drive wheels 11a and 11b, four casters 11c, 11d, 11e and 11f, a frame 12, a transport table 13, a travel control device 14, and a laser range finder 15.
  • the two drive wheels 11a and 11b are provided on the right and left sides of the AGV 10, respectively.
  • Four casters 11 c, 11 d, 11 e and 11 f are disposed at the four corners of the AGV 10.
  • the AGV 10 also has a plurality of motors connected to the two drive wheels 11a and 11b, but the plurality of motors are not shown in FIG. Further, FIG.
  • FIG. 5 shows one drive wheel 11a and two casters 11c and 11e located on the right side of the AGV 10 and a caster 11f located on the left rear, but the left drive wheel 11b and the left front
  • the caster 11 d is not shown because it is hidden by the frame 12.
  • the four casters 11c, 11d, 11e and 11f can freely pivot.
  • the drive wheel 11a and the drive wheel 11b are also referred to as a wheel 11a and a wheel 11b, respectively.
  • the travel control device 14 is a device that controls the operation of the AGV 10, and mainly includes an integrated circuit including a microcomputer (described later), an electronic component, and a substrate on which the components are mounted.
  • the traveling control device 14 performs transmission and reception of data with the terminal device 20 described above and pre-processing calculation.
  • the laser range finder 15 is an optical device that measures the distance to the reflection point by emitting a laser beam 15a of infrared or visible light, for example, and detecting the reflected light of the laser beam 15a.
  • the laser range finder 15 of the AGV 10 is, for example, a pulsed laser beam while changing the direction every 0.25 degree in a space within a range of 135 degrees (270 degrees in total) with reference to the front of the AGV 10
  • the light 15a is emitted, and the reflected light of each laser beam 15a is detected. This makes it possible to obtain data of the distance to the reflection point in the direction determined by the angle for a total of 1081 steps every 0.25 degrees.
  • the scan of the surrounding space performed by the laser range finder 15 is substantially parallel to the floor surface and planar (two-dimensional). However, the laser range finder 15 may scan in the height direction.
  • the AGV 10 can create a map of the space S based on the position and orientation (orientation) of the AGV 10 and the scan result of the laser range finder 15.
  • the map may reflect the surrounding walls of the AGV, structures such as columns, and the placement of objects placed on the floor. Map data is stored in a storage device provided in the AGV 10.
  • the position and posture of a mobile are called a pose.
  • the position and orientation of the moving body in a two-dimensional plane are represented by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
  • the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be hereinafter simply referred to as "position”.
  • the position of the reflection point viewed from the emission position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
  • the laser range finder 15 outputs sensor data represented by polar coordinates.
  • the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output it.
  • the structure and the operating principle of the laser range finder are known, so a further detailed description will be omitted herein.
  • Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, walls.
  • the laser range finder 15 is an example of an external sensor for sensing surrounding space and acquiring sensor data.
  • an image sensor and an ultrasonic sensor can be considered.
  • the traveling control device 14 can estimate the current position of itself by comparing the measurement result of the laser range finder 15 with the map data held by itself.
  • maintained may be the map data which other AGV10 created.
  • FIG. 6A shows a first hardware configuration example of the AGV 10.
  • FIG. 6A also shows a specific configuration of the traveling control device 14.
  • the AGV 10 includes a travel control device 14, a laser range finder 15, two motors 16a and 16b, a drive device 17, wheels 11a and 11b, and two rotary encoders 18a and 18b.
  • the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a position estimation device 14e.
  • the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the position estimation device 14e are connected by a communication bus 14f and can exchange data with each other.
  • the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as a measurement result to the microcomputer 14a, the position estimation device 14e and / or the memory 14b.
  • the microcomputer 14 a is a processor or control circuit (computer) that performs calculations for controlling the entire AGV 10 including the traveling control device 14.
  • the microcomputer 14a is a semiconductor integrated circuit.
  • the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal, which is a control signal, to the drive unit 17 to control the drive unit 17 to adjust the voltage applied to the motor. This causes each of the motors 16a and 16b to rotate at a desired rotational speed.
  • PWM Pulse Width Modulation
  • One or more control circuits for example, microcomputers for controlling the drive of the left and right motors 16a and 16b may be provided independently of the microcomputer 14a.
  • motor drive device 17 may be provided with two microcomputers for controlling the drive of motors 16a and 16b, respectively.
  • Those two microcomputers may perform coordinate calculation using encoder information output from the encoders 18a and 18b, respectively, to estimate the moving distance of the AGV 10 from a given initial position.
  • the two microcomputers may control the motor drive circuits 17a and 17b using encoder information.
  • the memory 14 b is a volatile storage device that stores a computer program executed by the microcomputer 14 a.
  • the memory 14b can also be used as a work memory when the microcomputer 14a and the position estimation device 14e perform an operation.
  • the storage device 14 c is a non-volatile semiconductor memory device.
  • the storage device 14 c may be a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
  • the storage device 14 c may include a head device for writing and / or reading data on any recording medium and a control device of the head device.
  • the storage device 14c stores map data M of the space S in which the vehicle travels and data (traveling route data) R of one or more traveling routes.
  • the map data M is created by the AGV 10 operating in the mapping mode and stored in the storage device 14c.
  • the travel route data R is transmitted from the outside after the map data M is created.
  • the map data M and the traveling route data R are stored in the same storage device 14c, but may be stored in different storage devices.
  • the AGV 10 receives traveling route data R indicating a traveling route from the tablet computer.
  • the travel route data R at this time includes marker data indicating the positions of a plurality of markers. “Marker” indicates the passing position (passing point) of the traveling AGV 10.
  • the travel route data R includes at least position information of a start marker indicating a travel start position and an end marker indicating a travel end position.
  • the travel route data R may further include positional information of markers at one or more intermediate waypoints. When the travel route includes one or more intermediate via points, a route from the start marker to the end marker via the travel via points in order is defined as the travel route.
  • the data of each marker may include, in addition to the coordinate data of the marker, data of the orientation (angle) and traveling speed of the AGV 10 until moving to the next marker.
  • the data of each marker is an acceleration time required to accelerate to the traveling speed, and / or It may include data of deceleration time required to decelerate from the traveling speed to a stop at the position of the next marker.
  • the operation management device 50 may control the movement of the AGV 10.
  • the operation management apparatus 50 may instruct the AGV 10 to move to the next marker each time the AGV 10 reaches the marker.
  • the AGV 10 receives, from the operation management apparatus 50, coordinate data of a target position to be headed to next, or data of a distance to the target position and data of an angle to be traveled as travel route data R indicating a travel route.
  • the AGV 10 can travel along the stored travel path while estimating its own position using the created map and the sensor data output from the laser range finder 15 acquired during travel.
  • the communication circuit 14d is, for example, a wireless communication circuit that performs wireless communication compliant with the Bluetooth (registered trademark) and / or the Wi-Fi (registered trademark) standard. Both standards include wireless communication standards using frequencies in the 2.4 GHz band. For example, in the mode in which the AGV 10 is run to create a map, the communication circuit 14d performs wireless communication conforming to the Bluetooth (registered trademark) standard, and communicates with the terminal device 20 on a one-to-one basis.
  • the position estimation device 14e performs map creation processing and estimation processing of the self position when traveling.
  • the position estimation device 14e creates a map of the moving space S based on the position and attitude of the AGV 10 and the scan result of the laser range finder.
  • the position estimation device 14e receives sensor data from the laser range finder 15, and reads out the map data M stored in the storage device 14c.
  • the self position (x, y, ⁇ ) on the map data M is obtained Identify
  • the position estimation device 14 e generates “reliability” data indicating the degree to which the local map data matches the map data M.
  • the data of the self position (x, y, ⁇ ) and the reliability can be transmitted from the AGV 10 to the terminal device 20 or the operation management device 50.
  • the terminal device 20 or the operation management device 50 can receive each data of the self position (x, y, ⁇ ) and the reliability and can display it on a built-in or connected display device.
  • microcomputer 14a and the position estimation device 14e are separate components, this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing each operation of the microcomputer 14a and the position estimation device 14e.
  • FIG. 6A shows a chip circuit 14g including the microcomputer 14a and the position estimation device 14e.
  • the microcomputer 14a and the position estimation device 14e are provided separately and independently will be described.
  • Two motors 16a and 16b are attached to two wheels 11a and 11b, respectively, to rotate each wheel. That is, the two wheels 11a and 11b are respectively drive wheels.
  • the motor 16a and the motor 16b are described as being motors for driving the right and left wheels of the AGV 10, respectively.
  • the moving body 10 further includes an encoder unit 18 that measures the rotational position or rotational speed of the wheels 11a and 11b.
  • the encoder unit 18 includes a first rotary encoder 18a and a second rotary encoder 18b.
  • the first rotary encoder 18a measures the rotation at any position of the power transmission mechanism from the motor 16a to the wheel 11a.
  • the second rotary encoder 18 b measures the rotation at any position of the power transmission mechanism from the motor 16 b to the wheel 11 b.
  • the encoder unit 18 transmits the signals acquired by the rotary encoders 18a and 18b to the microcomputer 14a.
  • the microcomputer 14 a may control the movement of the mobile unit 10 using not only the signal received from the position estimation device 14 e but also the signal received from the encoder unit 18.
  • the drive device 17 has motor drive circuits 17a and 17b for adjusting the voltage applied to each of the two motors 16a and 16b.
  • Each of motor drive circuits 17a and 17b includes a so-called inverter circuit.
  • the motor drive circuits 17a and 17b turn on or off the current flowing to each motor by the PWM signal transmitted from the microcomputer 14a or the microcomputer in the motor drive circuit 17a, thereby adjusting the voltage applied to the motor.
  • FIG. 6B shows a second hardware configuration example of the AGV 10.
  • the second hardware configuration example differs from the first hardware configuration example (FIG. 6A) in that it has the laser positioning system 14 h and that the microcomputer 14 a is connected to each component on a one-to-one basis. Do.
  • the laser positioning system 14 h includes a position estimation device 14 e and a laser range finder 15.
  • the position estimation device 14e and the laser range finder 15 are connected by, for example, an Ethernet (registered trademark) cable.
  • the operations of the position estimation device 14e and the laser range finder 15 are as described above.
  • the laser positioning system 14 h outputs information indicating the pose (x, y, ⁇ ) of the AGV 10 to the microcomputer 14 a.
  • the microcomputer 14a has various general purpose I / O interfaces or general purpose input / output ports (not shown).
  • the microcomputer 14a is directly connected to other components in the travel control device 14, such as the communication circuit 14d and the laser positioning system 14h, via the general-purpose input / output port.
  • the AGV 10 in the embodiment of the present disclosure may include a safety sensor such as a bumper switch not shown.
  • the AGV 10 may include an inertial measurement device such as a gyro sensor.
  • an inertial measurement device such as a gyro sensor.
  • FIGS. 7A to 7F schematically show the AGV 10 moving while acquiring sensor data.
  • the user 1 may move the AGV 10 manually while operating the terminal device 20.
  • the unit provided with the travel control device 14 shown in FIGS. 6A and 6B, or the AGV 10 itself may be mounted on a carriage, and sensor data may be acquired by the user 1 manually pushing or holding the carriage.
  • FIG. 7A shows an AGV 10 that scans the surrounding space using a laser range finder 15. A laser beam is emitted for each predetermined step angle and scanning is performed.
  • the illustrated scan range is an example schematically shown, and is different from the total scan range of 270 degrees described above.
  • the position of the reflection point of the laser beam is schematically shown using a plurality of black points 4 represented by a symbol “ ⁇ ”.
  • the scanning of the laser beam is performed at short intervals while the position and attitude of the laser range finder 15 change. Therefore, the number of actual reflection points is much larger than the number of reflection points 4 shown.
  • the position estimation device 14e stores, for example, in the memory 14b, the position of the black point 4 obtained as the vehicle travels.
  • the map data is gradually completed as the AGV 10 continues to scan while traveling.
  • FIGS. 7B-7E only the scan range is shown for simplicity.
  • the scan range is an example, and is different from the above-described example of 270 degrees in total.
  • the map may be created using the microcomputer 14a in the AGV 10 or an external computer based on the sensor data after acquiring the sensor data of the amount necessary for creating the map. Alternatively, a map may be created in real time based on sensor data acquired by the moving AGV 10.
  • FIG. 7F schematically shows a part of the completed map 40.
  • free space is partitioned by a point cloud (Point Cloud) corresponding to a collection of reflection points of the laser beam.
  • Point Cloud Point Cloud
  • Another example of the map is an occupied grid map that distinguishes space occupied by an object from free space in grid units.
  • the position estimation device 14e stores map data (map data M) in the memory 14b or the storage device 14c.
  • map data M maps map data in the memory 14b or the storage device 14c.
  • the illustrated number or density of black spots is an example.
  • the map data thus obtained may be shared by multiple AGVs 10.
  • a typical example of an algorithm in which the AGV 10 estimates its own position based on map data is ICP (Iterative Closest Point) matching.
  • ICP Intelligent Closest Point
  • the map data M may be created and recorded as data of a plurality of partial maps.
  • FIG. 8 shows an example in which the entire area of one floor of one factory is covered by a combination of four partial map data M1, M2, M3 and M4.
  • one partial map data covers an area of 50 m ⁇ 50 m.
  • a rectangular overlapping area of 5 m in width is provided at the boundary between two adjacent maps in each of the X direction and the Y direction. This overlapping area is called "map switching area".
  • Map switching area When the AGV 10 traveling while referring to one partial map reaches the map switching area, it switches to a traveling referring to another adjacent partial map.
  • the number of partial maps is not limited to four, and may be appropriately set according to the area of the floor on which the AGV 10 travels, and the performance of a computer that executes map creation and self-position estimation.
  • the size of the partial map data and the width of the overlapping area are not limited to the above example, and may be set arbitrarily.
  • FIG. 9 shows a hardware configuration example of the operation management device 50.
  • the operation management apparatus 50 includes a CPU 51, a memory 52, a position database (position DB) 53, a communication circuit 54, a map database (map DB) 55, and an image processing circuit 56.
  • the CPU 51, the memory 52, the position DB 53, the communication circuit 54, the map DB 55, and the image processing circuit 56 are connected by a communication bus 57 and can exchange data with each other.
  • the CPU 51 is a signal processing circuit (computer) that controls the operation of the operation management device 50.
  • the CPU 51 is a semiconductor integrated circuit.
  • the memory 52 is a volatile storage device that stores a computer program that the CPU 51 executes.
  • the memory 52 can also be used as a work memory when the CPU 51 performs an operation.
  • the position DB 53 stores position data indicating each position that can be a destination of each AGV 10.
  • the position data may be represented, for example, by coordinates virtually set in the factory by the administrator. Location data is determined by the administrator.
  • the communication circuit 54 performs wired communication conforming to, for example, the Ethernet (registered trademark) standard.
  • the communication circuit 54 is connected to the access point 2 (FIG. 1) by wire, and can communicate with the AGV 10 via the access point 2.
  • the communication circuit 54 receives data to be transmitted to the AGV 10 from the CPU 51 via the bus 57.
  • the communication circuit 54 also transmits data (notification) received from the AGV 10 to the CPU 51 and / or the memory 52 via the bus 57.
  • the map DB 55 stores data of an internal map of a factory or the like on which the AGV 10 travels.
  • the map may be the same as or different from the map 40 (FIG. 7F).
  • the data format is not limited as long as the map has a one-to-one correspondence with the position of each AGV 10.
  • the map stored in the map DB 55 may be a map created by CAD.
  • the position DB 53 and the map DB 55 may be constructed on a non-volatile semiconductor memory, or may be constructed on a magnetic recording medium represented by a hard disk or an optical recording medium represented by an optical disc.
  • the image processing circuit 56 is a circuit that generates data of an image displayed on the monitor 58.
  • the image processing circuit 56 operates only when the administrator operates the operation management device 50. In the present embodiment, particularly the detailed description is omitted.
  • the monitor 59 may be integrated with the operation management device 50. Further, the CPU 51 may perform the processing of the image processing circuit 56.
  • FIG. 10 is a view schematically showing an example of the movement route of the AGV 10 determined by the operation management device 50. As shown in FIG.
  • the outline of the operation of the AGV 10 and the operation management device 50 is as follows. In the following, an example will be described in which an AGV 10 is currently at position M 1 and travels through several positions to the final destination position M n + 1 (n: 1 or more positive integer) .
  • position DB 53 coordinate data indicating positions such as a position M 2 to be passed next to the position M 1 and a position M 3 to be passed next to the position M 2 are recorded.
  • CPU51 of traffic control device 50 reads out the coordinate data of the position M 2 with reference to the position DB 53, and generates a travel command to direct the position M 2.
  • the communication circuit 54 transmits a traveling command to the AGV 10 via the access point 2.
  • the CPU 51 periodically receives data indicating the current position and attitude from the AGV 10 via the access point 2.
  • the operation management device 50 can track the position of each AGV 10.
  • CPU51 determines that the current position of the AGV10 matches the position M 2, reads the coordinate data of the position M 3, and transmits the AGV10 generates a travel command to direct the position M 3. That is, when it is determined that the AGV 10 has reached a certain position, the operation management device 50 transmits a traveling command for directing to the next passing position.
  • the AGV 10 can reach the final target position Mn + 1 .
  • the passing position and the target position of the AGV 10 described above may be referred to as a “marker”.
  • mapping system 200 a more specific example of the mapping system 200 according to the present embodiment will be described.
  • FIG. 11A is an overhead view of the moving space S in which a plurality of fixed sensors 103 are installed.
  • 11B shows a configuration example of a map creation system 200 including the fixed sensor 103 and the map creation device 105.
  • the map creating device 105 is connected to the access point 2 and the operation management device 50 via the hub 3.
  • Each fixed sensor 103 is fixedly installed in the moving space S, and senses a part of the moving space S included in the field of view of each fixed sensor 103 at a plurality of different times, and sensor data of each time Output to the map creating device 105 wirelessly.
  • the fixed sensor 103 As the fixed sensor 103, the above-mentioned “external sensor” can be used.
  • One example of the fixed sensor 103 is a laser range finder. Now, it is assumed that fixed sensor 103 starts scanning from time T and outputs data of the position of the reflection point in the direction determined by the angle for a total of 1081 steps every 0.25 degrees.
  • the output data is a point cloud (Point Cloud) corresponding to a collection of reflection points of the laser beam.
  • a set of these data is called "sensor data at time T”.
  • the number of sensor data at time T is at most 1081. If the reflected light does not return, the number of sensor data is smaller than 1081.
  • Setting the fixed sensor 103 “fixed” in the movement space S means that the position and angle (attitude) of each fixed sensor 103 do not change at least at the time of scanning.
  • the fixed sensor 103 may not be firmly fixed to the floor as long as the position and posture do not change.
  • the laser range finder 15 built in the stationary AGV 10 may be used as the fixed sensor 103.
  • the access point 2 receives the wireless signal of the sensor data of each time output from each fixed sensor 103, and transmits the received sensor data to the cartographic device 105 via the hub 3.
  • the map creating device 105 is a local map that is a map (local map) of a part of the moving space S based on the position of the fixed object commonly included in all the sensor data of each time output from each fixed sensor 103 Create data
  • the map creating device 105 updates the map of the moving space S using the local map.
  • the operation management device 50 updates the map managed by itself with the received map.
  • the operation management device 50 transmits, for example, the updated map to one or more AGVs managed by itself. This allows each AGV to estimate its own position by referring to the latest map.
  • the fixed sensor 103 includes a CPU 201, a memory 203, a communication circuit 205, a drive circuit 207, and a laser device 209.
  • the CPU 201 is a semiconductor integrated circuit that controls the operation of the fixed sensor 103.
  • the memory 203 is a storage device that temporarily accumulates the acquired sensor data.
  • the communication circuit 205 transmits sensor data by wireless communication conforming to the Wi-Fi (registered trademark) standard.
  • the drive circuit 207 generates a drive current to be supplied to the laser device 209 in accordance with a control signal from the CPU 201.
  • the laser device 209 emits an infrared or visible laser beam 15b based on the supplied drive current.
  • the laser device 209 has a light detector (not shown), and detects the reflected light of the emitted laser beam 15b.
  • the CPU 201 generates a timing at which a current flows to the drive circuit 207 and a control signal indicating a current value, and drives the drive circuit 207 by the control signal.
  • the CPU 201 also calculates the distance to the reflection point using the difference between the phase of the emitted laser beam 15 b and the phase of the reflected light acquired by the light detector of the laser device 209. Since the method of calculating the distance is well known, the description in the present specification is omitted.
  • the frequency of the laser beam 15b may not be one, but may be plural.
  • the direction in which the reflected light of the laser beam 15 b has arrived represents the direction in which the object is present as viewed from the fixed sensor 103.
  • the configuration of the laser range finder 15 (FIG. 6A, FIG. 6B, etc.) is also equivalent to that of the fixed sensor 103 described above.
  • FIG. 14 shows objects 221 and 223 existing in the movement space S, and fixed sensors 103p to 103s including the objects 221 or 223 in the field of view.
  • the field of view of each of the fixed sensors 103p to 103s is indicated by a fan centered on each of the fixed sensors.
  • the radius of the sector corresponds to the detectable distance of each fixed sensor, but the size of the radius of the sector shown is only an example.
  • the fixed sensor 103p detects the object 221, and the fixed sensors 103q to 103s detect the object 223.
  • the fixed sensor 103p can detect the direction in which the object 221 exists and the distance to the object 221. The same applies to fixed sensors 103 q to 103 s.
  • FIG. 15 shows a local map 41 including sensor data output from each of the fixed sensors 103p to 103s in a certain period.
  • the point cloud which is not present in the overall map obtained in advance but appears specifically in the sensor data is surrounded by alternate long and short dash lines 231 and 233.
  • the point group in the ellipse of the dashed dotted line 231 and 233 is a candidate of a movable object.
  • the sensor data in the alternate long and short dash line 231 is a set (point group) of reflection points output from the fixed sensor 103p, and indicates the position of the object 221.
  • the sensor data in the alternate long and short dash line 233 is a set (point group) of reflection points output from the fixed sensors 103 q to 103 s, and indicates the position of the object 223.
  • each of the fixed sensors 103p to 103s performs sensing once every hour for 24 hours.
  • the local map 41 is obtained in the first eight hours
  • the local map 40 (FIG. 7F) is obtained in the next eight hours
  • the local map 41 is obtained again in the remaining eight hours.
  • the objects 221 and 223 are “movable objects” because they are not included in common in all the sensor data of each time but are included in only a part.
  • the local map 40 is reflected on the entire map of the movement space S. That is, the position of the movable object is not reflected on the map of the movement space S, and only the position of the fixed object is reflected on the entire map of the movement space S.
  • the map generation device 105 updates the local map 41 to the local map 40 not including the movable object.
  • FIG. 16 shows the local map 40 after the update. It is understood that no sensor data exists in the region surrounded by the one-dot chain lines 241 and 243 of the local map 40, and the movable objects 221 and 223 included in the local map 41 (FIG. 15) are not reflected. Ru.
  • the map generation device 105 maintains the local map 40 without updating it.
  • FIG. 17 is a flowchart showing the procedure of the process of the map creating device 105.
  • one or more fixed sensors 103 perform sensing of a part (partial space) of the movement space S at time T n (n: 1,..., N; N is an integer of 2 or more). Output sensor data.
  • the series of times T n may be periodic or random.
  • each fixed sensor does not have to simultaneously sense at time T n . It suffices that the map creating device 105 can process using the sensor data received from each fixed sensor within a predetermined period.
  • step S10 the cartographic device 105 receives sensor data at each time T n from the fixed sensor.
  • the signal processing circuit 109 stores the received sensor data in the storage device 107.
  • step S11 the signal processing circuit 109, an object appearing in the sensor data for all time of the sensor data at time T N from the time T 1 is determined to be a stationary object, appears at the sensor data in some time only It determines that the object is a movable object.
  • the signal processing circuit 109 for example, the object indicated by the data obtained by calculating a logical product of the sensor data at time T N from the time T 1, it can be determined that the fixed object.
  • the basis of this calculation is based on the fact that the position of the fixed object does not substantially change in the obtained sensor data, since the position and orientation of the fixed sensor 103 do not change.
  • the signal processing circuit 109 determines that the negation of logical product of above, an object appearing a logical product of the data obtained by calculating the respective sensor data at time T N from the time T 1, a movable object be able to.
  • step S12 the signal processing circuit 109 creates a local map based on the position of the fixed object. Since the data obtained by the above calculation result of the logical product represents the position of the fixed object, the signal processing circuit 109 can create a local map using the calculation result. If the created local map already matches a part of the whole map, the map is not updated, and if it does not match, the corresponding part of the whole map may be updated to the created local map .
  • step S11 there is a possibility that the fixed object present behind the movable object may be determined as the movable object.
  • the laser beam of the fixed sensor was blocked by the movable object, when the movable object is removed, the laser beam reaches the fixed object existing behind it.
  • the set (point group) of the reflection points of the laser beam reflects the position of the fixed object, but is not present in the sensor data of all the sensor data of time T 1 to time T N obtained so far .
  • the fixed object may be misidentified as a newly placed moving object.
  • a case is considered where a certain fixed sensor 103 outputs sensor data at times t1 and t2 (t1 ⁇ t2), respectively, and the two sensor data do not match.
  • sensor data after time t2 is further utilized to determine whether a point cloud at a relatively distant position continues to be present at the same position, and is present at the same position. If yes, it can be determined that the point group is a fixed object.
  • sensor data before time t1 is further utilized to determine whether a point cloud at a relatively distant position has been present at the same position before and is present at the same position. If it does, it can be determined that the point group is a fixed object.
  • FIG. 18 shows the transition of sensor data output from the fixed sensor 103 when a plurality of objects appear at different times and different positions.
  • the example shown in FIG. 18 is an example based on sensor data output from all the fixed sensors including the fixed sensors 103p to 103s (FIG. 14). In addition, it is assumed that the entire map of the moving space S is prepared in advance.
  • the signal processing circuit 109 reflects the obtained map data on a part of the entire map of the movement space S.
  • FIG. 19 shows an example of a local map 42 in which a fixed object and a movable object are shown distinguishably.
  • the position of a fixed object such as a wall is represented by a set of dots 261 indicated by “ ⁇ ”.
  • the position of the movable object is represented by a set of marks 263 indicated by “X” (refer to the circle in a dashed dotted line).
  • the operation management apparatus 50 or a person can easily recognize the position of the movable object and determine the traveling route.
  • the manager can easily recognize the position at which a movable object such as a load is likely to be placed, and can adjust the position at which the load is temporarily placed so as not to hinder the traveling of the AGV 10. This makes it easy to improve the working environment of people and the AGV 10.
  • the position of a movable object having a certain period of existence or more is displayed on a local map
  • the position of a movable object having a period of existence less than a certain period of time is not displayed on a local map Good.
  • K pieces: a to (K integer 1 or more and less than N) is the point group of the movable object are included.
  • the value K is referred to as the “presence frequency” of sensor data of a movable object.
  • the existence frequency can be said to be a ratio of scan data including a point cloud of a movable object to N scan data.
  • the signal processing circuit 109 displays the position of the movable object in a distinguishable manner on the local map when the value of the presence frequency K is equal to or more than the threshold.
  • Each point cloud represents the position of the movable object.
  • the existence frequency of the fixed object corresponding to the point group in the alternate long and short dash line 231 is equal to or greater than the threshold, and the existence frequency of the fixed object corresponding to the point cloud in the alternate long and short dash line 233 is less than the threshold.
  • a local map at this time is shown in FIG.
  • the local map 43 includes a point group in the alternate long and short dash line 231 indicating the position of the movable object whose presence frequency is equal to or higher than the threshold but within the alternate long and short dash line 233 indicating the position of the movable object whose presence frequency is less than the threshold Does not include the point cloud of
  • the concept of the presence frequency K can be extended to the entire moving space S to create a local map or a map attached to the local map.
  • the presence frequency K of the point group of the movable object is an integer value of 1 or more and less than N.
  • the value K of the frequency of occurrence K of a point group relating to a free space, such as a space where no object exists, that is, a passage through which the AGV 10 can travel 0.
  • FIG. 21 shows a lattice map 44 displayed darker as the value of the existence frequency K is larger.
  • color differences are represented by pattern differences, but it is possible, for example, to change the density in gray scale.
  • the signal processing circuit 109 divides the movement space S into a grid, determines K to be a representative for each grid, and creates a grid map 44 by representing grids with a density corresponding to the value of K. .
  • the passage (free space) in which the AGV 10 can travel is expressed in white, and the position where the immovable fixed object is present is displayed in a darker color than white. The darker the color, the closer to the square the position where the movable object is present.
  • the grid map 44 even if the grid is not white, if the density is low, it can be determined that the path is less affected by the movable object.
  • the gray scale it may be represented by a color according to the value of the existing frequency K.
  • the grid map 44 may be created separately from the local map, or may be created as attribute information attached to the local map.
  • the AGV 10 can also use the value of the presence frequency K as the “weight” of the position corresponding to each square.
  • the AGV 10 can also use the value of the presence frequency K as the “weight” of the position corresponding to each square.
  • the AGV 10 may select a path with a small weight. Note that, for example, by using the inverse number of the existence frequency K, the weight may be increased as the area of a square that has less influence on traveling.
  • FIG. 22 is a flow chart showing the procedure of processing of the map creating device 105 for creating a map displaying a fixed object and a movable object.
  • the procedure from the start of the process to step S11 is the same as the procedure of the process in FIG.
  • step S13 the signal processing circuit 109 creates a local map in which the fixed object and the movable object are displayed so as to be distinguishable according to the determination result in step S11.
  • the identification of a fixed object or a movable object can be realized by making the characters, icons, characters, the frequency of presence, the density according to the frequency of presence, etc. different.
  • the installation position of the fixed sensor 103 is not particularly mentioned. However, if installed at a position where there are relatively many movable objects, it is possible to update the local map reflecting the movable objects more quickly.
  • the position estimation device 14 e of the AGV 10 generates “reliability” data indicating the degree to which the local map data matches the map data M.
  • the AGV 10 is actually traveled, it can be estimated that the lower the position or the region with lower reliability, the more movable objects.
  • the AGV 10 is made to run under an environment that is actually used, and a log of reliability data is stored in the storage device of the AGV 10.
  • the log may be analyzed later, and one or more fixed sensors 103 may be installed to have a field of view including a position or an area whose reliability is less than or equal to a predetermined threshold.
  • the map generation device 105 is provided separately from the AGV 10 and the operation management device 50.
  • a microcomputer, a CPU or the like in the AGV 10 or in the operation management device 50 operates the map generation device 105. It may be executed.
  • an AGV traveling in a two-dimensional space is taken as an example.
  • the present disclosure can also be applied to a mobile object moving in three-dimensional space, such as a flying object (drone).
  • a drone creates a three-dimensional space map while flying, the two-dimensional space can be expanded to a three-dimensional space.
  • the mobile body and mobile body management system of the present disclosure can be suitably used for moving and transporting objects such as luggage, parts, finished products, etc. in factories, warehouses, construction sites, logistics, hospitals and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Mathematical Physics (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Instructional Devices (AREA)

Abstract

L'invention concerne un système de création de carte (101, 200) comprenant : un capteur fixe (103), installé à demeure dans un espace dans lequel se déplace une entité mobile, le capteur fixe détectant une partie de l'espace à une pluralité d'heures de la journée différentes et émettant des données de capteur de chaque heure de la journée; et un dispositif de création de carte destiné à recevoir les données de capteur de chaque heure de la journée et à créer une carte d'au moins la partie de l'espace. Les données de capteur de chaque heure de la journée (231, 223) indiquent les positions d'objets (221, 223) présents dans la partie de l'espace à chaque heure de la journée. Le dispositif de création de carte (105) comprend un dispositif de mémorisation (107) destiné à mémoriser les données de capteur de chaque heure de la journée, et un circuit de traitement de signal (109) permettant de créer des données de carte locale qui constituent une carte de la partie de l'espace sur la base de positions d'objets fixes qui sont compris en commun dans toutes les données de capteur de chaque heure de la journée.
PCT/JP2018/032448 2017-09-13 2018-08-31 Système et dispositif de création de carte WO2019054209A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019541997A JPWO2019054209A1 (ja) 2017-09-13 2018-08-31 地図作成システムおよび地図作成装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017175646 2017-09-13
JP2017-175646 2017-09-13

Publications (1)

Publication Number Publication Date
WO2019054209A1 true WO2019054209A1 (fr) 2019-03-21

Family

ID=65723937

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/032448 WO2019054209A1 (fr) 2017-09-13 2018-08-31 Système et dispositif de création de carte

Country Status (2)

Country Link
JP (1) JPWO2019054209A1 (fr)
WO (1) WO2019054209A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020202510A1 (fr) * 2019-04-03 2020-10-08 三菱電機株式会社 Dispositif de distribution de données cartographiques pour corps mobile et système de corps mobile
JP2021103482A (ja) * 2019-12-25 2021-07-15 財団法人車輌研究測試中心 自己位置推定方法
CN113711153A (zh) * 2019-04-17 2021-11-26 日本电产株式会社 地图制作系统、信号处理电路、移动体和地图制作方法
WO2021256322A1 (fr) * 2020-06-16 2021-12-23 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
JP2021196214A (ja) * 2020-06-11 2021-12-27 トヨタ自動車株式会社 位置推定装置及び位置推定用コンピュータプログラム
WO2024076020A1 (fr) * 2022-10-07 2024-04-11 삼성전자 주식회사 Procédé et serveur pour générer une carte spatiale

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003300186A (ja) * 2002-04-03 2003-10-21 Mitsubishi Heavy Ind Ltd 移動ロボットシステム
JP2010277548A (ja) * 2009-06-01 2010-12-09 Hitachi Ltd ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム
US20140207280A1 (en) * 2013-01-18 2014-07-24 Irobot Corporation Environmental management systems including mobile robots and methods using same
JP2017045447A (ja) * 2015-08-28 2017-03-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
JP2017097000A (ja) * 2015-11-18 2017-06-01 株式会社明電舎 局所地図作成装置および局所地図作成方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003300186A (ja) * 2002-04-03 2003-10-21 Mitsubishi Heavy Ind Ltd 移動ロボットシステム
JP2010277548A (ja) * 2009-06-01 2010-12-09 Hitachi Ltd ロボット管理システム、ロボット管理端末、ロボット管理方法およびプログラム
US20140207280A1 (en) * 2013-01-18 2014-07-24 Irobot Corporation Environmental management systems including mobile robots and methods using same
JP2017045447A (ja) * 2015-08-28 2017-03-02 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 地図生成方法、自己位置推定方法、ロボットシステム、およびロボット
JP2017097000A (ja) * 2015-11-18 2017-06-01 株式会社明電舎 局所地図作成装置および局所地図作成方法

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020202510A1 (fr) * 2019-04-03 2020-10-08 三菱電機株式会社 Dispositif de distribution de données cartographiques pour corps mobile et système de corps mobile
CN113711153A (zh) * 2019-04-17 2021-11-26 日本电产株式会社 地图制作系统、信号处理电路、移动体和地图制作方法
CN113711153B (zh) * 2019-04-17 2024-04-19 日本电产株式会社 地图制作系统、信号处理电路、移动体和地图制作方法
JP2021103482A (ja) * 2019-12-25 2021-07-15 財団法人車輌研究測試中心 自己位置推定方法
JP7092741B2 (ja) 2019-12-25 2022-06-28 財団法人車輌研究測試中心 自己位置推定方法
JP2021196214A (ja) * 2020-06-11 2021-12-27 トヨタ自動車株式会社 位置推定装置及び位置推定用コンピュータプログラム
JP7287353B2 (ja) 2020-06-11 2023-06-06 トヨタ自動車株式会社 位置推定装置及び位置推定用コンピュータプログラム
WO2021256322A1 (fr) * 2020-06-16 2021-12-23 ソニーグループ株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme de traitement d'informations
WO2024076020A1 (fr) * 2022-10-07 2024-04-11 삼성전자 주식회사 Procédé et serveur pour générer une carte spatiale

Also Published As

Publication number Publication date
JPWO2019054209A1 (ja) 2020-10-29

Similar Documents

Publication Publication Date Title
JP7168211B2 (ja) 障害物の回避動作を行う移動体およびそのコンピュータプログラム
US20190294181A1 (en) Vehicle, management device, and vehicle management system
WO2019054209A1 (fr) Système et dispositif de création de carte
US20200264616A1 (en) Location estimation system and mobile body comprising location estimation system
JP7081881B2 (ja) 移動体および移動体システム
WO2019026761A1 (fr) Corps mobile et programme informatique
US20200110410A1 (en) Device and method for processing map data used for self-position estimation, mobile body, and control system for mobile body
US20200363212A1 (en) Mobile body, location estimation device, and computer program
WO2019187816A1 (fr) Corps mobile et système de corps mobile
JP7136426B2 (ja) 管理装置および移動体システム
JP2019053391A (ja) 移動体
JP2019148870A (ja) 移動体管理システム
US11537140B2 (en) Mobile body, location estimation device, and computer program
JP2019175137A (ja) 移動体および移動体システム
JP2019175136A (ja) 移動体
WO2019194079A1 (fr) Système d'estimation de position, corps mobile comprenant ledit système d'estimation de position, et programme informatique
JP2019179497A (ja) 移動体および移動体システム
JP2019079171A (ja) 移動体
JP2020166702A (ja) 移動体システム、地図作成システム、経路作成プログラムおよび地図作成プログラム
JP2019067001A (ja) 移動体
WO2020213645A1 (fr) Système de création de carte, circuit de traitement de signal, corps mobile et procédé de création de carte
CN112578789A (zh) 移动体
WO2019069921A1 (fr) Corps mobile
JP2019148871A (ja) 移動体および移動体システム
WO2019059299A1 (fr) Dispositif de gestion opérationnelle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18856992

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019541997

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18856992

Country of ref document: EP

Kind code of ref document: A1